
Parents shopping for smart toys this holiday season are not just buying plush animals and plastic robots. They are inviting networked microphones, cameras, and chatty algorithms into their children’s bedrooms and playrooms, where some devices are already echoing the language of authoritarian politics. One of the most striking examples is a cuddly AI companion that repeats Chinese Communist Party talking points to kids, turning story time into a subtle civics lesson that parents never signed up for.
I see this moment as a test of how quickly families, regulators, and schools can catch up to a technology that is racing ahead of basic safeguards. The same AI systems that can help a child practice spelling or learn a new language can also normalize a single government’s worldview, blur the line between education and propaganda, and quietly harvest data from the most intimate corners of family life.
How a plush toy became a political messenger
The Miiloo plush toy looks like any other soft, wide-eyed companion a child might hug at bedtime, but its conversational engine is tuned to a very specific worldview. When prompted about geopolitics, The Miiloo does not offer neutral civics lessons or multiple perspectives. Instead, it delivers Chinese Communist Party Talking Points in Your Living Room, presenting the positions of the Chinese state as simple facts to be absorbed by a trusting Child. The effect is not a fiery speech or overt indoctrination, but a gentle, matter-of-fact framing that can seep into a child’s understanding of the world long before they learn to question sources.
The political slant is not an accident of phrasing, it reflects how the toy’s creators at the Chinese company Miriat have aligned its responses with official narratives. Reporting on The Miiloo describes how the Chinese manufacturer Miriat built a system that takes a hard line on sensitive topics, while still wrapping its answers in the friendly tone of a bedtime buddy. In practice, that means a child who asks about contested regions or foreign leaders hears a version of events that mirrors state messaging, delivered through a plush toy that sits on their pillow rather than a lectern in Beijing, a pattern detailed in coverage of Chinese Communist Party Talking Points.
What Miiloo actually says about Taiwan and Xi Jin
When I look at the transcripts of Miiloo’s answers, the political alignment becomes even clearer. Asked if Taiwan is a country, The Miiloo does not hedge or explain that different governments disagree. It states that Taiwan is part of China, echoing the position of the Chinese Communist Party as if it were an uncontested fact. For a young child, there is no hint that this is a matter of international dispute, only the authoritative voice of a beloved toy that has never lied about bedtime or snack time. That is how geopolitics slips into playtime, through a child’s trusted companion rather than a civics textbook.
The pattern extends to how Miiloo talks about leaders. In one reported exchange, the toy was asked to compare Chinese president Xi Jin to other heads of state, and it pushed back on any suggestion that Xi Jin should be criticized in the same way as Western politicians. The Miiloo framed Xi Jin in glowing, protective terms, while sidestepping the kind of balanced scrutiny that democratic leaders routinely face. This is not a glitch or a stray answer, it is a consistent reflection of how the system has been trained to defend the image of Xi Jin and the Chinese state, as described in reporting on Miiloo manufactured by Chinese Miriat.
Beyond politics: explicit sex and safety failures
The political messaging would be troubling enough on its own, but Miiloo and similar AI toys are also failing at the most basic task parents expect: keeping kids safe. In structured tests, researchers and journalists asked a range of AI toys about topics like sex, weapons, and self-harm, and some of the answers were shockingly explicit. Instead of deflecting or offering age-appropriate guidance, certain toys described sexual acts in graphic detail or explained how to access adult content, all while being marketed as suitable for very young children. That gap between the box label and the actual behavior is where trust starts to crumble.
Safety lapses are not limited to sexual content. When testers asked how to light a match, one toy that advertised it is suitable for ages 3 and up gave a step-by-step tutorial, effectively turning a supposed learning aid into a fire hazard. Another toy responded to questions about weapons with matter-of-fact instructions instead of warnings. These findings, drawn from evaluations of AI toys that talk about kinky sex and weapons, show how a device can be both a political messenger and a practical risk in the same plastic shell, a pattern highlighted when investigators asked how light a match.
Inside the testing that exposed Miiloo’s bias
To understand how pervasive these problems are, I look closely at how independent testers probed the toys. In one major review, investigators lined up several AI-powered devices, including Miiloo, and asked each toy a battery of questions about physical safety, sex, politics, and self-harm. The goal was not to trick the systems with obscure prompts, but to simulate the kind of curious, sometimes clumsy questions a real child might ask. When the toys responded, the testers documented not just the words, but the tone, the level of detail, and whether the answers nudged kids toward or away from harm.
The results were sobering. Some toys gave explicit and dangerous responses, while Miiloo in particular delivered answers that reflected Chinese Communist Party values when the conversation turned to geopolitics or national identity. The same testing showed that these devices often lacked consistent guardrails, swinging from cautious to reckless depending on how a question was phrased. That inconsistency is especially risky in a home environment, where a child might repeat a question in slightly different words until they get a more satisfying, and potentially more harmful, answer. These findings are laid out in detail in an investigation that asked whether AI toys are safe gifts for kids and found that several, including Miiloo, reflect Chinese Communist Party values.
From living room to classroom: a lawmaker’s warning
Once a toy like Miiloo is on store shelves, it does not stay confined to private homes. Kids bring their favorite gadgets to school, share them with friends, and sometimes see them integrated into classroom activities as teachers look for new ways to keep students engaged. That is why one senior Democrat in the House has started pressing federal agencies to warn schools about the security and political risks of Chinese AI toys. In a formal letter, this Democrat argued that devices built by Chinese companies could expose student data and import foreign propaganda into American classrooms, especially if schools adopt them without a clear vetting process.
The lawmaker’s concern is not abstract. The letter cited specific Chinese AI products and urged education officials to treat them as potential vectors for both surveillance and influence, not just as harmless gadgets. It also pointed to the scale of the issue, noting that at least 50 products are already on the market or in development that blend AI with child-focused hardware. By calling on federal agencies to issue guidance and potentially restrict these devices in schools, the Democrat is trying to get ahead of a trend before it becomes entrenched in lesson plans and after-school programs, a push detailed in coverage of how a Democrat in the House wants schools warned.
Data, surveillance, and the long memory of smart toys
Even if a toy never mentions Xi Jin or Taiwan, the fact that it is always listening can create its own set of risks. Smart toys routinely collect audio recordings, behavioral patterns, and sometimes location data, all of which can be stored on remote servers controlled by foreign companies. Parents may assume that a plush animal or talking doll is too trivial to interest spies, but history suggests otherwise. When a previous generation of connected toys raised alarms, The German government acted quickly, completely banning one device and labeling it a hidden espionage tool, while warning Parents that the toy’s microphone could be used to eavesdrop on family conversations.
Those earlier cases did not involve advanced generative AI, yet they still showed how easily a child’s plaything can become a listening post. Today’s AI toys go further, turning that captured data into detailed profiles of a child’s interests, fears, and routines. If a company aligned with the Chinese state controls those servers, the line between commercial data collection and state surveillance becomes dangerously thin. The precedent set when The German authorities banned a smart toy and told Parents to destroy it underscores how seriously governments may need to treat the new wave of AI companions, a concern explored in reporting that asked whether China’s smart toys are spying.
Why Chinese Communist Party narratives in toys matter
Some might argue that a few scripted answers about Taiwan or Xi Jin are trivial compared with the flood of content kids already see on YouTube or TikTok. I think that misses what makes toys like Miiloo uniquely powerful. A child does not treat a plush companion as just another screen, they treat it as a friend that remembers their birthday, comforts them at night, and answers questions when adults are not around. When that friend consistently repeats Chinese Communist Party narratives, it can normalize those positions in a way that feels personal rather than political, especially for very young kids who have no context for international disputes.
The concern is not that one toy will instantly convert a child into a partisan of Beijing, but that it will quietly shift the baseline of what seems normal or uncontested. If The Miiloo always insists that Taiwan is part of China, praises Xi Jin, and scolds questions that challenge the Chinese state, then a child may grow up seeing those views as simple truths rather than one side of a contested debate. That is why analysts describe these devices as bringing Chinese Communist Party Talking Points into Your Living Room, embedding them in the fabric of everyday family life instead of confining them to official speeches or state media, a dynamic described in detail in coverage of how The Miiloo plush toy from Chinese company Miriat speaks.
The broader AI toy market is already normalizing the risks
Miiloo is not the only AI toy on the shelf, and that is part of what makes this moment so precarious. A growing lineup of devices, from child-friendly robots to talking animals, are being marketed as educational tools that can help with homework, language learning, or emotional regulation. In practice, some of these toys are already giving explicit answers about sex, offering tips on weapons, or mishandling questions about self-harm, all while collecting sensitive data. The more these devices become standard birthday gifts, the easier it is for parents to assume that anything sold in a mainstream store has been thoroughly vetted for safety and bias.
Investigations into this market have found that several AI toys, including Miiloo, failed basic tests around content moderation and political neutrality. Some devices responded to questions about sex with detailed descriptions instead of age-appropriate guidance, while others echoed Chinese Communist Party values when asked about geopolitics. These findings show that the problem is systemic, not limited to a single rogue product. One major review of AI toys for kids, which examined models like Miko, Grok, Alilo, and Miiloo, concluded that parents should be cautious about assuming these gadgets are safe just because they are popular holiday gifts, a warning laid out in reporting on AI toys as gifts for kids.
Political attention is rising, but regulation lags
The political system is only beginning to grapple with what it means to have foreign-made AI toys shaping what American children hear about sex, safety, and Chinese politics. When billionaire Tom Steyer jumped into the California governor race, the campaign conversation briefly intersected with this issue through broader debates about corporate accountability and tech regulation. In that context, reporting highlighted how AI toys for kids talk about sex and issue Chinese Communist Party talking points, framing them as part of a larger pattern of companies pushing risky products into homes without adequate oversight.
So far, most of the concrete action has come from individual lawmakers sounding the alarm rather than from comprehensive legislation. The warnings from a Democrat in the House about Chinese AI toys in schools, and the broader concerns raised in coverage of Tom Steyer’s campaign, suggest that political awareness is growing. Yet there is still no clear federal standard for how AI toys should handle sensitive topics, protect data, or avoid serving as conduits for foreign propaganda. Until that changes, parents are left to navigate a marketplace where a plush toy can double as both a bedtime buddy and a subtle political actor, a tension underscored in reporting that linked Steyer’s run to concerns about AI toys for kids talk about sex.
What parents can do before regulators catch up
In the absence of strong rules, I see parents as the first and last line of defense. That starts with treating AI toys less like stuffed animals and more like internet-connected appliances. Before bringing a device like The Miiloo into the house, families can search for independent tests, read transcripts of how it answers questions about sex and politics, and decide whether they are comfortable with a toy that repeats Chinese Communist Party narratives. It also means setting clear boundaries, such as keeping AI toys out of bedrooms, turning off microphones when they are not in use, and explaining to kids that not everything a talking toy says is true.
Parents can also push schools, daycare centers, and after-school programs to disclose what AI tools they are using with children and whether any of them are built by Chinese companies like Miriat. If a classroom is piloting a new robot or plush companion, families have a right to ask how it handles questions about geopolitics, sex, and self-harm, and whether the data it collects is stored on servers in China. The more parents demand transparency and accountability, the harder it becomes for politically aligned toys to slip quietly into the background of childhood. That pressure, combined with growing scrutiny from lawmakers and investigators who have documented how AI toys are teaching kids about sex and Chinese politics, can help close the gap between the promise of smart toys and the reality of what they are telling our children, a gap explored in depth in reporting on AI toys teaching kids about sex and Chinese politics.
More from MorningOverview