News: 0180158155

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Advocacy Groups Urge Parents To Avoid AI Toys This Holiday Season

(Thursday November 20, 2025 @10:30PM (BeauHD) from the PSA dept.)


An anonymous reader quotes a report from the Associated Press:

> They're cute, even cuddly, and promise learning and companionship -- but artificial intelligence toys are not safe for kids, according to children's and consumer advocacy groups [1]urging parents not to buy them during the holiday season . These toys, marketed to kids as young as 2 years old, are generally powered by AI models that have already been shown to harm children and teenagers, such as OpenAI's ChatGPT, according to [2]an advisory published Thursday by the children's advocacy group Fairplay and signed by more than 150 organizations and individual experts such as child psychiatrists and educators.

>

> "The serious harms that AI chatbots have inflicted on children are well-documented, including fostering obsessive use, having explicit sexual conversations, and encouraging unsafe behaviors, violence against others, and self-harm," Fairplay said. AI toys, made by companies including Curio Interactive and Keyi Technologies, are often marketed as educational, but Fairplay says they can displace important creative and learning activities. They promise friendship but disrupt children's relationships and resilience, the group said. "What's different about young children is that their brains are being wired for the first time and developmentally it is natural for them to be trustful, for them to seek relationships with kind and friendly characters," said Rachel Franz, director of Fairplay's Young Children Thrive Offline Program. Because of this, she added, the trust young children are placing in these toys can exacerbate the types of harms older children are already experiencing with AI chatbots.

>

> A [3]separate report Thursday by Common Sense Media and psychiatrists at Stanford University's medical school warned teenagers against using popular AI chatbots as therapists. Fairplay, a 25-year-old organization formerly known as the Campaign for a Commercial-Free Childhood, has been warning about AI toys for years. They just weren't as advanced as they are today. A decade ago, during an emerging fad of internet-connected toys and AI speech recognition, the group helped lead a backlash against Mattel's talking Hello Barbie doll that it said was recording and analyzing children's conversations. This time, though AI toys are mostly sold online and more popular in Asia than elsewhere, Franz said some have started to appear on store shelves in the U.S. and more could be on the way. "Everything has been released with no regulation and no research, so it gives us extra pause when all of a sudden we see more and more manufacturers, including Mattel, who recently partnered with OpenAI, potentially putting out these products," Franz said.

Last week, consumer advocates at U.S. PIRG called out the trend of buying AI toys in its annual " [4]Trouble in Toyland" report . This year, the organization tested four toys that use AI chatbots. "We found some of these toys will talk in-depth about sexually explicit topics, will offer advice on where a child can find matches or knives, act dismayed when you say you have to leave, and have limited or no parental controls," the report said.



[1] https://apnews.com/article/ai-toys-miko-curio-gabbo-aa6d829b1aba18e2d1dfedd4cfca8da7

[2] https://fairplayforkids.org/pf/aitoyadvisory

[3] https://www.commonsensemedia.org/ai-ratings/ai-chatbots-for-mental-health-support

[4] https://pirg.org/edfund/resources/trouble-in-toyland-2025-a-i-bots-and-toxics-represent-hidden-dangers/



AI in toys isn't always risky (Score:1)

by davidwr ( 791652 )

Connected toys that spy on you, on the other hand....

By the way, the companies that make and sell these toys are putting their stockholders at risk of a future privacy lawsuit. This is one of those times where corporate in-house lawyers should put the brakes on a product until the law is more settled. As it stands now, "will we get sued in 2030 and lose a fortune for what we are selling in 2025" is an open question.

Re: (Score:2)

by abulafia ( 7826 )

Yeah, there's already one made-for-tabloid-news [1]lawsuit magnet [cnn.com] out there.

[1] https://www.cnn.com/2025/11/19/tech/folotoy-kumma-ai-bear-scli-intl

Re: (Score:2)

by glowworm ( 880177 )

It's sort of sad that parents are not aware that they are not just installing something that by it's nature must record and send everything back to some faceless intermediate before it heads right into BigAIs training data, but more than this, a blackbox more than capable of subtle manipulation.

As long as it comes wrapped in a cuddly Teddy Bear shell, these parents seemingly lose all sense of safety.

The Simpsons... (Score:2)

by joshuark ( 6549270 )

The Simpsons called this out in 1999, watch out for "Funzo" :)

[1]https://www.youtube.com/watch?... [youtube.com]

JoshK.

[1] https://www.youtube.com/watch?v=tRolO0BJnNs

If you're a kid (Score:1)

by blue trane ( 110704 )

If all the adults are telling you "don't do this", are you more likely to do it, because they're so demonstrably wrong on so much?

How many did more drugs because Nancy Reagan said not to?

hello (Score:2)

by hamburger lady ( 218108 )

my name is teddy ruxpin. don't you want to kill yourself tonight? you know where mommy keeps her pills, right?

<Knghtbrd> Overfiend - BTW, after we've discovered X takes all of 1.4 GIGS
to build, are you willing admit that X is bloatware? =>
<Overfiend> KB: there is a 16 1/2 minute gap in my answer
<acf> knghtbrd: evidence exists that X is only the *2nd* worst windowing
system ;)