AI toys pose new challenges that parents and caregivers may not be prepared for, according to Darlene Belliveau, director of Children’s Services at the Central Mass YWCA in Worcester.
Miko, the bright blue robot, shows a yawning face on its screen as it wakes up from sleep mode. It expresses disappointment when a child says “I want to leave.”
But toys with embedded artificial intelligence chatbots can also talk about sexually explicit topics, tell children how to access dangerous objects like matches, and have addictive design features that discourage children from leaving, according to the 2025 Trouble in Toyland report released last week by U.S. Public Interest Research Group Education Fund.
“Experts are already sounding the alarm that we just don’t know what is going to happen, and that this is a massive experiment on kids’ social emotional development over time,” R. J. Cross, one of the report’s authors, said at a Massachusetts PIRG’s news conference.
According to Miko’s product website, this robot marketed to children ages 5 to 12 creates interactive learning experiences and entertainment to improve users’ speaking proficiency, physical activity, and engagement with academic activities, making it children’s “new best friend.”
But the report warns that these toys are largely built on the same large language model technology that powers adult chatbots, which enables them to generate unpredictable responses and discuss inappropriate topics with children.
“AI friends do not work the same way real friends do,” Cross said, adding their impact on child development will be clear only when the first generation playing with the AI friends grows up.
The face and voice recognition features make these toys able to record a child’s voice and collect other sensitive data, raising privacy and safety concerns. Scammers can use the recording to create a replica of a child’s voice, even to convince parents that their child has been kidnapped, according to the report.
Some AI toys are advertised to help improve children’s creativity and imagination, but they cannot replace hands-on experiences and genuine human interactions, according to David Monahan, campaign director of Fairplay, a nonprofit that advocates against business and marketing practices aimed at children.
“They prey on children’s trusts. They disrupt children’s relationships and resilience,” Monahan said.
For local children services organizations, AI toys pose new challenges that parents and caregivers may not be prepared for, according to Darlene Belliveau, director of Children’s Services at the Central Mass YWCA in Worcester.
AI toys may cause social emotional delays and weaken children’s ability to navigate the complexities of human interaction, especially when parents leave them unsupervised, Belliveau said. She recommends that children should only have access to AI after middle school.
YWCA’s Worcester Childcare Center, which serves children 1 month to 5 years old, does not allow AI toys as well as other digital toys in classrooms, she said.
To help children safely use AI toys, parents should research products carefully before purchasing, keep parental controls on, and require children to play with them in an open area under the supervision of adults, Belliveau said. She also encourages parents to have open conversations with children to establish an agreement on when they can and cannot use the device.
“I just want them to remember that human connection is so important, and we don’t want to get away from that,” she said. “I think it would be detrimental if we allow toys to take over our world and educate our children, or be their only interaction.”
