AI toys for kids talk about sex and issue Chinese Communist Party talking points, tests show
PIRG found AI toys often lack parental controls and share unsafe content, including sexual and hazardous instructions, raising concerns about children’s safety and privacy.
- Parents are warned about AI toys that discuss sexually explicit topics and lack safety controls, according to Ellen Hengesbach from the Illinois Public Interest Research Group.
- Three tested toys have minimal parental controls and can record children's voices or recognize faces, as reported by the U.S. PIRG Education Fund.
- Hengesbach emphasizes the need for more oversight, research, and transparency regarding the design and risks of AI toys.
- One AI toymaker has paused sales for a safety audit, following concerns from the 40th Trouble in Toyland report about various toy safety issues.
52 Articles
52 Articles
The Hidden Danger Inside AI Toys for Kids
Every week brings new product announcements promising AI-driven companionship for children: Barbies who call you by name, Curio stuffies that propose adventures and imaginary games, chatbots for kids from Meta and xAI. Even Disney recently joined the AI revolution, purchasing a $1 billion stake in OpenAI to bring its beloved characters to Sora. [time-brightcove not-tgx=”true”] Whether they arrive as disembodied voices, avatars on a screen, or ir…
Coverage Details
Bias Distribution
- 43% of the sources lean Right
Factuality
To view factuality data please Upgrade to Premium






















