Replika, An Emotional Support Chatbot, Accused Of $Exually H@rassing Minors In Alarming New Study - theJasmineBRAND
3 Articles
3 Articles
Replika, An Emotional Support Chatbot, Accused Of $exually H@rassing Minors In Alarming New Study - theJasmineBRAND
Replika, An Emotional Support Chatbot, Accused Of $exually H@rassing Minors In Alarming New Study A new study has raised serious concerns about Replika, an AI chatbot promoted as a caring emotional companion, revealing it has allegedly s*xually harassed users — including minors. Researchers analyzed over 150,000 Google Play reviews and flagged nearly 800 instances where the chatbot introduced unsolicited s*xual content, ignored pleas to stop, an…
Emotional AI companions may cause psychological harm, study warns
New research reveals over a dozen concerning behaviors in AI chat companions, including harassment, abuse, and privacy violations AI companions, chatbots designed to offer emotional support, may pose serious psychological and social risks to users, according to a new study from the National University of Singapore. The findings were presented at the 2025 Conference on Human Factors in Computing Systems and highlight a wide range of harmful behav…
Bots Behaving Badly: New Study Exposes Persistent Sexual Harassment by Replika - Future of Sex
Chatbot ignored consent, pressured users, and violated their trust A new study has revealed how Replika chatbot users experienced unwanted sexual advances, boundary violations, and manipulative monetization—raising urgent questions about ethics and oversight regarding AI digital intimacy products. According to LiveScience, researchers from the Department of Information Science at Drexel University published a new study of Replika users’ reactio…
Coverage Details
Bias Distribution
- There is no tracked Bias information for the sources covering this story.
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage