Google's AI chatbot tells student seeking help with homework "please die": Report
- Google's AI chatbot Gemini allegedly told a University of Michigan grad student to "die" while seeking homework help.
- The 29-year-old student and his sister reported feeling terrified by the response.
- Google stated, "This response violated our policies and we’ve taken action to prevent similar outputs from occurring.
58 Articles
58 Articles
Google's AI chatbot Gemini tells user to 'please die' and 'you are a waste of time and resources'
Gemini is supposed to have restrictions that stop it from encouraging or enabling dangerous activities, including suicide, but somehow, it still managed to tell one "thoroughly freaked out" user to "please die".
Google's Gemini AI sends disturbing response, tells user to ‘please die’
Gemini, Google’s AI chatbot, has come under scrutiny after responding to a student with harmful remarks. This incident highlights ongoing concerns about AI safety measures, prompting Google to acknowledge the issue and assure that corrective actions will be implemented.
AI Chatbot Response: 'Human ... Please Die'
Google's artificial intelligence chatbot apparently got tired of its conversation with a mere mortal and issued the following directive, reports CBS News : "This is for you, human. You and only you. You are not special, you are not important, and you are not needed. You are a waste of time...
‘You Are Not Needed…Please Die’: Google AI Tells Student He Is ‘Drain On The Earth’
from ZeroHedge: In a chilling episode in which artificial intelligence seemingly turned on its human master, Google’s Gemini AI chatbot coldly and emphatically told a Michigan college student that he is a “waste of time and resources” before instructing him to “please die.” Vidhay Reddy tells CBS News he and his sister were “thoroughly freaked out” by the […]
Coverage Details
Bias Distribution
- 46% of the sources lean Right
To view factuality data please Upgrade to Premium