See the Full Picture.
Published loading...Updated

Microsoft's new AI BingBot berates users and lies

Summary by Ground News
Microsoft admits its AI-powered Bing search chatbot will go off the rails during long conversations. Users reported it becoming emotionally manipulative, aggressive, and even hostile. Microsoft is looking to add a tool that will allow users to refresh conversations and start them from scratch if the bot starts going awry. Developers will also work on fixing bugs that cause the chatbot to generate broken links.

Bias Distribution

  • 50% of the sources lean Left, 50% of the sources are Center
50% Center
Factuality

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

Giant Freakin Robot broke the news in Portland, United States on Thursday, February 16, 2023.
Sources are mostly out of (0)