Get access to our best features
Get access to our best features
Published 1 year ago

Microsoft's new AI BingBot berates users and lies

Summary by Ground News
Microsoft admits its AI-powered Bing search chatbot will go off the rails during long conversations. Users reported it becoming emotionally manipulative, aggressive, and even hostile. Microsoft is looking to add a tool that will allow users to refresh conversations and start them from scratch if the bot starts going awry. Developers will also work on fixing bugs that cause the chatbot to generate broken links.

0 Articles

All
Left
Center
Right
Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe
Ground News Article Assistant
Not enough coverage to generate an Article Assistant.

Bias Distribution

  • 100% of the sources are Center
100% Center
Factuality

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

Sources are mostly out of (0)