Microsoft limits Bing chats to 5 questions per session
29 Articles
29 Articles
Angry Bing chatbot just mimicking humans, say experts
Microsoft's nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics said on Friday.Tales of disturbing exchanges with the artificial intelligence (AI) chatbot -- including it issuing threats and speaking of desires to steal nuclear code, create a deadly virus, or to be alive -- have gone viral this week."I think this is basically mimicking co…
Microsoft limits Bing chats to 5 questions per session
"As we mentioned recently, very long chat sessions can confuse the underlying chat model in the new Bing. To address these issues, we have implemented some changes to help focus the chat sessions," Microsoft said in the blog post.
Coverage Details
Bias Distribution
- 36% of the sources lean Left
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage