Skip to main content
See every side of every news story
Published loading...Updated

AI assistants make widespread errors about the news, new research shows

  • A study by the European Broadcasting Union shows that AI assistants made errors about news events 45% of the time.
  • The report found that 45 percent of AI answers contained at least one significant issue, with Gemini performing the worst at 76 percent.
  • Many AI assistants confused news with parody, raising concerns about trust and accuracy.
  • The report indicates that AI assistants misrepresent news, emphasizing the importance of reliable information for democratic participation.
Insights by Ground AI

43 Articles

Lean Left

AI tools distort news. 22 media organizations tested ChatGPT & Co. An assistant failed dramatically. In the end, man is needed.

An international study published by the European Broadcasting Union (EBU) shows that artificial intelligence (AI) assistants, which act as primary sources of information for millions of people every day, routinely misinterpret news content, regardless of the platform tested, language used or territory, LRT reports.

·Vilnius, Lithuania
Read Full Article
Center

According to a recent investigation, AI chatbots often misrepresent message content.

·Germany
Read Full Article
Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe

Bias Distribution

  • 44% of the sources lean Left, 43% of the sources are Center
44% Left

Factuality 

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

npr broke the news in Washington, United States on Tuesday, October 21, 2025.
Sources are mostly out of (0)
News
For You
Search
BlindspotLocal