See every side of every news story
Published loading...Updated

OpenAI's new o3 and o4-mini models hallucinate more than the company's predecessor reasoning models.

Summary by diarioestrategia.cl
OpenAI's new o3 and o4-mini reasoning models offer more results with hallucinations than the company's previous reasoning models, with almost twice the hallucination rate recorded in model o1, according to internal evidence from the PersonQA evaluation.
DisclaimerThis story is only covered by news sources that have yet to be evaluated by the independent media monitoring agencies we use to assess the quality and reliability of news outlets on our platform. Learn more here.

Bias Distribution

  • There is no tracked Bias information for the sources covering this story.
Factuality

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

unsafe.sh broke the news in on Monday, April 21, 2025.
Sources are mostly out of (0)

Similar News Topics

You have read out of your 5 free daily articles.

Join us as a member to unlock exclusive access to diverse content.