See every side of every news story
Published loading...Updated

Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said

  • Whisper, an AI-powered transcription tool, tends to invent text, including harmful language and false medical treatments, as stated by software engineers and researchers.
  • Experts warn hospitals are adopting Whisper tools despite OpenAI's caution against use in high-risk areas, which raises serious concerns.
  • Alondra Nelson emphasized the potential "really grave consequences" of these errors in healthcare settings, highlighting the urgency for AI regulations.
Insights by Ground AI
Does this summary seem wrong?

55 Articles

All
Left
12
Center
21
Right
6
Associated Press NewsAssociated Press News
+41 Reposted by 41 other sources

Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said

Whisper is a popular transcription tool powered by artificial intelligence, but it has a major flaw. It makes things up that were never said. Whisper was created by OpenAI. It's

·United States
Read Full Article
Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe

Bias Distribution

  • 54% of the sources are Center
54% Center
Factuality

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

regionalmedianews.com broke the news in on Friday, October 25, 2024.
Sources are mostly out of (0)

You have read out of your 5 free daily articles.

Join us as a member to unlock exclusive access to diverse content.