Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said
- Researchers found that Whisper, an AI-powered transcription tool, is prone to fabricating text, which is concerning for its use in various industries.
- Medical centers are rapidly adopting Whisper-based tools for transcribing doctor-patient consultations, despite warnings from OpenAI.
- Alondra Nelson stated that inaccuracies in Whisper transcriptions could have "really grave consequences" in hospital settings.
Insights by Ground AI
Does this summary seem wrong?
55 Articles
55 Articles
All
Left
12
Center
21
Right
6
Coverage Details
Total News Sources55
Leaning Left12Leaning Right6Center21Last UpdatedBias Distribution54% Center
Bias Distribution
- 54% of the sources are Center
54% Center
L 31%
C 54%
15%
Factuality
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage