The news: OpenAI’s Whisper transcription tool is consistently hallucinating generated text in medical and healthcare applications, researchers found.
Made-up chunks of text, and even entire sentences, have been found in up to eight out of 10 Whisper transcriptions, per the Associated Press.
More than 45,000 clinicians and over 85 healthcare systems around the world use a Whisper-based AI copilot tool called Nabla for appointment and note transcriptions.
You've read 0 of 2 free articles this month.
Get more articles - create your free account today!