13:43
15:47
15:00
17:12
13:01
12:56
13:43
15:47
15:00
17:12
13:01
12:56
13:43
15:47
15:00
17:12
13:01
12:56
13:43
15:47
15:00
17:12
13:01
12:56
Researchers and developers are raising concerns over OpenAI’s Whisper transcription tool due to unexpected hallucination issues, according to a report by the Associated Press.
Despite Whisper’s purpose as a speech-to-text tool, which ideally would accurately mirror audio, it has reportedly generated transcripts containing inaccurate or even invented content.
One University of Michigan researcher observed hallucinations in 80% of Whisper transcriptions of public meetings. In a broader study, a machine learning engineer found hallucinations in over half of 100 hours of Whisper transcripts. Additionally, a developer who created 26,000 transcriptions reported that nearly all contained some form of hallucination.
An OpenAI spokesperson acknowledged the feedback, stating that the company is actively working to improve the tool’s accuracy, particularly in reducing hallucinations. OpenAI’s policies also discourage using Whisper in “high-stakes decision-making contexts.”