• btc = $98 654.00 17.37 (0.02 %)

  • eth = $3 336.29 5.56 (0.17 %)

  • ton = $5.52 -0.01 (-0.25 %)

  • btc = $98 654.00 17.37 (0.02 %)

  • eth = $3 336.29 5.56 (0.17 %)

  • ton = $5.52 -0.01 (-0.25 %)

28 Oct, 2024
1 min time to read

Researchers and developers are raising concerns over OpenAI’s Whisper transcription tool due to unexpected hallucination issues, according to a report by the Associated Press.

Despite Whisper’s purpose as a speech-to-text tool, which ideally would accurately mirror audio, it has reportedly generated transcripts containing inaccurate or even invented content.

One University of Michigan researcher observed hallucinations in 80% of Whisper transcriptions of public meetings. In a broader study, a machine learning engineer found hallucinations in over half of 100 hours of Whisper transcripts. Additionally, a developer who created 26,000 transcriptions reported that nearly all contained some form of hallucination.

An OpenAI spokesperson acknowledged the feedback, stating that the company is actively working to improve the tool’s accuracy, particularly in reducing hallucinations. OpenAI’s policies also discourage using Whisper in “high-stakes decision-making contexts.”