Skip to main content
See every side of every news story
Published loading...Updated

AI Hallucinations in Court Documents Are a Growing Problem, and Lawyers Are Responsible for Many of Them

  • Damien Charlotin created a public database in early 2024 that tracks 120 court cases involving AI-generated hallucinations in legal documents worldwide.
  • The rise in hallucinations links to increased availability of large language models and more awareness of AI's risks in legal text and citations.
  • In 2023 and 2024, lawyers and pro se litigants both made AI errors, with recent months showing lawyers responsible for a growing number of cases and sanctions.
  • Courts imposed fines exceeding $10,000 in several cases, including a $31,000 sanction on lawyers at K&L Gates and Ellis George for relying partly on fabricated cases.
  • Charlotin noted courts have imposed mild penalties so far but emphasized the legal field's reliance on text patterns makes it prone to AI hallucinations, putting responsibility on parties to verify citations.
Insights by Ground AI

11 Articles

Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe

Bias Distribution

  • 50% of the sources lean Left, 50% of the sources lean Right
50% Right

Factuality 

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

nexusnewsfeed.com broke the news in on Sunday, May 25, 2025.
Sources are mostly out of (0)
News
For You
Search
BlindspotLocal