Judge initially fooled by fake AI citations, nearly put them in a ruling
- Judge Michael Wilner discovered forged legal citations generated by AI in briefs submitted by plaintiff's lawyers in a civil lawsuit as of May 2025 in California.
- The forged citations appeared because lawyers used AI tools like Google Gemini and Westlaw CoCounsel without verifying the material or disclosing the use, leading to fabricated research.
- Wilner found multiple false citations persisted even after lawyers submitted a revised brief, and he issued an Order to Show Cause demanding detailed explanations and sworn statements.
- Wilner imposed $26,100 in sanctions on the law firms Ellis George and K&L Gates to reimburse defense costs and stated that no competent attorney should outsource research to AI without verification.
- The case highlights risks of AI-generated legal misinformation, prompting concerns about the integrity of court filings and the need for careful oversight of AI use in legal research.
13 Articles
13 Articles
Lawyers Used AI to Make a Legal Brief—and Got Everything Wrong
By now, some AI-generated nonsense sneaking into legal briefs isn’t shocking. But that doesn’t mean the fallout is or should be any less severe. In a case out of California, reported by The Verge, U.S. Magistrate Judge Michael Wilner slammed two law firms for filing legal documents riddled with fictional cases and quotes, all dreamed up by generative AI. By trying to cut corners, the two firms will now have to pay $31,000 in sanctions. The offen…
How an AI almost ruined a court ruling in the U.S.
Artificial intelligence (AI) is moving forward at rapid pace in many areas, and the judicial system is no stranger to this transformation.But what consequences can the incorporation of such powerful and still imperfect technology have in decisions that directly affect people's rights and responsibilities?A recent case against the State Farm insurance company in Los Angeles, California, USA, puts the magnifying glass on the subject.Judge Michael …
AI Hallucination Case Stemming from Use of a Paralegal's AI-Based Research
I blogged yesterday about AI hallucinations in court filings by prominent law firms, as well as a nonexistent source cited in an expert's declaration (the expert works for leading AI company Anthropic, though at this point it's not yet clear whether the error stemmed from an AI hallucination or from something else). But I thought I'd blog a bit more in the coming days about AI hallucinations in court filings, just to show how pervasive the probl…
Coverage Details
Bias Distribution
- 40% of the sources lean Right
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage