Lawyers for Chicago Housing Authority Used ChatGPT to Cite Nonexistent Court Case
CHICAGO, ILLINOIS, JUL 15 – Goldberg Segalla's use of ChatGPT led to fabricated legal citations in a Chicago Housing Authority case, prompting firm-wide AI policy reforms and court sanctions.
- On July 17, 2025, lawyers for the Chicago Housing Authority used ChatGPT to cite a nonexistent court case in a lead paint lawsuit motion at the Daley Center.
- This error occurred amid a lack of clear regulatory guidance and firm policies banning AI use, leading to one lawyer, Danielle Malaty, including an unverified AI-generated citation and her subsequent termination.
- Three incidents have surfaced where lawyers cited fictitious cases generated by AI, prompting Cook County Judge Thomas Cushing to hold a special hearing requiring affected attorneys and firms to explain these lapses.
- Goldberg Segalla billed CHA over $389,900 for services between March and December 2024, apologized for the AI-related error as a "serious lapse in professionalism," and implemented firm-wide AI use policies and attorney re-education.
- The incidents exposed gaps in verifying AI-generated legal content, risking misinformed court rulings and eroding public confidence, thereby prompting calls for swift regulatory action and enforceable standards in legal AI use.
Insights by Ground AI
Does this summary seem wrong?
15 Articles
15 Articles
Lawyers for Chicago Housing Authority used ChatGPT to cite nonexistent court case – Chicago Tribune/Yahoo
Lawyers hired by the Chicago Housing Authority recently cited Illinois Supreme Court case Mack v. Anderson in an effort to persuade a judge to reconsider a jury’s $24 million verdict against the agency in a case involving the alleged poisoning of two children by lead paint in CHA-owned property. The problem? The case doesn’t exist.

+2 Reposted by 2 other sources
Lawyers for Chicago Housing Authority used ChatGPT to cite nonexistent court case
Open the article to view the coverage from Chicago Tribune
·Chicago, United States
Read Full ArticleSouth Africa: Ai 'Hallucinations' Are Threatening the Administration of Justice in SA
There have already been three incidents in which non-existent cases have been used in court documents. The lawyers involved were required to explain how these fictitious cases came to be cited.
·South Africa
Read Full ArticleCoverage Details
Total News Sources15
Leaning Left0Leaning Right4Center6Last UpdatedBias Distribution60% Center
Bias Distribution
- 60% of the sources are Center
60% Center
C 60%
R 40%
Factuality
To view factuality data please Upgrade to Premium