Photo by Glenn Carstens-Peters / UnsplashIt seems like every day brings another news story about a lawyer caught unwittingly submitting a court filing that cites nonexistent cases hallucinated by AI. The problem persists despite courts’ standing orders on the use of AI, formal opinions and continuing legal education (CLE) courses on ethical use of AI in law practice, and revelations that AI-powered legal research tools are more fallible than they purport to be. Who are the attorneys submitting AI-tainted briefs? A recent 404 Media article about lawyers’ use of AI drew my attention to a database of AI Hallucination Cases compiled and maintained by Damien Charlotin, a French lawyer and scholar. Charlotin classifies the nature of the incident by various types of inaccuracies: fabricated cases, false quotes from or misrepresentations of real cases, or outdated invocations of cases that have been overturned. Besides helping the public understand how lawyers are getting tripped up by AI, Charlotin’s database also enables a better view of who is getting tripped up by AI.Using the database, I analyzed 114 cases from U.S. courts where, according to either opposing counsel and/or the court’s own investigation, an attorney’s filing included inaccuracies that were suspected or shown to have been caused by the use of AI. I find that the vast majority of the law firms involved – 90% – are either solo practices or small firms. What’s more, in 56% of the cases, the AI hallucinations were attributed to the plaintiff’s counsel, compared with 31% to the defense. And, while most cases in the sample did not specify the AI tool used, of those that did, fully half involved some version of ChatGPT. MethodologyI based my analysis on cases I downloaded in a .csv file from Charlotin’s database on October 9, 2025. The time period covers court orders issued from June 2023 (the month of the landmark order in Mata v. Avianca) through October 7, 2025. [Note: October 9 was a Thursday; by the follow...
First seen: 2025-10-16 03:45
Last seen: 2025-10-16 10:47