News: 0180018716

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

'AI Slop' in Court Filings: Lawyers Keep Citing Fake AI-Hallucinated Cases (indianexpress.com)

(Sunday November 09, 2025 @04:43PM (EditorDavid) from the disorder-in-the-court dept.)


"According to court filings and interviews with lawyers and scholars, the legal profession in recent months has increasingly become a hotbed for AI blunders," [1]reports the New York Times :

> Earlier this year, a lawyer filed a motion in a Texas bankruptcy court that cited a 1985 case called Brasher v. Stewart. Only the case doesn't exist. Artificial intelligence had concocted that citation, along with 31 others. A judge blasted the lawyer in an opinion, referring him to the state bar's disciplinary committee and mandating six hours of A.I. training.

>

> That filing was spotted by Robert Freund, a Los Angeles-based lawyer, who fed it to an online database that tracks legal A.I. misuse globally. Mr. Freund is part of a growing network of lawyers who track down A.I. abuses committed by their peers, collecting the most egregious examples and posting them online. The group hopes that by tracking down the A.I. slop, it can help draw attention to the problem and put an end to it... [C]ourts are starting to map out punishments of small fines and other discipline. The problem, though, keeps getting worse. That's why Damien Charlotin, a lawyer and researcher in France, started an online database in April to track it.

>

> Initially he found three or four examples a month. Now he often receives that many in a day. Many lawyers... have helped him document 509 cases so far. They use legal tools like LexisNexis for notifications on keywords like "artificial intelligence," "fabricated cases" and "nonexistent cases." Some of the filings include fake quotes from real cases, or cite real cases that are irrelevant to their arguments. The legal vigilantes uncover them by finding judges' opinions scolding lawyers...

>

> Court-ordered penalties "are not having a deterrent effect," said Freund, who has publicly flagged more than four dozen examples this year. "The proof is that it continues to happen."



[1] https://indianexpress.com/article/technology/tech-news-technology/vigilante-lawyers-expose-the-rising-tide-of-ai-slop-in-court-filings-10352972/



Make it stop quickly (Score:5, Interesting)

by RitchCraft ( 6454710 )

If a lawyer introduces hallucinated slop into a case that lawyer loses their license for at least one year. The lawyer must then retake the bar exam to regain their license. That'll make them think twice about using AI slop.

Re: Make it stop quickly (Score:4, Interesting)

by blue trane ( 110704 )

When I was convicted of marijuana possession before the state legalized it, how come the prosecutors got away with writing the wrong amount of marijuana I was found with, which the judge caught but then simply corrected in court and moved on with the sentencing? Why shouldn't those prosecutors have been punished for hallucinating slop?

Re: (Score:3)

by TheMiddleRoad ( 1153113 )

I mistake is different from glaring lack of professional conduct.

Re: Make it stop quickly (Score:2)

by blue trane ( 110704 )

So just because my prosecutors didn't use AI, you forgive them for a hallucination that you would disbar them for if AI had made it? And how do you know how many other times they made this mistake?

Re: Make it stop quickly (Score:4, Interesting)

by drinkypoo ( 153816 )

> I mistake is different from glaring lack of professional conduct.

Using non-local AI in any way in court filings which are supposed to be confidential until filed is glaring lack of professional conduct right up front. Allowing AI hallucinations to get in to your court paperwork even once is the same. They should lose their license for one year the first time, five years the second time, and permanently the third.

Re: (Score:2)

by gurps_npc ( 621217 )

Technically Judges cannot do that directly. Instead the procedure is:

1) Report to Bar

2) Have a Hearing by the Bar

3) The bar can decide to take their license for X amount of time.

But I do agree that is what the Judges should be doing.

The judges can however hold anyone in contempt of court for any reason at any time. Do not even have to be in court. You can appeal it, but as long as the Judge was somewhere near reasonable, you will not succeed.

If however the Judge does something like hold the Umpire at his k

And this will go on and on. Until? (Score:4, Insightful)

by oldgraybeard ( 2939809 )

The law firms are hit with huge (firm destroying) penalties and lawyers are disbarred for the lies under oath since they submitted the briefs to the court.

Re: (Score:2)

by Krishnoid ( 984597 )

No need for all that. Either "Judgement is for the other side" or "Case dismissed." Clears the docket, and slows down these kinds of submissions until they're at least doublechecked.

Re: (Score:2)

by bill_mcgonigle ( 4333 ) *

> No need for all that. Either "Judgement is for the other side" or "Case dismissed." Clears the docket, and slows down these kinds of submissions until they're at least doublechecked.

Interesting. I think you've changed my mind about this.

Economic incentives are probably the way to go.

I suspect they've always done this (Score:3)

by rsilvergun ( 571051 )

But the courts are paying more attention now because it's ai and it's bad press.

But I would bet money if you did a exhaustive analysis of court filings you would find plenty of made up citations that nobody ever looked into.

Then again lawyers are famous for their honesty and decency so maybe I shouldn't make assumptions.

Re: (Score:3)

by anoncoward69 ( 6496862 )

The lawyers have probably never read these now or in the past. They had their paralegals create them, they've probably just replaced paralegals with AI

Re: (Score:2)

by gweihir ( 88907 )

Which now bites them in the ass. This is a classical case of greedy dumb assholes trying to do things "cheaper than possible".

Re: (Score:2)

by evanh ( 627108 )

A distinct shift in who messed up. The AI can't be blamed.

Re: (Score:2)

by oldgraybeard ( 2939809 )

"famous for their honesty and decency" And keep in mind, most judges were lawyers!

And what does that say about the trust we should have in today's justice system and government in general? Lots of lawyers in government!

Lazy People == AI Slop (Score:2)

by kid_wonder ( 21480 )

This means you

This is not difficult (Score:1)

by CEC-P ( 10248912 )

These lazy, entitled rich boy lawyers who don't ever want to do actual work and just want to overbill their customers just have to learn to add "And only cite cases that actually, factually happened and cite your source so I can verify it in each answer" to every single prompt.

Debugging LLM (Score:2)

by registrations_suck ( 1075251 )

This gets into the issue of how to debug an LLM?

Since this keeps happening, it seems like when a model is trained for legal cases, it should be done in such a way that the underlying source material is tied to the case.

For example, if the model reads about Johnson v. Smith, 1972 in "Legal Shit today, issue 175", then it should be possible to query the LLM and ask it what the source material is for that case, and get back "Legal Shit Today, issue 175". This feature would at least start to allow you know how

Re: (Score:2)

by drinkypoo ( 153816 )

> Manually research the sources, verify each case cited

Clearly this not even even being done by an automated tool, let alone a human. An LLM which is given access to a database of actual cases could reasonably be successful at checking whether the cased cited even exist which isn't being checked now!

Re: (Score:2)

by registrations_suck ( 1075251 )

Well that's kind of my point. Other than, I was proposing human do it, rather than having an AI do it.

Re: (Score:2)

by gweihir ( 88907 )

Ah, no. LLMs cannot do fact-checking. They hallucinate even with full access to verified facts.

Re: (Score:2)

by gweihir ( 88907 )

You cannot "debug" an LLM. Not possible. That is one of the reasons why LLM use requires a high level of on-target expertise and a lot of care.

Idiotic statement (Score:2)

by RossCWilliams ( 5513152 )

> Court-ordered penalties "are not having a deterrent effect," said Freund, who has publicly flagged more than four dozen examples this year. "The proof is that it continues to happen."

That is an idiotic statement. People still speed and drive drunk despite penalties. It could be applied to any penalty for any crime. People still rob banks so penalties putting people in jail for bank robbery "are not having a deterrent effect?"

The flaw here is the same with any crime statistic, you only know about the reported problems that someone noticed and reported. Maybe what's needed is an AI legal review program that identifies fake citations. And yes it will make errors, miss some and report o

Re: (Score:3)

by registrations_suck ( 1075251 )

It just means the penalties are insufficient.

If the penalty for speeding were life imprisonment, you would see a huge, but not total, reduction in speeding.

Assuming the law were enforced. If you don't enforce the law, the penalty is only theoretical.

Re: (Score:2)

by gweihir ( 88907 )

> It just means the penalties are insufficient.

That is the no-clue cave-man answer. All research shows that increased penalties have no positive effect, but make the problem worse.

Obligatory (Score:2)

by organgtool ( 966989 )

I'd like to refer you to the case of Finders vs. Keepers.

It Never Ceases to Amaze Me (Score:2)

by organgtool ( 966989 )

It never ceases to amaze me that we have a seemingly-magical tool that can do many hours of research in just minutes and the person using the tool can't be bothered to take a couple of minutes to fact-check the info the seemingly-magical tool shat out. And the cherry on top is going to court and confidently presenting that unchecked info in front of a judge while being on public record. We're taking laziness to levels never fathomed before.

Re: (Score:2)

by gweihir ( 88907 )

The thing is that LLMs cannot fact check. Apparently the users of LLMs are, in this case, too lazy or to dumb (or both) to do it themselves.

There is indications that for many things, LLMs actually decrease efficiency and using them is a costly mistake. This ("better search") is supposedly one of the few areas where they save time. But fact-checking LLM results actually requires more skill and competence than generating the results manually and it MUST be done for results of reasonable quality. Apparently, t

Disbar them (Score:2)

by Sebby ( 238625 )

> [C]ourts are starting to map out punishments of small fines and other discipline [....] Court-ordered penalties "are not having a deterrent effect," said Freund, who has publicly flagged more than four dozen examples this year. "The proof is that it continues to happen."

Disbar them then - they're not doing their fucking job properly, so why should they get called "lawyers/attorneys" then?

And if they do want to be a lawyer/attorney again, then I guess they'll have to re-enroll, study, and take the bar exam again. Without using any A.I.

Imagine if a plumber relied on AI to fix your massive water leak. Would you seriously pay for that?!?

"Science" has the same problem, thank you RFKjr... (Score:2)

by acroyear ( 5882 )

RFKjr's administration have been using AI to generate justifications for policies that all are hitting exactly the same problems:

* AI is inventing studies that never existed

* AI is using quotes from real studies that aren't in the studies

* AI is generating summaries of studies that are the opposite of what the study itself actually concluded

and he's referencing these AI generated summaries in congressional hearings.

Re: (Score:2)

by gweihir ( 88907 )

The funny thing is that all of that would be very easy to verify. But LLMs are completely unable to verify. Such a great tool...

Prison time (Score:3)

by gweihir ( 88907 )

At the rates these fuckers charge, this is completely unacceptable and should be regarded as fraud.

Bachelor:
A man who chases women and never Mrs. one.