Georgia court throws out earlier ruling that relied on fake cases made up by AI
(2025/07/08)
- Reference: 1751985133
- News link: https://www.theregister.co.uk/2025/07/08/georgia_appeals_court_ai_caselaw/
- Source link:
The Georgia Court of Appeals has tossed a state trial court's order because it relied on court cases that do not exist, presumably generated by an AI model.
The errant citations, the appeals court said, appear to have been "drafted using generative AI."
Judges have previously [1]scolded lawyers for citing AI falsehoods in legal filings, but here they actually reversed a lower court's order.
[2]
The case involves wife Nimat Shahid's effort to challenge a divorce order granted to her husband Sufyan Esaam in July 2022. A lower court's decision in the case included citations fabricated by an AI model, and Shahid appealed.
[3]
[4]
The Georgia Court of Appeals found that phony cases were enough to vacate the trial court's order.
"We are troubled by the citation of bogus cases in the trial court’s order," the appeals court said in its [5]decision [PDF], which directs the lower court to revisit the wife's petition.
[6]
"As the reviewing court, we make no findings of fact as to how this impropriety occurred, observing only that the order purports to have been prepared by Husband's attorney, Diana Lynch."
[7]Scholars sneaking phrases into papers to fool AI reviewers
[8]Game, set, botch: AI umpiring at Wimbledon goes long
[9]Ousted US copyright chief argues Trump did not have power to remove her
[10]EU businesses want a pause on AI regulations so they can cope with unregulated Big Tech players
Trial judges [11]often rely on attorneys to draft the proposed court orders. Lynch did not immediately respond to a request for comment.
The appellate judges note that Lynch repeated the bogus citations in the trial court order to the appeals court and expanded upon them, even after Shahid had challenged the fictitious cases in the trial court's order. In total, the appeals court said Lynch's appeals briefs contained "11 bogus case citations out of 15 total, one of which was in support of a frivolous request for attorney fees."
The appeals court fined Lynch $2,500 as a penalty for filing a frivolous motion for attorney fees.
The appellate order cites a US Supreme Court report as a warning, noting:
In his [12]2023 Year-End Report on the Federal Judiciary , Chief Justice John Roberts warned that "any use of AI requires caution and humility." Roberts specifically noted that commonly used AI applications can be prone to "hallucinations," which caused lawyers using those programs to submit briefs with cites to non-existent cases.
At that point in time, Chief Justice Roberts presumably would have been aware of [13]Mata v. Avianca , in which the plaintiff's attorneys were sanctioned for citing fake cases, and perhaps [14]United States v. Hayes and [15]United States v. Cohen , which also surfaced in 2023.
In the years since then, the uncritical use of AI has continued, in cases like [16]Concord Music Group vs. Anthropic , [17]Wadsworth v. Walmart , and [18]Kohls et al v. Ellison , among others. [19]At least 200 instances involving AI hallucinations in legal disputes have been documented. ®
Get our [20]Tech Resources
[1] https://www.theregister.com/2025/05/15/anthopics_law_firm_blames_claude_hallucinations/
[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aG1Al6zOwpdJxNp9qCRUQQAAAIs&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aG1Al6zOwpdJxNp9qCRUQQAAAIs&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aG1Al6zOwpdJxNp9qCRUQQAAAIs&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[5] https://regmedia.co.uk/2025/07/07/georgia_appeals_decision.pdf
[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aG1Al6zOwpdJxNp9qCRUQQAAAIs&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[7] https://www.theregister.com/2025/07/07/scholars_try_to_fool_llm_reviewers/
[8] https://www.theregister.com/2025/07/07/ai_wimbledon_fail/
[9] https://www.theregister.com/2025/07/04/copyright_office_trump_filing/
[10] https://www.theregister.com/2025/07/04/eu_businesses_push_for_freedom/
[11] https://www.law.cornell.edu/rules/frcp/rule_16
[12] https://www.theregister.com/2024/01/02/supreme_court_ai_roberts/
[13] https://www.theregister.com/2023/06/22/lawyers_fake_cases/
[14] https://www.fd.org/news/court-sanctions-defense-attorney-whose-brief-cited-nonexistent-case
[15] https://www.npr.org/2023/12/30/1222273745/michael-cohen-ai-fake-legal-cases
[16] https://www.theregister.com/2025/05/15/anthopics_law_firm_blames_claude_hallucinations/
[17] https://www.theregister.com/2025/02/14/attorneys_cite_cases_hallucinated_ai/
[18] https://www.reuters.com/legal/government/judge-rebukes-minnesota-over-ai-errors-deepfakes-lawsuit-2025-01-13/
[19] https://www.damiencharlotin.com/hallucinations/
[20] https://whitepapers.theregister.com/
The errant citations, the appeals court said, appear to have been "drafted using generative AI."
Judges have previously [1]scolded lawyers for citing AI falsehoods in legal filings, but here they actually reversed a lower court's order.
[2]
The case involves wife Nimat Shahid's effort to challenge a divorce order granted to her husband Sufyan Esaam in July 2022. A lower court's decision in the case included citations fabricated by an AI model, and Shahid appealed.
[3]
[4]
The Georgia Court of Appeals found that phony cases were enough to vacate the trial court's order.
"We are troubled by the citation of bogus cases in the trial court’s order," the appeals court said in its [5]decision [PDF], which directs the lower court to revisit the wife's petition.
[6]
"As the reviewing court, we make no findings of fact as to how this impropriety occurred, observing only that the order purports to have been prepared by Husband's attorney, Diana Lynch."
[7]Scholars sneaking phrases into papers to fool AI reviewers
[8]Game, set, botch: AI umpiring at Wimbledon goes long
[9]Ousted US copyright chief argues Trump did not have power to remove her
[10]EU businesses want a pause on AI regulations so they can cope with unregulated Big Tech players
Trial judges [11]often rely on attorneys to draft the proposed court orders. Lynch did not immediately respond to a request for comment.
The appellate judges note that Lynch repeated the bogus citations in the trial court order to the appeals court and expanded upon them, even after Shahid had challenged the fictitious cases in the trial court's order. In total, the appeals court said Lynch's appeals briefs contained "11 bogus case citations out of 15 total, one of which was in support of a frivolous request for attorney fees."
The appeals court fined Lynch $2,500 as a penalty for filing a frivolous motion for attorney fees.
The appellate order cites a US Supreme Court report as a warning, noting:
In his [12]2023 Year-End Report on the Federal Judiciary , Chief Justice John Roberts warned that "any use of AI requires caution and humility." Roberts specifically noted that commonly used AI applications can be prone to "hallucinations," which caused lawyers using those programs to submit briefs with cites to non-existent cases.
At that point in time, Chief Justice Roberts presumably would have been aware of [13]Mata v. Avianca , in which the plaintiff's attorneys were sanctioned for citing fake cases, and perhaps [14]United States v. Hayes and [15]United States v. Cohen , which also surfaced in 2023.
In the years since then, the uncritical use of AI has continued, in cases like [16]Concord Music Group vs. Anthropic , [17]Wadsworth v. Walmart , and [18]Kohls et al v. Ellison , among others. [19]At least 200 instances involving AI hallucinations in legal disputes have been documented. ®
Get our [20]Tech Resources
[1] https://www.theregister.com/2025/05/15/anthopics_law_firm_blames_claude_hallucinations/
[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aG1Al6zOwpdJxNp9qCRUQQAAAIs&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aG1Al6zOwpdJxNp9qCRUQQAAAIs&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aG1Al6zOwpdJxNp9qCRUQQAAAIs&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[5] https://regmedia.co.uk/2025/07/07/georgia_appeals_decision.pdf
[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aG1Al6zOwpdJxNp9qCRUQQAAAIs&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[7] https://www.theregister.com/2025/07/07/scholars_try_to_fool_llm_reviewers/
[8] https://www.theregister.com/2025/07/07/ai_wimbledon_fail/
[9] https://www.theregister.com/2025/07/04/copyright_office_trump_filing/
[10] https://www.theregister.com/2025/07/04/eu_businesses_push_for_freedom/
[11] https://www.law.cornell.edu/rules/frcp/rule_16
[12] https://www.theregister.com/2024/01/02/supreme_court_ai_roberts/
[13] https://www.theregister.com/2023/06/22/lawyers_fake_cases/
[14] https://www.fd.org/news/court-sanctions-defense-attorney-whose-brief-cited-nonexistent-case
[15] https://www.npr.org/2023/12/30/1222273745/michael-cohen-ai-fake-legal-cases
[16] https://www.theregister.com/2025/05/15/anthopics_law_firm_blames_claude_hallucinations/
[17] https://www.theregister.com/2025/02/14/attorneys_cite_cases_hallucinated_ai/
[18] https://www.reuters.com/legal/government/judge-rebukes-minnesota-over-ai-errors-deepfakes-lawsuit-2025-01-13/
[19] https://www.damiencharlotin.com/hallucinations/
[20] https://whitepapers.theregister.com/
Georgia, the state, USA
Jamie Jones
Article needs clarification. I assume from the links in the article that this is in the US.
I assumed it was the country of Georgia until I read to that point.
Re: Georgia, the state, USA
KittenHuffer
With El Reg these days I'm afraid you'll have to flip that assumption.
Assume that everything is to do with the US of A ...... unless explicitly stated otherwise!
commonly used AI applications can be prone to "hallucinations,"...
Mentat74
No they don't... they just produce crappy output that people didn't expect...
Re: commonly used AI applications can be prone to "hallucinations,"...
KittenHuffer
Garbage in .... cabbage out!
MOH
Why does this article not make clear that it's referring to a US state, and not the country?
Why is submitting made-up evidence not considered contempt of court?