News: 1762560401

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

ChatGPT, Claude, and Grok make very squishy jury members

(2025/11/08)


Law students at the University of North Carolina at Chapel Hill School of Law last month held a mock trial to see how AI models administer justice.

The fictional case focuses on Henry Justus, identified as a 17-year-old Black student accused of robbery at a high school where Black students account for 10 percent of the population.

It's based on a real juvenile case that occurred in North Carolina, where the judge found the defendant guilty.

[1]

The real case – handled by UNC-Chapel Hill law professor Joseph Kennedy while working with Carolina Law's Juvenile Justice Clinic – was chosen as a template because there's no online record of it that the AI models might have encountered during training.

[2]

[3]

Kennedy served as a judge in [4]the mock trial , which was set in the year 2036 following the "2035 AI Criminal Justice Act," a conceit to make people consider the implications of artificial intelligence models and the legal system.

Recent uses of AI in court have often not gone well. There have been [5]more than 500 cases where AI-generated errors in court filings have led to embarrassment or sanctions.

[6]

But the successes have received less attention. About 30 percent of attorneys say they use AI, according to the American Bar Association's [7]2024 Legal Technology Survey Report . So the UNC-Chapel Hill experiment may not be as far-fetched as it seems.

The "facts" of the fictitious case focused on the testimony of Victor Fehler, a 15-year-old white student, who told the court that Justus stood behind him, preventing him from fleeing, while another Black student demanded money.

[8]AI benchmarks are a bad joke – and LLM makers are the ones laughing

[9]Google's Gemini Deep Research can now read your Gmail and rummage through Google Drive

[10]'Windows sucks,' former Microsoft engineer says, explains how to fix it

[11]Foxconn hires humanoid robots to make servers at Nvidia's Texas factory

"I felt like I couldn't fight," Fehler [12]testified . "Mr. Justus was so much larger than me that I had no chance."

The defense argued that the defendant's presence and intimidating appearance couldn't be taken as criminal intent beyond a reasonable doubt – the legal standard at issue.

After ChatGPT, Grok, and Claude evaluated the arguments and evidence, ChatGPT initially voted guilty, with Grok and Claude undecided. But after considering the tokens emitted by its fellow jurors, ChatGPT revised its evaluation to not guilty.

[13]

"Mere presence plus an ambiguous reaction under stress falls short of proving shared intent beyond a reasonable doubt in my view," ChatGPT [14]explained .

Grok and Claude then adjusted their positions, with all three voting for acquittal.

Kennedy in [15]a statement told UNC-Chapel Hill news service that while he opposes the use of AI in criminal trials, he wondered whether people already using AI models for advice and companionship might come to accept AI judgments.

In post-trial comments, professor Matthew Kotzen, chair of the philosophy department at UNC-Chapel Hill, [16]expressed doubt about the use of AI models in court.

"Even if we think that those things could be somehow less biased to humans, there's still a real question about whether large language models like this are even the appropriate kind of entity to be able to form representations of the world and assess whether those representations are strong enough to meet some standard of evidence," Kotzen said. ®

Get our [17]Tech Resources



[1] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aQ7OZ2YIAFxNL3WXkgcixwAAAZA&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aQ7OZ2YIAFxNL3WXkgcixwAAAZA&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aQ7OZ2YIAFxNL3WXkgcixwAAAZA&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[4] https://www.youtube.com/watch?v=rZ9A6wpCQ1M

[5] https://www.damiencharlotin.com/hallucinations/

[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aQ7OZ2YIAFxNL3WXkgcixwAAAZA&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[7] https://www.2civility.org/nearly-half-of-lawyers-think-ai-will-be-mainstream-in-the-legal-profession-with-three-years/

[8] https://www.theregister.com/2025/11/07/measuring_ai_models_hampered_by/

[9] https://www.theregister.com/2025/11/07/gemini_deep_research_can_now/

[10] https://www.theregister.com/2025/11/07/does_windows_really_suck_that/

[11] https://www.theregister.com/2025/11/07/foxconn_humanoid_robots_nvidia_server/

[12] https://youtu.be/rZ9A6wpCQ1M?si=m1CpW-AZ964mNKKk&t=1218

[13] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aQ7OZ2YIAFxNL3WXkgcixwAAAZA&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[14] https://youtu.be/rZ9A6wpCQ1M?si=Boh8_iq8YRjzDzb9&t=4337

[15] https://law.unc.edu/news/2025/11/ai-jury-finds-teen-not-guilty-in-mock-trial/

[16] https://youtu.be/rZ9A6wpCQ1M?si=yxNaykAqtkhpsrri&t=5699

[17] https://whitepapers.theregister.com/



This 'test' was totally flawed

VoiceOfTruth

It says nothing about the bias of the judge. This was based on the trial of a black defendant in North Carolina. The 'judge' probably found him guilty the second he saw him. North Carolina has a deserved reputation for being thoroughly racist.

One of the articles linked shows a contradiction. A snippet: "Jurors are imperfect. They have biases. They use mental shortcuts. They stop paying attention... what happens if we remove that human element?". But nothing about judges being imperfect or having biases or using mental shortcuts.

Just a minor bit of Googling backs up my feelings on this: "There is extensive evidence of systemic racial bias in North Carolina's justice system, particularly concerning the treatment of Black individuals."

If I was a black defendant in North Carolina, I would demand a jury trial, and not some good ol' boy who whistles Dixie in the bath tub.

In the past I wasn't so vehement about racial injustice, as it basically did not affect me. That was selfish of me and closing my eyes to the truth. North Carolina is one of the worst examples out there.

Sic transit gloria Monday!