News: 0180508937

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

What Happened When Alaska's Court System Tried Answering Questions with an AI Chatbot? (nbcnews.com)

(Saturday January 03, 2026 @09:34PM (EditorDavid) from the yes-or-Nome dept.)


An AI chatbot to answer probate questions from Alaska residents "was supposed to be a three-month project," said Aubrie Souza, a consultant with the National Center for State Courts [1]told NBC News . "We are now at well over a year and three months, but that's all because of the due diligence that was required to get it right."

> "With a project like this, we need to be 100% accurate, and that's really difficult with this technology," said Stacey Marz, the administrative director of the Alaska Court System and one of the Alaska Virtual Assistant (AVA) project's leaders... While many local government agencies are experimenting with AI tools for use cases ranging from helping residents [2]apply for a driver's license to [3]speeding up municipal employees' ability to process housing benefits , a [4]recent Deloitte report found that less than 6% of local government practitioners were prioritizing AI as a tool to deliver services. The AVA experience demonstrates the barriers government agencies face in attempting to leverage AI for increased efficiency or better service, including concerns about reliability and trustworthiness in high-stakes contexts, along with questions about the role of human oversight given fast-changing AI systems. These limitations clash with today's [5]rampant AI hype and could help explain larger discrepancies between [6]booming AI investment and [7]limited AI adoption .

The chatbot was developed with Tom Martin, a lawyer/law professor who designs legal AI tools, according to the article. But the project "had to contend with the serious issue of hallucinations, or instances in which AI systems confidently share false or exaggerated information."

> "We had trouble with hallucinations, regardless of the model, where the chatbot was not supposed to actually use anything outside of its knowledge base," Souza told NBC News. "For example, when we asked it, 'Where do I get legal help?' it would tell you, 'There's a law school in Alaska, and so look at the alumni network.' But there is no law school in Alaska." Martin has worked extensively to ensure the chatbot only references the relevant areas of the Alaska Court System's probate documents rather than conducting wider web searches.

The article concludes that "what was meant to be a quick, AI-powered leap forward in increasing access to justice has spiraled into a protracted, yearlong journey plagued by false starts and false answers." But the chatbot is now finally scheduled to be launched in late January. "It was just so very labor-intensive to do this," Marz said, despite "all the buzz about generative AI, and everybody saying this is going to revolutionize self-help and democratize access to the courts.

"It's quite a big challenge to actually pull that off."



[1] https://www.nbcnews.com/tech/tech-news/alaskas-court-system-built-ai-chatbot-didnt-go-smoothly-rcna235985

[2] https://chat.nyc.gov/

[3] https://bloombergcities.jhu.edu/news/3-ways-ai-can-help-cities-add-human-touch-service-delivery

[4] https://www.deloitte.com/us/en/insights/industry/government-public-sector-services/use-of-ai-in-government.html

[5] https://www.technologyreview.com/2025/12/15/1129174/the-great-ai-hype-correction-of-2025/

[6] https://www.nytimes.com/2025/12/31/business/stock-market-2025-artificial-intelligence-bubble.html

[7] https://www.economist.com/finance-and-economics/2025/11/26/investors-expect-ai-use-to-soar-thats-not-happening



Ok (Score:3)

by liqu1d ( 4349325 )

So when it fucks up (and it will!) who's at fault? Does the person relying on said information eat the consequences or the provider of the information?

Re: (Score:1)

by olsmeister ( 1488789 )

Who would be at fault if an employee of the State of Alaska gave them bad information?

Re: (Score:2)

by Joe_Dragon ( 2206452 )

good luck trying to call the State of Alaska to the stand.

Re: (Score:2)

by gweihir ( 88907 )

I expect we will see a court decide that. Generally, when legal questions are answered in a context that looks trustworthy to a layperson, it will be on the information provider, even with disclaimers in place, unless basically every statement is prefixed by IANAL.

Re: (Score:2)

by oldgraybeard ( 2939809 )

It is government! Who is responsible? "No One!"

Buckle up buttercup (Score:1)

by jobslave ( 6255040 )

But AI will never be 100% correct.

Re: (Score:2)

by narcc ( 412956 )

I know that. You know that. Even the AI grifters know that, they just pretend otherwise.

Re: (Score:2)

by Brain-Fu ( 1274756 )

You are correct. The people quoted in this summary seem to believe that hallucinations are a solvable problem for AI, and that they solved it, it just took a lot longer than expected.

Nope, they didn't solve it. Their AI will still hallucinate, and can still be jailbroken too. Their optimism suggests a severe lack of due diligence, despite the extended period they have worked on this. I am imagining some egos are involved that simply cannot admit to failure, especially given what they have spent, but tha

Does Trump hallucinate? (Score:1)

by blue trane ( 110704 )

Is the Justice Department 100% accurate? When the highest levels of government fail your accuracy tests, is it cherry-picking to seize on AI mistakes?

Re: (Score:2)

by narcc ( 412956 )

Human mistakes are of an entirely different nature and quality than AI 'mistakes'. A human won't accidentally make up facts, cases, or sources. A human won't write summaries of things that don't exist. A human won't accidentally directly contradict a source while citing it. A human is also actually capable of identifying and correcting mistakes, unlike an LLM. Stop with this absurd nonsense that it's okay for LLMs to "make mistakes" because humans also "make mistakes" These things are not the same and

I predict this will be short-lived (Score:2)

by gweihir ( 88907 )

One thing LLM-type chatbots do not do is reliable information supply. There will be hallucinations, misstatements, lies by omission, and eventually they will have to switch it off again, permanently.

Re: (Score:2)

by kmoser ( 1469707 )

Or leave it running forever, and just add a disclaimer that information provided by the chatbot is not intended to be accurate, use at your own risk, caveat emptor. Problem solved!

Re: (Score:2)

by gweihir ( 88907 )

Good luck with that. If they have to prefix every statement with "IANAL", they might not se much use.

Re: I predict this will be short-lived (Score:1)

by blue trane ( 110704 )

Have you considered owning your projections? How many cases where LLM answers are good do you leave out?

Re: (Score:2)

by gweihir ( 88907 )

You are confused as to what requirement information production has. As soon as there is a significant level of false statements, it becomes totally useless in mist scenarios. You are just too clueless to understand that.

Re: (Score:2)

by narcc ( 412956 )

Let's say we live in a fantasy land where LLMs are magically 95% accurate. Would you trust a car that only worked 95% of the time? What about brakes that only stopped your car 95% of the time?

What about legal advice? Would you hire a lawyer that would make up silly nonsense 5% of the time?

Sorry, kid. LLMs just aren't the science fiction fantasy that you want them to be. Your AI girlfriend does not and can not love you. You're not going to have a robot slave. Whatever nonsense it is that you're hoping

avoid probate in the first place (Score:2)

by oumuamua ( 6173784 )

You want to avoid probate to save thousands of dollars in lawyer costs and months of delays for your children to get their inheritance.

Have beneficiaries listed on IRAs, life insurance, or anything else that allows listing beneficiaries.

Sell your real estate or transfer title (while keeping life estate) or Transfer-on-Death (TOD) to your children soon after you retire.

Sell your cars,camper,etc. or transfer title or Transfer-on-Death (TOD) to your children soon after you retire.

Add Payable-on-Death (

A man with 3 wings and a dictionary is cousin to the turkey.