The AI Therapist Can See You Now (npr.org)
- Reference: 0176986377
- News link: https://science.slashdot.org/story/25/04/09/155247/the-ai-therapist-can-see-you-now
- Source link: https://www.npr.org/sections/shots-health-news/2025/04/07/nx-s1-5351312/artificial-intelligence-mental-health-therapy
> The [2]recent study , published in the New England Journal of Medicine, shows results from the first randomized clinical trial for AI therapy. Researchers from Dartmouth College built the bot as a way of taking a new approach to a longstanding problem: The U.S. continues to grapple with an acute shortage of mental health providers. "I think one of the things that doesn't scale well is humans," says Nick Jacobson, a clinical psychologist who was part of this research team. For every 340 people in the U.S., there is just one mental health clinician, according to some estimates.
>
> While many AI bots already on the market claim to offer mental health care, some have dubious results or have even led people to self-harm. More than five years ago, Jacobson and his colleagues began training their AI bot in clinical best practices. The project, says Jacobson, involved much trial and error before it led to quality outcomes. "The effects that we see strongly mirror what you would see in the best evidence-based trials of psychotherapy," says Jacobson. He says these results were comparable to "studies with folks given a gold standard dose of the best treatment we have available."
[1] https://www.npr.org/sections/shots-health-news/2025/04/07/nx-s1-5351312/artificial-intelligence-mental-health-therapy
[2] https://ai.nejm.org/doi/full/10.1056/AIoa2400802
doesn't pass the sniff test (Score:2)
'He says these results were comparable to "studies with folks given a gold standard dose of the best treatment we have available."'
To be clear, this isn't an endorsement of AI, it's a condemnation of "the best treatment we have available".
So it appears that this "research" suggests that empathy is not required for effective "mental health therapy". Imagine having a sociopath as your therapist, one who claims to provide care just as good as any other. Sure thing.
"The U.S. continues to grapple with an acute
A shortage (Score:2)
> The U.S. continues to grapple with an acute shortage of mental health providers.
And part of the reason for this shortage of providers is the shortage of medical insurance companies in the U.S. who will cover treatment by those providers. Can't get paid? Then why be a provider?
Maybe insurance companies will cover AI providers because the costs of the therapy will be low and predictable. At least at first. That might change when the lawsuits begin because the therapy goes wrong.
Much of practical therapy ... (Score:2)
... is crackpot science. Not all and perhaps not the majority, but given that the quality of Therapy is extremely dependent on the match and vibe between patient and therapist and many practicing therapists aren't that good at their job I totally believe that an AI can be a better option for many.
Especially given the fact that by simply heuristics a therapy AI can gain experience from talking to hundreds of millions of people. Something that is totally beyond anything a human can do.
If AI comes anywhere nea
No, AI Therapist could be Bad, deadly (Score:2)
As someone who co-facilitated a peer support group, for 10 years. An AI therapist would not be able to taken in all the subtle messages that a person communicates without. Either the greatest instruments or detriments a therapist has--is their own visceral reactions. People can more easily lie on a paper test.
No and No (Score:4, Informative)
AI doesn't have feelings and doesn't understand what it's like to be human. If you don't have that, talking to an AI therapist is not any better than reading a book. Don't even think for 2 seconds that you can replace a therapist with a computer.
Re: (Score:3)
It depends on the quality of the therapist. If all they are doing is listening and making "go on" noises, perhaps an AI or even a simple Chatbot would do. On the other hand, how long will it be before the AI starts suggesting suicide as a solution (already happened, after all).
Re: (Score:2)
Suggesting suicide might be considered a good outcome if you design your metric to do so. It seems quite clear that this is what has occurred.
Therapy is subjective and depends heavily on the participation and receptiveness of the patient. It seems that measurement is both difficult AND not the real goal of the study. It's just part of the money grab by the tech bros, is there any job that can't be done by a machine that VC's can own for a few billion more dollars?
Re: (Score:2)
The actual advice in evidence based therapy is pretty straightforward. The patient doesn't want to have to tell the therapist they just stuck to usual habits, so tries their best to comply. Only the feelings of the patient are relevant in this.
The huge success of Character.AI proves that people do feel attachment to these bots, that's all that's needed.