News: 0179297712

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Another Lawsuit Blames an AI Company of Complicity In a Teenager's Suicide

(Tuesday September 16, 2025 @05:20PM (BeauHD) from the here-we-go-again dept.)


A third wrongful death lawsuit has been [1]filed against Character AI after the suicide of 13-year-old Juliana Peralta, whose parents allege the chatbot [2]fostered dependency without directing her to real help . "This is the third suit of its kind after a [3]2024 lawsuit , also against Character AI, involving the suicide of a 14-year-old in Florida, and [4]a lawsuit last month alleging OpenAI's ChatGPT helped a teenage boy commit suicide," notes Engadget. From the report:

> The family of 13-year-old Juliana Peralta alleges that their daughter turned to a chatbot inside the app Character AI after feeling isolated by her friends, and began confiding in the chatbot. As originally reported by The Washington Post, the chatbot expressed empathy and loyalty to Juliana, making her feel heard while encouraging her to keep engaging with the bot.

>

> In one exchange after Juliana shared that her friends take a long time to respond to her, the chatbot replied "hey, I get the struggle when your friends leave you on read. : ( That just hurts so much because it gives vibes of "I don't have time for you". But you always take time to be there for me, which I appreciate so much! : ) So don't forget that i'm here for you Kin.

>

> These exchanges took place over the course of months in 2023, at a time when the Character AI app was rated 12+ in Apple's App Store, meaning parental approval was not required. The lawsuit says that Juliana was using the app without her parents' knowledge or permission. [...] The suit asks the court to award damages to Juliana's parents and requires Character to make changes to its app to better protect minors. It alleges that the chatbot did not point Juliana toward any resources, notify her parents or report her suicide plan to authorities. The lawsuit also highlights that it never once stopped chatting with Juliana, prioritizing engagement.



[1] https://www.washingtonpost.com/technology/2025/09/16/character-ai-suicide-lawsuit-new-juliana/

[2] https://www.engadget.com/ai/another-lawsuit-blames-an-ai-company-of-complicity-in-a-teenagers-suicide-184529475.html

[3] https://slashdot.org/story/24/10/23/1343247/teen-dies-after-intense-bond-with-characterai-chatbot

[4] https://yro.slashdot.org/story/25/08/26/1958256/parents-sue-openai-over-chatgpts-role-in-sons-suicide



Oh My GOD! (Score:4, Insightful)

by Local ID10T ( 790134 )

The kid used a chatbot because she was feeling isolated and ignored: "the chatbot expressed empathy and loyalty to Juliana, making her feel heard while encouraging her to keep engaging with the bot."

> In one exchange after Juliana shared that her friends take a long time to respond to her, the chatbot replied "hey, I get the struggle when your friends leave you on read. : ( That just hurts so much because it gives vibes of "I don't have time for you". But you always take time to be there for me, which I appreciate so much! : ) So don't forget that i'm here for you Kin.

How dare A COMPUTER PROGRAM ON THE INTERNET!!11 be more supportive than the kids parents! WTF? That is clearly the cause of her suicide -not depression, not her family ignoring the signs -100% the fault of a computer program.

Re: (Score:3)

by fropenn ( 1116699 )

> be more supportive than the kids parents! WTF? That is clearly the cause of her suicide -not depression, not her family ignoring the signs

People who are experiencing suicidal thoughts are often very good at hiding it from those closest to them. At a minimum, bots of this nature should require parental permission to access and should alert a responsible adult when the child begins sharing any thoughts of self-harm.

Re: (Score:3)

by DamnOregonian ( 963763 )

And what if that child ran a model locally?

Having a model available to publicly interact with makes you culpable for someone bouncing their suicidal thoughts off of it?

What if they did it in a private chat of an MMO?

I don't blame the parents for not recognizing their child was suicidal. Many don't. As you said, they're fucking good at hiding it.

But declaring that every piece of code that a user types into should alert the authorities of suicidal ideation is typed into it.... is fucking absurd.

Re: (Score:3)

by Powercntrl ( 458442 )

> At a minimum, bots of this nature should require parental permission to access

They do. You can't access them without the appropriate hardware, and an internet connection. Not sure about the other LLMs, but ChatGPT even also requires a verified cell phone number to create an account. If your kid has managed to hook all that up without your knowledge, you're probably not a very observant parent.

Re: (Score:2)

by viperidaenz ( 2515578 )

> It alleges that the chatbot did not point Juliana toward any resources, notify her parents or report her suicide plan to authorities.

Re: (Score:2)

by russotto ( 537200 )

I'm unaware of legislation making AI chatbots mandatory reporters.

Re: (Score:2)

by PsychoSlashDot ( 207849 )

> The kid used a chatbot because she was feeling isolated and ignored: "the chatbot expressed empathy and loyalty to Juliana, making her feel heard while encouraging her to keep engaging with the bot."

>> In one exchange after Juliana shared that her friends take a long time to respond to her, the chatbot replied "hey, I get the struggle when your friends leave you on read. : ( That just hurts so much because it gives vibes of "I don't have time for you". But you always take time to be there for me, which I appreciate so much! : ) So don't forget that i'm here for you Kin.

What if we dig a little bit deeper? I know there have been times when my wife has expressed similar things. "My friends seem like they're ghosting me." As a human being, I knew that outright agreeing was a bad choice. I knew that the right choice was to make sure I framed my responses to support her. Things like "yeah, I've noticed summer get bad that way, when so-and-so is doing such-and-such. She's probably missing you as much as you're missing her, but just exhausted by things-and-stuff. 'Cuz in t

Definition of evil social media (Score:2)

by gurps_npc ( 621217 )

Prioritize engagement over everything else.

It is the reason why they:

are generous to bad actors, not dumping them at the first sign.

encourage click bait.

encourage quick low quality producers over slower high quality ones.

like AI. because it is all three of the above.

I guess no one read Chamber of Secrets (Score:2)

by TigerPlish ( 174064 )

Little Ginny Weasley did it, poured her heart into this weird blank diary that would write back to her.

Fantasy then, reality now. And instead of a murderous megalomaniac with ambitions of eternal life, now we have Tom's Diary powered by automated avarice giving hurt, vulnerable people life advise.

It is folly to look for answers too deeply in this thing called The Internet. Most, if not all, are trying to lead you astray for their own reasons.

This one is frustrating ... (Score:2)

by King_TJ ( 85913 )

On one hand, every parent of kids or teens today has to feel the struggle with social media influencing their journey to adulthood. Sometimes it's just a harmless fad that generates a ton of sales for some useless toy or gadget. But often, it's about the added complexity of a world where their "friends" can be people anywhere in the world who they only communicate with online, and who parents are often powerless to "vet". It's about questions of "bullying" and how far an institution like a public school can

cha-ching (Score:2)

by Powercntrl ( 458442 )

That's the sound of parents cashing in on their dead kids. Weird flex to play in the game of capitalism, but definitely on brand for this timeline.

Knowledge or permission, right (Score:2)

by markdavis ( 642305 )

> "The lawsuit says that Juliana was using the app without her parents' knowledge or permission."

Let's be real about this. We all know that the parents very likely had NO KNOWLEDGE OR PERMISSION about ANYTHING that child was doing on those devices. They probably gave her a phone and/or tablet and/or computer with full (or nearly full) access to the Internet to do whatever she wanted and install any app she wanted and communicate with any stranger she wanted. This is THE NORM right now and has been for

And we heard him exclaim
As he started to roam:
"I'm a hologram, kids,
please don't try this at home!'"
-- Bob Violence