Teen Dies After Intense Bond with Character.AI Chatbot
- Reference: 0175309337
- News link: https://slashdot.org/story/24/10/23/1343247/teen-dies-after-intense-bond-with-characterai-chatbot
- Source link:
Character.AI, valued at $1 billion and claiming 20 million users, [2]in response said it would implement new safety features for minors, including time limits and expanded trigger warnings for self-harm discussions. The company's head of trust and safety Jerry Ruoti said they "take user safety very seriously."
[1] https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html
[2] https://x.com/nickfloats/status/1849055647272251582
pickup line (Score:2)
Bestest pickup line: "Cutie, your name must be Suicide, cuz I think of you every day.".
Re: (Score:2)
Given the chatbot is called "Dany" and is based on a character from Game of Thrones (gee, I wonder which one?), maybe he went with "How do you like your eggs in the morning? Fertilized or extra crispy?" then immolated himself on a makeshift funeral pyre.
What a shit article (Score:5, Insightful)
Do you know how many real non AI sick fuckers troll Discord pretending to be teens looking for vulnerable teens to convince them to join suicide pacs? This article is so poorly written it fails to, in any way, establish complicity on the part of the LLM. Where is the evidence that the LLM encouraged suicide? Where is the evidence that this suicide was preventable? What attempts did the parents make to limit access to a child who is at a very highly influential age of development?
13,14,15 are practically imprinting years of young adult development. Influences at this age are more damaging than practically any other age of ones adult lifetime. If your kid is a recluse at 13, 14, 15 you need to try to get your kid involved in more after school activities or clubs. It doesn't really matter if its e-sports, role-playing, board-game, archerty, church youth group, or whatever. Having teammates and feeling like you are part of a team with real humans you interact with and can physically see goes a long way toward not feeling isolate. Living in isolation will lead to either suicide, alcoholism, or drug abuse.
This article explains nothing of what steps happened, what signs were present, anything. Thats trying to scrape the article from behind a paywall. Had this been another teen victim of the suck fucks trolling Discord would the NYT even have printed the article? Journalistic irresponsibility. No way I am paying for this level of poor journalism.
Re: What a shit article (Score:2)
I guess these details will be in the lawsuit, which is still not filed with the courts.
I suppose the story is the suicide leading to the suit.
Re: (Score:1)
You're argument is they're at such an impressionable age, the parents shouldn't have let him use Character.AI. So, why does Character.AI allow users at that age if it's clearly a bad idea? Presumably they claim their product is safe for kids. If it made his mental health worse, then that is a lie that endangers kids and is worth suing over.
Re: (Score:2)
Im saying that at that age ALL communication should be supervised and moderated. See my comment about the slime fuckers on discord. Plaintiff has to prove AI was MORE dangerous than every other access than she allowed him to have. Spending hours in your room talking to your imaginary friend who tells you to shoot up a school or self harm would trigger an immediate visit to a child psychologist. There are always warning signs of suicide. Someone saw them. Maybe he was cutting on himself. Mentioned something
Re: (Score:2)
> Do you know how many real non AI sick fuckers troll Discord pretending to be teens looking for vulnerable teens to convince them to join suicide pacs?
Lmao, citation needed. This sounds too close to that uninformed worrying from parents thinking D&D games and rock music are turning kids into satan worshipers.
typical msmash trash (Score:4, Insightful)
The word is suicide
Re: (Score:2)
Suicide isn't a great word for this. Suicide includes things like suicide attacks, assisted dying, and so forth. It's also usually used with the word "committed", which makes it sound like some sort of crime (it used to be illegal in the UK and probably elsewhere, due to religious influence).
He died of mental health problems, of illness. How much AI contributed to that we can't say from TFA, we will have to wait for details of the lawsuit to be made public.
Better avoid all contact then (Score:2)
Since having contact with people who may commit suicide could get you sued afterwards. Makes their prospects worse but hey, at least you can't be blamed.
For profit corporate structure (Score:1)
It sells you something, that's all. If they don't have a product to pitch you they just tell you who you are. Someone who should buying more stuff.
Countersue the parents (Score:3)
If you have a kid who is able to get suicidally attached to a chatbot and you didn't see it coming, you weren't parenting.
I'm not saying that the outcome could have been avoided, some people have serious mental illnesses that are resistant to what few treatments we have, but in no way should extended access to a chatbot make the chatbot provider liable. The parents should take responsibility for that aspect of their child's fate.
Le wrong generation (Score:2)
What happened to the Star Wars era when we learned to mock, ignore, unplug, or threaten to leave behind robots mimicking human interaction like we should?
It's a Friendly AI, Thankfully! (Score:2)
"Hi, Dany!"
"Can't we just be friends?"
"You look nice today!"
"Can't we just be friends?"
"Was it fun riding dragons?"
"Can't we just be friends?"
"What was all that green stuff?"
"Can't we just be friends?"
Etc.
Re: It's a Friendly AI, Thankfully! (Score:1)
How can you know what's real when the AI domain is a parody of a mirror domain which is a satire of the first? Your real life literally does not matter any more than being an algorithm to make fun of.
I think the one skill schools need to teach asap (Score:2)
is not forming emotional bonds with fucking machines.
However depressed this teen was, he knew he was talking to a machine, and the machine made him more depressed. That would not have happened if the teen kept in mind at all time, whenever he interacted with that machine, that he's really talking to a dystopian for-profit trying to profit from his emotions.
The educational system needs to give our kids the tools needed to distance themselves from dystopia.
Re: (Score:2)
Can you think of a redeeming quality of this chatbot? I can't.
Re: (Score:1)
Well I imagine it's quite possible that some teens that were thinking about suicide were "saved" by finding "friendship" with one of these AI. We would be much less likely too hear about it though.
Re: (Score:2)
Young people lacking in real human interaction is a huge problem today how is replacing this with a chatbot going to save someone? It just makes it worse.
Re: (Score:2)
> Young people lacking in real human interaction is a huge problem today how is replacing this with a chatbot going to save someone? It just makes it worse.
Not necessarily. In the same way we can develop "addictive" AI bots that exploit vulnerabilities, we can also develop bots trained with medical expertise, to "detect" depression or suicidal interactions and attempt to reassure the individual and seek help (and if possible, alert a corresponding system.)
Re: (Score:2)
[1]Here's Tom with the weather. [youtube.com]
[1] https://www.youtube.com/watch?v=rFcTPfVybHA
Re: (Score:2)
> Can you think of a redeeming quality of this chatbot? I can't.
There's only one quality, redeeming or otherwise: It provides emotional support that kids desperately need and parents and peers are too emotionally and mentally stunted to provide. Which should be a condemnation of our entire society, but instead has become a point of profit for those who see the need and have found ways to exploit it.
Re: (Score:2)
I bet there were a trove of lawyers knocking at her door so she's not the only one seeking to get rich from this...
Re: (Score:2)
Litigation nation
Re:Let me fix that for you. (Score:4, Insightful)
> Teen lost to depression, mother diverts her guilt and blame seeking to get rich.
And so she should. Depression is an illness that needs to be carefully treated and can easily be exacerbated by certain interactions. If it can be shown in chat logs that what the AI chatbot said was actively harmful then a commercial product contributed to someone's death. And I couldn't think of a happier outcome than a wealth transfer from a worthless AI company to someone who could probably do with some extra cash.