Google and Character.AI Agree To Settle Lawsuits Over Teen Suicides
- Reference: 0180543903
- News link: https://yro.slashdot.org/story/26/01/07/2211252/google-and-characterai-agree-to-settle-lawsuits-over-teen-suicides
- Source link:
> Families allege that Character.AI's chatbot encouraged their children to cut their arms, suggested murdering their parents, wrote sexually explicit messages and did not discourage suicide, per lawsuits and congressional testimony. "Parties have agreed to a mediated settlement in principle to resolve all claims between them in the above-referenced matter," one document filed in U.S. District Court for the Middle District of Florida reads.
>
> The documents do not contain any specific monetary amounts for the settlements. Pricy settlements could deter companies from continuing to offer chatbot products to kids. But without new laws on the books, don't expect major changes across the industry.
Last October, Character.AI said it would [4]bar people under 18 from using its chatbots , in a sweeping move to address concerns over child safety.
[1] https://www.axios.com/2026/01/07/google-character-ai-lawsuits-teen-suicides
[2] https://slashdot.org/story/24/10/23/1343247/teen-dies-after-intense-bond-with-characterai-chatbot
[3] https://slashdot.org/story/25/09/16/1959230/another-lawsuit-blames-an-ai-company-of-complicity-in-a-teenagers-suicide
[4] https://slashdot.org/story/25/10/29/213211/characterai-to-bar-children-under-18-from-using-its-chatbots
CEO can add to resume (Score:2)
The CEOs can add assisted suicide to their list of qualifications. That goes for the rest of you programmers helping to push this slop on us. Shame on you.
Re: (Score:2)
> The CEOs can add assisted suicide to their list of qualifications.
true, developing such a product for kids is reckless.
but on other news:
- parents leave kids unattended with chatbot
- kids cut their arms
- enraged parents sue company while leaving kids unattended on the internets, and get a treat for leaving kids unattended with chatbot
is that reinforcing proper behavior?
> That goes for the rest of you programmers helping to push this slop on us. Shame on you.
i would include the parents in that. at least parents should know their kids. programmers can't always know where and how their work will end up being used.
Re: (Score:2)
[1] Look, we put a label on every bag that says, “Kid! Be careful – broken glass!” I mean, we sell a lot of products in the “Bag O'” line.. like Bag O’ Glass, Bag O’ Nails, Bag O’ Bugs, Bag O’ Vipers, Bag O’ Sulfuric Acid. They’re decent toys, you know what I mean? [youtube.com]
[1] https://www.youtube.com/watch?v=veMiNQifZcM
Re: (Score:2)
> The CEOs can add assisted suicide to their list of qualifications.
Didn't we just have several tech companies that scattered a bunch of scooters all over the place and then literally said "go ride these in traffic"? If that's somehow okay, I'm really not seeing how some kid offing himself after getting bad advice from a chatbot is anywhere near on the same level of exposure to liability.
Oh right, you're technically supposed to be 18 to ride those scooters.
Interesting (Score:2)
So does a company the size of google submitting mean that openAI is in trouble for their ongoing suit?