News: 0180543903

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Google and Character.AI Agree To Settle Lawsuits Over Teen Suicides

(Wednesday January 07, 2026 @10:30PM (BeauHD) from the regulation-by-lawsuits dept.)


Google and Character.AI have [1]agreed to settle multiple lawsuits from families alleging the chatbot encouraged [2]self-harm [3]and suicide among teens. "The settlements would mark the first resolutions in the wave of lawsuits against tech companies whose AI chatbots encouraged teens to hurt or kill themselves," notes Axios. From the report:

> Families allege that Character.AI's chatbot encouraged their children to cut their arms, suggested murdering their parents, wrote sexually explicit messages and did not discourage suicide, per lawsuits and congressional testimony. "Parties have agreed to a mediated settlement in principle to resolve all claims between them in the above-referenced matter," one document filed in U.S. District Court for the Middle District of Florida reads.

>

> The documents do not contain any specific monetary amounts for the settlements. Pricy settlements could deter companies from continuing to offer chatbot products to kids. But without new laws on the books, don't expect major changes across the industry.

Last October, Character.AI said it would [4]bar people under 18 from using its chatbots , in a sweeping move to address concerns over child safety.



[1] https://www.axios.com/2026/01/07/google-character-ai-lawsuits-teen-suicides

[2] https://slashdot.org/story/24/10/23/1343247/teen-dies-after-intense-bond-with-characterai-chatbot

[3] https://slashdot.org/story/25/09/16/1959230/another-lawsuit-blames-an-ai-company-of-complicity-in-a-teenagers-suicide

[4] https://slashdot.org/story/25/10/29/213211/characterai-to-bar-children-under-18-from-using-its-chatbots



Interesting (Score:2)

by liqu1d ( 4349325 )

So does a company the size of google submitting mean that openAI is in trouble for their ongoing suit?

CEO can add to resume (Score:2)

by RitchCraft ( 6454710 )

The CEOs can add assisted suicide to their list of qualifications. That goes for the rest of you programmers helping to push this slop on us. Shame on you.

Re: (Score:2)

by znrt ( 2424692 )

> The CEOs can add assisted suicide to their list of qualifications.

true, developing such a product for kids is reckless.

but on other news:

- parents leave kids unattended with chatbot

- kids cut their arms

- enraged parents sue company while leaving kids unattended on the internets, and get a treat for leaving kids unattended with chatbot

is that reinforcing proper behavior?

> That goes for the rest of you programmers helping to push this slop on us. Shame on you.

i would include the parents in that. at least parents should know their kids. programmers can't always know where and how their work will end up being used.

Re: (Score:2)

by jacks smirking reven ( 909048 )

[1] Look, we put a label on every bag that says, “Kid! Be careful – broken glass!” I mean, we sell a lot of products in the “Bag O'” line.. like Bag O’ Glass, Bag O’ Nails, Bag O’ Bugs, Bag O’ Vipers, Bag O’ Sulfuric Acid. They’re decent toys, you know what I mean? [youtube.com]

[1] https://www.youtube.com/watch?v=veMiNQifZcM

Re: (Score:2)

by Powercntrl ( 458442 )

> The CEOs can add assisted suicide to their list of qualifications.

Didn't we just have several tech companies that scattered a bunch of scooters all over the place and then literally said "go ride these in traffic"? If that's somehow okay, I'm really not seeing how some kid offing himself after getting bad advice from a chatbot is anywhere near on the same level of exposure to liability.

Oh right, you're technically supposed to be 18 to ride those scooters.

Vermouth always makes me brilliant unless it makes me idiotic.
-- E. F. Benson