News: 0181102178

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Meta Loses Trial After Arguing Child Exploitation Was 'Inevitable' (arstechnica.com)

(Wednesday March 25, 2026 @06:00PM (BeauHD) from the not-a-great-defense dept.)


Meta lost a child safety trial in New Mexico after a court found that its platforms [1]failed to adequately protect children from exploitation and misled parents about app safety. According to Ars Technica, the jury on Tuesday "deliberated for only one day before agreeing that Meta should pay $375 million in civil damages..." While the jury declined to impose the maximum penalty New Mexico sought, which could have cost the company $2.2 billion, Meta may still face additional financial penalties and could be forced to make changes to its apps. From the report:

> The trial followed a 2023 lawsuit filed by New Mexico Attorney General Raul Torrez after The Guardian [2]published a two-year investigation exposing child sex trafficking markets on Facebook and Instagram. Torrez's office then conducted an undercover investigation codenamed "Operation MetaPhile," in which officers posed as children on Facebook, Instagram, and WhatsApp. The jury heard that these fake profiles were "simply inundated with images and targeted solicitations" from child abusers, Torrez [3]told CNBC in 2024. Ultimately, three men were arrested amid the sting for attempting to use Meta's social networks to prey on children. At trial, Mark Zuckerberg and Instagram chief Adam Mosseri testified that "harms to children, such as sexual exploitation and detriments to mental health, were inevitable on the company's platforms due to their vast user bases," The Guardian reported. Internal messages and documents, as well as testimony from child safety experts within and outside the company, showed that Meta repeatedly ignored warnings and failed to fix platforms to protect kids, New Mexico's AG successfully argued.

>

> Perhaps most troubling to the jury, law enforcement and the National Center for Missing and Exploited Children also testified that Meta's reporting of crimes to children on its apps -- including child sexual abuse materials (CSAM) -- was "deficient," The Guardian reported. Rather than make it easy to trace harms on its platforms, the jury learned from frustrated cops that Meta "generated high volumes of 'junk' reports by overly relying on AI to moderate its platforms." This made its reporting "useless" and "meant crimes could not be investigated," The Guardian reported.

>

> Celebrating the win as a "historic victory," Torrez [4]told CNBC that families had previously paid the price for "Meta's choice to put profits over kids' safety." "Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew," Torrez said. "Today the jury joined families, educators, and child safety experts in saying enough is enough."

Meta said the company plans to appeal the verdict. "We respectfully disagree with the verdict and will appeal," Meta's spokesperson said. "We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online."



[1] https://arstechnica.com/tech-policy/2026/03/meta-loses-trial-after-arguing-child-exploitation-was-inevitable-on-its-apps/

[2] https://www.theguardian.com/news/2023/apr/27/how-facebook-and-instagram-became-marketplaces-for-child-sex-trafficking

[3] https://www.cnbc.com/2024/01/31/meta-chief-mark-zuckerberg-criticized-over-child-sex-targeting-on-site.html

[4] https://www.cnbc.com/2026/03/24/jury-reaches-verdict-in-meta-child-safety-trial-in-new-mexico.html



Exploitation of children is inevitable??? (Score:5, Funny)

by Locke2005 ( 849178 )

Darn, why didn't Epstein's lawyers use that excuse?

Re: (Score:1, Redundant)

by alvinrod ( 889928 )

There's a difference between Facebook who didn't do a good job at policing their platform and Epstein who committed the acts himself. Consider that the bits were transmitted by some ISP, but that you would think it's absurd to punish them just like it would be stupid to try to put Chevy on trial because some bank robbers used a Camaro as a getaway vehicle. If you tried to charge the ISP they'd also argue that some illegal activity is inevitable. It's impossible to prevent all crime, but the law is that Face

Re: (Score:2)

by dowhileor ( 7796472 )

I am now more encouraged by the fact that any service/product is not inherently constitutional just because the mentioned. self labeled "industrialists" providing said service in their defence say people are "too stupid" to reject it.

Re: (Score:3)

by Calydor ( 739835 )

The difference is that if they really did their best to prevent it and someone slipped through the cracks anyway they could honestly say that they did their best.

They didn't do their best, though. They just didn't care. Or worse, they deemed it not worth the expense to even try to protect kids. That's different.

Re: (Score:2)

by boskone ( 234014 )

or even more worse... it's a feature, not a bug

Re: (Score:2)

by jd ( 1658 )

It is legitimate for any service that constitutes a "common carrier" to be free of consequences for what it carries. But Meta do not claim to be a "common carrier", and that changes the nature of the playing field substantially. As soon as a service can inspect messages and moderate, it is no longer eligible to claim that it is not responsible for what it carries.

Your counter-argument holds some merit, but runs into two problems.

First, society deems any service that monitors to be liable. That may well be u

Re: Exploitation of children is inevitable??? (Score:2)

by ToasterMonkey ( 467067 )

Uhhh... welcome to 1996?

[1]https://www.law.cornell.edu/us... [cornell.edu]

> (c) Protection for "Good Samaritan" blocking and screening of offensive material

> (1) Treatment of publisher or speaker

> No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

> (2) Civil liability

> No provider or user of an interactive computer service shall be held liable on account of -

> (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

[1] https://www.law.cornell.edu/uscode/text/47/230#

Re: (Score:2)

by dowhileor ( 7796472 )

> Darn, why didn't Epstein's lawyers use that excuse?

He did. Long they argued that exploitation was inevitable even constitutional and the fact that a person's influence in this civilization was chained to how willing you were to exploit Epstein-like situstions was obvious, and I'm just going by the following: The names dropped as holding the baton tended to be of a certain financial security while the people receiving said baton were below that ceiling.

Re: (Score:2)

by dowhileor ( 7796472 )

and completely off topic.

but does this website have the right to monitor your browsing and remove threads they do not want you to see?

just asking?

Re: (Score:2)

by dowhileor ( 7796472 )

> and completely off topic.

> but does this website have the right to monitor your browsing and remove threads they do not want you to see?

> just asking?

and again not the topic of the article.

but can they deny others the ability to moderate one comment as opposed to another?

just asking again.

Re: (Score:2)

by stabiesoft ( 733417 )

Not sure if you are asking can /. remove a comment or can /. adjust which comments a specific user sees. As to the first, yes /. can remove a comment. The flag symbol is there to report a comment which someone considers a problem, and /. can remove it.

Re: (Score:2)

by dowhileor ( 7796472 )

by "others" I mean "anyone"'s ability to moderate and I guess see, to clarify the references in both posts. and I did not mention that any flagging action by the viewer or viewers was initiated to cause this assuming no action was requested. but thank you for your reply.

Re: (Score:2)

by stabiesoft ( 733417 )

If you get mod points, then yes someone with those points could downvote a post. And in so doing, depending on how you have set your filter for what you see, the post could disappear. I usually set mine at 0, so I see most stuff, but stuff that gets mod'ed to -1, I won't see. Many set the thresholds higher.

\o/ (Score:1)

by easyTree ( 1042254 )

Wow, if that's the best they can think of to surface to others (on trial no less), one can only wonder what lurks in the deep dark depths of their secret intentions.

Re: (Score:1)

by easyTree ( 1042254 )

I forgot to say that clearly the answer is encryption back-doors for law enforcement. Oh wait.

Re: (Score:2)

by alvinrod ( 889928 )

What were their alternatives? They weren't doing anything about the problem so had nothing else to point to. Their lawyers can't outright lie and claim Facebook did things to try and stop them problem when it didn't, so this was the one excuse that was presented. It's no different than a murder trial where it's clear that the defendant is guilty, but the defense presents an absurd theory that no one buys because they have to have some alternative explanation. If Facebook had done more then their lawyers wou

Meta? (Score:5, Informative)

by Locke2005 ( 849178 )

The same company that demanded a copy of my driver's license, and then changed my account name to because their support monkey was too stupid to realize that Oregon driver's license print the last name first? I can't believe THAT company would shirk their responsibility to protect children! My daughter, like all her peers, originally created her Facebook account by simply lying about her age, after Facebook decided they would comply with COPA by simply barring anyone admitting to being under age 13 from creating an account.

Re: (Score:2)

by Moryath ( 553296 )

The same company that banned people for linking to the British Holocaust Museum (HMD.Org.UK) claiming the links were violations of the policy on "dangerous individuals and organizations." Zuckerberg is a pedophilic little-boy-raping sick-trash Nazi.

I've seen this one before (Score:1)

by Iamthecheese ( 1264298 )

This is the episode where mainstream press pushes "This company is evil therefore we need to trample the constitution". Then the partisan bickering starts, the clowns pile out of the car in DC, and the tightrope walker nearly falls. By the end of the episode the henchman has hidden the bomb in the bill and the villain is sneaking away with the dosh.

Re: I've seen this one before (Score:2)

by retchdog ( 1319261 )

"First they came for the amoral megacorporations run by lizard people!"

Re: (Score:2)

by serviscope_minor ( 664417 )

First they came for the amoral megacorporations run by lizard people!

Are you claiming that Zuck is a person (even if of the lizard variety)? I do not buy it.

Re: (Score:2)

by Locke2005 ( 849178 )

In real life, conflicts like the are resolved in favor of whichever party donates the most to political campaigns. How much has Zuckerberg/Meta donated? How much have the victims donated?

Re: (Score:2)

by ArchieBunker ( 132337 )

What constitutional rights of Meta were violated?

Re: (Score:2)

by Locke2005 ( 849178 )

The child victims are discriminating against lizard people!

Re: (Score:2)

by dowhileor ( 7796472 )

> What constitutional rights of Meta were violated?

I would like to know also?

Re: I've seen this one before (Score:1)

by retchdog ( 1319261 )

The right to unaccountable profit.

Accountability is tantamount to socialism. A few potential child penetrations is a small price to pay for FREEDOM OF CHOICE. (besides they were probably crisis actors planted by Soros.)

Re: (Score:1)

by Iamthecheese ( 1264298 )

If your children do not grow up free, they will not grow up safe.

Re: (Score:2)

by ArchieBunker ( 132337 )

You sound like a poorly written perl script.

Re: (Score:1)

by Iamthecheese ( 1264298 )

You sound like a naive, teamism-consumed person.

family (Score:1)

by noshellswill ( 598066 )

So Zuk's a perv. An incautious prodigy of pervosity. And his company attracts other pervs ... keeping growth in the family so-to-speak. Sure oops! happen, but the $300-M wrist-slap fools nobody. I mean. pervosity is as natural as META so why apply a real heart-stopping penalty ?

Re: (Score:2)

by sarren1901 ( 5415506 )

You know 18 is teenage AND an adult that could sign up to e-harmony.

priorities (Score:2)

by toxonix ( 1793960 )

"harms to children, such as sexual exploitation and detriments to mental health, were inevitable on the company's platforms due to their vast user bases"

Zuck and his lawyer are admitting that the very nature of their business is detrimental to society. What is the upside? They make hundreds of billions of dollars while firing half of their staff because their agentic AI tool is supposed to be good enough to replace some 40k employees. It seems like working in any corporation there's always a big part of the

AI moderation... what are the alternatives? (Score:2)

by dgatwood ( 11270 )

> Rather than make it easy to trace harms on its platforms, the jury learned from frustrated cops that Meta "generated high volumes of 'junk' reports by overly relying on AI to moderate its platforms." This made its reporting "useless" and "meant crimes could not be investigated," The Guardian reported.

What, exactly, do they think the alternatives are?

Facebook has over 3 billion users. If they output an average of twenty artifacts (posts, replies, direct messages, or images/videos) per day, that's 60 billion outputs. If 1% of those are videos that are an average of three minutes long, that's 1.8 billion minutes of video, and if the other 99% take thirty seconds to moderate, that's another 29.7 billion minutes, for a total of 31.5 billion minutes per day to moderate.

That's 65.6 million workdays of conten

We are Pentium of Borg. Division is futile. You will be approximated.
(seen in someone's .signature)