News: 0178617748

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

AI Industry Horrified To Face Largest Copyright Class Action Ever Certified (arstechnica.com)

(Friday August 08, 2025 @11:30PM (BeauHD) from the financial-ruin dept.)


An anonymous reader quotes a report from Ars Technica:

> AI industry groups are urging an appeals court to block what they say is the [1]largest copyright class action ever certified . They've warned that a single lawsuit raised by three authors over Anthropic's AI training now threatens to "financially ruin" the entire AI industry if [2]up to 7 million claimants end up joining the litigation and forcing a settlement. Last week, Anthropic [3]petitioned (PDF) to appeal the class certification, urging the court to weigh questions that the district court judge, William Alsup, seemingly did not. Alsup allegedly failed to conduct a "rigorous analysis" of the potential class and instead based his judgment on his "50 years" of experience, Anthropic said.

>

> If the appeals court denies the petition, Anthropic argued, the emerging company may be doomed. As Anthropic argued, it now "faces hundreds of billions of dollars in potential damages liability at trial in four months" based on a class certification rushed at "warp speed" that involves "up to seven million potential claimants, whose works span a century of publishing history," each possibly triggering a $150,000 fine. Confronted with such extreme potential damages, Anthropic may lose its rights to raise valid defenses of its AI training, deciding it would be more prudent to settle, the company argued. And that could set an alarming precedent, considering all the other lawsuits generative AI (GenAI) companies face over training on copyrighted materials, Anthropic argued. "One district court's errors should not be allowed to decide the fate of a transformational GenAI company like Anthropic or so heavily influence the future of the GenAI industry generally," Anthropic wrote. "This Court can and should intervene now."

>

> In a court [4]filing Thursday, the Consumer Technology Association and the Computer and Communications Industry Association backed Anthropic, warning the appeals court that "the district court's erroneous class certification" would threaten "immense harm not only to a single AI company, but to the entire fledgling AI industry and to America's global technological competitiveness." According to the groups, allowing copyright class actions in AI training cases will result in a future where copyright questions remain unresolved and the risk of "emboldened" claimants forcing enormous settlements will chill investments in AI. "Such potential liability in this case exerts incredibly coercive settlement pressure for Anthropic," industry groups argued, concluding that "as generative AI begins to shape the trajectory of the global economy, the technology industry cannot withstand such devastating litigation. The United States currently may be the global leader in AI development, but that could change if litigation stymies investment by imposing excessive damages on AI companies."



[1] https://arstechnica.com/tech-policy/2025/08/ai-industry-horrified-to-face-largest-copyright-class-action-ever-certified/

[2] https://yro.slashdot.org/story/25/07/17/1548245/judge-allows-nationwide-class-action-against-anthropic-over-alleged-piracy-of-7-million-books-for-ai-training

[3] https://storage.courtlistener.com/recap/gov.uscourts.ca9.8d0d59a1-7a6e-f011-a2d9-001dd80ea460/gov.uscourts.ca9.8d0d59a1-7a6e-f011-a2d9-001dd80ea460.1.0.pdf

[4] https://ccianet.org/library/amicus-brief-of-technet-ccia-et-al-in-bartz-v-anthopic-9th-cir/



Re: (Score:2)

by quonset ( 4839537 )

> Is Slashdot letting Donald Trump write the headlines again?

That is the literal headline from ArsTechica.

Maybe? (Score:5, Insightful)

by fropenn ( 1116699 )

> They've warned that a single lawsuit raised by three authors over Anthropic's AI training now threatens to "financially ruin" the entire AI industry if up to 7 million claimants end up joining the litigation and forcing a settlement

Maybe don't steal stuff from so many people if you can't handle the consequences?

Re: (Score:1)

by quonset ( 4839537 )

>> They've warned that a single lawsuit raised by three authors over Anthropic's AI training now threatens to "financially ruin" the entire AI industry if up to 7 million claimants end up joining the litigation and forcing a settlement

> Maybe don't steal stuff from so many people if you can't handle the consequences?

Like people who steal movies, software, videos, and games, right?

Re:Maybe? (Score:5, Insightful)

by nonsenseponsense ( 10297685 )

Oddly enough people have been punished and gone to jail for stealing movies, software, videos, and games. The laws haven't changed. Are you suggesting that we should make an exception now that it's companies or those with deep pockets committing the theft?

Re: (Score:2)

by arglebargle_xiv ( 2212710 )

Exactly. Can anyone see my world's smallest violin? I'd like to play something for Anthropic but can't seem to find it anywhere.

Re: (Score:2)

by h33t l4x0r ( 4107715 )

I'm pretty sure that's bullshit, nobody ever went to jail for streaming a pirated movie. For distribution yes.

Re: (Score:2)

by dinfinity ( 2300094 )

It's crazy how Slashdot's commentariat has pivoted to spouting and defending the bullshit the RIAA and MPAA have always been spewing and which old Slashdot would have shat on liberally.

Re: (Score:2)

by ebunga ( 95613 )

Well, everyone was a poor high school student 25 years ago and couldn't afford crap and we were all losers that nobody wanted to hang around anyways.

Re: Maybe? (Score:2)

by blue trane ( 110704 )

Just how hard did you sell out, my good man?

Re:Maybe? (Score:5, Insightful)

by jacks smirking reven ( 909048 )

Leave it to the tech sector to end up creating something we can all hate worse then the RIAA and MPAA.

Impressive when you think of it that way.

Re: (Score:2)

by troff ( 529250 )

Which raises the question: where the hell ARE the RIAA and MPAA in all of this? Why aren't they getting in on this?

Re: (Score:3)

by Sebby ( 238625 )

> Which raises the question: where the hell ARE the RIAA and MPAA in all of this? Why aren't they getting in on this?

They're there - it's just a different strategy:

Before they needed to make an example of people publicly to stem piracy of "their" content - many fronts (individuals) to deal with to make that happen.

With AI companies, they can just use their size & leverage once others have successfully argued that content is being scraped "illegitimately", and force some perpetual periodic/usage licensing or percentage fee, like they've done for small businesses that play radio or songs in their establishments.

Re: (Score:3)

by SeaFox ( 739806 )

They don't want the AI industry squashed by a huge judgement (or for AI to not be able to train on previous creative works).

They still have a dream of a future where their corporate leadership can sit in a chair and make new product simply by describing the music or movie to a computer with voice recognition, never having to deal with (let alone pay) any other human being in the process to make it.

Re: (Score:2)

by 0123456 ( 636235 )

I don't think Slashdot was ever in favor of big business stealing other people's movies, books or games and selling them as their own.

Re: Maybe? (Score:2)

by blue trane ( 110704 )

Why can't AI jailbreakers break the subscription payment part of AI?

Re:Maybe? (Score:4, Insightful)

by StormReaver ( 59959 )

> ...how Slashdot's commentariat...

The two are completely different. People downloading movies and songs for themselves and friends, while morally gray, is vastly different from AI companies taking people's works and reselling them. And don't even try to sell me the nonsense that LLMs are "learning" like a human does. That is complete and utter bullshit. LLMs and other so-called AI programs are retrieval, processing, and storage engines. The term "AI" is just a marketing term for the gullible and the ignorant.

Re: Maybe? (Score:2)

by blue trane ( 110704 )

How smart was Buddy Bolden, who refused to record because others would just steal his licks?

Re: (Score:2)

by troff ( 529250 )

And this is why you're stupid. You're not getting the fact that the Slashdot Commentariat want some consistency in the way this stupid world is inconsistently running.

Re: (Score:3)

by Petersko ( 564140 )

Right. Those things are wrong too. People like to do the mental gymnastics required to convince themselves otherwise, but it's horseshit.

If it's okay to take those things without consequence it should be equally fine to ignore the GPL and do whatever the hell you like with the software.

Re: Maybe? (Score:2)

by blue trane ( 110704 )

Anyone else remember the Bugroff license?

[1]https://www.lonsteins.com/post... [lonsteins.com]

"The âoeNo problem Bugroffâ license is as followsâ¦

The answer to any and every question relating to the copyright, patents, legal issues of Bugroff licensed software isâ¦.

Sure, No problem. Donâ(TM)t worry, be happy. Now bugger off."

[1] https://www.lonsteins.com/posts/bugroff-license/

Personal use versus commercialized service (Score:4, Insightful)

by ebunga ( 95613 )

It's one thing to download a movie without paying for it. It's not good if you share it with others. It's absolutely wrong on all levels when you turn it into a commercial service. It's a few hundred orders of magnitude worse when you you're doing this at the scale of the generative AI companies.

Re: (Score:2)

by registrations_suck ( 1075251 )

> It's one thing to download a movie without paying for it. It's not good if you share it with others.

To download something requires someone else sharing it with others. So why is the first one "one thing" and the second one somehow any different?

> It's absolutely wrong on all levels when you turn it into a commercial service.

Why is taking something that isn't yours any better than taking something that isn't yours and selling it to someone else?

Re: (Score:2)

by ffkom ( 3519199 )

>> Maybe don't steal stuff from so many people if you can't handle the consequences?

> Like people who steal movies, software, videos, and games, right?

Those people that you mean, did they sell the stuff they downloaded to other people for lots of money, like today's "AI"-industry?

Re: (Score:2)

by Mspangler ( 770054 )

If you sell copies or make money off of the copyrighted works then you can expect to be hammered flat.

Are the AI companies making money from inappropriate use of copyrighted works? Then hammer them.

If the AI is fed only material from expired copyrights and non-copyrightable facts and government publications then they are in the clear though their prose might be in the style of Victorian times.

Re: (Score:1)

by Blackjetta ( 807781 )

Like people who steal movies, software, videos, and games, right? And those people get sued too. The record labels were ruthless in the MP3 era just for starters.

Compairing apples to oranges (Score:2)

by Sebby ( 238625 )

> Like people who steal movies, software, videos, and games, right?

I've [1]already explained [slashdot.org] how the above argument is absolutely invalid/irrelevant:

> Comparing apples to oranges.

> Multi-billion$ AI companies scrape content, then repeatedly sell access to services that use that content at scale without compensation to the creators, without whose content those companies would have nothing to offer in the first place.

> Quite different than some individual "stealing" a song for their own use (sure there's some level of deprivation of funding to the creator, but they're not making mone

[1] https://slashdot.org/comments.pl?sid=23732992&cid=65489354

Re: Maybe? (Score:2)

by djp2204 ( 713741 )

But technology is about the free exchange of other peoples ideas!

I wanna make out with my MonroeBot ;)

Re: (Score:2)

by Dr. Tom ( 23206 )

keyword "if"

Re: (Score:1)

by badboy_tw2002 ( 524611 )

You wouldn't steal a pizza! Why would you steal a movie!?!

Re: (Score:1)

by easyTree ( 1042254 )

> Maybe don't steal stuff from so many people if you can't handle the consequences?

Techbros want all the money so techbros gotta steal all the stuff

Re: (Score:2)

by ahoffer0 ( 1372847 )

Many people rely Open AI and Anthropic. For example, I use their products to write a lot of bash scripts. If those companies go offline, where will people turn? Will everyone flock to Chinese AI offerings? It is going to be a lot harder to shut down those for IP reasons.

Re: (Score:1)

by 0123456 ( 636235 )

They'll just need to have a contract with the people whose content they use which allows them to legally use it.

Doesn't Microsoft make people on github agree to allow them to use all their code to train AIs, for examople?

Re: (Score:3)

by troff ( 529250 )

> For example, I use their products to write a lot of bash scripts. If those companies go offline, where will people turn?

How about pick up a fucking book and learn how to use your fucking computer like you weren't a decerebrate child? If you can't write bash scripts, then go back to your fucking Windows box.

LMFAO! (Score:3, Funny)

by Shakes Fist ( 10502847 )

I laughed so loud! These tech bros creaming in billions and now might have to face the consequences of producing a hallucinogenic program used to threaten millions of workers? Screw them.

Re: (Score:3)

by Ksevio ( 865461 )

If it's a hallucinogenic program then what's the copyright infringement?

Re: (Score:2)

by drinkypoo ( 153816 )

> If it's a hallucinogenic program then what's the copyright infringement?

The training set is itself presumably typically infringing whether the output is or not.

I personally don't think the output is infringing, but there is a plausible legal argument to be made that it is if the output is similar enough to copyrighted input.

But we're rich (Score:4, Insightful)

by medusa-v2 ( 3669719 )

Once upon a time, things didn't work out for Napster. Personally I felt that copyright rules had been skewed too far against the general public, but in that time period the general pattern was that if you couldn't run a business without breaking the law, then you'd just go out of business.

I guess these days I still feel like the rules are still skewed too far against the general public. The big difference now is this expectation that not only do the extremely wealthy rewrite the rules in their favor, they also take it as a given that if, somehow, they encounter a rule that doesn't let them do whatever they want, the rule must not have been intended to apply to them in the first place, so why should they even have to go to the trouble of getting it rewritten before they ignore it?

Neural Imaging (Score:1)

by kurt_cordial ( 6208254 )

Generations 1-4 effectively reproduced organic vision, matrix, etc. to the tune of non-marketing cultists protesting it was promised in the 1960's. Generation 5 has distinguished deeper cognizance and stable self-aware video (a la Futurism or Cubism). If the Judge last longer than the IRS chief or crypto billionaire Scaramucci, he will not wind up dead... or wwwhatevs.com

This line says it all (Score:5, Insightful)

by Rinnon ( 1474161 )

> "One district court's errors should not be allowed to decide the fate of a transformational GenAI company like Anthropic or so heavily influence the future of the GenAI industry generally,"

Translation: "We're so big and important that we're above the law."

No love (Score:3)

by RitchCraft ( 6454710 )

Sorry, there is no love lost here. These companies have been pushing snake oil for long enough. Time to rid the yard of snakes.

Re: (Score:2)

by Dr. Tom ( 23206 )

Fun fact, in Japan they call it Toad Oil, "gama no abura"

It's exactly the same thing, except toads instead of snakes

Not as horrified as the artists. (Score:2)

by gurps_npc ( 621217 )

The artists are far more horrified by how AI is destroying their business.

Voiceovers, commercial artwork, etc. have all become 'cheap' because they no longer need to pay artists.

The artists find the total elimination of their work to be devastating.

Then share in the spoils (Score:4, Insightful)

by sziring ( 2245650 )

If they can't pay upfront they need to make good with some other method.. shares (public or private) or something of value perpetual revenue from MAX accounts etc. It is beyond crazy how they argue it's fair when it isn't. If I trained my AI using their system they would slap me with a lawsuit so fast.. but hey might makes right

They'll come up with a comprimise (Score:3)

by wakeboarder ( 2695839 )

The US gov isn't going to let AI die, they'll hash a compromise with the copyright holders. The stupid thing is and individual can't break copyright, they'll loose their whole net worth over copyright infringement. A corporation can do copyright infringement and get away with it (or maybe not). If they go after anthropic, Open AI and Meta are next.

Re: (Score:2)

by troff ( 529250 )

The US gov isn't going to let AI die

Clearly the US gov needs to die right along with AI.

Consequences (Score:2)

by rizznay ( 1696094 )

Well if it isn't the consequences of my own actions. All I have to say about this is, good, fuck 'em. If your business can't survive without stealing from others, your business doesn't need to exist and should be illegal.

Re: (Score:2)

by sarren1901 ( 5415506 )

It's copyright infringement, it's not theft. That's motto around here, right?

Good (Score:3)

by dskoll ( 99328 )

About time the AI industry is held accountable for its theft.

And no, I don't condone stealing software, movies or videogames either. But these AI companies are stealing on a far more massive scale than someone who copies the odd movie or game.

Re: (Score:2)

by sarren1901 ( 5415506 )

I think the difference between just copying something to use for your own personal usage is nothing like what these AI companies are doing. Now if I was distributing these things I copied and charging money for it, that would be akin to what these AI companies are doing.

Has a point (Score:1, Flamebait)

by colonslash ( 544210 )

Look, $150K per work × 7 million potential claimants = over $1 trillion in damages. That's not justice, that's basically a kill switch for the entire industry.

Here's the thing everyone's missing: AI training is closer to how compression algorithms work than traditional copying. When you train GPT on a book, it's not storing the book - it's extracting statistical patterns about how words relate to each other. The actual text gets compressed into a few bytes of weight adjustments across billions of pa

Re: (Score:3, Insightful)

by Zurk ( 37028 )

cry me a river. where was all this hand wringing nonsense when piratebay was shut down repeatedly ? how about the DMCA which US industry was happy to back. oh and the mickey mouse perpetual licensing joke. now its come back to bite the tech industry in the ass. fuck em. let them ALL go out of business. let the investors cry. then maybe we will get some sanity back in copyright law.

Re: (Score:1, Interesting)

by iAmWaySmarterThanYou ( 10095012 )

AI is not transformative at all.

Changing the storage method does not change the thing stored. That's not at all what copyright law means by transformative.

Transformative might be something like taking someone's song and making a parody of it like Weird Al Yankovic. Taking someone's song and storing it in a different format is a worthless defense in copyright law.

There have been many examples of people able to get full or partial text extractions from copyrighted works with the right query. That means the

Re: (Score:2)

by sphealey ( 2855 )

> "That's not justice, that's basically a kill switch for the entire industry."

Stealing, e.g., book authors' work, mashing it up using an "LLM" algorithm, and selling it in a deliberate attempt to undercut the original, content-creating authors would seem like a 'kill switch' for human creativity and livelihoods, no? Ursula K. LeGuin was 101% right when she fought Google and the what was supposed to be her own advocate, the SFWA, on this 20 years ago and everything she predicted about the destruction of au

Re: (Score:2)

by Pinky's Brain ( 1158667 )

They pirated everything under the sun to create the training set.

After they got some money they started book scanning to at least have some chance at a fair use defence, but even now their training set is likely mostly pirate booty. Any transformation after the fact is besides the point.

Re: (Score:1)

by colonslash ( 544210 )

Transforming works is fair use. Other cases have been brought against the AI companies, and they've all failed; the works weren't similar enough.

Re: (Score:2)

by Pinky's Brain ( 1158667 )

There's only two real judgements on fair use and one is weak as shit.

Anthropic lost on the piracy part of their case. Which can kill them, regardless if using scanned books for training is deemed fair use at the supreme court.

Re: (Score:2)

by msauve ( 701917 )

When you train GPT on a book, it's not storing the book - it's extracting statistical patterns about how words relate to each other. The actual text gets compressed into a few bytes of weight adjustments across billions of parameters.

Oh, bullshit. It's been shown that AI models [1]can and will regurgitate obviously infringing output [ieee.org]. Show us that compression which will reduce one of the examples to "just a few bytes."

[1] https://spectrum.ieee.org/midjourney-copyright

Re: (Score:1)

by 0123456 ( 636235 )

"Look, $150K per work × 7 million potential claimants = over $1 trillion in damages. That's not justice, that's basically a kill switch for the entire industry."

Maybe an industry based on stealing everyone else's stuff and pretending it's theirs because 'muh training' should be killed?

Particularly given the reports of AI companies ignoring robots.txt to steal web content after being denied permission to access it.

Losing the Battle and the War (Score:3)

by lazycam ( 1007621 )

This is how the U.S. cedes its lead to China or any other country willing to make the investment.

Re: (Score:1)

by WolfgangVL ( 3494585 )

It's lead in what exactly? Unemployment? AI Slop generation? Homeless people?

If these tech companies get what they want we'll all be wage slave prisoners shuffling around in the dirt of their techno-utopia city-states while they invent creative new ways to entertain the idle rich and casually live out their lives in extreme wealth and opulence on the backs of the rest of us.

Fuck em..

Re: (Score:2)

by lazycam ( 1007621 )

This is a good point. While horse saddle companies suffered from the introduction of the car, that's not an excuse to limit the diffusion or further development of that technology. The same was said about computers and their ability to reproduce and copy copyrighted work -- not forgetting the same technology could be use do create said work. Other countries have solved this problem in the past using excise taxes (like the ones that were included on hard drives).

Re: (Score:1)

by Defraggle ( 70799 )

It's just the law. No big deal.

Re: (Score:2)

by lazycam ( 1007621 )

That's not an excuse to restrict knowledge or technological progress. Do you seriously believe other countries AI startups will not do the same and we won't even have a seat at the table? I agree, respect for private property is a critical component of our modern system of copyright. However, as some other users have pointed out, this feels like an opportunity review modern guidelines on fair use and the corporate structures/entities that are allowed to form around these AI technologies. Just my two cents.

Re: (Score:3)

by Dixie_Flatline ( 5077 )

If they can't figure out a way to pay people for their work, why should they be allowed to have it? They're just impoverishing people that actually spent time and energy and their own money to make things, and they're saying they should get it 'because'.

Is your problem with people doing it on an individual basis that the crime is too small, or?

They didn't even pay for the value of a single book before slurping it up. They scrape videos and obviously never watch any ads or would even need any of the products

Re: (Score:2)

by lazycam ( 1007621 )

I agree with you. But the technology is going to be developed and trained. Would you feel more comfortable if OpenAI or any other company were brought under government control -- this way the benefits of the technology can proportionally benefit creators and citizens? Or simply tell companies that they must pay for any and all copyrighted work? There would be no AI for companies that follow the rules. Others who are willing to steal to train their models will benefit and win. I guess what I am saying is tha

Re: (Score:2)

by Defraggle ( 70799 )

So the sacred copyright is secondary to national security huh?

I guess it wasn't so sacred after all...

Re: (Score:2)

by registrations_suck ( 1075251 )

Why should native Americans receive any better treatment than any of the various other conquered peoples throughout some 5000 years of human history?

\o/ (Score:1)

by easyTree ( 1042254 )

...but not so horrified that it acted as a disincentive to committing the world's largest copyright infringement.

Welcome to the finding out phase. (Score:1)

by Defraggle ( 70799 )

The fucked around.... Now they find out.

Being Big and Rich Doesn't Make You Right (Score:2)

by biggaijin ( 126513 )

Having lots of money and claiming you are important for the future of the universe doesn't give you the right to steal property from the people who own it.

Re: (Score:2)

by sarren1901 ( 5415506 )

It's copyright infringement. It's not theft. Different crimes here.

Our courts side with whoever has the most money (Score:2)

by rsilvergun ( 571051 )

And while copyright holders have cash AI is worth trillions based on the possibility that it could replace most if not all white collar workers.

At best this will hit the supreme Court where they will declare that it's fair use. The decades of precedence around storing the contents of copyrighted material won't matter.

There is no way something is valuable is AI is going to get derailed by a little thing like the rights of the copyright holders. This is happening and we all need to get comfortable wit

copyright did it to single moms (Score:4, Insightful)

by awwshit ( 6214476 )

Single moms that shared a few songs paid dearly. Watch tech bros that stole everything walk.

If copyrights were reasonable we would be having a different conversation.

Good! (Score:2)

by StormReaver ( 59959 )

Those companies SHOULD be bankrupted, and the CEOs put in prison for a few years. They are thieves of the highest order, and of which the world has never seen before.

Really? The whole sector will die? (Score:2)

by gkelley ( 9990154 )

How many multi-billionaires are there in the AI/AI adjacent field? Maybe they could cough up some of those billions to pay the artists they stole from. If they can hand out bonuses of hundreds of thousands to almost a million, then they can certainly afford to give back to those that made it possible for them to do what they do.

Pay up or shut up (Score:2)

by Wokan ( 14062 )

These AI companies are throwing fortunes at employees working on their function. Start throwing that money at the people creating the body of knowledge.

I hope it does ruin them (Score:2)

by thePsychologist ( 1062886 )

I'd be a very happy person if all AI companies were ruined to the point of bankruptcy and their CEOs sent to prison for life.

Entire AI industry facing financially ruin ? (Score:2)

by Mirnotoriety ( 10462951 )

What do you expect when your entire business model is built on other people unattributed work.

Re: (Score:2)

by h33t l4x0r ( 4107715 )

That's not even close to the entire business model. AI can get by just fine without Harry Potter or Taylor Swift's catalog. It would still have all that social media and codebase data that was obtained legitimately.

This MUST be a good party -- My RIB CAGE is being painfully pressed up
against someone's MARTINI!!