News: 0181000314

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

FSF Threatens Anthropic Over Infringed Copyright: Share Your LLMs Freely (fsf.org)

(Monday March 16, 2026 @03:34AM (EditorDavid) from the free-as-in-freedom dept.)


In 2024 Anthropic was [1]sued over claims it infringed copyrights when training LLMs .

But as they [2]try to settle , they may have a problem. The Free Software Foundation [3]announced Friday that Anthropic's training data apparently even included the book " [4]Free as in Freedom: Richard Stallman's Crusade for Free Software " — for which the Free Software Foundation holds a copyright.

> It was published by O'Reilly and by the FSF under the GNU Free Documentation License ( [5]GNU FDL ). This is a free license allowing use of the work for any purpose without payment.

>

> Obviously, the right thing to do is protect computing freedom: share complete training inputs with every user of the LLM, together with the complete model, training configuration settings, and the accompanying software source code. Therefore, we urge Anthropic and other LLM developers that train models using huge datasets downloaded from the Internet to provide these LLMs to their users in freedom .

>

> We are a small organization with limited resources and we have to pick our battles, but if the FSF were to participate in a lawsuit such as Bartz v. Anthropic and find our copyright and license violated, we would certainly request user freedom as compensation.

"The FSF doesn't usually sue for copyright infringement," reads [6]the headline on the FSF's announcement , "but when we do, we settle for freedom."



[1] https://yro.slashdot.org/story/24/08/20/1524250/authors-sue-anthropic-for-copyright-infringement-over-ai-training

[2] https://www.anthropiccopyrightsettlement.com/

[3] https://www.fsf.org/blogs/licensing/2026-anthropic-settlement

[4] https://static.fsf.org/nosvn/faif-2.0.pdf

[5] https://www.gnu.org/licenses/fdl-1.3.html

[6] https://www.fsf.org/blogs/licensing/2026-anthropic-settlement



Re:Ducks (Score:4, Informative)

by thecombatwombat ( 571826 )

That is not a quote from Stallman.

That is from a statement from Krzysztof Siewicz, and I would assume it's just an odd turn of phrase from someone who mostly speaks something other than English.

RTFA is alive and well.

Re:Ducks (Score:5, Insightful)

by sg_oneill ( 159032 )

Presumably it means they are demanding the models be released under a free license.

Heres the thing with RMS. He's always tended to be the most "extreme" of the free/open source advocates, but he's had a history of being right as well. A lot of those "extreme" predictions have ended up being dead on the money.

The only place I think the FSF ever really fucked up with the AGPL license which has been basically used as a sort of shareware license by server software devs. But with the gobsmacking amount of contributions the FSF has made to software, you can forgive maybe that one screw up.

Re:Ducks (Score:5, Insightful)

by Tom ( 822 )

That is the problem. "Right to read" was visionary and will really soon be reality.

Given how much capitalism insists on copyright and prosecution when it comes to THEIR works, how they get custom-made laws like the DMCA passed just to protect their rights... well, let's just say that if the big AI models weren't from the corporate sector but had been created by nerds on github, the copyright police would already have broken down our doors to arrest us all for copyright infringement.

So please, please, pretty please, let them have a dose of their own medicine. Heck, let the courts classify LLMs as "software" and find just one instance of the training data containing GPL3 content. Whoopsie, all your code belongs to us.

Re: (Score:2)

by DamnOregonian ( 963763 )

I imagine you can infer, no?

Presumably, the ask/demand is to release the weights of the model, and possibly its training regime so that it can be replicated.

Frankly, it IS kind of a weird ask.

So the lawsuit is about book piracy. It's not that Anthropic used copyrighted data to train it's models- it's that it pirated books (downloaded to a computer without license)

If they had been legal copies of the books, what Anthropic did with them would have been legal (under current jurisprudence, it's fair use)

T

âoeUse of the work for any purpose without pa (Score:4, Insightful)

by khb ( 266593 )

I think the relevant language actually is âoe This License is a kind of "copyleft", which means that derivative works of the document must themselves be free in the same sense. It complements the GNU General Public License, which is a copyleft license designed for free softwareâ because the LLM is a derived work, thus arguably must be free âoein the same. senseâ

If it really was permissive as described thereâ(TM)d be no basis to make the demands described.

Re: (Score:2)

by 93 Escort Wagon ( 326346 )

> This License is a kind of "copyleft"

As opposed to all of the LLMs, which use more of a "copytheft" license.

Re: Go Anthropic! (Score:2)

by madbrain ( 11432 )

The lawsuit wasn't brought by Stallman.

Good luck with that (Score:2)

by diffract ( 7165501 )

If proprietary work is stolen willy nilly to train LLMs, what chance does a free foundation have against these AI giants?

And just like that . . . (Score:2)

by quonset ( 4839537 )

Copyright is good.

The book is over fifteen years old. How much longer should it be protected? At least that's the argument we hear on here all the time.

WHO sees a BEACH BUNNY sobbing on a SHAG RUG?!