Nick Clegg Says Asking Artists For Use Permission Would 'Kill' the AI Industry
- Reference: 0177787423
- News link: https://tech.slashdot.org/story/25/05/26/2026200/nick-clegg-says-asking-artists-for-use-permission-would-kill-the-ai-industry
- Source link:
> Speaking at an event promoting his new book, Clegg said the creative community should have the right to opt out of having their work used to train AI models. But he claimed it wasn't feasible to ask for consent before ingesting their work first.
>
> "I think the creative community wants to go a step further," Clegg said according to The Times. "Quite a lot of voices say, 'You can only train on my content, [if you] first ask.' And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data."
>
> "I just don't know how you go around, asking everyone first. I just don't see how that would work," Clegg said. "And by the way if you did it in Britain and no one else did it, you would basically kill the AI industry in this country overnight."
[1] https://www.theverge.com/news/674366/nick-clegg-uk-ai-artists-policy-letter
Is this bad? (Score:5, Insightful)
"you would basically kill the AI industry in this country overnight."
He says that as if it were a bad thing.
Re: Is this bad? (Score:2)
Right? I mean, no other industry gets a pass for this behavior
Re: (Score:2)
Killing the AI industry in its current form _worldwide_ would be a good thing. A big reset and rethink, kind of like reining in the nuclear arms race. Killing the AI industry in the UK only, while it remaining a free-for-all elsewhere is economic harakiri. If the UK bans it, but some other country doesn't, then companies will simply go set up shop in that other country, do all the creative output mining where it's legal, and then sell whatever they can to whomever they can. They make money, the UK doesn't.
Translation (Score:2)
"We're the new hotness that everyone wants to throw huge stacks of cash at and use as an excuse for / against anything at all, but we don't want to spend any of those billions of dollars fairly compensating the creator of the works we want to create derivative works of for their copyright, which is explicitly under the control of the copyright holder."
Or, the TL;DR version: "waaaahh we don't want to pay licensing for their work, but expect everyone to pay us to license the derived product of their work"
And
Re: (Score:2)
Year by year "right to a profitable business model" (as long as you have a big enough pile of wealth to begin with) marches onward.
When your businesses' "one neat trick" is *ignore the law* (because money), your business should not be permitted to exist.
Re: Is this bad? (Score:2)
Thank you very much
Re: (Score:2)
> "you would basically kill the AI industry in this country overnight."
> He says that as if it were a bad thing.
The genie ain't going back in the bottle.
meanwhile... (Score:2)
My kids are training on their content for free at school.
We have all the regulation we need in terms of copyright law. If AI plagiarizes, and I definitely believe that can happen, then sue everyone involved.
Re: (Score:2)
Impractical. There are too many bots stealing too much stuff.
I like the opt in idea. It would be hell to regulate given that no one believes in following the law until they get caught, but it is a start.
Re: (Score:2)
"stealing" is an interesting word choice. No one is "stealing" anything, it's being offered up for free and no one else is being deprived of the thing.
The only possible theft is in business, People will not seek the artist to create original works, they'll ask the AI. If they're happy with the substandard work the AI will do, and we've all seen what AI creates for art and code at this point, then there's no real loss here.
Re: meanwhile... (Score:2)
It's "copyright" for fuck's sake. Also it's means "it is".
Re: (Score:3)
> It's "copyright" for fuck's sake. Also it's means "it is".
I have this strange urge to write a post about page tables and spell it "copy on right" and watch the chaos. But yes, this annoys me, too.
Re: meanwhile... (Score:1)
How interesting is it that AI doesn't make those errors?
Re: (Score:2)
That's where you're wrong dude.
The AI is ingesting whole works, and in many cases, can spit out entire pieces of those works without citing anything.
Like the best thing AI can do going forward is have a special tag for the Author Copyright, and a special tag for the source, and then always cite the author copyright and and sources that generate every response to a prompt. For artwork this has to go one further and actually create it in the metadata.
Re: (Score:2)
Shame they can't memorise textbooks in a millisecond or so and then access it through memory.
Or were you suggesting you pirate all your kids textbooks?
Re: (Score:2)
They are asking for a law to make discovery easier, it's less a special protection and more a high level legal intervention in response to a high level disregard of the law.
Re: (Score:2)
The trouble is that AI plagiarising is kind of like money laundering. You can't take the output of an AI and work out what went in, in general, an more than you can take the output of a hash function and determine the input that produced it. Reaching the bar necessary to win a lawsuit is probably impractical, and the AI companies and their expensive teams of lawyers know this.
Re: (Score:2)
Depending on the legal system, you don't have to prove it, you just make it plausible enough for a judge to force the AI companies to provide you with the evidence.
NYT got OpenAI's training set. If they win at the Supreme Court, every other registered copyright owner will have an easy job doing likewise for every major AI company in the US. It might be harder in the UK, dunno.
Re: (Score:2)
Derivative works are also controlled through copyright. Are you arguing that an AI trained on a specific copyrighted material isn't a derivative work?
Re: (Score:2)
> My kids are training on their content for free at school.
You have a very odd definition of "free" going on here. Because apparently you think that textbooks are delivered to the classroom by the magical textbook fairy.
Here's a quick hint:
1. citizens pay taxes.
2. some of those tax revenues go to public schools.
3. public schools BUY the textbooks with tax revenue, from vendors that specifically resell textbooks to school districts with the intent to teach students from them.
Feel free to swap out "textbooks" with "library books" if you like. They are still license
Re: meanwhile... (Score:2)
Orrr ... are your kids providing valuable training data for free? while you are probably thinking "gee, this is great, look what we're getting for free?"
Admission of guilt. (Score:5, Insightful)
If your industry is unable to survive by following the law then isn't that the same as admitting the basis of your industry is violating the law?
Re: (Score:2)
Not once you understand the Golden Rule: he who has the gold, rules .
Re: Admission of guilt. (Score:2)
Yep. What he's saying is the very definition of corporate theft.
Familiar argument (Score:3)
That's the same argument Napster used.
If we can't get all our inputs for free, it would kill our ability to charge for similar stuff based on those inputs.
If we can't take everything we want without permission whenever we want it, it would kill our plan to replace human creativity with cheap imitations.
Re: (Score:2)
Shutting down Napster didn't kill music streaming, either. The industry went back to the drawing board, came up with a better business model, and now rules the entertainment world with it.
Re: (Score:2)
There's your answer, then. Start feeding AI copyrighted music, which will piss off the RIAA. That should bring the issue to *some* sort of resolution really quickly.
Re: (Score:2)
> That's the same argument Napster used.
> If we can't get all our inputs for free, it would kill our ability to charge for similar stuff based on those inputs.
Napster A. didn't charge, and B. didn't provide "similar stuff"; it provided identical stuff (ignoring compression artifacts). So no, that's not the argument Napster used.
Ak the lib dems (Score:1)
Posh privately schooled out of touch ex-politician who literally torpedoed his political party for a chance at power.
Nothing to see here, move along. Are we that starved for content that we have to give so much oxygen to idiots and their uninformed opinions?
I t
I fail.... (Score:2)
...too see the problem.
Alternate example (Score:3)
> "Quite a lot of voices say, 'You can only train on my content, [if you] first ask.' And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data."
Let's rephrase this to see if it makes sense.
> "Quite a lot of voices say, 'You can only deport people if you first give them due process.' And I have to say that strikes me as somewhat implausible because there are a LOT of people we want to deport."
Noting that Trump said just that. From [1]Trump wants to bypass immigration courts. Experts warn it's a 'slippery slope.' [npr.org]:
> "I hope we get cooperation from the courts, because we have thousands of people that are ready to go out and you can't have a trial for all of these people," Trump told reporters in the [2]Oval Office [youtube.com] last week [mid April].
Bottom line, there are things you're suppose to do even if they're inconvenient.
(The article goes into more detail, noting that, while the Constitution does not make any distinction between citizens and non-citizens for the application of the protections of due process and judicial review, due process is considered a "spectrum of rights," with different people being allotted different levels of protection with the fewest rights are offered to more recent arrivals to the U.S. without legal status. Lawful permanent residents and naturalized citizens traditionally have the most protections.)
[1] https://www.npr.org/2025/04/29/g-s1-63187/trump-courts-immigration-judges-due-process
[2] https://www.youtube.com/watch?v=0c6pm6393C4
Re: (Score:2)
>> "Quite a lot of voices say, 'You can only train on my content, [if you] first ask.' And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data."
> Let's rephrase this to see if it makes sense.
>> "Quite a lot of voices say, 'You can only deport people if you first give them due process.' And I have to say that strikes me as somewhat implausible because there are a LOT of people we want to deport."
> Noting that Trump said just that. From [1]Trump wants to bypass immigration courts. Experts warn it's a 'slippery slope.' [npr.org]:
>> "I hope we get cooperation from the courts, because we have thousands of people that are ready to go out and you can't have a trial for all of these people," Trump told reporters in the [2]Oval Office [youtube.com] last week [mid April].
> Bottom line, there are things you're suppose to do even if they're inconvenient.
It's also a crock of s**t. When the web came into existence, folks came up with a robots.txt file that gave permission to spider the site and, to a limited extent, serve snippets of the content. There's nothing preventing an ai_robots.txt file with a standard format that dictates whether the content can be used for AI training, whether it can be used for training of commercially available AIs or just fully open source AIs, etc. It won't kill the industry, unless the assumption is that nearly everyone wil
[1] https://www.npr.org/2025/04/29/g-s1-63187/trump-courts-immigration-judges-due-process
[2] https://www.youtube.com/watch?v=0c6pm6393C4
Umm, the point? (Score:2)
> Asking Artists For Use Permission Would 'Kill' the AI Industry
First, a lot of hyperbole there (like AI couldn't possibly do anything else?!?), but, when you think of it, what's the alternative? Not asking artists, and kill off that industry (which, btw, if didn't exist these AI companies wouldn't have anything for their models to train on)
OK, so that in case (Score:2)
Make it mandatory, but force the AI industry to pay for every work they use. Photos and pictures could cost a million quid per photo/picture. And written material a thousand pounds per word.
Oh, and add an extra 10% on top of that to be redistributed as a UBI to the entire country.
If you're going to replace good quality work with slop, you need to compensate the people who'd have otherwise been able to get quality content. And if you're going to automate people's jobs out of existence and demand their labor
Lawsuits? (Score:1)
I'm curious where we stand with regard to any planned or existing lawsuits against big tech for rights violations on artistic content. What are the courts saying about all this?
I say (Score:1)
"Ask before taking your work for our own profit? One simply does not have the time."
Imagine one day (Score:2)
Seeing a computer on a street corner with a sign that says "starving AI"
Business plan (Score:2)
The business plan therefore is to grab content for free, digest it, then provide the resulting service for a fee?
Or pay for the input content and charge more/a lot for the service?
In other words, the business plan doesn't hold water so the only option is to lobby government and public opinion.
Sad.
You say that as though it's a bad thing (Score:2)
> ...claimed a push for artist consent would "basically kill" the AI industry.
That strikes me as a feature, not a bug.
Furthermore, who do you think you are, that you imagine the Silly Valley cunts you just finished hanging with have some kind of moral right to do whatever the hell they want with artists' works? Un-fucking-believable.
Can we break out the torches and pitchforks yet?
What the Eyes see and the Ears hear... (Score:1)
The mind believes. Buckle up everyone. Its gonna get super weird in the next few years. Video generation is getting VERY good, and the distinction lines between AI and human made content, will soon fall away. Its gonna be an interesting year folks ;-D
Damn Straight (Score:1)
EVERYONE should have a voice in how their image and likeness is used. Facebook is going roughshod ALLOWING AI to expand on it's platform unabashed. If I were a celebrity I would be pissed and would sue them to pull any likeness of myself off ALL platforms. There's a distinct difference when a person draws something than AI does, the person has creative control, AI Doesn't and Shouldn't get any protections whatsoever. The person who uploads it shouldn't also NOT Get any profit from it as THEY did NOTHING cr
Don't kill the dead baby industry (Score:2)
Obviously the dead baby industry is going to suffer if I am not allowed to go around murdering babies without permission. You'll kill the dead baby industry if you continue down this path of enforcing laws against murdering babies.
Nutshell (Score:3)
Obviously AI in its current incarnation is incapable of existing with consideration for artists/authors rights.
Re: (Score:2)
The rights to control commercial reproduction of their works, primarily. Continental Europe mostly recognizes "moral rights" of authors and other creators that go beyond what US copyright law recognizes; I'm not sure where the UK falls on that topic.
Re: (Score:2)
Copyright was never intended to prevent how IP is being used to train models. It was intended to prevent someone from outright distributing a work without authorization, or claiming substantial portions of someone else's work as their own. It was never intended to protect ideas (a story about a special school for mutants/wizards/aliens/zombies/etc.), concepts, or styles. It is only supposed to apply to the specific instance of creative work.
We just had a story yesterday about a Star Wars-inspired game th
Re: (Score:3)
Of course. AI is not creative. It has to learn off existing material, be that text, voice, music, people's faces, paintings, drawings, television shows, anime, etc.
Like here's the thing. I would be OK with the AI scraping existing publicly-reachable information, if it only scraped it once and retained the credit. The problem is that it does neither of those. It does not respect the bandwidth websites pay for ( a problem that web crawlers/spiders have a problem with in the first place ) and it does not respe
Re: (Score:2)
> Music is the worst though because at present AI does not sing, either it can "choir" or it requires someone else to sing (eg the original artist) and squeeze it through an AI-autotune into another signer's voice style, but it's still very clearly the original singer.
Man, you're behind the times a bit here. I've been playing around with Suno quite a bit and their latest model is so good it's creepy.
[1]Hear it for yourself [youtu.be], that's one of the songs I made with the free trial of their paid model.
Now here's the thing, do I fancy myself an actual artist because I'd collaborated with ChatGPT to turn my ideas into some lyrics, then had Suno make it into a song? Not really, because being a "real" artist is about having the correct industry connections. Then you can get away wit
[1] https://youtu.be/IUQPmyvl9aE
Re: (Score:2)
Not that AI shouldn't be asking for permission, but human thought requires the same. That said, buying the book or a similar act by a human covers the license. Perhaps AI needs to do something equivalent.