News: 1768391233

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Stop dragging feet on AI nudification ban, UK government told

(2026/01/14)


The Science, Innovation and Technology Committee has criticized the UK government's handling of AI nudification tools, saying it is taking too long to ban apps, and that expedited legislation does not encompass multi-purpose platforms used to create nude images.

Grok, the AI chatbot owned and run by Elon Musk's xAI, [1]caused controversy at the start of this year when users prompted it to create images of naked or barely dressed people – mostly women, some underage – from real photos. Over a 24-hour period between January 5 and 6, it generated 6,700 sexualized images every hour.

Regulators in the UK subsequently launched a probe and the government came under pressure to penalize X, formerly Twitter. Comms watchdog Ofcom, which polices the Online Safety Act (OSA), is now [2]formally investigating the social media platform, which was bought by xAI in March 2025. Grok's nudity capabilities are still switched on for paying users.

[3]

Today's comments from Dame Chi Onwurah, chair of the Science, Innovation and Technology Committee, are in response to a letter from technology minister Liz Kendall, who tried to assure her that the government is tackling the issue.

[4]

[5]

Kendall said in her [6]letter to Dame Onwurah, dated January 12 but made public this morning: "xAI's action to restrict this ability to paying users is a further insult to victims, effectively monetizing this horrific crime."

She said the OSA was built to deal with this situation and "intimate image abuse has been designated a 'priority offence,'" adding: "Ofcom has the mandate it needs to hold services to account for horrific illegal content on their sites and they have the government's unequivocal backing to use the full force of the powers that Parliament has granted them, up to and including, if they deem necessary, the power to apply to the courts to block services from being accessed in the UK if they refuse to comply with UK law."

[7]

The government is also banning nudification tools, and Kendall said: "We will bring forward this legislation as a priority, making amendments to the Crime and Policing Bill going through Parliament now."

[8]Ofcom officially investigating X as Grok's nudify button stays switched on

[9]Grok told to cover up as UK weighs action over AI 'undressing'

[10]UK regulators swarm X after Grok generated nudes from photos

[11]Users prompt Elon Musk's Grok AI chatbot to remove clothes in photos then 'apologize' for it

Dame Onwurah responded, saying "significant questions remain" about the approach being taken, and asking why it has taken "so long" to introduce the nudification ban "when reports of these disturbing Grok deepfakes appeared in August 2025."

"It's also unclear whether this ban – which appears to be limited to apps that have the sole function of generating nude images – will cover multi-purpose tools like Grok."

Kendall also said if there are "gaps" in the OSA, the government will address them. To this, Dame Onwurah said: "This comes months after rejecting the committee's recommendations to explicitly regulate generative AI and put greater responsibility on platforms like X and Grok. I urge the government to adopt our recommendations and embed core principles – such as responsibility and transparency – into the online safety regime. These are essential principles to build a strong regulatory framework that protects users online."

The Register has asked X to respond. Instead of a poop emoji, which is what X used to send journalists contacting it for comment, these days the automated messages states "Legacy Media Lies." We're hoping it sends us a specific statement on the matters above. ®

Get our [12]Tech Resources



[1] https://www.theregister.com/2026/01/03/elon_musk_grok_scandal_underwear_strippers_gross/

[2] https://www.theregister.com/2026/01/12/xai_grok_uk_regulation/

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aWlxn9Vzn-LdNQvyUi9snQAAAxA&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aWlxn9Vzn-LdNQvyUi9snQAAAxA&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aWlxn9Vzn-LdNQvyUi9snQAAAxA&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[6] https://committees.parliament.uk/publications/51067/documents/283193/default/

[7] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aWlxn9Vzn-LdNQvyUi9snQAAAxA&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[8] https://www.theregister.com/2026/01/12/xai_grok_uk_regulation/

[9] https://www.theregister.com/2026/01/09/grok_image_generation_uk/

[10] https://www.theregister.com/2026/01/08/uk_regulators_swarm_x_after/

[11] https://www.theregister.com/2026/01/03/elon_musk_grok_scandal_underwear_strippers_gross/

[12] https://whitepapers.theregister.com/



Knickers in a twist?

Long John Silver

I doubt that either of the women mentioned in this article face the prospect of being depicted in their 'undies'.

Re: Knickers in a twist?

LionelB

Strangely, despite it being highly unlikely that it's a prospect I'll ever face personally, I happen to be quite cross about people being executed for protesting against totalitarian regimes.

Wtf are you trying to say?

Re: Knickers in a twist?

doublelayer

I think that post was intended as an insult on their appearance. If so, it says a lot more about Long John Silver's personality than anything else.

Re: Knickers in a twist?

LionelB

An ugly mind, indeed.

Re: Knickers in a twist?

NapTime ForTruth

Much like:

Automatic weapons are just a tool.

It doesn't do anything you couldn't do before, just makes it a whole lot easier. (I believe the current version of the US in particular has form here, as do as several South American nations).

Similarly:

Nuclear weapons are just a tool.

It doesn't do anything you couldn't do before, just makes it a whole lot easier.

The "making it easier" part is really the main part. The difference between local individual action and action at worldwide scale is the difference between argument and total war.

Re: Knickers in a twist?

JulieM

It doesn't just make things you could do before easier. It also enables things you could not do before.

If some sicko wants to look at a publicly-available picture of me (which will definitely include clothing) and imagine what I look like undressed while he's having a Thomas Tank somewhere in private, well, that's obviously extremely distasteful; but there's nothing I can do about it, because all the bad stuff is going on entirely within his own imagination -- and the inside of a person's head is the most private space there is.

The moment the aforementioned sicko does anything that exposes what belongs in his imagination to the world of reality, though, a line is crossed.

Re: Knickers in a twist?

desht

Criticise them on their policies and performance (there's plenty of room for that) and not their physical appearance, you mysoginistic dickhead.

skswales

Rushed amendments to legislation never led to poor & unexpected outcomes, did it.

Anonymous Coward

It's been interesting to see how the message has slowly evolved since this story came to light as media outlets try to outdo each other. It's like a game of Chinese Whispers.

It started out as "Grok lets users make pictures of women in bikinis without their consent", then it became "sexualised images of women", then it became "sexualised images of women and girls". Then it became "naked images of women and children". Then it finally evolved to "Elon Musk allows paedofiles to make sexual images of naked children for profit, we must ban X and Grok immediately, won't somebody think of the children!!!!!!"

Regardless of that, it's a tool which clearly doesn't have any valid usecase. If somebody consents to be photographed naked then they could make real photos, so the only purpose for the tool would be to make naked photos of people that don't consent. Which is clearly wrong.

And why would anybody even bother? It's not like there is a shortage of porn on the internet. You can easily get more naked photos than will fit on your hard disk. It's almost impossible to search the internet without seeing naked people popping up everywhere. Actually, that would be a much better tool, an AI browser extension that puts clothes on pictures of naked people.

AVR

People do bother though. Whether it's just wanting free (now with a minor cost the user might have been paying anyway) pics made to order or an actual desire to victimise others, it's not going to go away because you personally don't see a use case.

& news outfits don't necessarily know everything immediately - it's less Chinese whispers/telephone game and more reacting to activists yelling at them repeatedly.

Dinanziame

And why would anybody even bother? It's not like there is a shortage of porn on the internet.

You cannot be that naive. First there is never enough porn, second people create sexualized images of women they personally know, whether for their own gratification, or for harassing said women. Check e.g. [1]this year-old story about South Korea for the problems this causes

[1] https://www.bbc.com/news/articles/cpdlpj9zn9go

Elongated Muskrat

It started out as "Grok lets users make pictures of women in bikinis without their consent", then it became "sexualised images of women", then it became "sexualised images of women and girls". Then it became "naked images of women and children". Then it finally evolved to "Elon Musk allows paedofiles to make sexual images of naked children for profit, we must ban X and Grok immediately, won't somebody think of the children!!!!!!"

Has it occurred to you, that since "Grok" is an "AI" that includes an image generator, which appears to have either no guardrails, or limited ones which are easily overcome, that all of these things might actually be true? Well, all of them, except how you spelt the word "paedophile".

JulieM

That's literally how the gutter press work. Whatever will get attention has always been afforded a higher priority than what's strictly accurate.

There used to be a time, though, when stories had to be at least vaguely plausible.

Now they know no-one is going to be able to test how true the story is, *and* they have plausible deniability that they fabricated it out of whole cloth.

I'm so tired ...

tony72

... of faux moral outrage.

Re: I'm so tired ...

nobody who matters

Me too.

However, this is a case where I think the outcry is fully justified.

Re: I'm so tired ...

tony72

And you've seen the material causing this hubub, have you? I admit I haven't, because there's no set of keywords that I'd want to enter in my browser in order to see them, so I may be talking out of my arse. But from descriptions I've read, this is mostly hyperbole and exaggeration, and it's only getting such a big deal made out of it because Elon.

Re: I'm so tired ...

LionelB

> But from descriptions I've read, …

Oh, come off it! You'll have seen screeds of descriptions (and descriptions which are consistent across media) of the kind of material discussed here.

Perhaps you need to relax your prejudices, expand your media bubble a little, or just grow some gumption about how to parse news.

Re: I'm so tired ...

Elongated Muskrat

And you've seen the material causing this hubub, have you?

Have you ? I ask, only because actively seeking it out is a criminal act in itself. Personally, I have no direct way of telling what is or is not posted on "X" since I'd already left the platform when Musk bought it, because I was paying attention, and I happened to already have noticed what a repellent human being he is, and how he had immediately started to enshittify it, as he enshittifies everything he touches (whilst having a long history of claiming credit for creations that are not his).

If your reaction to "people are posting child porn" on whatever platform is to go to have a look and check it out for yourself, perhaps you should stop using a computer, before the courts order that you have to stop using a computer.

Re: I'm so tired ...

tony72

Clearly you didn't get as far as the second sentence of my post, rendering your reply largely nonsensical.

Re: I'm so tired ...

Elongated Muskrat

I did, in fact, get to your third sentence, which, by your own admission in the second sentence, is hyperbole and speculation. Your first sentence is encouraging others to either commit a criminal act in order to disprove your speculation, or accept what you said at face value, and it was grade-A bollocks, so I won't be doing that.

Re: I'm so tired ...

LionelB

On what grounds do you consider this outrage to be "faux"?

Re: I'm so tired ...

tony72

It probably wouldn't even be a thing if Elon's name wasn't associated with it. Why is all the focus on the platform, and not the people creating the content? The Taylor Swift deepfake furore happened before Grok was in the game, because guess what, anybody can download Easy Diffusion or whatever and produce far more worrisome content than what Grok is allegedly producing, and I doubt that the genie can be put back in the bottle - guardrails on hosted platforms are containment theatre, nothing more. So why obsess about whatever mildly spicy content Grok can create, except that it's given the Elon-haters something new to rant about?

Re: I'm so tired ...

Dan 55

You're right, it absolutely wouldn't be a thing. If it were any other CSAM website the entire domain would resolve to /dev/null about the day after it were found and 99.9999999999% of the population wouldn't be any the wiser. Unfortunately it's UK government policy to simp US billionaires. The government would really rather the whole thing would go away, but it won't. They should have just banned it soon as it was known about like any other similar site, then it would be Musk who would have had to do something about X and Grok to get it back online, not the government.

CSAM = mild spicy content? Whatever you say.

Irongut

Why have X & Grok not been banned in the UK already? There is no need to pass a new law, just use existing laws against obscenity and CSAM and block the entire site.

This could have been done on Jan 7th. How many illegal images has Grok created in the 7 days that UKgov has been dragging its feet?

Why no ban?

Steve Davies 3

could if be that Elongated Muskrat has dirt on those in power?

Those old 'mob' tricks worked for a reason and SKUM is powerful enough to make a lot of people's lives a wreck without blinking.

He, like Trump simply does not care. Their whole Modus Operandii is Me, more money for me and Me and to hell with anyone else

Ask yourself why Trump is sending out begging emails... The emails are designed to drain your bank accounts.

Re: Why no ban?

Dan 55

Ask yourself why Trump is sending out begging emails... The emails are designed to drain your bank accounts.

He's a firm believer in trickle up economics.

But but

Mr Dogshit

FREE SPEECH! We have to have FREE SPEECH!

Re: But but

Elongated Muskrat

Frozen peaches, get your frozen peaches here! No need to freeze your own peaches!

0laf

Are they just trying to avoid upsetting the orange toddler?

His big tech funders seem to call up him pretty quick when anyone threatens to take a penny or two out of their pockets.

Would he back up Ol' Musky? I've no idea if those two have tiffs in public just for shits 'n' giggles, to pump stocks somehow or if it's real.

Just ban all of them

Anonymous Coward

and watch the mental health of the nation get better.

As for Twatter... it is the worst of them all and if you use it, ask yourself why you are continuing to line the pockets of the worlds richest man...

Re: Twatter

TimMaher

I’m now calling it Titter.

Re: Twatter

Ken Shabby

Titter ye not, please yourselves

Re: Twatter

Elongated Muskrat

Or "witter" as that accurately describes what most users do on there.

I'm shocked I tell ya shocked /s

Anonymous Coward

> Grok .. caused controversy .. when users prompted it to create images of naked or barely dressed people – mostly women, some underage – from real photos.

Elon's pants

MashedPotato

On 11th January 2025, The Guardian website has an image taken from X/Twitter of Elon Musk in a bikini.

It has not yet been taken down, despite the obvious

Re: Elon's pants

Pete Sdev

That image was generated by and posted by Mr Musk himself.

So consent for the image and its publication is implicit.

In strong contrast to the images the article is referring to.

Sloth77

“…and asking why it has taken "so long" to introduce the nudification ban "when reports of these disturbing Grok deepfakes appeared in August 2025."”

Quite simple - it hadn’t hit the mainstream media back then.

Anonymous Coward

Puritanism is back with vengeance!

Anonymous Coward

You prefer sexual abuse then…?

LionelB

I'm not sure that decrying the facilitation of creation and public dissemination of sexualised images of real people without their consent, including children (by definition without consent) actually qualifies as "puritanism".

Like Trump, Musk is just testing the water

Winkypop

Pushing known limits, getting reactions, forging new, if not forbidden, trails.

These guys are absolutely full of shit and hate.

Shut the fucker down, stop X in its tracks.

Do it.

understand, v.:
To reach a point, in your investigation of some subject, at which
you cease to examine what is really present, and operate on the
basis of your own internal model instead.