Ireland joins regulator smackdown after X's Grok AI accused of undressing people
- Reference: 1771326516
- News link: https://www.theregister.co.uk/2026/02/17/ireland_dpc_x_grok_probe/
- Source link:
The DPC confirmed today it is launching the probe under section 110 of the Data Protection Act 2018.
Officials said the inquiry will focus on Grok AI and the nude or nearly nude images users prompted it to create from photographs of people, which could be viewed by others on the platform.
[1]
X's safety team said in [2]January that it had blocked its Grok tool from making these edits: "We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing..."
[3]
[4]
In a statement, the Irish DPC said: "The inquiry concerns the apparent creation, and publication on the X platform, of potentially harmful, non-consensual intimate and/or sexualized images, containing or otherwise involving the processing of personal data of EU/EEA data subjects, including children, using generative artificial intelligence functionality associated with the Grok large language model via the Grok account within the X platform."
The DPC will determine whether X violated various aspects of the GDPR, including Articles 5, 6, 25, and 35, that are related to principles and lawfulness of data processing, data protection by design and default, and data protection impact assessment requirements.
[5]
"The DPC has been engaging with [X] since media reports first emerged a number of weeks ago concerning the alleged ability of X users to prompt the Grok account on X to generate sexualized images of real people, including children," said deputy commissioner Graham Doyle.
"As the lead supervisory authority for [X] across the EU/EEA, the DPC has commenced a large-scale inquiry which will examine [X] compliance with some of their fundamental obligations under the GDPR in relation to the matters at hand."
A little late to the party, the DPC joins the likes of the [6]European Commission , the UK's [7]ICO and [8]Ofcom , Australia, Canada, India, Indonesia, and Malaysia in opening cases against the platform.
[9]
The [10]French also have a broad investigation ongoing since January, the scope of which continues to widen as more issues emerge.
[11]EU looking into Elon Musk's X after Grok produces deepfake sex images
[12]Ofcom officially investigating X as Grok's nudify button stays switched on
[13]UK to properly probe xAI to test if its revolting robo-smut generator broke the law
[14]X marks the raid: French cops swoop on Musk's Paris ops
X's lawyers are going to be busy, not just with the number of open investigations but under the different laws they will have to defend themselves against.
While both are EU authorities, the DPC and the European Commission's investigations are probing the same activity within the confines of GDPR and the Digital Services Act, respectively.
Likewise, the UK's ICO and Ofcom regulate from different angles. The ICO will probe it from a data protection perspective, while Ofcom, the communications watchdog, is responsible for policing the Online Safety Act.
X is accused of allowing users to prompt its Grok AI chatbot to digitally undress images of real people without their consent. Investigations will also determine whether these cases included images of children.
The company responded by complying with initial requests from the regulators, and revoked Grok's image-generation capabilities for free X users, reserving it only for paid subscribers to the platform. It subsequently [15]widened the restriction to all users . ®
Get our [16]Tech Resources
[1] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aZSetBDWmm5mFOdf0fzvnwAAA5E&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[2] https://x.com/Safety/status/2011573102485127562
[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aZSetBDWmm5mFOdf0fzvnwAAA5E&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aZSetBDWmm5mFOdf0fzvnwAAA5E&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aZSetBDWmm5mFOdf0fzvnwAAA5E&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[6] https://www.theregister.com/2026/01/26/ec_open_new_investigation_into/
[7] https://www.theregister.com/2026/02/04/uk_spain_social_media_regulation/
[8] https://www.theregister.com/2026/01/12/xai_grok_uk_regulation/
[9] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aZSetBDWmm5mFOdf0fzvnwAAA5E&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[10] https://www.theregister.com/2026/02/03/french_police_raid_x/
[11] https://www.theregister.com/2026/01/26/ec_open_new_investigation_into/
[12] https://www.theregister.com/2026/01/12/xai_grok_uk_regulation/
[13] https://www.theregister.com/2026/02/04/uk_spain_social_media_regulation/
[14] https://www.theregister.com/2026/02/03/french_police_raid_x/
[15] https://www.theregister.com/2026/01/15/ofcom_grok_probe/
[16] https://whitepapers.theregister.com/
Re: Am I the only one ...
There are shit heads who set out to make that specifically but, in the case of Grok, it's basically misuse of image touch up capabilities. "can you remove this ugly broach from my dress in this picture" is the same task as "can you remove my dress in this picture".
Re: Am I the only one ...
I think the technology has been around for a while for women to see how they would look wearing different dresses etc, it isn't a huge technological leap to "no dress".
Re: Am I the only one ...
I don't know about others here, but I just don't get the point of AI photo editing. Perhaps it's because, when I take a picture of something, I want to record what I can see is actually there and not some machine-edited version of it. Who is going round taking pictures and then thinking, "I want to show this to someone, or post it on my 'socials', to show what I've been up to, but rather than showing what I've really been doing, I want to lie instead"?
I'm left thinking of the TV advert that is doing the rounds (I think it's for a Google phone) which asks the viewer whether they think they should allow their phone to reveal anything it "knows" about a person to their friends. I would have thought the answer to this, from most people, would be a resounding "Hell, no." I'll be the arbiter of what I tell my friends, thank you very much, not an "AI" on my phone which probably relies on sending all my personal data for processing to a data centre in the least regulated part of the US.
As for using "AI" to "touch-up" photographs to completely remove people's clothes, since an "AI" is just a statistical model based entirely on its training data, it stands to reason that it wouldn't have this capability unless it included such images in its training data. If it can be used to generate pornographic images of children, then I think there should be some very pointed questions being asked about how and why it is able to do this, and whether the owners of the model should be investigated more closely. Who thought that feeding images of naked people into image generation software that isn't specifically intended to generate other images of naked people, was a good idea?
Re: Am I the only one ...
Vanity
Re: Am I the only one ...
I think the CSAM ingestion accusation is a bit of a stretch. Most diffusion models can generate fairly photographic images of clearly impossible things by bodging together known concepts (e.g. A pink elephant in a rice paddy or a Ferrari drawn in the style of Da Vinci).
As for the why of editing photos, the reasons are too broad to list but the phenomenon of people touching up photos is hardly unique to diffusion models and is as old as photography (have a look at why photoshop tools like dodge and burn have those icons).
Re: Am I the only one ...
I think the CSAM ingestion accusation is a bit of a stretch. Most diffusion models can generate fairly photographic images of clearly impossible things by bodging together known concepts (e.g. A pink elephant in a rice paddy or a Ferrari drawn in the style of Da Vinci).
To do these things, the model would have to have been training data that allowed it to identify an elephant, determine what colour pink is, have source material of rice paddies, and Ferraris, as well as at least a representative sample of the corpus of work by Da Vinci.
Similarly, to render a naked person, it needs to have had at least some source material representing what people look like under their clothes, or it's not going to know what genitalia look like.
I've not seen the supposed output of Grok (I've never even used Grok, since I won't touch Elmo's stuff on principle), so I don't know how realistic or convincing the output is, but the reporting would suggest that it is accurate enough to cause concerns, which implies that it has been fed accurate training data.
Once again, a statistical model cannot produce output that does not meet the patterns of its input. In simple terms, it can either provide an "average" of the data it is provided, interpolate between two data points, or extrapolate from a series. If it hasn't been trained on images of naked people, then this would be a case of extrapolation, and I simply do not believe that an unthinking piece of computer software would correctly extrapolate these images. The conclusion, then, is that it is either "averaging" from a large number of images of naked people, or interpolating between, say, the image of one person clothed, and another naked. You cannot interpolate between two points without knowing both points. This is, of course, a simplification, but the thing here is that "AI" cannot create anything new, and anyone who tells you it can is swindling you.
Re: Am I the only one ...
Sorry if I was unclear, I meant that I believe a diffusion model would be capable of making simulated, fake CSAM based on inputs of just nude adults and clothed children, though I'm really not interested in testing this hypothesis beyond pointing to pink elephants
a cynical attempt to shutdown the last truly open forum on the Internet
This could be interpreted as a cynical attempt to shutdown the last truly open forum on the Internet that is not under control of that state security apparatus.
Re: a cynical attempt to shutdown the last truly open forum on the Internet
Wah! Free speech! Wah! Must have free speech!
You are Muskrat and I claim my five pounds
Re: a cynical attempt to shutdown the last truly open forum on the Internet
Dear AC. Just a heads-up that your post could be interpreted as a cynical attempt to promote the creation and dissemination of pornographic images of children.
Not that I'd interpret it that way myself.
Oh, no, definitely not.
Re: a cynical attempt to shutdown the last truly open forum on the Internet
It could be. But first one would have to get past the pretzel logic claiming that X is "the last truly open forum on the Internet." Which is so insanely laughable as to cause one to debate the poster's connection to reality.
Re: a cynical attempt to shutdown the last truly open forum on the Internet
I'm starting to think that about 90% of the AC posts here are the same person. There certainly do seem to be a lot of very deranged posts made behind the shroud of anonymity, which again, is especially cowardly given that a user's handle hardly reveals their true identity.
Re: a cynical attempt to shutdown the last truly open forum on the Internet
> the last bastion of Musk love on the Internet
FTFY
Re: a cynical attempt to shutdown the last truly open forum on the Internet
Cool story, bro.
It seems last stretch to just lying to call an AI hallucination of a naked person “undressing someone”.
I'm going to suggest that if the undresee were a child, and the images/videos freely and publicly available, their parents might beg to differ with you (and that's just the most extreme end of the issue).
FFS, I cannot see even the remotes justification for the facility to create and disseminate such images to be readily available to all and sundry — "freedom of expression" does not even come close. Surely even the most zealous Musk fanboi would concede that freedom of expression is not unconstrained (unless you're a very literal anarchist). I cannot, for example, express my freedom to arbitrarily kill someone I don't like, because, y'know, it's against the law . If laws are currently inadequate to protect people against a novel form of obvious and clearcut abuse, and provide no redress against such abuse, then the law needs to catch up.
Not just the parents, the po-po might like to have a quiet word with you down at the station, whilst seizing all of your computing devices for forensic examination.
AI depicting an image of a dressed person as being an undressed person is actually a slanderous lie. The arrogance of X making that tool available as part of X means section 230 won't protect X from civil lawsuits from damages, or possibly even criminal lawsuits.
Am I the only one ...
... who is both disgusted by anyone even thinking that 'nudifying' software was 'a good idea', or troubled that at the end of an article about an at best highly dubious use of AI there is a link to:
"Sponsored: Unlocking the hidden power of unstructured data with AI"