News: 0173648042

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Apple Removes Nonconsensual AI Nude Apps From App Store (404media.co)

(Friday April 26, 2024 @11:20AM (msmash) from the cat-and-mouse dept.)


404 Media:

> Apple has removed a number of AI image generation apps from the App Store after 404 Media found these apps advertised the [1]ability to create nonconsensual nude images , a sign that app store operators are starting to take more action against these types of apps.

>

> Overall, Apple removed three apps from the App Store, but only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself.

>

> Apple's action comes after we reported on Monday that Instagram [2]advertises nonconsensual AI nude apps . By browsing Meta's Ad Library, which archives ads on its platform, when they ran, on what platforms, and who paid for them, we were able to find ads for five different apps, each with dozens of ads. Two of the ads were for web-based services, and three were for apps on the Apple App Store. Meta deleted the ads when we flagged them. Apple did not initially respond to a request for comment on that story, but reached out to me after it was published asking for more information. On Tuesday, Apple told us it removed the three apps on its App Store.



[1] https://www.404media.co/apple-removes-nonconsensual-ai-nude-apps-following-404-media-investigation/

[2] https://www.404media.co/instagram-advertises-nonconsensual-ai-nude-apps/



wat (Score:3)

by drinkypoo ( 153816 )

> Apple has removed a number of AI image generation apps from the App Store after 404 Media found these apps advertised the ability to create nonconsensual nude images

You literally cannot prevent that in an app which can make consensual nude images. Therefore the word nonconsensual is being used in order to trigger people into having a specific opinion. A better description is "an app which can be used to create fake nude images" since it can't literally show you what someone would look like unclothed.

Re: (Score:2)

by jacks smirking reven ( 909048 )

Yeah this is definitely a "I'll know it when I see it" type of judgement but to be fair most of their customer base if presented with the question of "how would you feel about someone using an AI app to perfectly map your face onto a nude body and distribute them and most people will not know the difference or see it labelled as fake in any way" would have a negative reaction to that.

If I was Tim Apple I would probably make the same call.

Re: (Score:2)

by bugs2squash ( 1132591 )

How will I generate my dating app profile picture now though ?

Re: (Score:2)

by jacks smirking reven ( 909048 )

Go old school, hire a real airbrush artist.

Re: (Score:2)

by drinkypoo ( 153816 )

I'm not against them removing the apps, I'm against the headline trying to make me feel a certain way. Present the facts, I'll decide how I feel about them.

Re: (Score:2)

by cayenne8 ( 626475 )

> Yeah this is definitely a "I'll know it when I see it" type of judgement but to be fair most of their customer base if presented with the question of "how would you feel about someone using an AI app to perfectly map your face onto a nude body and distribute them and most people will not know the difference or see it labelled as fake in any way" would have a negative reaction to that.

Well....if the resultant images made me much less fat, and a bit more ripped....I dunno...maybe?

;)

jk

Re: (Score:2)

by jacks smirking reven ( 909048 )

Hey consent means you reserve to right to make yourself as much of a gigachad as you want. That's your right as an American damnit.

Inked (Score:3)

by bugs2squash ( 1132591 )

Tattoo parlors will become the new bastions of bodily security

And who are you? (Score:2)

by Valgrus Thunderaxe ( 8769977 )

Overall, Apple removed three apps from the App Store, but only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself.

I'm glad you're here doing all this good, preventing "nonsensical nudity".

Re: (Score:2)

by smooth wombat ( 796938 )

Nonconsensual. Meaning, someone is taking one person's body and slapping on a different person's face and touting it off as the person.

Coincidentally, it's almost always men doing it to women.

Re: (Score:2)

by Valgrus Thunderaxe ( 8769977 )

Nonconsensual.

Yes, this was changed very quickly.

Pictures (Score:2)

by Archangel Michael ( 180766 )

"Pictures. Or it didn't happen!"

AI has ruined the joke.

Images (Score:1)

by necro81 ( 917438 )

Pics or it didn't happen!

Oh, wait. No, I don't want to see that, because that shit's nasty and harmful. The existence of such images demonstrates the widespread existence of a tool that can be used to nasty, spiteful, disturbing ends against me, my family, people I know and care about, and the vast overwhelming majority of people that do not want that technology used on them.

I understand the technology cannot be unmade; Pandora's box is already wide open. But that also doesn't mean Apple (or whoeve

Another day, another dollar.
-- Vincent J. Fuller, defense lawyer for John Hinckley,
upon Hinckley's acquittal for shooting President Ronald
Reagan.