White House Gets Voluntary Commitments From AI Companies To Curb Deepfake Porn (engadget.com)
- Reference: 0174984549
- News link: https://yro.slashdot.org/story/24/09/12/2031226/white-house-gets-voluntary-commitments-from-ai-companies-to-curb-deepfake-porn
- Source link: https://www.engadget.com/ai/white-house-gets-voluntary-commitments-from-ai-companies-to-curb-deepfake-porn-191536233.html
> The participating businesses have laid out the steps they are taking to prevent their platforms from being used to generate non-consensual intimate images (NCII) of adults and child sexual abuse material (CSAM). Specifically, Adobe, Anthropic, Cohere, Common Crawl, Microsoft and OpenAI said they'll be: "responsibly sourcing their datasets and safeguarding them from image-based sexual abuse."
>
> All of the aforementioned except Common Crawl also agreed they'd be: "incorporating feedback loops and iterative stress-testing strategies in their development processes, to guard against AI models outputting image-based sexual abuse" and "removing nude images from AI training datasets" when appropriate. [...] The notable absences from today's White House release are Apple, Amazon, Google and Meta.
[1] https://www.whitehouse.gov/ostp/news-updates/2024/09/12/white-house-announces-new-private-sector-voluntary-commitments-to-combat-image-based-sexual-abuse/
[2] https://www.engadget.com/ai/white-house-gets-voluntary-commitments-from-ai-companies-to-curb-deepfake-porn-191536233.html
Gentleman's agreements (Score:5, Insightful)
These sorts of agreements have worked out so well for the public in recent history. I'm sure there's no way they would renege on such a binding commitment, once the government commits to not legislating.
Re: (Score:2)
There's a reason strip clubs are called gentleman's clubs.
Re: (Score:2)
> These sorts of agreements have worked out so well for the public in recent history. I'm sure there's no way they would renege on such a binding commitment, once the government commits to not legislating.
Those commitments don't work when there's a strong motivation to break them, that's not that case here. This is more a "we commit not to do something incredibly controversial for virtually zero benefit".
Even Grok isn't interested in pron, [1]despite letting through virtually everything else go [acs.org.au].
I'm not sure why Apple, Amazon, Google and Meta weren't in the list, though I'm guessing it's more to do with internal red tape than wanting to make Deepfake porn.
[1] https://ia.acs.org.au/article/2024/xai-grok-goes-wild-with-deepfakes-and-nudity.html
We can't get them to stop training on our data... (Score:2)
What makes anybody think they'll expend too much effort to prevent what could well be the killer app for the AI industry?
Are we sure it's not actually Photoshop? (Score:2)
Granted, I'll be the first to admit I've never tried to find an AI porn generator, but it certainly doesn't seem like it's as easy as running a Google search. I've already run headfirst into the "responsible AI guidelines" on one of the major ones trying to make some silly cartoon artwork for AI sung songs, so I doubt any of the mainstream AI stuff will let you make porn. Heck, they don't even let you make political cartoons of well-known celebs without complaining that you're being too naughty.
By comparis
I promise I won't come in your mouth (Score:2)
I always tell the truth and I never lie. Honest.
anyone else skeptical? (Score:2)
Don't think i will trust any of this to not be abused to claim something is fake but turns out to be true and something that is true turns out to be fake in which interestingly all goes 1 way like how media always gets it wrong against 1 direction.
Except that by doing so they will fuck it up (Score:1)
Last time that someone (Stability AI) tried a little too hard to make its new model (Stable Diffusion 3) as SFW as possible we got unfixable anatomic monstrosities, you couldn't even prompt for an human lying on grass without getting a nightmarish mess. That wasn't even nerfing, it was straight up downgrading something that used to work just fine to something broken and unusable, as a result they fell into irrelevance (as it should be) and got superspeed by Flux (from Black Forest Labs). The true point here
Sexual abuse material? (Score:2)
So now I'm abusing someone if I so much as have my own computer make a fake with their face on it? This has gone several steps too far. How to reacquire liberty?
Re:Sexual abuse material? (Score:4, Insightful)
Too many people are profiting from these claims. Hence you would have to reform a major part of law-enforcement, politics and some part of the IT security industry.
As to whether that is doable, refer to the "War on drugs".
Re: (Score:1)
So now I'm abusing someone if I so much as have my own computer make a fake with their face on it?
Yes. You do not have permission to use their likeness. Not hard to understand.
This has gone several steps too far. How to reacquire liberty?
Not a problem. We'll just make a deepfake of your wife being railed by a pig. Oh wait. You don't have a wife.
Re: (Score:2)
> Not a problem. We'll just make a deepfake of your wife being railed by a pig. Oh wait. You don't have a wife.
What ever you are into, if you get enjoyment out of it then its up to you.
If you never distributed it how would I know? How would it affect me or my wife in any way. Right now if someone sees anyone they can imagine them doing any sexual act they want.
If it gets distributed as long as its clearly labeled as a deep fake I also don't care. It would probably be lost in the sea of porn anyway I probably wouldn't know either.
If it is portrayed as real then that is fraud an character assignation, however that is
Re: (Score:2)
> Yes. You do not have permission to use their likeness. Not hard to understand.
Actually, it is hard to understand since it's currently legal to use someone's likeness.
If Taylor Swift walks down the street, I can legally photograph her.
I can't sell the photo for commercial use in advertising or product endorsement, but I can sell it to a newspaper for editorial use, or post it to my Facebook page.
It is also legal for me to edit the photo, rearrange the pixels, or use AI to create an image that looks exactly like her. I can't use it for advertising, but I can post it publicly.
So what, e
Re: (Score:2)
Your freedom to swing your arms ends at my nose.
Re: (Score:2)
Which means distributing a malicious deepfake should be a crime, which actually is probably already covered under numerous other laws. Actually creating it for personal use if nobody ever finds out? Banning that is getting creepily close to thoughtcrime laws.