Google Photos Will Soon Show You If an Image Was Edited With AI
- Reference: 0175319267
- News link: https://tech.slashdot.org/story/24/10/24/2122224/google-photos-will-soon-show-you-if-an-image-was-edited-with-ai
- Source link:
> "Photos edited with tools like Magic Editor, Magic Eraser and Zoom Enhance already include metadata based on technical standards from The International Press Telecommunications Council (IPTC) to indicate that they've been edited using generative AI," John Fisher, engineering director of Google Photos, wrote in a [2]blog post . "Now we're taking it a step further, making this information visible alongside information like the file name, location and backup status in the Photos app."
>
> The "AI info" section will be found in the image details view of Google Photos both on the web and in the app. These labels won't be limited strictly to generative AI, either. Google says it'll also specify when a "photo" contains elements from several different images -- such as when people use the Pixel's Best Take and Add Me features. [...] "This work is not done, and we'll continue gathering feedback and evaluating additional solutions to add more transparency around AI edits," Fisher wrote.
[1] https://www.theverge.com/2024/10/24/24278663/google-photos-generative-ai-label-reimagine-best-take
[2] https://blog.google/products/photos/ai-editing-transparency/
So there's a market for high-quality man-made fake (Score:2)
Traditional tools and good manual image doctoring leaves no trace. Whoever is willing to pay the price will always be able to create convincing fake material.
The danger is, if people get used to thinking that no warning from Google means it's legit, the hand-made fake stuff will become that much more credible.
Photo but not Metadata Edited (Score:2)
So actually it will not show you if an image was edited with AI. What it will show you is if an image was edited with a standard AI tool that records it in the metadata and that the metadata was not subsequently edited. If you use some custom/homebrew AI tool or edit the metadata afterwards it will have no clue.
Of course the real fun will start when someone edits the metadata of a non-AI edited image so that Google will report it as an AI enhanced image. If anything this feature is just going to confuse
How detailed is the info? (Score:2)
It needs to give details as to what was done exactly. A label that just says AI was used doesn't tell us if the creator was just using the "easy button" for basic picture enhancement or if significant changes in the content of the image were done. Even if the label was something embedded in a way a malicious actor could not remove (unlikely), they can just lie and explain away the AI usage by claiming it was for basic picture tasks, if there is no detailed info given.
So this is like my ComfyUI output then. (Score:2)
Anyone who wants to take the output of my ComfyUI sessions can easily replicate the workflow because it's embedded in the metadata. If I pull it into an external editor, this gets broken and disappears. But I actually want to distribute with that metadata, because if people want to replicate the look, I want to make that as simple as possible on behalf of the people making the tools. If this kind of metadata starts being used as an AI "tell", that's fine. My generations are subject to the same flaws and foi
Not good enough. (Score:2)
AI metadata will be easily stripped from an image file.
Better if the metadata was steganographed into the image as well, or signed with a hash of the picture or... SOMETHING so the AI marking can't be removed without destroying an indicator of tampering.