this post was submitted on 27 Jul 2023
38 points (85.2% liked)

Technology

33632 readers
305 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
top 9 comments
sorted by: hot top controversial new old
[–] [email protected] 25 points 11 months ago (2 children)

All good and well till someone takes a screenshot.

[–] [email protected] 9 points 11 months ago (2 children)

It might be resistant to screenshots - unless I missed it, the article didn't clarify whether the obfuscation process is applied to the image on a per-pixel basis, or within the file format itself...

If it was that easy to bypass it would be a pretty futile mechanism IMO, one would just need to convert the image to strip out the obfuscation 🫠 or just take a screenshot as you said

[–] [email protected] 18 points 11 months ago (1 children)

Sounds like it's tiny changes to the image data to trick it. But it also sounds dependent on each algorithm. So while you might trick Stable Diffusion, another like Midjourney would be unaffected.

And either way, I'd bet mere jpeg compression would be enough to destroy your tiny changes.

[–] [email protected] 3 points 11 months ago

a couple minutes of photoshop and a smudge or burn tool would also negate all the effects

[–] [email protected] 4 points 11 months ago (1 children)

These things never work in the real world. We’ve seen this over and over. It’s snakeoil. Latent space mapping may survive compression but don’t work across encoders.

[–] [email protected] 2 points 11 months ago

It's as good as scanning a random marking of a human bone that somehow installs a virus in your pc

[–] [email protected] 2 points 11 months ago

Yeah it might work in the original format under some conditions but won't survive a screenshot or saving to another format.

[–] [email protected] 3 points 11 months ago

Once again there comes the time to manually shop oneself to handshake with celebrities.

[–] [email protected] 1 points 11 months ago

The white paper linked's title is very pragmatic sounding "Raising the Cost of Malicious AI-Powered Image Edit". Would like to read it deeper later to see what the actual mechanisms deployed are. I know ive considered some form of attestation embedding both in the data and form linked with cryptographic signature. You know for emportant things like politics, diplomacy and celeberty endorsement. /s