this post was submitted on 23 Jun 2024
279 points (92.9% liked)

Technology

33602 readers
232 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 25 points 1 week ago (15 children)

Odd that there is no mention of the parents contacting the police and working through them to get the images down Technically and legally the photos would be considered child porn Since it's over the Internet it would bring Federal charges even though there maybe State charges Somethings were handled wrong if all the kid is getting is probation

[–] [email protected] 21 points 1 week ago* (last edited 1 week ago) (10 children)

photos

They aren't photos. They're photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.

There isn't any actual private information about the girls being disclosed. The algorithms, for example, do not and could not know about and produce an unseen birthmark, mole, tattoo, piercing, etc. A photograph would have that information. What is being shown is an approxomation of what similar looking girls in the training set look like, with the girls' faces stiched on top. That is categorically different than something like revenge porn which is purely private information specific to the individual.

I'm sure it doesn't feel all that different to the girls in the pics, or to the boys looking at it for that matter. There is some degree of harm here without question. But we must tread lightly because there is real danger in categorizing algorithmic guesswork as reliable which many authoritarian types are desperate to do.

https://www.wired.com/story/parabon-nanolabs-dna-face-models-police-facial-recognition/

This is the other side of the same coin. We cannot start treating the output of neural networks as facts. These are error prone black-boxes and that fact must be driven hard into the consciousness of every living person.

For some, I'm sure purely unrelated reason, I feel like reading Phillip K Dick again...

[–] [email protected] 1 points 5 days ago

Whether or not you consider them photos, DOJ considers them child porn and you will still go to jail.

load more comments (9 replies)
load more comments (13 replies)