279
submitted 5 days ago by [email protected] to c/[email protected]
top 50 comments
sorted by: hot top controversial new old
[-] [email protected] 31 points 3 days ago* (last edited 3 days ago)

I apologize for the innappropriate behavior and bans by @[email protected] in this thread, I've removed them as a mod here, banned them, and unbanned the ppl who they innappropriately banned.

Note: if they get unbanned in the near future, its because of our consensus procedure which requires us admins to take a vote.

[-] [email protected] 6 points 3 days ago

Appreciate it.

[-] [email protected] 6 points 3 days ago

Thank you

They have another account on lemmygrad: https://lemmygrad.ml/u/TheAnonymouseJoker

Anyone around that knows the admins there so they take a look too?

load more comments (1 replies)
[-] [email protected] 43 points 4 days ago* (last edited 1 day ago)
[-] [email protected] 113 points 5 days ago

His record should be expunged when he turns 18 because it was a crime he committed as a child. I understand their frustrations, but they're asking to jail a child over some photoshopped images.

Making a deepfake is definitely not a heavy crime that deserves jailtime or a permanent mark unless he was an adult doing it.

[-] [email protected] 40 points 5 days ago* (last edited 5 days ago)

My personal belief still is that the prohibitive approach is futile and ultimately more harmful than the alternative: embrace the technology, promote it and create deepfakes of everyone.

Soon the taboo will be gone, the appeal as well, and everyone will have plausible deniability too, because if there are dozens of fake nudes of any given person then who is to say which are real, and why does it even matter at that point?

This would be a great opportunity to advance our societal values and morals beyond prudish notions, but instead we double down on them.

E: just to clarify I do not at all want to endorse creating nudity of minors here. Just point out that the girl in the article wouldn't have to humiliate herself trying to do damage control in the above scenario, because it would be entirely unimportant.

[-] [email protected] 63 points 5 days ago

While I think removing the stigma associated with having deepfakes made of you is important, I don't think that desensitization through exposure is the way to go about it. That will cause a lot of damage leading up to the point you're trying to reach.

load more comments (4 replies)
[-] [email protected] 23 points 5 days ago

This sounds like a cool idea because it is a novel approach, and it appeals to my general heuristic of the inevitability of technology and freedom. However, I don't think it's actually a good idea. People are entitled privacy, on this I hope we agree -- and I believe this is because of something more fundamental: people are entitled dignity. If you think we'll reach a point in this lifetime where it will be too commonplace to be a threat to someone's dignity, I just don't agree.

Not saying the solution is to ban the technology though.

[-] [email protected] 16 points 5 days ago

When you put out photos of yourself on the internet you should expect anyone to find them and do whatever they want to them. If you aren't expecting that, then you aren't educated enough on how internet works and that's what we should be working on. Social media is really bad for privacy and many people are not aware of it.

Now if someone took a picture of you and then edited it without your consent, that is a different action and it's a lot more serious offense.

Either way, deepfakes are just an evolution of something that already existed before and isn't going away anytime soon.

load more comments (4 replies)
[-] [email protected] 19 points 5 days ago

It's also worth noting that too many people put out way too much imagery of themselves online. People have got to start expecting that anything you put out in the public domain becomes public domain.

load more comments (1 replies)
[-] [email protected] 19 points 5 days ago

You don't turn 18 and magically discover your actions have consequences.

"Not a heavy crime"? I'll introduce you to Sarah, Marie and Olivia. You can tell them it was just a joke. You can tell them the comments they've received as a result are just jokes. The catcalling, mentions that their nipples look awesome, that their pussies look nice, etc are just jokes. All 3 of them are changing schools, 2 are failing their years. Because of something someone else did to them. And you claim it's not that bad? The fuck is wrong with you?

[-] [email protected] 34 points 5 days ago

Kids are kids until 18 because people mature at different rates. At 18 it is safe to assume most have matured enough. This kid could be 18 mentally, but he could also be 13 mentally.

Why are you trying emotional manipulation in order to justify punishing this one kid as if he was an adult?

Here, let me show you what you just did. Let me introduce you to Steve. His life was ruined because he made a deepfake of a girl he likes and sent it to his friend, but he shouldn't have trusted that friend, because the deepfake then found itself on every phone in class. Steve got a 3 year sentence, forcing early dropout, and due to his permanent mark, he would forever be grouped with rapists and could never find a job. He killed himself at 21. And you claim it's not that bad? The fuck is wrong with you?

load more comments (18 replies)
[-] [email protected] 18 points 4 days ago* (last edited 4 days ago)

Perhaps at least a small portion of the blame for what these girls are going through should be laid upon the society which obstinately teaches that a woman's worth as a person is so inextricably tied to her willingness and ability to maintain the privacy of her areolas and vulva that the mere appearance of having failed in the endeavour is treated as a valid reason to disregard her humanity.

load more comments (11 replies)
load more comments (3 replies)
load more comments (2 replies)
[-] [email protected] 60 points 5 days ago

“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.

“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail...so he would get a punishment for what he actually did,” McAdams told CNN.

There's a reason kids are tried as kids and their records are expunged when they become adults. Undoing that will just ruin lives without lessening occurrences.

“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said. “By this bill getting passed, I will no longer have to live in fear knowing that whoever does bring these images up will be punished.”

This week, Republican Senator Ted Cruz, Democratic Senator Amy Klobuchar and several colleagues co-sponsored a bill that would require social media companies to take down deep-fake pornography within two days of getting a report.

“[The bill] puts a legal obligation on the big tech companies to take it down, to remove the images when the victim or the victim's family asks for it,” Cruz said. “Elliston's Mom went to Snapchat over and over and over again, and Snapchat just said, ‘Go jump in a lake.’ They just ignored them for eight months.”

BS

It's been possible for decades for people to share embarrassing pictures of you, real or fake, on the internet. Deep fake technology is only really necessary for video.

Real or fake pornography including unwilling participants (revenge porn) is already illegal and already taken down, and because the girl is underage it's extra illegal.

Besides the legal aspect, the content described in the article, which may be an exaggeration of the actual content, is clearly in violation of Snapchat's rules and would have been taken down:

  • We prohibit any activity that involves sexual exploitation or abuse of a minor, including sharing child sexual exploitation or abuse imagery, grooming, or sexual extortion (sextortion), or the sexualization of children. We report all identified instances of child sexual exploitation to authorities, including attempts to engage in such conduct. Never post, save, send, forward, distribute, or ask for nude or sexually explicit content involving anyone under the age of 18 (this includes sending or saving such images of yourself).
  • We prohibit promoting, distributing, or sharing pornographic content, as well as commercial activities that relate to pornography or sexual interactions (whether online or offline).
  • We prohibit bullying or harassment of any kind. This extends to all forms of sexual harassment, including sending unwanted sexually explicit, suggestive, or nude images to other users. If someone blocks you, you may not contact them from another Snapchat account.
load more comments (2 replies)
[-] [email protected] 15 points 4 days ago* (last edited 3 days ago)

Wary of the bill. Seems like every bill involving stuff like this is either designed to erode privacy or for regulatory capture.

Edit: spelling

[-] [email protected] 4 points 3 days ago

introducing the AI transparency act, which requires every generative prompt to be registered in a government database

[-] [email protected] 5 points 3 days ago

and that's what I loathe about the idiots who are for this stuff. Yes, I want to curb this stuff - but for fuck's sake there are ways to do it that aren't "Give big government every scrap of data on you".

There are ways to prove I'm over 18 without needing to register my ID with a porn company, or to regulate CSAM while not having to read private messages. Fuck, but we have the combination of circle of a venn diagram of idiot and control freak in congress, and they'll happily remove all of our rights over some fear of the boogeyman

load more comments (5 replies)
load more comments (3 replies)
[-] [email protected] 25 points 5 days ago

Odd that there is no mention of the parents contacting the police and working through them to get the images down Technically and legally the photos would be considered child porn Since it's over the Internet it would bring Federal charges even though there maybe State charges Somethings were handled wrong if all the kid is getting is probation

[-] [email protected] 21 points 4 days ago* (last edited 4 days ago)

photos

They aren't photos. They're photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.

There isn't any actual private information about the girls being disclosed. The algorithms, for example, do not and could not know about and produce an unseen birthmark, mole, tattoo, piercing, etc. A photograph would have that information. What is being shown is an approxomation of what similar looking girls in the training set look like, with the girls' faces stiched on top. That is categorically different than something like revenge porn which is purely private information specific to the individual.

I'm sure it doesn't feel all that different to the girls in the pics, or to the boys looking at it for that matter. There is some degree of harm here without question. But we must tread lightly because there is real danger in categorizing algorithmic guesswork as reliable which many authoritarian types are desperate to do.

https://www.wired.com/story/parabon-nanolabs-dna-face-models-police-facial-recognition/

This is the other side of the same coin. We cannot start treating the output of neural networks as facts. These are error prone black-boxes and that fact must be driven hard into the consciousness of every living person.

For some, I'm sure purely unrelated reason, I feel like reading Phillip K Dick again...

load more comments (9 replies)
load more comments (4 replies)
[-] [email protected] 23 points 5 days ago

Shitty companies are selling AI editing tools explicitly for this purpose. Their ads are all over Instagram. They've been found in supposedly regulated app stores. Yet, I've never seen anyone report on this trash industry.

There is no stopping the existence of these tools when running on local hardware, but it shouldn't be this easy for teenagers. Somehow these companies manage to make money while real sex workers find themselves shoved into platforms like OnlyFans because no credit card company will process their payments. That's the wrong way around!

load more comments (3 replies)
[-] [email protected] 11 points 4 days ago

why do i get the gut feeling that this is going to be an utter clusterfuck of a mess.

Hopefully i'm wrong.

[-] [email protected] 16 points 4 days ago

Is it CSAM if it was produced by AI?

[-] [email protected] 28 points 4 days ago* (last edited 4 days ago)

It should be considered illegal if it was used to harm/sexually abuse a child which in this case it was.

Whether it should be classed as CSAM or something separate, I tend to think probably something separate as a revenge porn type law that still allows for distinguishing between this and say a girl whose uncle groomed and sexually abused her while filming it as while this is awful it can (and often does seem) be the product of foolish youth rather than the offender and those involved all being very sick, dangerous, and actually violent offending adult pedophiles victimizing children.

Consider the following:

  1. Underage girl takes a picture of her own genitals, unfortunately classified as the unhelpful and harmful term "child porn" and she can be charged and registered as a sex offender but it's not CSAM and -shouldn't- be considered illegal material or a crime (though it is because the west has a vile fixation on puritanism which hurts survivors of childhood sexual trauma as well as adults).

  2. Underage girl takes a picture of her genitals and sends it to her boyfriend, again /shouldn't/ be CSAM (unfortunately may be charged similarly), she consented and we can assume there wasn't any unreasonable level of coercion. What it is unfortunately is bound by certain notions of puritanism that are very American.

  3. From 2, boyfriend shares it with other boys, now it's potentially CSAM or at the least revenge porn of a child as she didn't consent and it could be used to harm her but punishment has to be modulated with the fact the offender is likely a child himself and not fully able to comprehend his actions.

  4. Underage boy cuts out photo of underage girl he likes, only her face and head, glues it atop a picture of a naked porn actress, maybe a petite one and uses it for his own purposes in private. Not something I think should be classed as CSAM.

  5. Underage boy uses AI to do the same as above but more believably, again I think it's kind of creepy but if he keeps it to himself and doesn't show anyone or spread it around it's just youthful weirdness though really he probably shouldn't have easy access to those tools.

  6. Underage boy uses AI to do same as 4-5 but this time he spread it around, defaming the girl, she/her friends find out, people say mean things about her, she has to go to school with a bunch of people who are looking and pleasuring themselves to fake but realistic images of herself against her consent which is violating and makes one feel unsafe. Worse probably being bullied for it, mean things, called the s-word, etc.

Kids are weird and do dumb things though unfortunately boys especially in our culture have a propensity to do things that hurt girls far more than the inverse to the point it's not even really worth talking about girls being creepy or sexually abusive towards peer-aged boys in adolescence and young adulthood. To address this though you need to address patriarchy and misogyny on a cultural level, teach boys empathy and respect for girls and women and frankly do away with all this abusive pornography that's super prevalent and popular which encourages and perpetuates abusive actions and mentalities towards women and girls, this will never happen in the US however because it's structurally opposed to being able to do such a thing. Also couldn't hurt to peel back the stigma and shame around sexuality and nudity in the US which stems from its reactionary Christian culture but again I don't think that will ever happen in the US as it exists, not this century anyways.

Obviously not getting into adults here as that doesn't need to be discussed, it's wrong plain and simple.

Bottom line I think is companies need to be strongly compelled to quickly remove revenge-porn type stuff (regardless of the age of the victim though children can't deal with this kind of thing as well as adults so the risk of suicide or other self-harm is much higher so it should be treated as higher priority) which this definitely is. It's abusive and unacceptable and they should fear the credit card companies coming down on them hard and destroying them if they don't aggressively remove it and ban it and report those sharing it. It should be driven off the clear-web once reported, there should be an image-hash data-set like that used for CSAM (but separate) for such things and major services should use it to stop the spread.

load more comments (2 replies)
[-] [email protected] 31 points 4 days ago

In this case, yes. Visually indistinguishable from a photo is considered CSAM. We don't need any new laws about AI to get these assholes. Revenge porn laws and federal CSAM statutes will do.

load more comments (24 replies)
[-] [email protected] 13 points 4 days ago* (last edited 4 days ago)

i believe in the US for all intents and purposes, it is, especially if it was sourced from a minor, because you don't really have an argument against that one.

load more comments (38 replies)
load more comments
view more: next ›
this post was submitted on 23 Jun 2024
279 points (92.9% liked)

Technology

33555 readers
335 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS