[-] [email protected] 1 points 22 hours ago* (last edited 22 hours ago)

yeah idk why they said that, it's objectively wrong.

[-] [email protected] 1 points 22 hours ago

Deepfakes do not contain any recorded information about the subject unless that subject is also in the training set.

this is explicitly, untrue, they literally do. You are just factually wrong about this. While it may not be in the training data, how do you think it manages to replace the face of someone in one picture, with the face of someone else in some other video.

Do you think it just magically guesses? No, it literally uses a real picture of someone. In fact, back in the day with ganimation and early deepfake software, you literally had to train these AIs on pictures of the person you wanted it to do a faceswap on. Remember all those singing deepfakes that were super popular back a couple of years ago? Yep, those literally trained on real pictures.

Regardless, you are still ignoring my point. My question here was how do we consider AI content to be "not photo" but consider photos manipulated numerous times, through numerous different processes, which are quite literally, not the original photo, and a literal "photo" to rephrase it simpler for you, and other readers. "why is ai generated content not considered to be a photo, when a heavily altered photo of something that vaugely resembles it's original photo in most aspects, is considered to be a photo"

You seem to have missed the entire point of my question entirely. And simply said something wrong instead.

Yes it is semantics

no, it's not, this is a ship of thesseus premise here. The semantics results in how we contextualize and conceptualize things into word form. The problem is not semantics (they are just used to convey the problem at hand), the problem is a philosophical conundrum that has existed for thousands of years.

in fact, if we're going by semantics here, technically photograph is rather broad as it literally just defines itself as "something in likeness of" though it defines it as taken by method of photography. We could arguably remove that part of it, and simply use it to refer to something that is a likeness of something else. And we see this is contextual usage of words, a "photographic" copy is often used to describe something that is similar enough to something else, that in terms of a photograph, they appear to be the same thing.

Think about scanning a paper document, that would be a photographic copy of some physical item. While it is literally taken via means of photography. In a contextual and semantic sense, it just refers to the fact that the digital copy is photographically equivalent to the physical copy.

[-] [email protected] 1 points 22 hours ago

oh shit, my bad.

[-] [email protected] 1 points 22 hours ago

yeah but we're also talking about something that quite literally never happened, it was all manufactured, and while i don't want to downplay the effects of that.

This is probably the best time ever to start being an e slut because you can just say it was deep faked and people don't exactly have a reason to disagree with you.

Also while trauma is permanent, i would also like to remind you that every life experience you have throughout your life is also permanent, it cannot be changed, it cannot be undone, it cannot be revoked. You simply have to live with it. The only thing that changes your experiences and memories around it, is how you handle it internally.

I would probably be more compassionate with you if we were literally talking about revenge porn, or whatever the correct stipulation would be here, i'm not sure, i don't exactly fuck people on the regular so i'm not really qualified here lmao.

But like i said, this is just AI generated. Everyone knows about AI now, how many people do you think are going to hear that and go "yeah that makes sense" probably most of them. Highschoolers might be a bit more unreasonable, but nothing changes the fact that they simply aren't real. You just have to do your best to dissociate yourself from that alternate reality where they are, because they quite literally, are not.

some people would consider it to be traumatic, others wouldn't. I wouldn't give a shit either way, i might even further the rumors because i think it would be funny. It's all a matter of perspective.

[-] [email protected] 6 points 1 day ago

Not when you ruin someone else’s life.

we are literally talking about an image that was made out of thin air, the description of "ruining someones life" is fucking absurd considering the very real alternative in this case.

[-] [email protected] 1 points 1 day ago

I don't think maturity is an explicit thing in a binary form, i would be ok with the presumption that the age of 18 provides a general expected range of maturity between individuals, it's when you start to develop your world view and really pick up on the smaller things in life and how they work together to make a functional system.

I think the idea of putting a "line" on it, is wrong, i think it's better to describe it "this is generally what you expect from this subset"

[-] [email protected] 2 points 1 day ago

People that want everyone to be OK with nudity and in most cases diddling kiddo’s. Same arguments, almost verbatim, have been used in the map-sphere.

you say this like they're saying that children have to be naked in order to be outside legally. The point they were making is that the primary reason half of what you said was a significant concern is due explicitly to our current social climate and it's values. While not fully relevant, they still made a point, and considering how bad your argumentative rhetoric is, i'd say it's a fair shot at something you said, considering you didn't have much else to say other than accusing someone of being a pedophile i guess.

[-] [email protected] 3 points 1 day ago

We have been able to see faces since forever and people are still mocked for having faces that don’t fit the popular norms. Your argument is flawed.

i have vitiligo on my face, have yet to be mocked for it. People only ask about it respectfully.

People still have the right to privacy.

actually, no you don't. Very few places have legal protections for privacy, both online, and physically, if you go outside in most states in the US you're being trained on some sort of crime stopping AI dataset somewhere

[-] [email protected] -2 points 1 day ago

All 3 of them are changing schools, 2 are failing their years. Because of something someone else did to them. And you claim it’s not that bad? The fuck is wrong with you?

and by the time they're 18 and moving on to college, or whatever they're probably busy not fucking worrying about whatever happened in high school, because at the end of the day you have two options here:

be a miserable fuck. try to be the least miserable fuck you can, and do something productive.

Generally people pick the second option.

And besides, at the end of the day, it's literally not real, none of this exists. It's better than having your nudes leaked. Should we execute children who spread nudes of other children now? That's a far WORSE crime to be committing, because now that shit is just out there, and it's almost definitely on the internet, AND IT'S REAL.

Seems to me like you're unintentionally nullifying the consequences of actual real CSAM material here.

Is my comment a little silly and excessive? Yes, that was my point. It's satire.

[-] [email protected] 2 points 1 day ago

It’s not as easy as they want you to believe it is. I’m pretty sure most of the “promotional material” has been photoshopped or cherry picked at best

absolutely, all of the material out there for marketing is digitally manipulated by a human to some degree. And if it isn't then honestly, i don't know what you're using AI image generation for lmao.

[-] [email protected] 6 points 1 day ago

They aren’t photos. They’re photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.

most phone cameras alter the original image with AI shit now, it's really common, they apply all kinds of weird correction to make it look better. Plus if it's social media there's probably a filter somewhere in there. At what point does this become the ship of thesseus?

my point here, is that if we're arguing that AI images are semantically, not photos, than most photos on the internet including people would also arguably, not be photos to some degree.

[-] [email protected] 2 points 1 day ago

it would be material of and or containing child sexual abuse in it.

view more: next ›

KillingTimeItself

joined 6 months ago