this post was submitted on 02 Jul 2025
328 points (97.7% liked)
Technology
72285 readers
2531 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.
If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.
Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.
It's bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It's always about using it bully someone.
This is different because it's easier. It's not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn't have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.
It's sexually objectifying the bodies of girls and turning them into shared sexual fantasies their male peers are engaging in. It is ABSOLUTELY different because it is more realistic. We are talking about entire deep fake porngraphy production and distribution groups IN THEIR OWN SCHOOLS. The amount of teenage boys cutting pictures out and photoshopping them was nowhere near as common as this is fast becoming and it was NOT the same as seeing a naked body algorithmically derived to appear as realistic as possible.
Can you stop trying to find a silver lining in the sexual exploitation of teenage girls? You clearly don't understand the kinds of long term psychological harm that is caused by being exploited in this way. It was also exploitative and also fucked up when it was in photoshop, this many orders of magnitude more sophisticated and accessible.
Youre also wrong that this is about bullying. Its an introduction to girls being tools for male sexual gratification. It's LITERALLY commodifiying teenage girls as sexual experiences and then sharing them in groups together. It's criminal. The consent of the individual has been entirely erased. Dehumanization in its most direct form. It should be against the law and it should be prosecuted very seriously wherever it is found to occur.
Can you please use words by their meaning?
Also I'll have to be blunt, but - every human has their own sexuality, with their own level of "drive", so to say, and their dreams.
And it's absolutely normal to dream of other people. Including sexually. Including those who don't like you. Not only men do that, too. There are no thought crimes.
So talking about that being easier or harder you are not making any argument at all.
However. As I said elsewhere, the actions that really harm people should be classified legally and addressed. Like sharing such stuff. But not as making child pornography because it's not, and not like sexual exploitation because it's not.
It's just that your few posts I've seen in this thread seem to say that certain kinds of thought should be illegal, and that's absolute bullshit. And laws shouldn't be made based on such emotions.
I don’t know where you’re getting this “thought crime” stuff. They’re talking about boys distributing deepfake nudes of their classmates. They’re not talking about individuals fantasizing in the privacy of their own homes. You have to read all of the words in the sentences, my friend.
If a boy fantasises sexually about a girl, is that harmful to her? If he tells his friends about it? No, this is not harmful - these actions do not affect her in any way. What affects the girl is how the boys might then treat her differently than they would do someone they don't find sexually attractive.
The solution, in both cases, has to be to address the harmful behaviour. The only arguments for criminalising deepfakes themselves are also arguments for criminalising sexual fantasies. that is why people are talking about thought crime, because once you criminalise things that are harmless on their own, but which might down the line lead to directly harmful behaviour, there is no other distinction.
Both of these, for example, apply just as readily to discussing a shared sexual fantasy about someone who didn't agree to it.
No distinction, that is, other than this is new and icky. I don't want government policy to be dictated by fear of the new and by what people find icky, though. I do lots of stuff people find icky.
No an image that is shared and distributed is not the same as a fantasy in someone's head. That is deranged. Should CSAM also be legal because making it illegal is like criminalizing the fantasies of pedophiles? Absolutely insane logical framework you have there.
This isnt fantasy. It is content. It is media. It is material. It is produced without the consent of the girls and women being sexualized and it commodifies their existence, literally transforming the idea of them into sexual media consumed for the gratification of boys and men.
It is genuinely incredible to me that you could be so unempathetic, so impassive, so detached from the real world and the consequences of this, that you could even make this comparison. You have seemingly no idea what youre talking about if you believe that pornography is the same thing as mental fantasies.
And even in the case of mental fantasies, are those all good? Is it really a good thing that boys see the mere existence of the girls around them as inherently some kind of sexual availability?
When someone makes child porn they put a child in a sexual situation - which is something that we have amassed a pile of evidence is extremely harmful to the child.
For all you have said - "without the consent" - "being sexualised" - "commodifies their existence" - you haven't told us what the harm is. If you think those things are in and of themselves harmful then I need to know more about what you mean because:
I am not unempathetic, but I attribute the blame for what makes me feel bad about the situation is that girls are being made to feel bad and ashamed not that a particular technology is now being used in one step of that.
I am just genuinely speechless than you seemingly do not understand how sickening and invasive it is for your peers to create and share sexual content of you without your consent. Yes its extremely harmful. Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they've already decided that you're a sexual experience for them.
We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal. It should be against the law to turn someone's images into AI generated pornography. It should also be illegal to share those images with others.
Why is it these things? Why does someone doing something with something which is not your body make it feel like your body doesn't belong to you? Why does it not instead make it feel like images of your body don't belong to you? Several of these things could equally be used to describe the situation when someone is fantasised about without their knowledge - why is that different? In Germany there's a legal concept called "right to one's own image" but there isn't in many other countries, and besides, what you're describing goes beyond this.
My thinking behind these questions is that I cannot see anything inherent, anything necessary about the creation of fake sexual images of someone which leads to these harms, and that instead there is an aspect of our society which very explicitly punishes and shames people - woman far more so than men - for being in this situation, and that without that, we would be having a very different conversation.
Starting from the position that the harm is in the creation of the images is like starting from the position that the harm of rape is in "defiling" the person raped. Rape isn't wrong because it makes you worthless to society - society is wrong for devaluing rape victims. Society is wrong for devaluing and shaming those who have fake images made of them.
Can you be more explicit about what it's the same as?
The sexualization of women and girls is pervasive across literally every level of western culture. What do you think the purpose is of the victims head and face being in the image? Do you believe that it plays an incidental and unrelated role? Do you believe that finding out that, there is an entire group of people who you thought were your friends but are in actuality taking pictures of your head and masturbating to the idea of you performing sex acts for them using alorthimically derived likenesses of your naked body, has no psychological consequences for you whatsoever? I'm just talking about it and it makes me want to throw up. It is a fucking nightmare. This is not normal. This is not creating a healthy relationship with sexuality and it is enforcing a view of women and their bodies existing for the gratification of men.
You continuously attempt to extrapolate some very bizarre metaphors about this that are not at all applicable. This scenario is horrifying. Teenage girls should not be subject to scenarios like this. It is sexual exploitation. It is dehumanization. It promotes misogynistic views of women. This is NOT a matter of sexual liberation. Youre essentially saying that men and boys can't be expected to treat girls and women as actual people and instead must be allowed to turn their friends and peers into fetishized media content they can share amongst each other. Thats fucking disgusting. The longer you talk the more you start to sound like an incel. I'm not saying you are one, but this is the kind of behavior that they defend.
Do you think the consequences of finding out are significantly different than finding out they're doing it in their imagination? If so, why?
And, just to be clear, by this you mean the stuff with pictures, not talking or thinking about them? Because, again, the words "media content" just don't seem to be key to any harm being done.
Your approach is consistently to say that "this is harmful, this is disgusting", but not to say why. Likewise you say that the "metaphors are not at all applicable" but you don't say at all what the important difference is between "people who you thought were your friends but are in actuality taking pictures of your head and masturbating to the idea of you performing sex acts for them using alorthimically derived likenesses of your naked body" and "people who you thought were your friends but are in actuality imagining your head and masturbating to the idea of you performing sex acts for them using imagined likenesses of your naked body". Both acts are sexualisation, both are done without consent, both could cause poor treatment by the people doing it.
I see two possiblities - either you see this as so obviously and fundamentally wrong you don't have a way of describing way, or you know that the two scenarios are fundamentally similar but know that the idea of thought-crime is unsustainable.
Finally it's necessary to address the gendered way you're talking about this. While obviously there is a huge discrepancy in male perpetrators and female victims of sexual abuse and crimes, it makes it sound like you think this is only a problem because, or when, it affects women and girls. You should probably think about that, because for years we've been making deserved progress at making things gender-neutral and I doubt you'd accept this kind of thing in other areas.
There is an institution in society specifically designed to strip women of their autonomy, reduce them down to their sexual appeal to men, and proliferate the notions of their inherent submission to men. This simply does not exist the other way. This will not be a major problem for boys, teenage girls are not creating fucking AI porn rings with pictures of boys from their classes. That isnt happening. Will someone do it? Almost certainly. Is it a systemic issue? No. Men's bodies are not attacked institutionally in this way.
And youre still trying to equate imagination with physical tangible media. And to be clear, if several of my friends said they were collectively beating off to the idea of me naked, I would be horrified and disgusted. The overwhelming majority of people would. Again, they've taken you an actual person they know and are friends with, and have turned you into a sexual goal to be attained. It is invasive, exploitative, and above all else dehumanizing. Yeah if even one of my friends told me he jerked off to the thought of me naked I would never see him the same way again and would stop being friends with him. If I was a teenager it would probably fuck me up pretty bad to know that someone who I thought was my friend just saw me as a collection of sexual body parts with a face attached. If I found that a whole group of boys, some who i might not even know, were sharing AI generated porn with my face it would be severely psychologically traumatizing and probably shake my trust in men and boys for the rest of my life. This isn't a fucking game. Youre acting like this is normal, its NOT FUCKING NORMAL. Photoshopping a girl in your classes face onto a nude body and sharing it with a group of boys is NOT NORMAL. That is severely disturbed behavior. That shows a complete malfunction in your empathy. It does if thats your imagination too. And finding that out, that somebody has done that, is absolutely repulsive.
And no I find it perfectly sustainable. We have no means by which to detect pedophiles by their thoughts. But pedophilic thoughts are still wrong and are not something we tolerate people expressing. Creating CSAM is still illegal, whether or not the child is aware such content is being created of them. They cant consent to that as they are children. This is the same. No we cant fucking read people's thoughts and punish them for them. Having thoughts like that is absolutely a sign of some obsessive tendencies and already forming devaluation of women and girls and reduction of them to their bodies, but the correct thing is for them to receive counseling and proper education about sex and relationships. Creating, sharing and distributing AI generated porn of someone is so fundamentally different from that I have to think you have a fundamental misunderstanding about what an image is. This isnt a fucking thought. These boys and men can do whatever they want with this pornography they've made of you, can send it to whoever they want and share it as far and wide as they want. They have literally created porn of you without your consent. And for teenage girls this is a whole other level of fucked up. This is being used to produce CSAM. They cannot consent to this. It is a provable act of violation of women and girls. This should be illegal and should be treated extremely seriously when teenage boys are found to have done it.
You all say youre feminists until someone comes after your fucked up sexualities and your porn addictions. Always the same.
Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?
The harm is:
No, but the harm certainly is not the same as CSAM and it should not be treated the same.
as far as I know there is no good evidence that this is the case and is a big controversy in the topic of fake child porn, i.e. whether it leads to more child abuse (encouraging paedophiles) or less (gives them a safe outlet) or no change.
If someone fantasises about me without my consent I do not give a shit, and I don't think there's any justification for it. I would give a shit if it affected me somehow (this is your first bullet point, but for a different situation, to be clear) but that's different.
Hm. I wasn’t expecting the pro-child porn argument. All I can say is that’s absolutely legally and morally CSAM, and you’re fuckin nasty. Oof. Not really gonna bother with the rest because, well, yikes.
Hey, it's OK to say you just don't have any counter-argument instead of making blatantly false characterisations.
Historically, the respectability of a woman depended on her sexuality. In many conservative cultures and communities, that is still true. Spreading the message that deepfakes are some particular horrible form of harassment reinforces that view.
If having your head on the model of a nude model is a terrible crime, then what does that say about the nude model? What does it say about women who simply happen to develop a larger bosom or lips? What does it say about sex before marriage?
The implicit message here is simply harmful to girls and women.
That doesn't mean that we should tolerate harassment. But it needs to be understood that we can do no more to stop this kind of harassment than we can do to stop any other kind.
This is just apologia for the sexual commodification and exploitation of girls and women. There literally is no girl being sexually liberated here, she has literally had the choice taken from her. Sexual liberation does NOT mean "boys and men can turn all women into personal maturation aids". This ENFORCES patriarchy and subjugation of women. It literally teaches girls that their bodies do not belong to them, that its totally understandable for boys to strip them of humanity itself and turn them into sex dolls.
The most deepfaked women are certainly actresses or musicians; attractive people that appear on screens and are known by much of the population.
In some countries, they do not allow people to appear on-screen exactly because of that. Or at least, that's one justification. If the honor or humanity of a woman depends on sexual feelings that she might or might not arouse in men, then women cannot be free. And men probably can't be free either.
At no point have I claimed that anyone is being liberated here. I do not know what will happen. I'm just pointing out how your message is harmful.
Spoken like someone who hasn't been around women.
You mean like a nerd who reads too much?
Sexual attraction doesn't necessarily involve dehumanization. Unlike most other kinds of interest in a human being, it doesn't require interest in their personality, but these are logically not the same.
In general you are using emotional arguments for things that work not through emotion, but through literal interpretation. That's like using metric calculations for a system that expects imperial. Utterly useless.
No, it's not. It's literally a photorealistic drawing based on a photo (and a dataset to make the generative model). No children have been abused to produce it. Laws work literally.
No, because the woman is not being literally sexually exploited. Her photo being used without consent is, I think, subject of some laws. There are no new fundamental legal entities involved.
I think I agree. But it's neither child pornography nor sexual exploitation and can't be equated to them.
There are already existing laws for such actions, similar to using a photo of the victim and a pornographic photo, paper, scissors, pencils and glue. Or, if you think the situation is radically different, there should be new punishable crimes introduced.
Otherwise it's like punishing everyone caught driving while drunk for non-premeditated murder. One is not the other.
Hey so, at least in the US, drawings can absolutely be considered CSAM
Well, US laws are all bullshit anyway, so makes sense
Normally yeah, but why would you want to draw sexual pictures of children?
Suppose I'm a teenager attracted to people my age. Or suppose I'm medically a pedophile, which is not a crime, and then I would need that.
In any case, for legal and moral purposes "why would you want" should be answered only with "not your concern, go eat shit and die".
I feel like you didn't read my comment thoroughly enough. I said it can constitue CSAM. There is a surprising amount of leewat for teenagers of course.
But no, I'm not gonna let you get away that easily. I want to know the why you think it's morally okay for an adult to draw sexually explicit images of children. Please, tell me how that's okay?
Because morally it's not your fucking concern what others are doing in supposed privacy of their personal spaces.
It seems to be a very obvious thing your nose doesn't belong there and you shouldn't stick it there.
I don't need any getting away from you, you're nothing.
No. That's not a good enough excuse to potentially be abusing children.
I can't think of a single good reason to draw those kinds of things. Like at all. Please, give me a single good reason.
It's good enough for the person whose opinion counts, your doesn't. And there's no such potential.
Too bad.
To reinforce that your opinion doesn't count is in itself a good reason. The best of them all really.
Okay so you have no reason. Which is because having sexually explicit images, drawn or otherwise, is gross and weird and disturbing. And the fact that you are continually doubling down shows me that you likely need your hard drives and notebooks checked.
Please don't respond again unless you are telling me what country you are from so I can report you to the appropriate authorities.
People don't need reasons to do things gross or disturbing or whatever for you in their own space.
Thankfully that's not your concern, and would get you in jail if you tried to do that yourself. Also I'm too lazy for my porn habits to be secret enough, LOL.
I don't think you understand. You're the fiend here. The kind of obnoxious shit that thinks it's in their right to watch after others' morality.
I wonder, what if I'd try to report you and someone would follow through (unlikely, of course, without anything specific to report), hypothetically, which instances of stalking and privacy violations they'd find?
You really seem the kind.
Thank you. Focusing on the harm the victims is the right way to understand this issue. Too many people in here hunting for a semantic loophole.