this post was submitted on 02 Jul 2025
328 points (97.7% liked)
Technology
72285 readers
2531 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.
I would categorise it as sexual harassment, not abuse. Still serious, but a different level
Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.
Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?
I don't know personally. The admins of the fediverse likely do, considering it's something they've had to deal with from the start. So, they can likely answer much better than I might be able to.
I would consider that as qualifying. Because it's targeted harassment in a sexually-explicit manner. All the girl would have to do is claim it's her.
Source: I'm a father of teenage daughters. I would pursue the individual(s) who started it and make them regret their choices.
Disagree. Not CSAM when no abuse has taken place.
That's my point.
I think generating and sharing sexually explicit images of a person without their consent is abuse.
That's distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I'm morally uncomfortable criminalizing an act that has no victim.
Harassment sure, but not abuse.
If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?
If so, how is the psychological effect of a convincing deepfake any different?
If someone puts a camera in a locker room, that means that someone entered a space where you would usually feel safe. It implies the potential of a physical threat.
It also means that someone observed you when you were doing "secret" things. One may feel vulnerable in such situations. Even a seasoned nude model might be embarrassed to be seen while changing, maybe in a dishevelled state.
I would think it is very different. Unless you're only thinking about the psychological effect on the viewer.
Taking secret nude pictures of someone is quite a bit different than....not taking nude pictures of them.
It's not CSAM to put a picture of someone's face on an adult model and show it to your friend. It's certainly sexual harassment, but it isn't CSAM.
There's a thing that was happening in the past. Not sure it's still happening, due to lack of news about it. It was something called "glamour modeling" I think or an extension of it.
Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.
Nothing untoward directly happened to the children. They weren't physically abused. They were treated as regular fashion models. And yet, it's still csam. Why? Because of the intention behind making those pictures.
The intention to exploit.
Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child's identity.
Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material... CSHAM, or maybe just CSAM, you know, to remember it more easily.
Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.
If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.
Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.
It's bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It's always about using it bully someone.
This is different because it's easier. It's not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn't have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.
Historically, the respectability of a woman depended on her sexuality. In many conservative cultures and communities, that is still true. Spreading the message that deepfakes are some particular horrible form of harassment reinforces that view.
If having your head on the model of a nude model is a terrible crime, then what does that say about the nude model? What does it say about women who simply happen to develop a larger bosom or lips? What does it say about sex before marriage?
The implicit message here is simply harmful to girls and women.
That doesn't mean that we should tolerate harassment. But it needs to be understood that we can do no more to stop this kind of harassment than we can do to stop any other kind.
This is just apologia for the sexual commodification and exploitation of girls and women. There literally is no girl being sexually liberated here, she has literally had the choice taken from her. Sexual liberation does NOT mean "boys and men can turn all women into personal maturation aids". This ENFORCES patriarchy and subjugation of women. It literally teaches girls that their bodies do not belong to them, that its totally understandable for boys to strip them of humanity itself and turn them into sex dolls.
The most deepfaked women are certainly actresses or musicians; attractive people that appear on screens and are known by much of the population.
In some countries, they do not allow people to appear on-screen exactly because of that. Or at least, that's one justification. If the honor or humanity of a woman depends on sexual feelings that she might or might not arouse in men, then women cannot be free. And men probably can't be free either.
At no point have I claimed that anyone is being liberated here. I do not know what will happen. I'm just pointing out how your message is harmful.
Sexual attraction doesn't necessarily involve dehumanization. Unlike most other kinds of interest in a human being, it doesn't require interest in their personality, but these are logically not the same.
In general you are using emotional arguments for things that work not through emotion, but through literal interpretation. That's like using metric calculations for a system that expects imperial. Utterly useless.
No, it's not. It's literally a photorealistic drawing based on a photo (and a dataset to make the generative model). No children have been abused to produce it. Laws work literally.
No, because the woman is not being literally sexually exploited. Her photo being used without consent is, I think, subject of some laws. There are no new fundamental legal entities involved.
I think I agree. But it's neither child pornography nor sexual exploitation and can't be equated to them.
There are already existing laws for such actions, similar to using a photo of the victim and a pornographic photo, paper, scissors, pencils and glue. Or, if you think the situation is radically different, there should be new punishable crimes introduced.
Otherwise it's like punishing everyone caught driving while drunk for non-premeditated murder. One is not the other.
Hey so, at least in the US, drawings can absolutely be considered CSAM
Well, US laws are all bullshit anyway, so makes sense
Normally yeah, but why would you want to draw sexual pictures of children?
Suppose I'm a teenager attracted to people my age. Or suppose I'm medically a pedophile, which is not a crime, and then I would need that.
In any case, for legal and moral purposes "why would you want" should be answered only with "not your concern, go eat shit and die".
I feel like you didn't read my comment thoroughly enough. I said it can constitue CSAM. There is a surprising amount of leewat for teenagers of course.
But no, I'm not gonna let you get away that easily. I want to know the why you think it's morally okay for an adult to draw sexually explicit images of children. Please, tell me how that's okay?
Because morally it's not your fucking concern what others are doing in supposed privacy of their personal spaces.
It seems to be a very obvious thing your nose doesn't belong there and you shouldn't stick it there.
I don't need any getting away from you, you're nothing.
Thank you. Focusing on the harm the victims is the right way to understand this issue. Too many people in here hunting for a semantic loophole.
I hope it might lead to a situation of dirty pics/vids not being a problem for the people in it any more, as it could be a deepfake. Like there were cases where a surfacing dirty pic was used for blackmail, ruined someones career or got them kicked out of some comittee, but since it could be fabrication now, I hope this will beva thing of the past, soon.
That could be a socially healthy place to end up at. I don't see it anytime soon though. Just look at the other response I got.
Sure. That might end up being a socially healthy place for adults to end up.
But it will never work that way for young teens. Their brains aren't done baking yet. They don't have the emotional maturity to understand that enough to be "okay with it because it's just a fake".
That's why we protect kids rather than just telling them "hey it's okay...it's only a fake."
Anyone with half a brain will certainly claim as much. Even if people don't fully believe it, it will blunt the most serious of social consequences.
I'm not even going to begin describing all the ways that what you just said is fucked up.
I'll just point out that online deepfake technology is FAR more accessible to the average 13 year old to use on their peers than "porno mags" were in our day.
You want to compare taking your 13 year old classmates photo off of Facebook, running it through an AI and in five seconds creating photo-realistic adult content featuring them, and compare that to getting your dad's skin-mag from under his mattress when he's not home, cutting your classmates face out of a yearbook, taping it on, then sneaking THAT into the computer lab at school so that you can photocopy it and pass it around in home room, and then putting the skin-mag BACK under the mattress before your dad finds out.
Is that right...is THAT what you're trying to say? Are those the two things that you're trying say are equivalent?
Yes, we all know it's fucked up. The point is that we don't need a new class of laws just because it's harassment and bullying ✨with AI✨.
Furthermore, we generally assume malicious intent, but I wouldn't be surprised if teenagers were using the app to 'get' big boobs etc., we all have seen those shopped pictures with deformed background 😁