this post was submitted on 02 Jul 2025
328 points (97.7% liked)

Technology

72285 readers
2723 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

you are viewing a single comment's thread
view the rest of the comments
[–] lath@lemmy.world 39 points 1 day ago (2 children)

Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

[–] LostXOR@fedia.io 17 points 1 day ago (3 children)

Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

[–] lath@lemmy.world 6 points 1 day ago

I don't know personally. The admins of the fediverse likely do, considering it's something they've had to deal with from the start. So, they can likely answer much better than I might be able to.

[–] lka1988@lemmy.dbzer0.com 2 points 22 hours ago* (last edited 22 hours ago)

I would consider that as qualifying. Because it's targeted harassment in a sexually-explicit manner. All the girl would have to do is claim it's her.

Source: I'm a father of teenage daughters. I would pursue the individual(s) who started it and make them regret their choices.

[–] surewhynotlem@lemmy.world -2 points 1 day ago

Drawing a sexy cartoon that looks like an adult, with a caption that says "I'm 12", counts. So yeah, probably.

[–] wewbull@feddit.uk 4 points 1 day ago (4 children)

Disagree. Not CSAM when no abuse has taken place.

That's my point.

[–] Zak@lemmy.world 15 points 1 day ago (1 children)

I think generating and sharing sexually explicit images of a person without their consent is abuse.

That's distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I'm morally uncomfortable criminalizing an act that has no victim.

[–] kemsat@lemmy.world 4 points 1 day ago

Harassment sure, but not abuse.

[–] atomicorange@lemmy.world 6 points 1 day ago (2 children)

If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

If so, how is the psychological effect of a convincing deepfake any different?

[–] General_Effort@lemmy.world 6 points 23 hours ago

If someone puts a camera in a locker room, that means that someone entered a space where you would usually feel safe. It implies the potential of a physical threat.

It also means that someone observed you when you were doing "secret" things. One may feel vulnerable in such situations. Even a seasoned nude model might be embarrassed to be seen while changing, maybe in a dishevelled state.

I would think it is very different. Unless you're only thinking about the psychological effect on the viewer.

[–] BombOmOm@lemmy.world 5 points 1 day ago* (last edited 1 day ago) (1 children)

Taking secret nude pictures of someone is quite a bit different than....not taking nude pictures of them.

It's not CSAM to put a picture of someone's face on an adult model and show it to your friend. It's certainly sexual harassment, but it isn't CSAM.

[–] atomicorange@lemmy.world 0 points 1 day ago (1 children)

How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

[–] BombOmOm@lemmy.world 4 points 1 day ago* (last edited 1 day ago) (1 children)

It's absolutely sexual harassment.

But, to your question: you can't just say something has underage nudity when the nudity is of an adult model. It's not CSAM.

[–] atomicorange@lemmy.world 6 points 1 day ago

Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.

[–] lath@lemmy.world 5 points 1 day ago

There's a thing that was happening in the past. Not sure it's still happening, due to lack of news about it. It was something called "glamour modeling" I think or an extension of it.

Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.

Nothing untoward directly happened to the children. They weren't physically abused. They were treated as regular fashion models. And yet, it's still csam. Why? Because of the intention behind making those pictures.

The intention to exploit.

[–] lka1988@lemmy.dbzer0.com 4 points 22 hours ago* (last edited 22 hours ago)

Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child's identity.

Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material... CSHAM, or maybe just CSAM, you know, to remember it more easily.