this post was submitted on 28 Feb 2025
265 points (99.3% liked)

Technology

65819 readers
4952 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Allero@lemmy.today 9 points 1 week ago (21 children)

That's exactly how they work. According to many articles I've seen in the past, one of the most common models used for this purpose is Stable Diffusion. For all we know, this model was never fed with any CSAM materials, but it seems to be good enough for people to get off - which is exactly what matters.

[–] DoPeopleLookHere@sh.itjust.works -2 points 1 week ago (20 children)

How can it be trained to produce something without human input.

To verify it's models are indeed correct, some human has to sit and view it.

Will that be you?

[–] TheRealKuni@midwest.social 5 points 1 week ago (18 children)

How can it be trained to produce something without human input.

It wasn’t trained to produce every specific image it produces. That would make it pointless. It “learns” concepts and then applies them.

No one trained AI on material of Donald Trump sucking on feet, but it can still generate it.

[–] DoPeopleLookHere@sh.itjust.works -2 points 1 week ago (2 children)

It was able to produce that because enough images of both feet and Donald Trump exist.

How would it know what young genitals look like?

[–] JuxtaposedJaguar@lemmy.ml 3 points 1 week ago (1 children)

You could probably make some semi-realistic drawings and feed those in, and then re-train the model with those same images over and over until the model is biased to use the child-like properties of the drawings but the realism of the adult pictures. You could also feed the most CP-looking images from a partially trained model as the training data of another model, which over time would make the outputs approach the desired result.

[–] DoPeopleLookHere@sh.itjust.works -1 points 1 week ago (1 children)

But to know if it's accurate, someone has to view and compare....

[–] JuxtaposedJaguar@lemmy.ml 4 points 1 week ago (1 children)

It doesn't matter if it's accurate or not as long as pedos can get off to it, so just keep going until they can. According to our definition of what a pedophile is, though, it would likely be accurate.

[–] DoPeopleLookHere@sh.itjust.works 0 points 1 week ago (1 children)

But if it's not accurate, will pedos jerk off to it?

[–] JuxtaposedJaguar@lemmy.ml 2 points 1 week ago (1 children)

Probably not, but that's irrelevant. The point is that no one needs to harm a child to find out if the output is sufficiently arousing.

[–] DoPeopleLookHere@sh.itjust.works 0 points 1 week ago (1 children)

But how does it get more authentic without actual input if what's accurate.

It's not enough to tell and AI that's somethings wrong. You have to also tell it what was right.

[–] JuxtaposedJaguar@lemmy.ml 2 points 1 week ago

It doesn't need to get more authentic, it just needs to get more arousing, and we have a perfectly ethical way to measure that. You tell the AI it was "right" if the pedos you show it to get aroused.

[–] JuxtaposedJaguar@lemmy.ml 3 points 1 week ago (1 children)

If you train a model on 1,000,000 images of dogs and 1,000,000 images of cats, your output isn't going to be a 50/50 split of purely dogs and purely cats, it's going to be (on average) somewhere between a cat and a dog. At no point did you have to feed in pictures of dog-cat hybrids to end up with that model.

[–] DoPeopleLookHere@sh.itjust.works -1 points 1 week ago (1 children)

Yes but you start with the basics of a cat and a dog. So you start with adult genitals and.......

[–] JuxtaposedJaguar@lemmy.ml 3 points 1 week ago (1 children)

Non-pornographic pictures of children and/or human-made pornographic drawings of children.

[–] DoPeopleLookHere@sh.itjust.works -1 points 1 week ago (1 children)
[–] JuxtaposedJaguar@lemmy.ml 2 points 1 week ago (1 children)

"Okay" in what sense? If you mean morally, then I think that's pretty clear cut. If you mean legally, then that's just a technicality.

[–] DoPeopleLookHere@sh.itjust.works 0 points 1 week ago (1 children)

totally ethical thousands of photos of drawings of children in sexual contexts

Legality is just a technicality

Okay there bud.

[–] JuxtaposedJaguar@lemmy.ml 2 points 1 week ago (1 children)

Why would "thousands of photos of drawings of children in sexual contexts" be unethical?

[–] DoPeopleLookHere@sh.itjust.works 0 points 1 week ago (1 children)

Because they're barely legal in certain places?

[–] JuxtaposedJaguar@lemmy.ml 3 points 1 week ago (1 children)

Plenty of moral things are illegal or barely legal in certain places. For example, homosexual adults having consensual sex with each other in their own home. I assume you don't think that's unethical or immoral?

I'm not saying legality is ethical.

I'm saying there's no practical way to assemble that much material without exploration at some level.

load more comments (15 replies)
load more comments (16 replies)
load more comments (16 replies)