Allero

joined 1 year ago
[–] Allero@lemmy.today 1 points 1 week ago (4 children)

Some form of digital signatures for allowed services?

Sure, it will limit the choice of where to legally generate content, but it should work.

[–] Allero@lemmy.today 1 points 1 week ago

Thanks for context!

[–] Allero@lemmy.today 2 points 1 week ago

The thing is, banning is also a consequential action.

And based on what we know about similar behaviors, having an outlet is likely to be good.

Here, the EU takes an approach of "banning just in case" while also ignoring the potential implications of such bans.

[–] Allero@lemmy.today 1 points 1 week ago (2 children)

Aha, I see. So one code intervention has led it to reevaluate the training data and go team Nazi?

[–] Allero@lemmy.today 3 points 1 week ago

Eh, I knew something was fishy - otherwise it would be such a great option for a compression algorithm!

No luck again :D

Thanks for elaborating!

[–] Allero@lemmy.today 21 points 1 week ago* (last edited 1 week ago) (11 children)

"Bizarre phenomenon"

"Cannot fully explain it"

Seriously? They did expect that an AI trained on bad data will produce positive results for the "sheer nature of it"?

Garbage in, garbage out. If you train AI to be a psychopathic Nazi, it will be a psychopathic Nazi.

[–] Allero@lemmy.today 2 points 1 week ago (2 children)

Honestly I was not able to retrieve information by those coordinates (hexagon number, wall, shelf, volume, page). Gonna play around more with it - maybe I didn't get something.

[–] Allero@lemmy.today 19 points 1 week ago

As an advocate for online and offline safety of children, I did read into the research. None of the research I've found confirm with any sort of evidence that AI-generated CSAM materials increase risks of other illicit behavior. We need more evidence, and I do recommend to exercise caution with statements, but for the time being, we can rely on the studies in other forms of illegal behaviors and the effects of their decriminalization, which paint a fairly positive picture. Generally, people will tend to opt for what is legal and more readily accessible - and we can make AI CSAM into exactly that.

For now, people are criminalized for the zero-evidence-its-even-bad crime, while I tend to look quite positively on what it can bring on the table instead.

Also, pedophiles are not human trash, and this line of thinking is also harmful, making more of them hide and never get adequate help from a therapist, increasing their chances of offending. Which, well, harms children.

They are regular people who, involuntarily, have their sexuality warped in a way that includes children. They never chose it, they cannot do anything about it in itself, and can only figure out what to do with it going forward. You could be one, I could be one. What matters is the decisions they take based on their sexuality. The correct way is celibacy and refusion of any sources of direct harm towards children, including the consumption of real CSAM. This might be hard on many, and to aid them, we can provide fictional materials so they could let some steam off. Otherwise, many are likely to turn to real CSAM as a source of satisfaction, or even turn to actually abusing children IRL.

[–] Allero@lemmy.today 4 points 1 week ago

Much as all in modern AI - it's able to train without much human intervention.

My point is, even if results are not perfectly accurate and resembling a child's body, they work. They are widely used, in fact, so widely that Europol made a giant issue out of it. People get off to whatever it manages to produce, and that's what matters.

I do not care about how accurate it is, because it's not me who consumes this content. I care about how efficient it is at curbing worse desires in pedophiles, because I care about safety of children.

[–] Allero@lemmy.today 9 points 1 week ago (21 children)

That's exactly how they work. According to many articles I've seen in the past, one of the most common models used for this purpose is Stable Diffusion. For all we know, this model was never fed with any CSAM materials, but it seems to be good enough for people to get off - which is exactly what matters.

view more: ‹ prev next ›