this post was submitted on 26 Aug 2025
320 points (98.2% liked)

Technology

74459 readers
3073 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] killeronthecorner@lemmy.world 9 points 15 hours ago (1 children)

Because they profited from his torture and subsequent death?

To your point though, they aren't responsible in the moral sense that you're implying. However, they committed a crime when they platformed, promoted and profited from it.

[–] AwesomeLowlander@sh.itjust.works 6 points 15 hours ago (4 children)

Do we REALLY want to have platforms deciding what content is and isn't acceptable for us, though? How is this different from the current controversy involving payment processors and their removal of content they find objectionable?

[–] hobwell@sh.itjust.works 12 points 15 hours ago (1 children)

I’d argue the main difference is that it involves a crime.

I’m not completely sure that torture itself constitutes a crime (though I’d be surprised if it wasn’t), but manslaughter/murder is. With few exceptions for medically assisted death, killing someone is a major crime. Presumably, we don’t want to promote people profiting from extreme suffering and death.

I also think there is a time and place for censorship (ex CSAM).

“Objectionable” is a subjective term, but “illegal“ is not.

[–] AwesomeLowlander@sh.itjust.works 7 points 14 hours ago (1 children)

There's 2 different parties under discussion here, the other streamers and the platform.

Regarding the streamers, I agree there might be room for a manslaughter charge. IANAL, much less in French law. Personally though, I don't see how it differs substantially from any other high risk group activity. If you're free-climbing (or maybe some other activity that involves more chance and less skill), and you're doing it voluntarily, knowing the risks, is it really fair to blame the survivors if somebody dies?

Regarding the platform, up until the point where a death actually occurred, what could they have reasonably done that would not have constituted some form of censorship? At that point, aren't we back to the censorship discussion of how much power platforms should have over the content we have access to?

[–] hobwell@sh.itjust.works 5 points 10 hours ago

I can kind of see what you are trying to say, but I don’t really agree with your conclusion.

I’d make the distinction that free climbing, while dangerous, is a recreational activity. I can reasonably conceive of people watching that for entertainment. There also isn’t anything morally questionable about it.

On the face of it, I don’t think you could reasonably argue that torture is a pastime.

All of that aside, torture is against international law. It is illegal in all circumstances.

From the United Nation Convention Against Torture:

“No exceptional circumstances whatsoever may be invoked as a justification for torture.”

For that reason, I would say the platform did have an obligation to de-platform it.

Arguably, the police should probably have put a stop to it as well.

[–] SculptusPoe@lemmy.world 10 points 15 hours ago* (last edited 15 hours ago) (1 children)

People never think of that. They always clamor for censorship, always thinking censorship will go their way and censor the things they don't like. Since the police already went to where this guy was and determined that he was there of his own volition, I don't think the streaming services should have any other responsibility. The streaming services should always err on the side of not censoring anything.

[–] michaelmrose@lemmy.world 1 points 13 hours ago (1 children)

Why should they err on that side again?

[–] AwesomeLowlander@sh.itjust.works 4 points 13 hours ago (1 children)

Because we don't want them censoring anything they find objectionable. Like porn. Or abortion material. Or LGBT stuff. Need I go on?

[–] michaelmrose@lemmy.world 3 points 12 hours ago (1 children)

Almost every website has a TOS and censors some stuff against said terms. You act like its not already nornal to have standards for conduct.

[–] AwesomeLowlander@sh.itjust.works 3 points 12 hours ago

some stuff against said terms

Like mastercard and their ban of all purchases of items that could reflect negatively on their brand. Like porn.

[–] killeronthecorner@lemmy.world 4 points 14 hours ago (1 children)

They aren't deciding, they're being held to laws that they didn't create nor necessarily agree with.

I'd assume that, given the option, they'd like this kind of thing to be legal so they can continue making money from it legitimately

[–] AwesomeLowlander@sh.itjust.works 7 points 14 hours ago (1 children)

What? I think you've misread something.

The argument against them, as I understand it, is that they should not have allowed the streaming to happen. As this was pre-death, that would have required them to make a decision about what content they allowed that most people would consider censorship.

[–] killeronthecorner@lemmy.world 7 points 14 hours ago (1 children)

Yes, that is the law. You are required not to broadcast death and to create circumstances in which the likelihood of this is minimised.

That's not calling for censorship because it doesn't preclude a level of consensual harm that doesn't lead to high risk of death.

As I said earlier, your point stands: it is not for these platforms to act as moral compasses for viewers of consensual but provocative content.

However, that's irrelevant to the law which wants to avoid incentivising people dying / being killed on broadcast streams for a profit.

I think this is ratified by the fact that there will be less of a burden of blame on the service provider if this proves not to be the case

[–] AwesomeLowlander@sh.itjust.works 5 points 14 hours ago (1 children)

I follow what you're saying. In that case, what about extreme sports that carry a statistically significant chance of fatalities? Granted that they're usually not televised, but that's probably because they're usually done out of passion. From a legal perspective, there's not much to differentiate them.

[–] killeronthecorner@lemmy.world 6 points 13 hours ago (1 children)

In those cases broadcasters take one of two roads:

  1. Don't broadcast it - many extreme sports are simply not broadcast by many, many broadcasters.

  2. Properly mitigate the risk to an acceptable level - this is done frequently for sports and other media. This is the reason you can watch Jackass and Dirty Sanchez even though the risk of death for many stunts is non-zero.

Once the death occurs though, they can only rely on their demonstration of #2 here to offset legal culpability. They are also then generally bound to remove the material and not re-air (in this case, Kick did make the content available again for whatever reason)

It seems like this is the road the defense will take in this particular case is to prove the death (illegal to air if preventable) was not caused by the preceding consensual torture (legal to air, seemingly).

[–] AwesomeLowlander@sh.itjust.works 5 points 12 hours ago

Thanks, that's the sort of info I was hoping to get, and food for thought. I do wonder how Jackass-genre shows would work with streaming platforms where it's obviously impractical to vet all of them. Do they just become illegal, then? Probably something that will get hammered out at some point.

[–] michaelmrose@lemmy.world 3 points 13 hours ago (1 children)

Yes to some degree obviously I want some editorial control. For instance I don't want people posting snuff films or child porn and I want sited that wouldn't remove such themselves removed.

[–] AwesomeLowlander@sh.itjust.works 4 points 13 hours ago (1 children)

Those are directly and obviously illegal material, and therefore a no brainer. This was a very different case, up until the death actually happened, nothing illegal was going on and there was no reason to think otherwise.

[–] _cryptagion@anarchist.nexus 3 points 11 hours ago (1 children)

it's 2025 and we got people arguing people who torture someone to death on a livestream shouldn't be deplatformed.

fucking wild, man, just fucking wild.

[–] AwesomeLowlander@sh.itjust.works 1 points 11 hours ago (1 children)

Way to skip any nuance of the discussion at hand.

[–] _cryptagion@anarchist.nexus 2 points 11 hours ago (1 children)

it's a torture stream. there's no fucking nuance. you sound like the people who are on the fence about gaza.

either you think people should be able to consent to being tortured on stream to death so a corporation can profit, or you don't. there's not a middle ground. I don't give a shit whether the people doing the torturing feel like they're being censored. they're sadistic filth.

[–] AwesomeLowlander@sh.itjust.works 1 points 10 hours ago (1 children)

Yeah... Maybe read the fucking discussion points before chiming in. I love how you think I'm on the side of the fucking corporation.