this post was submitted on 30 Nov 2025
979 points (99.3% liked)

People Twitter

8612 readers
1293 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a pic of the tweet or similar. No direct links to the tweet.
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician. Archive.is the best way.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] scintilla@crust.piefed.social -2 points 22 hours ago (7 children)

I'm not sure I like these comments. Under the presumption that an uploaded consioness is even possible that's still a sapient individual I don't really think you can possibly do something to deserve eternal torment like some of these comments seem so gleeful about.

[–] bitcrafter@programming.dev 9 points 22 hours ago

I actually think that this thread is extremely useful for demonstrating that the downside of obtaining immortality by uploading your consciousness is that you risk handing your digital soul into the hands of a wrathful and malicious deity for all eternity.

(Though honestly, I think that uploading is overrated anyway given that I will still be dead, so how does the continued existence of a copy of me help anything?)

[–] Iheartcheese@lemmy.world 5 points 21 hours ago (1 children)

Simping for fictional tech bros. God I hate this planet.

[–] bitcrafter@programming.dev 6 points 20 hours ago (1 children)

You hate this planet because someone expressed discomfort at the idea of tormenting an uploaded consciousness for eternity? That's a weird reason, but to each their own I suppose.

[–] Iheartcheese@lemmy.world -2 points 18 hours ago (1 children)

WON'T SOMEBODY THINK OF THE FICTIONAL TECH BROS

[–] scintilla@crust.piefed.social 1 points 16 hours ago (1 children)

Me: Torture bad

You: why are you simping

Go to therapy.

[–] Iheartcheese@lemmy.world 1 points 13 hours ago (1 children)

If this kind of fictional torture is this bad to you I hope you never find out about slasher movies.

[–] Cruel@programming.dev 2 points 2 hours ago* (last edited 2 hours ago)

Them: "If I could, I'd torture and rape my enemies."

Me: "That's fucked up..."

You: "It's only hypothetical, it's a fictitious scenario!"

You're still advocating for torture. Why does the hypothetical matter?

[–] pelespirit@sh.itjust.works 4 points 22 hours ago

Some of them are dark, but you can't do this anyway. The men they're talking about need to think about their future a little bit and come back down to earth. Admittedly, I like the teenage girl playing sims with their character the most.

[–] AnarchistArtificer@slrpnk.net 3 points 17 hours ago

I share your consternation, but I also find it cathartic to imagine testing how long it would take the simulated consciousness of Elon Musk to develop a sense of compassion for workers when forced to work in a Tesla factory (and indeed, if that would ever happen). I imagine I'd be far less comfortable thinking this way if it wasn't a concept squarely in the domain of science fiction. I think many of the most gleeful commenters are people who are presuming that simulating a consciousness in this way isn't possible.

[–] MadMadBunny@lemmy.ca 2 points 17 hours ago* (last edited 17 hours ago) (1 children)

Do bear in mind that we’re talking about a select few persons that didn’t hesitate at all to do far, far worse to thousands of human beings.

Just google "usaid cuts impact", which was deliberately caused by these tech bros without a shred of remorse.

And the scenario in this post is a fictional one, unlike the untold destruction caused by these cruel cuts.

[–] scintilla@crust.piefed.social 2 points 16 hours ago (2 children)

I don't like tech bros. If I could snap my fingers tomorrow and kill all billionaires I would do it. Its just bad to want to watch someone suffer and wanting them to suffer forever is inherently immoral. No human can do so much bad that they deserve eternal torment because humans can only live for a finite period of time and create a finite amount of harm.

[–] okwhateverdude@lemmy.world 2 points 7 hours ago

because humans can only live for a finite period of time and create a finite amount of harm.

For now.

[–] Cruel@programming.dev 1 points 2 hours ago

And killing all billionaires is not going to cause immense pain for millions of people?

[–] EnsignWashout@startrek.website 1 points 19 hours ago (1 children)

I don't really think you can possibly do something to deserve eternal torment like some of these comments seem so gleeful about.

It wouldn't be eternal. Most kids don't have that kind of attention span.

Plus, these consciousnesses might be carefully protected the way major corporations have protected my SSN. So maybe the billionaires have nothing to worry about...

[–] scintilla@crust.piefed.social 1 points 16 hours ago

Read the other comments their are multiple people bringing up eternal simulated torture. I never said anything about the post itself I specifically called out the comments being weird.

[–] Mika@piefed.ca 1 points 18 hours ago (1 children)

Consciousness is a process, not just data. If they copy the weights of your brain neurons, they would be able to create new life that thinks like you, but is NOT you. Although that new life would be very confused and would honestly believe that it is you.

Thus, it's neither eternal life nor eternal torment. Maybe torment of newly created AIs, but we didn't come to "AI have rights" yet.

[–] scintilla@crust.piefed.social 1 points 16 hours ago (1 children)

I have a super weird view of selfhood I think. I know that the uploaded existence wouldn't be me but I also don't care if that makes sense. If you've ever watched pantheon I would totally upload myself even if it killed me so that the uploaded me could exist.

[–] Mika@piefed.ca 1 points 11 hours ago

So do it? Honestly the biggest issue is privacy, someone could analyse your dark fantasies in detail.

And yeah I think we will be able to simulate human brain at some point. What's more interesting, would we be able to continue the thought process outside the biological box. Could I escape into machine? Doesn't guarantee huge longevity boost as machines bug out & break from time to time, but still.