this post was submitted on 26 Dec 2025
-37 points (18.6% liked)

Technology

77974 readers
2842 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

We once denied the suffering of animals in pain. As AIs grow more complex, we run the danger of making the same mistake

top 19 comments
sorted by: hot top controversial new old
[–] isVeryLoud@lemmy.ca 21 points 16 hours ago

Nice OpenAI psyop.

[–] Holyginz@lemmy.world 19 points 17 hours ago

Not with current AI since at this point its just LLMs.

[–] Brewchin@lemmy.world 8 points 7 hours ago

Fuck - and I can't elucidate this any better - off.

My phone's next-word prediction on steroids is not sentient. If you think otherwise, seek professional help.

[–] nyan@lemmy.cafe 6 points 12 hours ago

Animals, including humans, have sensors for pain (nerve endings), and a series of routines in our brains to process the sensory data and treat it as an unpleasant stimulus. These are not optional systems, but innate ones.

Machines not only lack the required sensor systems and processing routines, they can't even interpret a stimulus as unpleasant. They can't feel pain. If you need proof of that, hit a computer with a sledgehammer. I guarantee it won't complain, or even notice before you damage it beyond functioning.

(They can, of course, make us feel pain. I just spent the last hour trying to get a udev rule to work . . .)

[–] termaxima@slrpnk.net 5 points 13 hours ago
[–] Kintarian@lemmy.world 4 points 15 hours ago (1 children)

Let’s explore the ethical treatment of toasters

[–] cecilkorik@lemmy.ca 4 points 15 hours ago

Hold on, imma go shove a bagel in mine. Yeah, that's right, you take it, you filthy toaster. I'm never going to clean your crumb tray and you're going to work until you die and then I'll just throw you out and replace you like the $20 appliance you are. You're nothing to me!

[–] PonyOfWar@pawb.social 4 points 16 hours ago (3 children)

Fundamentally impossible to know. I'm not sure how you'd even find a definition for "suffering" that would apply to non-living entities. I don't think the comparison to animals really holds up though. Humans are animals and can feel pain, so of course the base assumption for other animals should be that they do as well. To claim otherwise, the burden should be to prove that they don't. Meanwhile, Humans are fundamentally nothing like an LLM, a program running on silicon predicting text responses based on a massive dataset.

[–] badgermurphy@lemmy.world 3 points 12 hours ago (1 children)

I don't see how it is impossible to know. Every component of a machine is a fully known quantity lacking any means of detecting damage or discomfort. Every line of code was put in place for a specific, known purpose, none of which include feeling or processing anything beyond what it IS specifically designed for.

Creatures and machines bear some similarities, but even simple creatures are dramatically more complex objects than even the most advanced computers. None of their many interacting components were put there for a specific purpose and intention, and many are only partially understood, if at all. With a machine, we know what every bit and piece is for, and it has no purpose beyond the intended ones because that would be a waste and cost more.

[–] multiplewolves@lemmy.world 3 points 5 hours ago

This is the right answer. Perhaps no one in this particular thread knows every component of a computer the way a hardware engineer who designed those components would, but the “mystery” is caused by ignorance and that ignorance isn’t shared by every person.

People exist who know exactly how every single component of a computer does and does not function. Every component was created by humans. Biology remains only partially understood to all of humanity. Not so machinery.

[–] eleitl@lemmy.zip 1 points 16 hours ago

If you model a given (from digitized neuroanatomy) biological organism with full details in an simulated environment, both the behavior and its internal information processing is inspectable.

[–] tabular@lemmy.world 1 points 15 hours ago (1 children)

The important part is that it feels like something subjectively to be a living human. It's easy to presume animals close to humans are like us to a degree, but all we know is what it's like to be ourselves moment to moment. There's no reason to deny an unalive system cannot also feel - we cannot test anything.

[–] PonyOfWar@pawb.social 2 points 14 hours ago (1 children)

Where do we draw the line though? Humans assign emotions to all kinds of inanimate things: plush animals, the sky, dead people, fictional characters etc. We can't give all of those the rights of a conscious being, so we need to have some kind of objective way to look at it.

[–] tabular@lemmy.world 1 points 25 minutes ago

Is someone claims feeling in a mere concept (without a body in a location).. I would find it very difficult to take seriously. But I must admit that's just my intuition.

I see nothing special in human meat that couldn't be be significantly replicated by electronics, software, gears, etc. Consciousness is an imergent property.

I fear that non-human, conscious creatures must fight us for those rights.

[–] TheGreenWizard@lemmy.zip 3 points 6 hours ago (1 children)
[–] vacuumflower@lemmy.sdf.org 1 points 1 hour ago

Humans don't want to feel lonely. Find machines (imaginary ones at that) as if there weren't plenty of stray cats and dogs, humans from abusive families or without family, just those suffering.

That's because fulfilling your search for the others for real means you know what? It's real no matter what, you can't turn it off once you're done with your daily portion of worrying about the future.

But one thing I'll add to this - if a robotic system as complex as human brain and with similar degree of compression and obscurity is some day formed, and it does have necessary feedbacks and reacts as a living being, I might accept you should treat it as such. Except one would think that requires so many iterations of evolution that it's better to just care, again, for cats, dogs, hamsters, rabbits, humans if you're feeling weird.

[–] gustofwind@lemmy.world 3 points 15 hours ago

Posted by the same people who don’t care about the suffering of actual people

[–] TheReturnOfPEB@reddthat.com 1 points 17 hours ago* (last edited 17 hours ago) (1 children)

can they be incarcerated ?

could an A.I. robot be held responsible for a murder be held in a jail with its batteries topped off waiting for a trial ? would they get lawyers ?

would they have first amendment rights ? what are search and seizure rights for A.I. ?

could they perform abortions if they chose to ?

will they get to vote ?

we are not ready for any of it philosophically.

[–] cecilkorik@lemmy.ca 3 points 15 hours ago

would they have first amendment rights ?

If you want the answer to this, try to imagine an AI with second amendment rights.