this post was submitted on 26 May 2025
-86 points (14.8% liked)

Technology

71110 readers
2982 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Do you think AI is, or could become, conscious?

I think AI might one day emulate consciousness to a high level of accuracy, but that wouldn't mean it would actually be conscious.

This article mentions a Google engineer who "argued that AI chatbots could feel things and potentially suffer". But surely in order to "feel things" you would need a nervous system right? When you feel pain from touching something very hot, it's your nerves that are sending those pain signals to your brain... right?

you are viewing a single comment's thread
view the rest of the comments
[–] Opinionhaver@feddit.uk 0 points 1 week ago (2 children)

First, one needs to define consciousness. What I mean by it is the fact that it feels like something to be from a subjective perspective - that there is qualia to experience.

So what I hear you asking is whether it’s conceivable that it could feel like something to be an AI system. Personally, I don’t see why not - unless consciousness is substrate-dependent, meaning there’s something inherently special about biological “wetware,” i.e. brains, that can’t be replicated in silicon. I don’t think that’s the case, since both are made of matter. I highly doubt there’s consciousness in our current systems, but at some point, there very likely will be - though we’ll probably start treating them as conscious beings before they actually become such.

As for the idea of “emulated consciousness,” that doesn’t make much sense to me. Emulated consciousness is real consciousness. It’s kind of like bravery - you can’t fake it. Acting brave despite being scared is bravery.

[–] amelia@feddit.org 1 points 1 week ago

You're getting downvoted but I absolutely agree. I don't understand why "AI algorithms are just math, therefore they can't have consciousness" seems to be the predominant view even among people interested in the topic. I haven't heard a single convincing argument why "math" is fundamentally different from human brains. Sure, current AI is way less complex and doesn't have a continuous stream of perceptual input. But that's something a "proper" humanoid robot would need to have, and processing power will increase as well.

[–] technocrit@lemmy.dbzer0.com 0 points 1 week ago* (last edited 1 week ago)

I don’t think that’s the case, since both are made of matter.

lmao. How about an anti-matter "AI"? Dark matter? Any other options for physical materials?