this post was submitted on 01 Nov 2025
337 points (89.5% liked)
Technology
76868 readers
2200 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This was bound to happen. Neural networks are inherently analog processes, simulating them digitally is massively expensive in terms of hardware and power.
Digital domain is good for exact computation, analog is better for approximate computation, as required by neural networks.
You might benefit from watching Hinton's lecture; much of it details technical reasons why digital is much much better than analog for intelligent systems
BTW that is the opposite of what he set out to prove He says the facts forced him to change his mind
https://m.youtube.com/watch?v=IkdziSLYzHw
Thank you for the link, it was very interesting.
Even though analogue neural networks have the drawback that you can't copy the neuron weights (currently, but tech may evolve to do it), they can still have use cases in lower powered edge devices.
I think we'll probably end up with hybrid designs, using digital for most parts except the calculations.
For low power neural nets look up "spiking neural networks"