this post was submitted on 27 May 2025
1951 points (99.5% liked)

Programmer Humor

23554 readers
2080 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] borari@lemmy.dbzer0.com 1 points 2 days ago (1 children)

It’s true that one is based on continuous floats and the other is dynamic peaks

Can you please explain what you’re trying to say here?

[–] CanadaPlus@lemmy.sdf.org 1 points 18 hours ago (1 children)

Both have neurons with synapses linking them to other neurons. In the artificial case, synapse activation can be any floating point number, and outgoing synapses are calculated from incoming synapses all at once (there's no notion of time, it's not dynamic). Biological neurons are binary, they either fire or do not fire, during a firing cycle they ramp up to a peak potential and then drop down in a predictable fashion. But, it's dynamic; they can peak at any time and downstream neurons can begin to fire "early".

They do seem to be equivalent in some way, although AFAIK it's unclear how at this point, and the exact activation function of each brain neuron is a bit mysterious.

[–] borari@lemmy.dbzer0.com 1 points 15 hours ago

Ok, thanks for that clarification. I guess I’m a bit confused as to why a comparison is being drawn between neurons in a neural network and neurons in a biological brain though.

In a neural network, the neuron receives an input, performs a mathematical formula, and returns an output right?

Like you said we have no understanding of what exactly a neuron in the brain is actually doing when it’s fired, and that’s not considering the chemical component of the brain.

I understand why terminology was reused when experts were designing an architecture that was meant to replicate the architecture of the brain. Unfortunately, I feel like that reuse of terminology is making it harder for laypeople to understand what a neural network is and what it is not now that those networks are a part of the zeitgeist thanks to the explosion of LLM’s and stuff.