this post was submitted on 17 Mar 2025
9 points (100.0% liked)
Programmer Humor
22155 readers
2418 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
An actual compsci professor would know real CPUs don't run arbitrary recursion, right? Nobody could possibly be that siloed.
Could someone expand a little on this statement, or point me toward an applicable resource? How do "real" (modern?) CPUs prevent unwanted recursion? As in, not the compiler or the OS, but the CPU itself? I've been searching for a while now but I haven't found anything that clears this up for me.
They don't prevent it, they just don't implement it. A real (physical) CPU is a fixed electrical circuit and can't just divide in two the way it would have to for the ideal mathematical version of recursion. If you want a simple way to visualise this, how would you implement recursion (as opposed to just loops) on a Turing machine with tapes?
Different CPUs might have strategies to approximate certain kinds of recursion, but at some point my own knowledge ends, and there has been many different designs. Tail recursion in particular is usually just turned back into a loop at the compiler, and typical modern architectures implement a call stack at the hardware level, which allows you to do limited-depth recursion, but breaks like in OP if you try to go too deep.
Yes, in my experience this is what the term "recursion" means in a programming context; it doesn't usually refer to a mathematical ideal. That was what tripped me up.
The basic definition would be something like use of a function in that function's own code. It's pretty easy to find examples that aren't tail-recursive specifically, like mergesort, and examples within those that would overflow a hardware stack, like in OP. And that's without looking at (mildly) exotic examples like the Ackermann function.
Basically, the "Please leave recursion to math and keep it out of (in particular C) software" in OP means don't define functions using those functions. It's pretty and it can work, but not reliably.