I've had to interact with too many people who say this with a straight face.
Redkey
I think it depends a lot on a person's individual knowledge. If you keep studying far enough away from your main area of expertise, there'll still be some point where you stop and have to blindly accept that something "just works", but it will no longer feel like that's what your main field is based upon.
Imagine a chef. You can be an OK chef just by memorizing facts and getting a "feel" for how recipes work. Many chefs study chemistry to better understand how various cooking/baking processes work. A few might even get into the physics underlying the chemical reactions just to satisfy curiosity. But you don't need to keep going into subatomic particles to have lost the feeling that cooking is based on mysterious unknowns.
For my personal interest, I've learned about compilers, machine code, microcode and CPU design, down to transistor-based logic. Most of this isn't directly applicable to modern programming, and my knowledge still ends at a certain point, but programming itself no longer feels like it's built on a mystery.
I don't recommend that every programmer go to this extreme, but we don't have to feel that our work is based on "magic smoke" if we really don't want to.
ADDED: If anyone's curious, I highly recommend Ben Eater's YouTube videos about "Building an 8-bit breadboard computer!" It's a playlist/course that covers pretty much everything starting from an overview of oscillators and logic gates, and ending with a simple but functional computer, including a CPU core built out of discrete components. He uses a lot of ICs, but he usually explains what circuits they contain, in isolation, before he adds them to the CPU. He does a great job of covering the important points, and tying them together well.
So this is a list of responses given by AI when you correct it? My guess was "Things you will never hear from a client when you politely point out a logical inconsistency, an incorrect assumption, or a wild over/underestimation in their project plan." 'Cause in my experience the response you will get, 99% of the time, is "That won't happen."
ASM doesn't care about your variable types, because it doesn't care about your variables. What's a variable, anyway? There is only address space.
Only Winsocks.
I don't think that even C++ is that bad. Like a lot of shows and music acts, I think it's more the toxic fan base than the thing itself that really sucks. I've had the same feeling with a certain kind of JavaScript programmer.
*Edit for clarity: I'm not saying that the entire C++ community is toxic, just a vocal segment of it, in line with the other examples I gave.
The added difficulty with this in programming is that it can be much harder simply to ignore them, because you may be forced to work with them, or stuck needing to learn something from them (shudder).
Tail recursion in particular is usually just turned back into a loop at the compiler, and typical modern architectures implement a call stack at the hardware level, which allows you to do limited-depth recursion, but breaks like in OP if you try to go too deep.
Yes, in my experience this is what the term "recursion" means in a programming context; it doesn't usually refer to a mathematical ideal. That was what tripped me up.
Could someone expand a little on this statement, or point me toward an applicable resource? How do "real" (modern?) CPUs prevent unwanted recursion? As in, not the compiler or the OS, but the CPU itself? I've been searching for a while now but I haven't found anything that clears this up for me.
Despite this, I still bet that they post "nvm fixed it" an hour or two later.
Putting aside the fact that the majority of commercial games of the time were written in assembly (or other low-level languages) just as a matter of course, I strongly suspect that programming the game in assembly was an execution speed issue, and not a cassette space issue. Regular audio cassettes easily held enough data to fill an average 8-bit home computer's memory many times over, whether that data was machine code or BASIC instruction codes.