FP & OOP both have their use cases. Generally, I think people use OOP for stateful programming, and FP for stateless programming. Of course, OOP is excessive in a lot of cases, and so is FP.
OOP is more useful as an abstraction than a programming paradigm. Real, human, non-computer programming is object-oriented, and so people find it a natural way of organizing things. It makes more sense to say "for each dog, dog, dog.bark()" instead of "map( bark, dogs)".
A good use case for OOP is machine learning. Despite the industry's best effort to use functional programming for it, Object oriented just makes more sense. You want a set of parameters, unique to each function applied to the input. This allows you to use each function without referencing the parameters every single time. You can write "function(input)" instead of "function(input, parameters)". Then, if you are using a clever library, it will use pointers to the parameters within the functions to update during the optimization step. It hides how the parameters influence the result, but machine learning is a black box anyway.
In my limited use of FP, I've found it useful for manipulating basic data structures in bulk. If I need to normalize a large number of arrays, it's easy to go "map(normalize, arrays)" and call it a day. The FP specific functions such as scan and reduce are incredibly useful since OOP typically requires you to set up a loop and manually keep track of the intermediate results. I will admit though, that my only real use of FP is python list comprehension and APL, so take whatever I say about FP with a grain of salt.
What's important to note is that there has been a big shift in the goals and techniques of education. This most famously occured with "common core" math in the US. It was a push to teach math in a more intuitive way, one that directly corresponds with what children already know. You can physically add things together by putting more of them together, and then counting them, so they try to teach addition with that analog in mind.
Prior to common core math, there was "new math," which anyone under 80 years old assumes has always been the standard. New math was a push to teach math in a more understandable way, one that gradually introduced new concepts to ensure children understood how math works. This was satirized by Tom Leher in his song "New Math." If you look up the song, you'll see that new math mostly was implemented by teaching students how base-10 positional notation works, and then using that understanding to present addition and subtraction as logical algorithms.
Prior to new math, the focus of math education was much more about getting the right answer, rather than the skills needed for problem solving using math. This allows for a higher breadth of education, as topics can be covered quickly, but each topic is understood in a shallow way.