Does advanced math "power" more rudimentary math?
I came across this quote by Eric Weinstein that "when things got supposedly more advanced, they actually got simpler because mathematicians started revealing what was powering all the things that you previously learn." I was wondering if anyone came to the same conclusion, and if so, care to give an example of such a revelation.
This is true whenever you learn a "powerful tool" in high school maths. The two main examples that come to mind are trigonometry and calculus.
In trigonometry, we had to memorise a large number of trig identities. We would learn that $\tan$ is basically "defined by" $\tan(x) = \sin(x)/\cos(x)$, but apart from this it was all just memorising the double angle formulae and so on. Early on in university maths, I learned that $e^{iz} = \cos(z) + i\sin(z)$, and all of the trig identities become obvious. But even then, I did not really understand what $e^{iz} = \cos(z) + i\sin(z)$ meant. Eventually, I studied complex analysis, and the notion of complex exponentiation made sense. At this point, everything that flowed on from it became clearer. (As an added bonus, understanding the proof behind the residue theorem meant I was also finally able to see why various integrals I memorised for my physics courses worked.)
A second related concept was calculus. When I first "learned" that $\int \frac{\mathrm{d}x}{x} = \ln(x)$, I had no idea why. Most of integration was just memorisation in high school. In my first maths course at university, I was finally properly taught what functions were, and what some of their properties are - and about inverse functions. Now, together with the chain rule, and knowing that $x\mapsto \ln(x)$ is the inverse of $x\mapsto e^x$, the lecturer showed us a simple reason why the derivative of $\ln(x)$ is $1/x$ (assuming it is in fact differentiable).
$$1 = x' = [e^{\ln(x)}]' = e^{\ln(x)}\cdot \ln'(x) = x\ln'(x)$$
Thus $\ln'(x) = 1/x$, and so an antiderivative of $1/x$ is $\ln(x)$. Seeing the proofs of the fundamental theorem of calculus was fantastic as well. In high school I asked why integration was antidifferentiation, and I was told "of course it is", but it was never obvious at all. They seemed like entirely different concepts so I couldn't understand how everyone was so okay with the close way they're tied together. Studying real analysis really helped to understand why calculus works.
As for the maths I've learned at university, all of my courses were well designed so they didn't use any tools that weren't proven in the course/earlier courses. I can't give any higher-level examples for your question.
The view that advanced mathematics is "what is really going on" in elementary mathematics is generally a fallacy in my opinion.
It is easy when learning (or even when teaching) to fall into the trap of thinking that, for example, trigonometric identities are because of complex numbers. But this amounts, I believe, to a large extent to confusing cause for effect.
It would be truer to what is really going on to say that we care about complex analysis because it happens to simplify (among plenty of other things) reasoning about trigonometry.
There is an infinity of possible "advanced" structures that we could reason about mathematically -- but the ones that get any effort spent on them are the ones that can (or are hoped to) lead to answers to questions that we can already ask without the advanced theory. Often they do this by unifying and generalizing concepts we already have.
An advanced theory earns its way by answering questions that can be asked in more elementary terms but are too hard to be answered by elementary techniques. Along the way, as a side effects, it is common for a lot of questions that can be answered with elementary tools to become simpler and easier to answer using the advanced theory -- and as a matter of research direction this is often used as a touchstone for whether we're getting closer to beating the currently-too-hard problems.
And, of course, once we have the advanced theory, that gives us a chance to ask even harder questions that we couldn't even have thought of before, which it will become the task of the next advance to solve.
But saying that it is the advanced theory that "powers" or "generates" the original elementary phenomena is putting the cart before the horse.
Terry Tao once talked about a concept I had devised too: Symmetrization. For this an example is: "A man saw a circle, another saw a rectangle: the two were seeing a cylinder". This means that whenever you have two dissimile structures, what you like to do as a mathematician is finding a "bigger picture", this is not to say that everything is compatible, just the opposite: this means to know exactly what constraints and what liberties are in the problem we are understanding, this means not everything is possible. So then, here comes an important element.
"Advanced math makes math simpler, not more difficult" which is like killing bugs like the irrationality of $\sqrt{2}$ with atomic bombs like Fermats Last Theorem means nothings because simply the latter knowledge goes first. Theres a clear ordering of the value of knowledge created by its logical interdependences.
I wouldnt say advanced math "powers" basic math, its the opposite (remember the previous paragraph) but its definitely great when you can have a bigger picture and retrieving also the details in your mind which makes computation easier. It would be some sort of putting some results at the same plateau, etc ,etc.
[Its important to note here that knowledge is stable, and so the brain can remember better when there are logical connections between facts, this way, computation, os power usage becomes more efficient. Seeing the bigger picture (a collection of principles) and not only sparse results does exactly that.]