Has Abstract Algebra ever been of service to Analysis?
I’m not saying that it ought to be. I was just wondering whether it has. What I have in mind is that it would have been of material help in proving, say, the Hahn-Banach Theorem, or some such. If it has, what is the most important/impressive instance of this?
Solution 1:
Speaking as someone who is basically an algebraist, I think of algebra as using structure to help understand or simplify a mathematical situation. The common algebraic objects (groups, rings, Lie algebras, etc.) reflect common structures that appear in many different contexts.
Now analysis often seems to have a certain slipperiness that makes it hard to pin down precise structures that meaningfully persist across different problems and contexts, and hence seems to have been somewhat resistant to methods of algebra in general. (This is an outsider's impression, and shouldn't be taken too seriously. But it does honestly reflect my impression ... .)
On the other hand, there do seem to be places where algebra can sneak in and play a role. One is mentioned by Qiaochu: Wiener proved that if $f$ is a nonwhere zero periodic function with absolutely convergent Fourier series, then $1/f$ again has an absolutely convergent Fourier series. Wiener's proof was by hand harmonic analysis, but a conceptually simpler proof (in a much more general setting) was supplied by Gelfand (I believe) using the theory of Banach algebras, which hinges on algebraic concepts such as maximal ideals and radicals. Wiener proved his result as a step along the way to proving his general Tauberian theorem, and this result again admits a conceptually simpler proof, and generalization, via Banach algebra methods.
Another, more recent, example is the work of Green and Tao on asymptotics for the Hardy--Littlewood problem of solving linear equations in primes. Here they introduced algebraic ideas related to nilpotent Lie groups, which play a key role in understanding and analyzing the complexity and solubility of such equations.
Solution 2:
Yes, differential equations is, intuitively speaking, probably about as far as you can get from abstract algebra in the realm of analysis. Yet algebra still manages to rear its ugly head there :)
.
One example is the entire subject of D-modules (see also Sato's algebraic analysis that Akhil mentioned in his comment.)
But my favourite example in the somewhat surprising application of algebra to analysis is that of lacunas for hyperbolic differential operators. Given a constant coefficient hyperbolic partial differential operator $P(D)$ on $\mathbb{R}^n$, it has associated to it a fundamental solution $E$, which solves the equation that $P(D)E = \delta$, the Dirac distribution. Using the fundamental solution we can write the solution to the inhomogeneous initial value problem $P(D)u = f$, $(u, \partial_t u)|_{t=0} = (u_0,u_1)$ in integral form. A Petrowsky lacuna of $P(D)$ is a region in which the fundamental solution $E$ vanishes.
Now, by hyperbolicity, the fundamental solution is supported within a bi-cone with vertex at the origin, a condition known as finite speed of propagation. So that forms a trivial region in which $E$ vanishes. But other than that the situation can be complicated. Just consider the fundamental solution for the linear wave equation. In odd number of spatial dimensions, the fundamental solution is supported precisely on the set $\{t^2 = r^2\}$. So the region $\{t^2 > r^2\}$ form lacunas for $P(D)$. On the other hand, in even number of spatial dimensions, the fundamental solution is supported on the whole set $\{ t^2 \geq r^2\}$: the equations look almost exactly the same, but the difference between even and odd dimensions is huge!
Petrowsky wrote a paper in 1945 giving precise "topological" conditions for the existence and characterisation of lacunas. Later on, Atiyah, Bott, and Garding wrote two papers revisiting this problem, in which the theorem(s) of Petrowsky are proved using an algebraic geometric framework. One can read more about this in Atiyah's Seminaire Bourbaki notes.
Solution 3:
I am not sure how to respond to this question. Functional analysis is a big part of analysis, and functional analysis is largely considered with topological vector spaces; do vector spaces count as "abstract algebra"?
Does Fourier analysis count as "service"? It and its more general companions surely count as the most important intersection of algebra and analysis both in pure and applied mathematics.
How about the theory of Banach algebras? They are the natural setting for spectral theory, which is surely an important analytic topic, and they can also be used to prove an important lemma of Wiener and study quantum mechanics and all sorts of other things.