Applications of Abstract Algebra to elementary mathematics

I'm currently an undergraduate student in mathematics. I am currently taking Algebra. The course is interesting, but I have grown very curious about the usefulness of algebra.

I am NOT asking about "applications of algebra to real-life". I am asking about how algebra can be used to solve math problems. Unfortunately, Googling "applications of algebra" is not all that helpful.

Right now I can only recall seeing two instances of "useful" applications of algebra -- a proof of Fermat's Little Theorem, and determining whether a polynomial is solvable in radicals by looking at its Galois group.

What interests me about both problems is that they are of interest to someone who has not necessarily encountered abstract algebra yet (e.g. what is the remainder when you divide k^p by p? can you write explicitly the roots of some polynomial using only the integers and the specified functions?).

At least from the way my course is currently progressing, it feels as though such applications are few and far in between. We are currently making observations about permutations (e.g. if p and q are permutations, then pq and qp have "similar forms"), which is interesting, but I fail to see how algebra has helped make any interesting deduction -- all interesting results so far about permutations (e.g. the one mentioned above) were all done without any algebraic result.

Only when we ask a question using algebraic terminology was algebra required (e.g. show An is a normal subgroup of Sn). If algebra were only used to answer questions about algebra, there would be no real need to study algebra, right?

What are some other "elementary" applications of algebra? What are some other interesting results I would be able to understand after an introductory course?

I have a suspicion that finding answers to these questions would better my understanding of algebra, but I have had difficulty in finding many good answers.


One of the most important results you learn in a first course on abstract algebra is Burnside's lemma, which has many applications in combinatorics and number theory. Some time ago I wrote a series of blog posts leading up to a powerful corollary of Burnside's lemma called the Polya enumeration theorem (including several applications, so you should look in those posts for them), which can be used to count many things; it was originally used to count chemical compounds.

The Polya enumeration theorem in turn can be used to prove a powerful result in combinatorics called the exponential formula, which gives you an enormous amount of information about permutations. For example, using the exponential formula you can prove results like

The number of fixed points of a random permutation of $n$ elements is asymptotically Poisson distributed with parameter $\lambda = 1$ as $n \to \infty$

relatively easily.


Disclaimer: This is a long post and I am not originally a Mathematician. I just wanted to offer my viewpoint as someone who is not just in the process of learning it, but also had prior experience in its application.

What you might be able to appreciate from an introductory course
Personally, I think the Rubik's Cube is a good object to make use of your group theory knowledge.
It can be fully described as products of permutation groups and it is fun!

In fact, it probably fits what you mentioned in the first part:

of interest to someone who has not necessarily encountered abstract algebra yet

So if you like puzzles, consider it as a learning tool!
Try to solve it using the theory you learned in class.
If the typical cube is too mundane for you, there are also exotic choices. :)
There are even Algebra courses conducted using it. (I cannot remember where)

And while staying in just group theory, a lot of application will come from combinatorics.
There is a good reason for this: every group is isomorphic to a subgroup of a permutation group.
And permutation happens very frequently in combinatorics topics.
For example, it can be shown that enumerating all permutations of $\lbrace 1,\dots,n\rbrace$ can be done using only 2 generators.

Going 1 step further in this direction, some areas of Graph theory is closely related to group theory too. For example, the Graph Isomorphsim problem is known to be a non-abelian hidden subgroup problem. Very difficult in general cases!
But if the graph involved is a permutation graph, then it can be solved efficiently.

If we consider an abelian hidden subgroup problem instead, then this includes 2 very important classes of problems: Integer factorization and Discrete Logarithm.
Although to be precise, study of these problems goes beyond group theory.

While talking about number theory, there are some nice algorithms that you can understand with just group theory.
Many important results are derived from just Fermat's little theorem:
Miller-Rabin primality test: A probabilistic test for primes (I think still most efficient)
Pollard's p-1 algorithm: An algorithm to find small prime factors

If your introductory course covers rings, more specifically polynomials in rings, then factorization of polynomial rings is a great area to look at.
Note: The question of whether polynomials have solutions is equivalent to its factorization into linear factors. There are some nice results stating when this is possible.
An example: If GCD$(f(x),x^p-x)\neq 1$, then it has solutions in $\mathbb{F}_p$.

I don't think I have enough experience to explain it better... this is the best I can offer at the moment. Hope it will be useful!

P.S. Algebra is also an important gateway to many other areas. Algebraic Geometry/Algebraic Topology/Algebraic Combinatorics etc
Perhaps if you also looked a bit into that direction you may find more things that interests you.


To rationalize denominators that are not quadratic is a nice application of linear algebra over fields. For example, if asked to rewrite $$ \frac{1}{1-5\sqrt[3]{2}} $$ with a rational denominator, the best way to approach the problem is to work in the field ${\mathbf Q}(\sqrt[3]{2})$ with ${\mathbf Q}$-basis $1,\sqrt[3]{2},\sqrt[3]{4}$.


What is the period of the Fibonacci sequence $F_n$ modulo a prime $p$?

This is the Pisano period. It is difficult to say much about the exact period, but one can write down a number which is guaranteed to be divisible by the period using some facts about finite fields in a manner analogous to Fermat's little theorem, together with quadratic reciprocity. The key result is that Binet's formula

$$F_n = \frac{\phi^n - \varphi^n}{\phi - \varphi}$$

continues to hold $\bmod p$ in a suitable sense for all $p \neq 5$. Here $\phi, \varphi$ are the two roots of the characteristic polynomial $t^2 - t - 1$.

Proposition: If $p \neq 5$, then $F_n$ has period dividing $p - 1$ if $p \equiv 1, 4 \bmod 5$; otherwise, $F_n$ has period dividing $2(p + 1)$. If $p = 5$, then $F_n$ has period $20$.

Proof. By quadratic reciprocity, $p \equiv 1, 4 \bmod 5$ if and only if $t^2 - t - 1$ factors over $\mathbb{F}_p$. By Fermat's little theorem, it follows that $\phi, \varphi$ have multiplicative order dividing $p-1$. If $t^2 - t - 1$ does not factor over $\mathbb{F}_p$, then its roots lie in $\mathbb{F}_{p^2}$, and the Frobenius map interchanges them; that is, $\phi^p \equiv \varphi \bmod p$ and vice versa. Consequently $\phi^{p+1} \equiv -1 \bmod p$, and we conclude that

$$F_p \equiv -1 \bmod p$$ $$F_{p+1} \equiv 0 \bmod p$$ $$F_{p+2} \equiv -1 \bmod p$$ $$F_{p+3} \equiv -1 \bmod p$$

and by induction $F_{p+1+k} \equiv - F_k \bmod p$, hence $F_{2(p+1)+k} \equiv F_k \bmod p$ as desired. The case $p = 5$ is left as an exercise. $\Box$

This proposition describes a pattern which is straightforward to verify by hand, but which without some knowledge of abstract algebra and number theory is very difficult to explain.


Lagrange's Theorem has some applications to elementary number theory:

1) Wilson's Theorem says that for a prime $p$ we have $(p-1)! \equiv -1 \pmod p$. Proof: By Lagrange's Theorem applied to $\mathbb{F}_p^*$ we have $X^{p-1}-1 = \prod_{a \in \mathbb{F}_p^*} (X+a)$ in $\mathbb{F}_p[X]$. Now let $X \mapsto 0$.

2) The algebraic definition of binomial coefficients relies on the fact that for $n,m \in \mathbb{N}$ we have that $n! m!$ divides $(n+m)!$. Here is a proof: There is a canonical monomorphism

$Sym(\{1,\dotsc,n\}) \times Sym(\{n+1,\dotsc,n+m\}) \hookrightarrow Sym(\{1,\dotsc,n+m\})$.

Now apply Lagrange's Theorem.