what are the different applications of group theory in CS? [closed]

What are some applications of abstract algebra in computer science an undergraduate could begin exploring after a first course?

Gallian's text goes into Hamming distance, coding theory, etc., I vaguely recall seeing discussions of abstract algebra in theory of computation / automata theory but what else? I'm not familiar with any applications past this.

Bonus: What are some textbooks / resources that one could learn about said applications?


Algebra is incredibly useful in computer science. I'll preface my answer with my opinion: I view a good portion of computer science as a branch of mathematics. So my answer will be quite broad. Note that my opinion is one that not everyone shares.

Algebraic Combinatorics and Graph Theory: Recall Cayley's Theorem from group theory, which states that every group is the subgroup of some Symmetry group. That is, every group is a permutation group. This immediately motivates the study of algebraic combinatorics. Group actions are used to study symmetries, or automorphisms, of mathematical objects. Informally, a group action is a dynamical process on an object, which partitions its members into sets which we call orbits. The study of the structure and quantity of these orbits yields important combinatorial results. You have likely seen the Dihedral group, which is the automorphism group of the Cycle graph. In general, finding the full automorphism group of an object (such as a graph) is a non-trivial exercise.

There are several applications that come to mind. The first is Polya Enumeration Theory. Given a mathematical object (such as a graph), we color the vertices (this is not necessarily a proper coloring, in the sense that two adjacent vertices may receive the same color). Then given some subgroup $H$ of automorphisms (such as rotations of a necklace), how many distinct colorings exist? That is, two colorings are equivalent if they belong to the same orbit under the action of $H$. Polya enumeration theory studies these types of problems. If your combinatorics is weak (and even if it isn't), Loehr's Bijective Combinatorics text is an excellent resource.

In algebraic graph theory, it is common to examine the Cayley Graph for a group. Given a group $G$ and a set $S$, the Cayley Graph $X(G, S)$ has vertex set $G$, and two elements $g, h$ are connected by an edge if $hg^{-1} \in S$. Intuitively, the Cayley Graph tells us if an element in $S$ can be used to take us from $g$ to $h$. We may examine a related graph of transpositions from $\text{Sym}(n)$, and leverage algorithms from graph theory to ascertain whether the transpositions generate $\text{Sym}(n)$. I cover some of this in my lecture notes for Theory of Computation (http://people.math.sc.edu/mlevet/Lecture_Notes.pdf). Godsil and Royle's text on Algebraic Graph Theory is a more comprehensive resource.

A third thought that comes to mind is the study of graph homomorphisms. In particular, a proper coloring of a graph is a graph homomorphism. If the graph $X$ has chromatic number $c$, then there exists a graph homomorphism $\phi : X \to K_{c}$. So there is a direct connection to complexity theory here, as graph coloring is NP-Hard. Constraint satisfaction problems are generalizations of this, where we study homomorphisms of relational structures of finite rank. Constraint satisfaction problems arise frequently in artificial intelligence. Algebraic techniques (granted, not senior level algebra techniques) arise in the study of CSPs. The Dichotomy Conjecture states that every CSP is either in $P$ or NP-Complete. Dimitry Zhuk recently presented his results at a two-week conference at Vanderbilt in September, settling the conjecture in the affirmative. While he has not put forth a paper, the folks there were relatively convinced of his results (including the logician at my school, who was a Tarski student).

Automata Theory: The set of regular languages forms a Kleene semi-ring over the operations of set union (addition) and concatenation (multiplication). The notion of the derivative also comes up. See the Brzozowski Derivative. We can leverage this fact to design a linear-algebraic algorithm (see the Brzozowski Algebraic Method) to convert a FSM to a regular expression. (I also cover this in my notes.)

Complexity Theory: Babai's recent result on the Graph Isomorphism problem is (or should be) the poster child for algebra in CS. He uses a lot of techniques such as group actions and representation theory to study problems like Graph Isomorphism and Group Isomorphism. Regarding the Group Isomorphism problem, it is undecidable in the general case. For finite groups, the best bound is something like $n^{\text{log}(n)}$ (which is easy to obtain, simply by showing that every finite group has a generating set with at most $\text{log}_{2}(n)$ generators). For abelian groups, Group Isomorphism has an $O(n)$ time solution. For solvable groups, the $n^{\text{log}(n)}$ bound has been relatively untouched. Babai and Qiao presented a polynomial time isomorphism test for groups with abelian Sylow towers. To the best of my knowledge (which is limited), this is the largest class of solvable groups to date with a polynomial time isomorphism test.

A lot of the techniques Babai uses are quite heavy handed. If these types of problems catch your interest, I might spend some time on representation theory. I think a solid handle on group theory (comfort with the Sylow Theorems) and abstract linear algebra, along with motivation, is sufficient to pick up a book in the area. There are several books on Representation Theory from a combinatorial perspective. Sagan's text on the Symmetric Group is one text that comes to mind. Babai also teaches classes on Algorithms in Finite Groups, which you can find on his home page. Computational Group Theory is the key phrase, if you decide to look into this area more.

Cryptology, Coding Theory, and Computational Number Theory: These are the obvious ones, and have been covered quite well.


Here is a link to a seminar course given at KTH, Stockholm, in 2014, the topic of the semester being "Algebraic Gems in TCS". There are several links to other, similar courses on the bottom of the page, as well as links and references to some relevant literature a bit higher up:

http://www.csc.kth.se/utbildning/kth/kurser/DD2442/semteo14/

As regards automata theory and algebra, there are interesting connections between automata and what is known in algebra as semigroups (the difference between a semigroup and a group being that semigroups don't need identity elements or inverse elements), and one of the main links between the subjects is the syntactic monoid (see https://en.wikipedia.org/wiki/Syntactic_monoid for more information). You can also try looking out for resources covering algebraic automata theory, apparently also known as Krohn-Rodes Theory:

https://en.wikipedia.org/wiki/Krohn%E2%80%93Rhodes_theory

Depending on one's viewpoint, one may consider calling category theory a branch of algebra, and categories have found quite a lot of application especially in connection with functional programming. First courses in algebra typically don't cover categories though, but if you'd like some reference to catch up, Awodey's book is quite pleasant to read and presents some connections to computer science. But you could probably google "Category theory for computer science" as well, and end up finding really good notes for free :)

And just like Chas Brown states, cryptography is another part of CS where algebra has found applications. You will probably find a lot of group and field theory applied there.

And then there is the entire subject of coding theory, but judging from the post you already knew about this.

And then I guess there is at least some more that I can't think of right now…but hopefully this should keep you busy :)