Is there a theory that combines category theory/abstract algebra and computational complexity?

Category theory and abstract algebra deal with the way functions can be combined with other functions. Complexity theory deals with how hard a function is to compute. It's weird to me that I haven't seen anyone merge these fields of study, since they seem like such natural pairs. Has anyone done this before?


As a motivating example, let's take a look at semigroups. Semigroups have an associative binary operation. It's well known that operations in semigroups can be parallelized due to the associativity. For example, if there's a billion numbers we want added together, we could break the problem into 10 problems of adding 100 million numbers together, then add the results.

But parallelizing the addition operator makes sense only because it can be computed in constant time and space. What if this weren't the case? For example, lists and the append operation form a common semigroup in functional programming, but the append operator takes time and space $O(n)$.

It seems like there should be a convenient way to describe the differences between these two semigroups. In general, it seems like algebras that took into account the complexity of their operations would be very useful to computer scientists like myself.


It seems that the idea of growth of groups is extremely relevant here. In the example in the question the operation may be viewed as concatenation of words, and so is very similar to the standard way in which we view the free group on $m$ generators $F(\mathbf{x}_m)$. The operation $V\circ W$ takes $O(|V|+|W|)$-time* to evaluate in $F(\mathbf{x}_m)$.

Every finitely-generated group can be constructed by taking such a (finite rank) free group $F(\mathbf{x}_m)$ and adding relators. These relators correspond to "short cuts" in our computation. For example, consider the free group $F(a)$ and add the relator $a^3=\epsilon$ (here, $\epsilon$ denotes the empty word) to obtain the group with presentation $\langle a\mid a^3\rangle$. (This group is actually cyclic of order $3$.) Now, I can evaluate $a^{100}\circ a^{34}$ to get $a^{134}$, but why would I? As I am in my group I know that $a^{100}=a$ and $a^{34}=a$ so $a^{100}\circ a^{34}=a\circ a=a^2$. The idea of growth of groups is to make sense of these "shortcuts" computationally. Roughtly, the "growth" of a group corresponds to how many group elements there are at distance $n$ from the identity element. Crucially, growth rate does not depend on the choice of generating set. Please see the above Wikipedia link for more details. However, I will leave you with some examples of groups and their growth rate (taken and amended from Wikipedia, of course!):

  • A free group with a finite rank $k > 1$ has an exponential growth rate.
  • A finite group has constant growth – polynomial growth of order $0$.
  • A group has polynomial growth if and only if it has a nilpotent subgroup of finite index. This is a massive result due to Gromov see wiki. It was one of the first results to "link" analytic behaviour of a group with its algebraic structure.
  • If M is a closed negatively curved Riemannian manifold then its fundamental group $\pi _{1}(M)$ has exponential growth rate. Milnor proved this using the fact that the word metric on $\pi_{1}(M)$ is quasi-isometric to the universal cover of $M$.
  • $\mathbb{Z}^d$ has a polynomial growth rate of order $d$.
  • The discrete Heisenberg group $H^3$ has a polynomial growth rate of order $4$. This fact is a special case of the general theorem of Bass and Guivarch that is discussed in the article on Gromov's theorem on groups of polynomial growth.
  • The lamplighter group has an exponential growth.
  • The existence of groups with intermediate growth, i.e. subexponential but not polynomial was open for many years. It was asked by Milnor in 1968 and was finally answered in the positive by Grigorchuk in 1984 (see Girgorchuk's group). This is still an area of active research.
  • The class of triangle groups include infinitely many groups of constant growth, three groups of quadratic growth, and infinitely many groups of exponential growth.

*This is an easily-computed upper bound. You can probably do something clever to reduce this though.