Spectrum of adjacency matrix of complete graph

Fooling around in matlab, I did an eigenvalue decomposition of the adjacency matrix of $K_5$.

 A = [0     1     1     1     1;
      1     0     1     1     1;
      1     1     0     1     1;
      1     1     1     0     1;
      1     1     1     1     0];

The eigenvalues are $4, -1, -1, -1, -1$. It now seems obvious that the largest eigenvalue for the adjacency matrix for $K_n$ would be $n-1$. What I don't understand is why $-1$ is an eigenvalue of multiplicity $n-1$. Is there any special meaning to this?


Solution 1:

It is possible to give a lower bound on the multiplicities of the eigenvalues of the adjacency or Laplacian matrix of a graph $G$ using representation theory.

Namely, the vector space of functions $G \to \mathbb{C}$ is a representation of the symmetry group, and both the adjacency and Laplacian matrix respect this group action. It follows that each irreducible subrepresentation lies in an eigenspace of the adjacency and Laplacian matrices, so their dimensions give a lower bound on the multiplicities.

In this particular case, $K_n$ has symmetry group the full symmetric group $S_n$, and the corresponding representation decomposes into the trivial representation (this corresponds to the eigenvalue $n-1$) and an $n-1$-dimensional irreducible representation, so there can only be one other eigenvalue and it must have multiplicity $n-1$. Moreover, the sum of the eigenvalues is $0$ because the trace of the adjacency matrix is zero, so the remaining eigenvalue must be $-1$.

This is all just a very fancy way of saying that

Any permutation of an eigenvector of $K_n$ is also an eigenvector with the same eigenvalue. If the eigenvector doesn't have all of its entries equal, then its permutations span a vector space of dimension $n-1$, so the remaining eigenvalue has multiplicity $n-1$.

Which is in turn just a very fancy way of saying that

Any vector $v$ all of whose entries sum to zero is an eigenvector of the adjacency matrix $A_n$ of $K_n$; moreover, it is easy to see that $(A_n + I)v = 0$ for any such $v$, hence $A_n v = -v$.

That is a straightforward explanation but it misses the bigger lesson, which is that

Highly symmetric graphs have eigenvalues of high multiplicity.


Another argument proceeds using the observation that, on the one hand, $\text{tr}(A^k)$ counts the number of walks of length $k$ which begin and end at the same vertex and, on the other hand, $$\text{tr}(A^k) = \sum_{i=1}^n \lambda_i^k$$

where $\lambda_i$ are the eigenvalues. This sequence uniquely determines the eigenvalues (exercise), so to prove the desired result it suffices to prove that $$\text{tr}(A^k) = (n-1)^k + (n-1)(-1)^k.$$

Now, the total number of walks of length $k-1$ is just $n(n-1)^{k-1}$, where the first $n$ counts the number of possible starting vertices and each factor of $n-1$ counts the possible next vertices. If such a walk ends at the original vertex (so is itself a closed walk of length $k-1$), it cannot be extended to a closed walk of length $k$; otherwise, it can be uniquely extended as such. Thus it follows that $$n(n-1)^{k-1} = \text{tr}(A^{k-1}) + \text{tr}(A^k)$$

and then the desired result follows by induction from the initial condition $\text{tr}(A^0) = n$.

Solution 2:

Note that $B = A + I$ is the $n \times n$ matrix of all $1$'s, which is $e e^T$ where $e$ is a column vector of all $1$'s. Note that $B e = n e$, so $e$ is an eigenvector of $B$ with eigenvalue $n$ (and thus an eigenvector of $A$ with eigenvalue $n-1$), while every vector orthogonal to $e$ is in $\rm Ker B$, so is an eigenvector of $B$ with eigenvalue $0$ and of $A$ with eigenvalue $-1$.

Solution 3:

This is too much for a comment. It won't be a complete picture but it is a group of interesting ideas surrounding this question.

A connected graph of diameter $d$ has at least $d + 1$ distinct eigenvalues. Thus, the only connected graph that can possibly have only 2 distinct eigenvalues is the graph of diameter 1, the complete graph!

The class of strongly regular graphs is a class that always realizes that bound, i.e., they have exactly $d+1$ distinct eigenvalues (and no more). A complete graph is strongly regular.

All $k$-regular graphs have $k$ as an eigenvalue with multiplicity the same as the number of connected components in the graph. Thus, a connected $k$-regular graph has $k$ as an eigenvalue of multiplicity 1. And, for any other eigenvalue, $\lambda$, we have $|\lambda| \leq k$.

The first few sections of "Algebraic Graph Theory" by Biggs has a bunch of these ideas.