Applying Graph Theory to Linear Algebra (not the other way around)

I know about applications of Linear Algebra to Graph Theory, I find them boring. What interests me is whether one can draw graph-like pictures of linear functions to understand them better.

Do you know of any results like that?

I have one particular question I would like to know the answer to:

Let $f : V \rightarrow V$ be a linear function and $b_1,...,b_n \in V$ a basis of $V$. Also for every $v \in V$ define $v_1,...,v_n$ so that $v_1 b_1 + ... + v_n b_n = v$. Finally let $G = (B,E)$ be the graph with $B = \{b_1,...,b_n\}$ and $E = \{ (b_i, b_j) \text{ with weight } f(b_i)_j \mid i,j \in \{1,...,n\} \}$. In words: draw a circle for every basis element and connect them so that you can see how $f$ maps the basis elements to each other.

Now delete all weights that are zero and assume the other weights are positive. Can we say something like: There is a cycle in $G$ if and only if $f$ has an eigenvector? To me that sounds like the Perron–Frobenius theorem .

I'm also wondering if one could prove the existence of Jordan-Normal-Forms using graphs like this. (generalized eigenvectors are then maybe cycles connected by a tree)

In general I feel like there should be a graph-theoretic perspective on the (basic) concepts I've seen in linear algebra. What do you think?


To build off of littleO's answer, the applications of graph theory to applied numerical linear algebra are incredibly extensive and I figured I'd add a bit more.

Associated to every $n\times n$ matrix $A$ is a graph $G$ whose vertices are $\{1,2,\ldots,n\}$ and for which $(i,j)$ is a directed edge iff $A_{ij} \ne 0$. As littleO mentioned, if $G$ is chordal, then there exists an elimination ordering such that $A$'s Cholesky factorization can be computed with no fill-in.

Even if $G$ is not chordal, understanding the graph structure $G$ can help find much better elimination orders. Finding the best elimination order for a general graph $G$ is NP-hard. However, for certain classes of graphs, much can be said about their optimal elimination orderings based on graph-theoretic arguments. For instance, for planar graphs, the computational complexity of performing Gaussian elimination on an $n\times n$ can at best be done with on the order of $\sim n^{3/2}$ operations (see, for instance, here and here). This involves a clever combinatorial graph theoretic argument. Similar results hold for "higher dimensional" graphs, although this becomes more subtle.

Let me rattle off a few more. Perfect matchings, bipartite graphs, and strongly connected components all play a big role in doing elimination intelligently for nonsymmetric matrices. (These slides are a nice place to start.) There are weighted bipartite matching algorithms for preconditioning. The very active area of Laplacian solvers use graph theoretic techniques to try to solve special linear systems super fast. There's also a very interesting area of research where graph theoretic algorithms are modeled as matrix problems over certain semirings. (This may be more of an application of linear algebra to graph theory, but it's cool to me none-the-less.) As an upshot, graph theoretic ideas are all over the field of numerical linear algebra, as many matrices that merge in practice are very sparse and thus have interesting graph theoretic structures necessary to develop fast algorithms.


The idea of a chordal graph is useful in numerical linear algebra. If an invertible matrix has a chordal sparsity pattern, then it has a Cholesky factorization with no fill-in (so that sparsity is not lost -- the Cholesky factors are just as sparse as the original matrix).


Let $A\in\mathbb{R_{+,0}}^{n\times n}$ be a quadratic matrix with non-negative entries. Then we can test the nilpotency of $A$ as follows:

We define $B\in\mathbb{R_{+,0}}^{n\times n}$ as $B_{i,j}=\delta_{A_{i,j}>0}$.
Now we interpret $B$ as the adjacency-matrix of a graph. If the graph has no cycles, then the matrix is nilpotent.

Why does it work?
Let $G$ be a graph with adjacency-matrix $B$. Then we have $B_{i,j}=1$ iff there is an edge from $i$ to $j$ in $G$.
Further, we have that $(B^n)_{i,j} $ counts the number of paths from $i$ to $j$ in $G$ that have precisely the length $n$.

Therefore, a graph without cycles has a nilpotent adjacency-matrix.

This result also holds if we give each edge in $G$ a weight $\in \mathbb R$, though now $(B^n)_{i,j}$ measures the sum of the weights of all paths from $i$ to $j$ of length $n$.
As, if all weights are in $\mathbb R_+$, this measure can only be zero if there is no path, we get the result.