Strictly diagonally dominant matrices are non singular

I try to find a good proof for invertibility of strictly diagonally dominant matrices (defined by $|m_{ii}|>\sum_{j\ne i}|m_{ij}|$). There is a proof of this in this paper but I'm wondering whether there are are better proof such as using determinant, etc to show that the matrix is non singular.


Solution 1:

The proof in the PDF (Theorem 1.1) is very elementary. The crux of the argument is that if $M$ is strictly diagonally dominant and singular, then there exists a vector $u \neq 0$ with $$Mu = 0.$$

$u$ has some entry $u_i > 0$ of largest magnitude. Then

\begin{align*} \sum_j m_{ij} u_j &= 0\\ m_{ii} u_i &= -\sum_{j\neq i} m_{ij}u_j\\ m_{ii} &= -\sum_{j\neq i} \frac{u_j}{u_i}m_{ij}\\ |m_{ii}| &\leq \sum_{j\neq i} \left|\frac{u_j}{u_i}m_{ij}\right|\\ |m_{ii}| &\leq \sum_{j\neq i} |m_{ij}|, \end{align*} a contradiction.

I'm skeptical you will find a significantly more elementary proof. Incidentally, though, the Gershgorin circle theorem (also described in your PDF) is very beautiful and gives geometric intuition for why no eigenvalue can be zero.

Solution 2:

I would probe it a bit tangentially. And not because it will be simpler, but because it gives an excuse to show an application. I would take an iterative method, like Jacobi's, and show that it converges in this case; and that it converges to a unique solution. This, incidentally implies the matrix is non-singular.

How does it work exactly?

For the system $Ax=b$, Jacobi's method consists in writing $A=D+R$, where $D$ is diagonal and $R$ has zeros in the diagonal. Then you define the recurrence

$$x_{n+1}=D^{-1}(b-Rx_{n}).$$

Now we can show that it converges.

We have

\begin{align}||x_m-x_n||&=||\sum_{k=n}^{m}(D^{-1}R)^kb-((D^{-1}R)^{m}-(D^{-1}R)^{n})x_0||\\ &\leq\sum_{k=n}^{m}||D^{-1}R||^k||b||+\left(||D^{-1}R||^m+||D^{-1}R||^n\right)||x_0|| \end{align}

For the norm $||\cdot||:=||\cdot||_{\infty}$, the matrix norm is bounded by the maximum of the sums of the absolute values of its entries in each row. Therefore $$||D^{-1}R||$$ is less than some number less than $1$. For this reason the sum above can be as small as you want for $n,m$ large. This shows the convergence of the sequence.

If it clear too that it has to converge to any solution of the system $Ax=b$. To see this we use the same argument above but placing a solution $x$ in place of $x_m$. We use that $Ax=b$, i.e. $x=D^{-1}(b-Rx)$ and we get it. So $x_n$ converges to any solution. Since it is a convergent sequence it converges to only one thing so there is only one solution to the system.