Is there a fast way to prove a tridiagonal matrix is positive definite?

I' m trying to prove that

$$A=\begin{pmatrix} 4 & 2 & 0 & 0 & 0 \\ 2 & 5 & 2 & 0 & 0 \\ 0 & 2 & 5 & 2 & 0 \\ 0 & 0 & 2 & 5 & 2 \\ 0 & 0 & 0 & 2 & 5 \\ \end{pmatrix}$$

admits a Cholesky decomposition.

$A$ is symmetric, so it admits a Cholesky decomposition iff it is positive definite. The only methods I know for checking this are:

  1. $X^tAX > 0, \quad \forall X \in \mathbb{K}^n- \{0\}$.
  2. If $\lambda$ is an eigenvalue of $A$, then $\lambda>0.$

I have failed to prove it using 1 and 2 is taking me so much time. Is there any easier way to do this, given that $A$ is tridiagonal?


Notice $A$ can be rewritten as a sum of 5 matrices. $$A = \left[\begin{smallmatrix} 2 & 0 & 0 & 0 & 0\\ 0 & 1 & 0 & 0 & 0\\ 0 & 0 & 1 & 0 & 0\\ 0 & 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 0 & 3 \end{smallmatrix}\right] + \left[\begin{smallmatrix} 2 & 2 & 0 & 0 & 0\\ 2 & 2 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 \end{smallmatrix}\right] + \left[\begin{smallmatrix} 0 & 0 & 0 & 0 & 0\\ 0 & 2 & 2 & 0 & 0\\ 0 & 2 & 2 & 0 & 0\\ 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 \end{smallmatrix}\right] + \left[\begin{smallmatrix} 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 2 & 2 & 0\\ 0 & 0 & 2 & 2 & 0\\ 0 & 0 & 0 & 0 & 0 \end{smallmatrix}\right] + \left[\begin{smallmatrix} 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 2 & 2\\ 0 & 0 & 0 & 2 & 2 \end{smallmatrix}\right] $$ The first matrix is diagonal with positive entries on diagonals, so it is positive definite. The remaining four matrices are clearly positive semi-definite. Being a sum of a positive definite matrix with a bunch of positive semi-definite matrices, $A$ itself is positive definite.


From Gershgorin’s disc theorem and the matrix being symmetric (real eigenvalues) you get that the eigenvalues are in the interval [1,9] and thus positive. Apparently it’s commonly known as Gershgorin’s circle theorem.


This particular matrix is symmetric diagonally dominant (SDD), meaning that the absolute values of each row's off-diagonal entries do not exceed the absolute value of the diagonal, ie. $$ \sum_{\substack{j\in[1,n] \\ i \neq j}} \lvert a_{i,j} \rvert \leq \lvert a_{ii} \rvert$$ Since the diagonals are positive, it is positive semidefinite, but it is actually positive definite because the inequality above is strict. This can be seen by splitting $A$ into $A = A' + \varepsilon I$ such that $A'$ remains SDD (and therefore PSD) and $\varepsilon > 0$. Multiplying on both sides by the same vector is always positive, since $\varepsilon I$ is positive definite.


Another approach specific to symmetric tridiagonal matrices is to use the following recurrence. Let $a_i$ be the entries of the main diagonal, and let $b_j$ be the entries of the off-diagonal. Then the determinant is given by $$ \begin{align*} d_{-1} &= 0 \\ d_0 &= 1 \\ d_n &= a_n d_{n-1} - b_{n-1}^2 d_{n-2} \end{align*} $$ This gives the series of determinants of its leading principal minors. In your case, this is: $$\begin{align*} d_1 = 4\cdot 1 - 0^2 \cdot 0 = 4 \\ d_2 = 5\cdot 4 - 2^2 \cdot 1 = 16 \\ d_3 = 5\cdot 16 - 2^2 \cdot 1 = 76 \\ d_4 = 5\cdot 76- 2^2 \cdot 1 = 376 \\ d_5 = 5\cdot 376- 2^2 \cdot 1 = 1876 \\ \end{align*}$$ Since they're all positive, Sylvester's criterion shows that the matrix is positive-definite.

Since you asked for a "fast" way, I'll note that this takes linear time to execute on a computer. I'll also note that a faster way of doing this on paper (fewer operations) is to divide the recurrence by the previous term to get $s_n = a_n - b^2_{n-1} / s_{n-1}$ with $s_0=1$ and $b_0=0$. If $a_1$ is positive and $s_n$ remains positive throughout, then the recurrence never changes sign, so $d_n$ is positive as before.