How do I find upper triangular form of a given 3 by 3 matrix??
We are asked to find an invertible matrix $P$ and an upper triangular matrix $U$ such that:
$P^{-1}\begin{pmatrix} 3 & -1 & 1 \\ 2 & 0 & 0 \\ -1 & 1 & 3 \end{pmatrix}P=U$
I'm a bit stuck.
I found the characteristic polynomial of the matrix, it is $(-x+2)^3$, so the only eigenvalue is 2.
The problem comes with the eigenvectors. I could only find one, $\begin{pmatrix} 1 \\ 1\\0 \end{pmatrix}$ ...How do I go on from here? the zero vectors are not eigenvectors, and even if they were,
the matrix $\begin{pmatrix} 1 & 0 & 0\\ 1 & 0 & 0\\0 & 0 & 0 \end{pmatrix}$ is singular so it can't be the $P$ i'm looking for...
If you don't know about JNF, here's another process which is easily generalizable. What I'm doing is simply following the constructive proof of Schur's Decomposition, (link provided below). Every now and then I'll choose random vectors that satisfy certain properties. There is an infinite number of choices for these vectors. You should pick your own while mimicing my answer. The $U$ and $P$ you'll end up with will probably be different.
Let $A=\begin{bmatrix} 3 & -1 & 1 \\ 2 & 0 & 0 \\ -1 & 1 & 3 \end{bmatrix}$.
You got only one eigenvector, namely $v_1:=\begin{bmatrix} 1 & 1 & 0\end{bmatrix}^T$.
Consider $P_1:=\begin{bmatrix} v_1 \mid v_2 \mid v_3\end{bmatrix}$ by columns. You want an invertible $P$ so just let $v_2, v_3$ be such that $P_1$ is invertible. An easy choice is $v_2:=\begin{bmatrix} 1 & -1 & 0\end{bmatrix}^T$ and $v_3:=\begin{bmatrix} 0 & 0 & 1\end{bmatrix}^T$. It's easy to see $P$ is invertible because its columns are orthogonal.
This yields $P_1^{-1}AP_1=\begin{bmatrix} 2 & 3 & 1/2 \\ 0 & 1 & 1/2 \\ 0 & -2 & 3 \end{bmatrix}$. It's not an upper triangular matrix. Let $B=P_1^{-1}AP_1$.
Suppose for a moment that there are matrices $P_2$ (invertible) and $T$ such that $P_2^{-1}BP_2=T$ where $T$ is an upper triangular matrix. This would yield $B=P_2TP_2^{-1}$ and $P_2TP_2^{-1}=P_1^{-1}AP_1$, thus giving $T=(P_2^{-1}P_1^{-1})A(P_1P_2)$.
So let's (try to) triangularize $B$.
Repeating the process wouldn't help, so let's instead try to triangularize $\color{grey}{B_1:=}\begin{bmatrix} 1 & 1/2 \\ -2 & 3\end{bmatrix}$. (Why? See Schur decomposition theorem's proof by induction here (page 12) ).
It's easy to check that $\left(2, \begin{bmatrix} 1\\ 2\end{bmatrix}\right)$ is an eigenpair of $B_1$ and that $B_1$ doesn't have any other linearly independent eigenvectors.
Define $P_{B_1}:=\begin{bmatrix}1 & -2\\2 & 1\end{bmatrix}$. The second column was chosen just to make $P_{B_1}$ invertible. There are, of course, other possibilities.
Then $P_{B_1}^{-1}B_1P_{B_1}=\begin{bmatrix}2 & 5/2\\ 0 & 2 \end{bmatrix}$.
Now it's possible to construct the aforementioned $P_2$. Let $P_2=\begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & -2\\ 0 & 2 & 1\end{bmatrix}$.
Block multiplication assures $P_2$ does the job.
Indeed $P_2^{-1}P_1^{-1}AP_1P_2=P_2^{-1}BP_2=\begin{bmatrix} 2 & 4 & -11/2\\ 0 & 2 & 5/2\\ 0 & 0 & 2\end{bmatrix}$.
So just let $U:=\begin{bmatrix} 2 & 4 & -11/2\\ 0 & 2 & 5/2\\ 0 & 0 & 2\end{bmatrix}$ and $P:=P_1P_2\color{grey}{=\begin{bmatrix} 1 & 1 & -2\\ 1 & -1 & 2\\ 0 & 2 & 1\end{bmatrix}}$.
Let's confirm it works. First find $P^{-1}=\begin{bmatrix}1/2 & 1/2 & 0\\ 1/10 & -1/10 & 2/5\\ -1/5& 1/5 & 1/5 \end{bmatrix}$.
Then $$\begin{align} P^{-1}AP&=\begin{bmatrix}1/2 & 1/2 & 0\\ 1/10 & -1/10 & 2/5\\ -1/5& 1/5 & 1/5 \end{bmatrix}\begin{bmatrix} 3 & -1 & 1 \\ 2 & 0 & 0 \\ -1 & 1 & 3 \end{bmatrix}\begin{bmatrix} 1 & 1 & -2\\ 1 & -1 & 2\\ 0 & 2 & 1\end{bmatrix}\\ &=\begin{bmatrix} 5/2 & -1/2 & 1/2\\-3/10 & 3/10 & 13/10\\ -2/5 & 2/5 & 2/5\end{bmatrix}\begin{bmatrix} 1 & 1 & -2\\ 1 & -1 & 2\\ 0 & 2 & 1\end{bmatrix}\\ &=\begin{bmatrix} 2 & 4 & -11/2\\ 0 & 2 & 5/2\\ 0 & 0 & 2\end{bmatrix}.\end{align}$$