Why Aren't "Similar" Matrices Actually the Same?
In linear algebra, a matrix $B$ is said to be "similar" to $A$ if $B=C^{-1}AC$, that is $B$ = a matrix $A$ multiplied by a third matrix $C$, and its inverse, $C^{-1}$.
In regular algebra, if I take a number $x$, and multiply it by $\frac{1}{2}$ and then $2$, the latter terms cancel out, and I get $x$, the same and not a "similar" variable. Wouldn't you also have this result in linear algebra? What am I missing?
Solution 1:
Matrix multiplication is not commutative in general. It corresponds to function composition, which is clearly not commutative in general.
Solution 2:
Two matrices are similar if they are the same linear transformation but looked at through a different basis. For example, if I take a linear transformation $T: \mathbb{R}^3\rightarrow \mathbb{R}^3$ and look at the images of, say, $(1, 0, 0)$, $(0, 1, 0)$ and $(0, 0, 1)$ then I will get a matrix, $M_1$. However, if I look at the images of another basis, say $(1, 2, 3)$, $(0, 1, 0)$ and $(0, 0, 1)$, I will get a different matrix, $M_2$.
Crucially, $M_1$ and $M_2$ are similar - one can be got to the other via conjugation. So, interestingly, they are the same linear transformation, but they are different matrices.
However, the problem is that matrix multiplication is not commutative in general. Take any two $2\times 2$ matrices $A$ and $B$ and, chances are, $AB\neq BA$.