Is every invertible matrix a change of basis matrix?

In the course that I am having, we are treating change of basis matrices as the matrices of the identity operation from one basis S to another basis say B.

So, our instructor introduced a theorem :

If A and B are similar, i.e., S−1AS = B for an invertible matrix S, then they have the same characteristic polynomial. In particular, they have the same eigenvalues, det(A) = det(B) and Trace(A) = Trace(B).

And hence came the question here... Is every invertible matrix a change of basis matrix? Every change of basis matrix is certainly invertible, is the converse true?


Solution 1:

Yes. If $S$ is an invertible matrix, then its columns will be the other basis, as for the standard basis $e_1,\dots,e_n$, we have $\ Se_k=$ the $k$th column of $S$.

Say $b_k:=Se_k$. Then for a linear transformation $A$, the value $ASe_k$ is the image of $b_k$ under $A$, and for a vector (given in standard coordinates), $S^{-1}v$ will give its coordinates in the basis $(b_1,\dots,b_n)$

because if $v=\lambda_1 b_1+\dots+\lambda_n b_n$, then, as the columns of $S$ are just the $b_k$ 's, this equation becomes $\ v=S\cdot\pmatrix{\lambda_1\\ \vdots\\ \lambda_n}$.

Solution 2:

Yes it is. Any invertible matrix $\mathbf{A}$ is the change of basis of the basis formed by the columns of $\mathbf{A}$ (which is a basis because $\mathbf{A}$ is invertible) to the canonical basis.

Solution 3:

Yes. In fact, the following is true: If $P$ is an invertible $n\times n$ matrix with coefficients in some field $\mathbb{K}$ and $\beta=(b_1,...,b_n)$ is a basis for an $n$ dimensional vector space $V$ over $\mathbb{K}$, then

1) There is a basis $\alpha=(a_1, ... , a_n)$ for $V$ such that $P$ is the change of basis matrix from $\beta$ to $\alpha$.

2) There is a basis $\alpha'=(a_1', ... , a_n')$ for $V$ such that $P$ is the change of basis matrix from $\alpha'$ to $\beta$.

Let's prove this, assuming that $V= \mathbb{K}^n$ (the results will easily carry over to the general case by the isomorphism of V and $\mathbb{K}^n$).

Ad 1)
Assuming $V= \mathbb{K}^n$, we can let $B$ be the (invertible) matrix with columns $b_1, ... ,b_n$. As $P$ is also invertible, we can define a matrix $A$ by $A=B P^{-1}$. As $A$ is invertible, the columns $a_1, ..., a_n$ of $A$ constitute a basis $\alpha$ for $V$. We now claim that this is the required basis, i.e. that $P$ is the change of basis matrix from $\beta$ to $\alpha$. To show this, we must show that the columns of matrix $P$ are the basis vectors $b_j$ expressed in the basis $\alpha$, i.e. that $b_j=\sum_{i=1}^{n} P_{ij}a_i$ for $j=1,...,n.$ But this is indeed the case, as

$$\sum_{i=1}^{n} P_{ij}a_i=A \begin{bmatrix}P_{1j}\\.\\.\\.\\P_{nj}\end{bmatrix}=BP^{-1}\begin{bmatrix}P_{1j}\\.\\.\\.\\P_{nj}\end{bmatrix}=B e_j=b_j, $$ where $e_j$ is the j'th element in the standard basis for $\mathbb{K}^n$.

Ad 2)
For $j=1,...,n$, let $a_j'=\sum_{i=1}^{n} P_{ij}b_i$. As P is invertible, $\alpha'=(a_1', ... , a_n')$ constitutes a basis that is easily seen to meet the requirements.