The given question:

Let $k$ be a field and $n \in \mathbb{N}$. Show that the centre of $\operatorname{GL}(n, k)$ is $\lbrace\lambda I\mid λ ∈ k^∗\rbrace$.

I have spent a while trying to prove this and have succeeded if $ k \subseteq \mathbb{R}$. So I imagine there is a nicer way to go about this. I have seen people saying take matrices where every element except one is zero in proving similar results but such matrices are not in $\operatorname{GL}(n,k)$ as they are not invertible.

What I have done.

I have said let $B \in$ Center then $B$ commutes with everything in $\operatorname{GL}(n,k)$. So take a permutation matrix, $P$ which swaps rows $i,j$ $P \neq I$. From this you can deduce that $B^T = B$ and that $B_{ii} = B_{jj}$. We can use the fact that $B$ is symmetric to find an orthogonal diagonalisation of $B$ by the spectral theorem.

This gives $QBQ^T = D \rightarrow B=D$. And as $B_{ii} = B_{jj}$ we can say $B = k * I$.

But the spectral theorem requires $B$ to be a real matrix.

How do I prove this for a general $k$?.


Let $A$ belong to the center of $GL(n,k)$.

Fix $1\leq i\neq j\leq n$.

Denote $E_{i,j}$ the matrix with $1$ as $(i,j)$ coefficient, and $0$ elsewhere.

Then $I_n+E_{i,j}$ belongs to $GL(n,k)$ (as requested below, a proof of this is that, like every transvection matrix, $I_n+E_{i,j}$ has determinant $1$).

So $$ (I_n+E_{i,j})A=A(I_n+E_{i,j})\quad \Leftrightarrow\quad E_{i,j}A=AE_{i,j}. $$ Now compute the appropriate coefficients in the latter.


Here's another method which almost entirely avoids working with matrices.

Let $V$ be a finite dimensional vector space. We will show that the center of $\text{Aut}(V)$ consists precisely of the maps which are of the form $(x\mapsto \lambda x)$ for some nonzero scalar $\lambda$.

If $V$ is $0$ or $1$-dimensional, this is trivial. So assume that $\dim V\geq 2$ from now on. The first important observation is that if the operator $A$ commutes with the operator $B$, then $A$ preserves the eigenspaces of $B$. For suppose that $v$ is an eigenvector of $B$ with eigenvalue $\lambda$. Then $$B(Av)=ABv=A(\lambda v)=\lambda Av$$ Thus $A$ sends the eigenspace corresponding to $\lambda$ to itself.

The next lemma we will use is this:

If every non-zero vector of $V$ is an eigenvector of the operator $A$, then $A=(x\mapsto \lambda x)$ for some scalar $\lambda$.

This, I leave for you to prove. We will show that if $A$ is an element of $\text{Aut}(V)$ which commutes with all operators, then every nonzero vector of $V$ is an eigenvector of $A$ which will then imply that $A$ corresponds to scalar multiplication.

How will we go about doing this? It would be nice if for each nonzero vector $v_1\in V$, there is an invertible operator $B$ whose only eigenvectors consist of $\text{span}(v_1)$. This is indeed the case. First extend $\{v_1\}$ to the ordered basis $\{v_1, v_2, \ldots,v_n\}$. Then let $B$ be the linear extension of the map which sends $v_1$ to itself and sends $v_i$ to $v_i+v_{i-1}$ for all other $i$ (with respect to this basis, this operator would look like a Jordan Block consisting entirely of 1s). One can show that $B$ only has $\text{span}(v_1)$ as its only eigenspace (with eigenvalue $1$). Since, by assumption, $A$ commutes with $B$, we have that $A$ sends $v_1$ to a scalar multiple of $v_1$. Thus every vector of $V$ is an eigenvector of $A$; hence $A$ must be scalar multiplication by some (nonzero) scalar.

Now, note that $GL_n(K)$ is isomorphic to the automorphism group of an $n$-dimensional vector space over $K$. Every isomorphism corresponds to a choice of ordered basis for $V$. This isomorphism is going to carry the center of $\text{Aut}(V)$ to the center of $GL_n(K)$. But under any choice of basis for $V$, we have that the matrix representation of the operator $(x\mapsto\lambda x)$ is a diagonal matrix with $\lambda$s on the diagonal.


Let $A$ be a matrix in the center of $\operatorname{GL}(n,k)$. Then it commutes with all invertible matrices. So look at invertible matrices $B$ with an easy structure (for example, having only entries 0 or 1, and as few 1's as possible) and translate the equation $AB = BA$ into conditions on the entries of $A$.