Can a matrix in $\mathbb{R}$ have a minimal polynomial that has coefficients in $\mathbb{C}$?
Like the title says, can a matrix in $\mathbb{R}$ have a minimal polynomial that has coefficients in $\mathbb{C}$?
I have a feeling the answer is yes because the characteristic polynomial can have coefficients in $\mathbb{C}$ and so I don't see any reason why not, I was merely asking in case there was some sneaky proof that proves that this cannot happen.
Let $A$ be a matrix with real entries, and let $P(x)$ be a minimal polynomial of $A$ with lead coefficient $1$. By a minimal polynomial $P(x)$ we mean a polynomial $P(x)$ of minimal degree, with coefficients in $\mathbb{C}$, such that $P(A)=0$. That is not quite the standard definition, but it is what you are considering. Note that whatever field we are working in, we can always force lead coefficient $1$.
Let $\overline{P}(x)$ be obtained from $P$ by taking complex conjugates of the coefficients of $P(x)$. It is easy to verify that $\overline{P}(A)=0$, since the evaluation of $P(A)$ only involves addition and multiplication.
If $\overline{P}\ne P$, and $Q(x)=P(x)-\overline{P}(x)$, then $Q$ is a non-zero polynomial of degree less than the degree of $P$ such that $Q(A)=0$.
Remark: If, for example, $x^2-1$ is a minimal polynomial of $A$, then $ix^2-i$, in the sense in which the term is being used here, is also a minimal polynomial of $A$. What is why we normalized to lead coefficient $1$.
Neither the characteristic polynomial nor the minimal polynomial can have coefficients outside the field $K$ over which a matrix $A$ (or linear operator) is defined. For the characteristic polynomial this is obvious, since it is the determinant of a matrix of polynomials in $K[X]$ and therefore itself in $K[X]$. As for the minimal polynomial, it is by definition the lowest degree monic polynomial in $K[X]$ that annihilates the matrix, so it has coefficients in $K$ by definition. But even if one considers the matrix as one with coefficients in a larger field, the minimal polynomial does not change, and so its coefficients will remain inside any (sub-)field that contains the coefficients of $A$ (in particular if $A$ has rational coefficients, you can be sure that the minimal polynomial of $A$ has so as well; the characteristic polynomial too).
For the minimal polynomial the argument is slightly different than for the characteristic polynomial. One can find the minimal polynomial by considering in the vector space of square matrices the sequence of powers of $A$; the minimal polynomial expresses the first relation of linear dependence as matrices are added to that sequence (i.e., if the first linear dependence is between $A^d$ and lower powers of $A$, then the minimal polynomial is monic of degree $d$, and its non-leading coefficients describe the expression of $A^d$ in terms of lower powers in an obvious way). Now whether or not a system of linear equations has a solution over a field $K$, and the actual solution in case it has a unique solution (as it is here for the expression of $A^d$) do not change when we replace $K$ by a larger field. In particular the minimal polynomial over $\mathbb R$ of a real matrix is also its minimal polynomial over $\mathbb C$, and these polynomials have real coefficients.
One more thing that does not change when extending a field is the divisibility relation between polynomials in $K[X]$: if $Q$ divides $P$ then this means the remainder in Euclidean division of $P$ by $Q$ is $0$, and this does not change when one extends $K$. The importance of this remark in relation to the previous ones is that it shows that the minimal polynomial over $K$ divides the characteristic polynomial if and only of the minimal polynomial over a larger field divides the characteristic polynomial there (neither change, nor does the divisibility relation). Therefore in proving the Cayley-Hamilton theorem (over fields), which asserts that this divisibility does indeed always hold, one may assume without loss of generality that $K$ is as large as one likes it to be, in particular it may be assumed to be algebraically closed, like $K=\mathbf C$. Some easy proofs of the Cayley-Hamilton theorem do indeed exploit this simplification. Of course if one states the Cayley-Hamilton theorem as the vanishing of an evaluation of the characteristic polynomial in the matrix itself, then the independence of the field is also immediately obvious.
Andreé' proof works only for field extensions of degree $2$. Mine works for every field extension.
If $\mathbb{K}$ is a field, we denote with $M_n(\mathbb{K})$ the $\mathbb{K}$-algebra of square matrices $n \times n$. If $A \in M_n(\mathbb{K})$ we call minimal polynomial of $A$ over $\mathbb{K}$ the unique monic polynomial $\mu_{A,\mathbb{K}} \in \mathbb{K}[t]$ that is a generator of the ideal $\{ q(t) \in \mathbb{K}[t] \mid q(A) = 0 \}$ of the ring $\mathbb{K}[t]$.
Lemma 1. If $A \in M_n(\mathbb{K})$, then $\deg \mu_{A, \mathbb{K}} = d$ if and only if the following two conditions are satisfied:
- the matrices $\mathrm{Id}, A, A^2, \dots, A^{d-1}$ are linearly independent over $\mathbb{K}$;
- the matrices $\mathrm{Id}, A, A^2, \dots, A^{d-1}, A^d$ are linearly dependent over $\mathbb{K}$.
Lemma 2. Let $\mathbb{F} \supseteq \mathbb{K}$ be a field extension and let $v_1, \dots, v_r \in \mathbb{K}^N \subseteq \mathbb{F}^N$ be vectors. Then they are linearly dependent over $\mathbb{K}$ if and only if they are linearly dependent over $\mathbb{F}$.
Try to prove Lemma 1 and Lemma 2 on your own.
Proposition. Let $\mathbb{F} \supseteq \mathbb{K}$ be a field extension and let $A \in M_n(\mathbb{K})$. Then $\mu_{A,\mathbb{K}} = \mu_{A,\mathbb{F}}$.
Proof. It is clear that $\mu_{A,\mathbb{F}}$ divides $\mu_{A,\mathbb{K}}$. Since they are monic, it suffices to show that they have the same degree. But it follows from the previous lemmas. $\square$