How to prove that algebraic numbers form a field? [duplicate]

I'd like to know how to prove algebraic numbers form a field, i.e.,

if $a,b$ are algebraic numbers, prove that $ab$ and $a+b$ are also algebraic numbers by finding specific polynomials that satisfy $ab$ and $a+b$.

I know there is a way to do this using Kronecker Symbol, but not sure exactly how to do it.

Edit: This question is different from the suggested duplicated one in that this question asks for answer to prove that algebraic numbers form a field in unconventional ways such as using Kronecker Symbol. As the result, answers are very different from the one suggested duplicated. So this question deserves remaining open.


Here's an example of "my" approach (not really mine: Dave Boyd told me he heard this from Olga Taussky in the 60's, and according to Bill Dubuque it's "already in Dedekind") with $a = \sqrt{2}$ and $b = \sqrt[3]{3}$. $a$ has minimal polynomial $p_a(X) = X^2 - 2$ and $b$ has minimal polynomial $p_b(X) = X^3 - 3$. The companion matrices of these polynomials are $$ A = \pmatrix{0 & 2\cr 1 & 0\cr}, B = \pmatrix{0 & 0 & 3\cr 1 & 0 & 0\cr 0 & 1 & 0\cr}$$ The point about the companion matrix for a monic polynomial is that it is a matrix whose characteristic polynomial is that polynomial, and the nonzero entries of the matrix are coefficients of the polynomial.

Now if $C$ and $D$ are matrices (say $m \times n$ and $p \times q$), then $C \otimes D$ is an $mp \times nq$ matrix whose entries are the products of the entries of $C$ and the entries of $D$: we can consider it as consisting of an $m \times n$ array of $p \times q$ blocks, where block $(i,j)$ is $c_{ij} D$. Similarly for column vectors $u$ and $v$ with $n$ and $q$ entries we take $u \otimes v$ as the column vector with $nq$ entries consisting of $n$ blocks of size $q$, where the $j$'th block is $u_j v$.

We then have $(C \otimes D)(u \otimes v) = (Cu) \otimes (Dv)$. In particular, if $u$ is an eigenvector of $C$ for eigenvalue $\lambda$ and $v$ is an eigenvector of $D$ for eigenvalue $\mu$, then $u \otimes v$ is an eigenvector of $C \otimes D$ for eigenvalue $\lambda \mu$.

For $ab$ we take the matrix $$A \otimes B = \pmatrix{0 & 0 & 0 & 0 & 0 & 6\cr 0 & 0 & 0 & 2 & 0 & 0\cr 0 & 0 & 0 & 0 & 2 & 0\cr 0 & 0 & 3 & 0 & 0 & 0\cr 1 & 0 & 0 & 0 & 0 & 0\cr 0 & 1 & 0 & 0 & 0 & 0\cr}$$ which has $a b$ as an eigenvalue, with eigenvector $u \otimes v$ where $u$ is an eigenvector of $A$ for eigenvalue $a$ and $v$ is an eigenvector of $B$ for eigenvalue $b$.

For $a + b$ we take the matrix $$A \otimes I_3 + I_2 \otimes B = \left( \begin {array}{cccccc} 0&0&0&2&0&0\\ 0&0&0&0 &2&0\\0&0&0&0&0&2\\ 1&0&0&0&0&0 \\ 0&1&0&0&0&0\\ 0&0&1&0&0&0 \end {array} \right) + \left( \begin {array}{cccccc} 0&0&3&0&0&0\\ 1&0&0&0 &0&0\\ 0&1&0&0&0&0\\ 0&0&0&0&0&3 \\ 0&0&0&1&0&0\\ 0&0&0&0&1&0 \end {array} \right) = \pmatrix{0 & 0 & 3 & 2 & 0 & 0\cr 1 & 0 & 0 & 0 & 2 & 0\cr 0 & 1 & 0 & 0 & 0 & 2\cr 1 & 0 & 0 & 0 & 0 & 3\cr 0 & 1 & 0 & 1 & 0 & 0\cr 0 & 0 & 1 & 0 & 1 & 0\cr } $$ which has $a + b$ as an eigenvalue, for the same eigenvector $u \otimes v$.

Since $a+b$ and $ab$ are eigenvalues of matrices with rational (in fact integer) entries, they are roots of the characteristic polynomials of those matrices, and those characteristic polynomials are monic polynomials with rational (in fact integer) coefficients, therefore $a+b$ and $ab$ are algebraic numbers (in fact algebraic integers).


Here’s one way of doing it for $a+b$; I’ll let you worry about the other case.

You seem to be talking about complex numbers that are algebraic over $\Bbb Q$. Let $f(X)\in\Bbb Q[X]$ be the minimal polynomial for $b$. Then $f(X-a)\in\Bbb Q(a)[X]$ is a polynomial vanishing at $a+b$. Now consider $\Bbb Q(a)[X]$ as a $\Bbb Q[X]$-algebra. It’s free, with the same basis that $\Bbb Q(a)$ has over $\Bbb Q$, namely $\{1,a,a^2,\cdots,a^{n-1}\}$, where $n$ is the degree of the minimal polynomial for $a$ over $\Bbb Q$. Now you can take the norm from the ring $\Bbb Q(a)[X]$ down to $\Bbb Q[X]$, and apply this map to the element $f(X-a)$ of the upper ring. This is a polynomial in $X$ over $\Bbb Q$, and it vanishes at $a+b$, though nothing guarantees that it will be $\Bbb Q$-irreducible.

Let me show you a nontrivial example. Let $a=\sqrt2$ and $b=\sqrt[3]3$. Then the minimal polynomial of $b$ is $f(X)=X^3-3$, and $f(X-a)=X^3-3aX^2+3a^2X-a^3-3=X^3-3\sqrt2X^2+6X-2\sqrt2-3$. To get its norm, you can multiply it by its conjugate (what you get by replacing $\sqrt2$ by $-\sqrt2$) to get $$ f(X-a)f(X+a)=X^6-6X^4-6X^3+12X^2-36X+1\,. $$ Of course in this case, our sextic polynomial has to be $\Bbb Q$-irreducible.

EDIT AND EXPANSION: You have asked me in a comment to show an explicit calculation of a norm. Here’s the general situation: let $R$ be a ring, and $A$ an $R$-algebra that is free as an $R$-module, say of rank $n$. Then the regular representation exhibits $A$ as a subring of the ring of $n$-by-$n$ matrices over $R$. Namely, if $v\in A$, then you get the associated matrix by considering multiplication-by-$v$ as an $R$-linear transformation from $A$ to itself. With a choice of basis, you get a matrix out of this linear transformation. And the norm of $v$ is exactly the determinant of the linear transformation (and of the matrix).

Let me illustrate this by interchanging the roles of $a$ and $b$ in the above example: this requires calculating a determinant.

Now let $a=\sqrt[3]3$, $b=\sqrt2$, so that $f(X)$, the minimal polynomial for $b$, is $X^2-2$. Now form $v=f(X-a)=X^2-2aX+a^2-2$. I want the matrix of this, with respect to the basis $\{1,a,a^2\}$ for $\Bbb Q(a)[X]$ over $\Bbb Q[X]$. I need to write $v\cdot1$, $v\cdot a$, and $v\cdot a^2$ as $\Bbb Q[X]$-linear combinations of the basis elements $1$, $a$, and $a^2$. We have: \begin{align} v\cdot1&=(X^2-2)(1)&+(-2X)(a)&+(1)(a^2)\\ v\cdot a&=(3)(1)&+(X^2-2)(a)&+(-2X)(a^2)\\ v\cdot a^2&=(-6X)(1)&+(3)(a)&+(X^2-2)(a^2)\,. \end{align} That is, the matrix is \begin{pmatrix} X^2-2&3&-6X\\ -2X&X^2-2&3\\ 1&-2X&X^2-2 \end{pmatrix} and its determinant is exactly the polynomial I wrote before, where I used a shortcut for calculating a different norm.


I'm late to the party, but none of these answers have mentioned polynomial resultants. To me this is "the natrual" approach.

Note that if $a,b \in \bar{F}$ have minimal polynomials $A(x),B(x) \in F[x]$, then $$P(x) := \prod_{a',b' \in \bar{F}: A(a')=B(b')=0}(x - a'b')$$ and $$S(x) := \prod_{a',b' \in \bar{F}: A(a')=B(b')=0}(x - a'-b')$$ are polynomials over $F$ (this is implied by Galois theory) which have $ab$ and $a+b$ as a root, respectively. This proves the algebraics form a field.

In terms of resultants, these polynomials are explicitly given by $$P(x) = \operatorname{Res}_y(A(y), \operatorname{Res}_z(B(z), x - yz))$$ and $$S(x) = \operatorname{Res}_y(A(y), \operatorname{Res}_z(B(z), x - y-z)).$$ I say explicit because the resultant is an explicit polynomial expression in the coefficients of its arguments, defined by some matrix determinant; see the wikipedia article for more information. In particular, the definition of resultant implies that $P$ and $S$ are defined over $F$. This gives an alternative proof which avoids Galois theory.


Here is yet another proof of this well-known fact. This will be a rather elementary proof based on linear algebra.

First a little Lemma

Suppose $V\subset\mathbb{C}$ forms a finite dimensional vector space over $\mathbb{Q}$. If $a\in\mathbb{C}$ satisfies $$ aV \subset V,$$ then $a$ is algebraic.

Let $\{e_1,\dots,e_n\}$ be a basis for $V$. Then there is an $n\times n$ matrix $A=(\alpha_{ij})$ in $\mathbb{Q}$ such that $$ \begin{align} ae_1&= \alpha_{11}e_1 +\ldots +\alpha_{1n}e_n\\ \ldots&\ldots\\ ae_n&= \alpha_{n1}e_1 +\ldots + \alpha_{nn}e_n \end{align} $$ This means that $a$ is a solution to the equation $$ p(z) = \operatorname{det}(A-zI) =0.$$ Since $p$ is a rational polynomial (i.e., $p$ has rational coefficients), $a$ is algebraic.


With this result at hand, we now prove that the set of algebraic numbers is a subfield of $\mathbb{Q}$.

Suppose $a$ and $b$ are algebraic. Let $m$ and $n$ the smallest positive integers for which rational polynomials $p$ and $q$ of degree $m$ and $n$ exist such that $p(a)=0=q(b)$. Let $V$ the linear space over $\mathbb{Q}$ spanned by $\{a^jb^k: O\leq j < m, 0\leq k< n\}$.

It is easy to check that

$$ aV\subset V,\qquad bV\subset V $$

Consequently, $(a+b)V\subset V$, and $(ab)V\subset V$. From our little Lemma, we conclude that $ a+b$ as well as $ab$ are algebraic.

To conclude, notice that if $c\neq0$ solves the polynomial equation $p(x)=0$, then $-c$ solves $r(x)=p(-x)=0$ and $1/c$ solves $q(x)x^np(1/x)=0$ with $n$ being the degree of $p$. Clearly, if $p$ is a rational polynomial, so are $r$ and $q$.

———