Show determinant of matrix is non-zero

Since $a,\ b$ and $c$ are rational, we may clear denominators in $$a^3 + 2b^3 -6abc +4c^3 = 0$$ The above equation is a homogenous equation of degree $3$ so we may cancel common factors. If there exists non-trivial solutions to the equation, we may therefore assume without loss of generality that $a,\ b$ and $c$ are integers with $\gcd(a,\ b,\ c)=1$.

Reducing modulo $2$, we find that $a\equiv 0\pmod 2$. Let $a=2\alpha$. Making the substitution and cancelling common factors, we arrive at $$4\alpha^3 +b^3 - 6\alpha bc + 2c^3 = 0$$ Reducing mod $2$ again, we get $b\equiv 0\pmod2$. So let $b=2\beta$ to obtain $$2\alpha^3 + 4\beta^3 - 6\alpha\beta c + c^3 = 0$$ Reducing modulo $2$ one last time gives $c\equiv 0\pmod 2$. This contradicts the fact that $\gcd(a,\ b,\ c)=1$. Therefore there are no non-trivial integer solutions to the above equation. It follows that the determinant is non-zero since $a,\ b$ and $c$ are not all zero.

To show the linear independence of $\left\{1,\ \sqrt[3]{2},\ \left(\sqrt[3]{2}\right)^2\right\}$ in $\mathbb{Q}$, suppose to the contrary that there exists some non-trivial rational linear combination such that $$r_0 + r_1\sqrt[3]{2} + r_2\left(\sqrt[3]{2}\right)^2 = 0$$ Then clearing denominators, there exists a non-trivial integral linear combination of the above set to $0$. Specifically, there exists an integral polynomial $p(x)$ of degree $2$ such that $\sqrt[3]{2}$ is a root. But the minimal polynomial of $\sqrt[3]{2}$ is $x^3 - 2$. This is a contradiction.


Edited in accordance with comment from Marc van Leeuwen:

Suppose $$a+b\root3\of2+c(\root3\of2)^2=0$$ with $a,b,c$ rational. Then $\root3\of2$ is a root of the polynomial $$f(x)=a+bx+cx^2$$ Now, $\root3\of2$ is also a root of $$g(x)=x^3-2$$ So $\root3\of2$ is a root of the gcd of $f$ and $g$. But $g$ is irreducible over the rationals, and $f$ has degree smaller than $g$ has, so $f$ is identically zero or the gcd is a nonzero constant. It isn't a nonzero constant, since it has to vanish at $\root3\of2$, so $f$ is the zero polynomial, so $$a=b=c=0$$ so $$\{{\,1,\root3\of2,(\root3\of2)^2\,\}}$$ is a linearly independent set over the rationals.


I'm not sure that's the best way to prove that $\mathbb{Q}[\sqrt[3]{2}] = \{a+b\sqrt[3]{2}+c(\sqrt[3]{2})^2\;|\;a,b,c\in\mathbb{Q}\}$ is a field.

I would argue instead that if $a+b\sqrt[3]{2}+c(\sqrt[3]{2})^2 \ne 0$, then the polynomial $f(x) = a+bx+c x^2$ is coprime to $x^{3} - 2$, so there are polynomials $u(x), v(x)$ such that $$ 1 = f(x) u(x) + (x^{3} - 2) v(x), $$ so that evaluating for $x = \sqrt[3]{2}$, $$ 1 = f(\sqrt[3]{2}) u(\sqrt[3]{2}) = (a+b\sqrt[3]{2}+c(\sqrt[3]{2})^2) \cdot u(\sqrt[3]{2}), $$ and $u(\sqrt[3]{2}) \in \mathbb{Q}[\sqrt[3]{2}]$ is the required inverse.


Let $r=\root3\of2$. The other answerers have shown that $1,r,r^2$ are linearly independent over $\mathbb{Q}$. Eu Yu's answer has also nicely shown that the determinant in question is nonzero. I don't have a better answer than his. However, since this question is, after all, one about determinant, I can't resist the temptation to solve it in (guise of) a matrix theoretical way.

Let $\omega$ be a primitive cube root of unity. Then your matrix is $\mathbb{C}$-similar to $$ A=\begin{pmatrix} a &r^2\omega ^2c &r\omega b\\ r\omega b &a &r^2\omega ^2c\\ r^2\omega ^2c &r\omega b &a \end{pmatrix}. $$ This is a circulant matrix. So, its eigenvalues are (see wikipedia): $$ \begin{cases} \lambda_1 = a+r^2\omega^2c+r\omega b &= a+r^2\omega^2c+r\omega b,\\ \lambda_2 = a+r^2\omega^2c\,\omega +r\omega b\,\omega^2 &= a+r^2c+rb,\\ \lambda_3 = a+r^2\omega^2 c\,\omega^2 +r\omega b\,\omega &= a+r^2\omega c+r\omega^2 b. \end{cases} $$ Note that $\lambda_2\neq0$ because $1,r,r^2$ are linearly independent over $\mathbb{Q}$. It follows that if both $\lambda_1$ and $\lambda_3$ have nonzero imaginary parts, $\det A\neq0$. Yet, if one of them is real, by inspecting its imaginary part, we get $b=rc$. So $b=c=0$ and $\det A=a^3$ is still nonzero.