How can I justify this without determining the determinant?
I need to justify the following equation is true:
$$ \begin{vmatrix} a_1+b_1x & a_1x+b_1 & c_1 \\ a_2+b_2x & a_2x+b_2 & c_2 \\ a_3+b_3x & a_3x+b_3 & c_3 \\ \end{vmatrix} = (1-x^2)\cdot\begin{vmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \\ \end{vmatrix} $$
I tried dividing the determinant of the first matrix in the sum of two, so the first would not have $b's$ and the second wouldn't have $a's$.
Then I'd multiply by $\frac 1x$ in the first column of the second matrix and the first column of the second, so I'd have $x^2$ times the sum of the determinants of the two matrices.
I could then subtract column 1 to column 2 in both matrices, and we'd have a column of zeros in both, hence the determinant is zero on both and times $x^2$ would still be zero, so I didn't prove anything. What did I do wrong?
Solution 1:
For another solution, note that $$ \underbrace{\begin{bmatrix} a_1+b_1x & a_1x+b_1 & c_1 \\ a_2+b_2x & a_2x+b_2 & c_2 \\ a_3+b_3x & a_3x+b_3 & c_3 \\ \end{bmatrix}}_{A} = \underbrace{\begin{bmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \\ \end{bmatrix}}_{B} \underbrace{\begin{bmatrix} 1 & x & 0 \\ x & 1 & 0 \\ 0 & 0 & 1 \\ \end{bmatrix}}_{C} $$ and therefore $\det(A) = \det(BC) = \det(B)\det(C)$. From there, it's enough to check that $$ \det(C) = \begin{vmatrix} 1 & x & 0 \\ x & 1 & 0 \\ 0 & 0 & 1 \\ \end{vmatrix} = \begin{vmatrix}1 & x \\ x & 1\end{vmatrix} = 1 \cdot 1 - x \cdot x = 1-x^2. $$
Solution 2:
\begin{align} &\phantom {=}\,\ \begin{vmatrix} a_1+b_1x & a_1x+b_1 & c_1 \\ a_2+b_2x & a_2x+b_2 & c_2 \\ a_3+b_3x & a_3x+b_3 & c_3 \end{vmatrix} \\ &= \begin{vmatrix} a_1 & a_1x+b_1 & c_1 \\ a_2 & a_2x+b_2 & c_2 \\ a_3 & a_3x+b_3 & c_3 \end{vmatrix} + \begin{vmatrix} b_1x & a_1x+b_1 & c_1 \\ b_2x & a_2x+b_2 & c_2 \\ b_3x & a_3x+b_3 & c_3 \end{vmatrix} \\&= \begin{vmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{vmatrix} + x \begin{vmatrix} b_1 & a_1x & c_1 \\ b_2 & a_2x & c_2 \\ b_3 & a_3x & c_3 \end{vmatrix} \\&= \begin{vmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{vmatrix} + x^2 \begin{vmatrix} b_1 & a_1 & c_1 \\ b_2 & a_2 & c_2 \\ b_3 & a_3 & c_3 \end{vmatrix} \\&= 1\cdot \begin{vmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{vmatrix} + (-1) x^2 \begin{vmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{vmatrix} \\&= (1-x^2)\cdot\begin{vmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \\ \end{vmatrix}. \end{align}
Solution 3:
The determinant is a polynomial of order 2 in $x$, $D(x)$, where the coefficients depend of the $a_i$, $b_i$ and $c_i$.
We know its two roots $1$ and $-1$, as the determinant is obviously null in these cases: two identical columns or one column the inverse of another one.
Therefore $$ D(x) = \lambda (1-x^2)$$
Where $\lambda$ depends of the $a_i$, $b_i$ and $c_i$.
Finally, the multiplicative term is given by $x=0$ :
$$D(0) =\lambda = \begin{vmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \\ \end{vmatrix}$$
Solution 4:
Sneaky Solution. . .
The left-hand-side is a polynomial of degree $2$ with zeroes at $x=\pm1$. Hence it has the form $C(1+x)(1-x) = C(1-x^2)$ for some $C \in \mathbb R$. Setting $x=0$ we get $C=\begin{vmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \\ \end{vmatrix}$ as required.
To see $x=\pm1$. are zeroes observe for $x=1$ the first and second columns are equal, hence the columns are linearly dependent, and the determinant is zero. For $x=-1$ the first column is the negative of the second, the columns are linearly dependent and the determinant is zero.
Solution 5:
An alternative way of looking at the accepted answer, and justifying the steps more (to answer the questions its comments), is considering the columns as vectors (which I now notice @Semiclassical did in their comment), $$\mathbf{a} = \begin{bmatrix} a_1 \\ a_2 \\ a_3 \end{bmatrix}$$ and same for $\mathbf{b}$ and $\mathbf{c}$. Now we should know determinants are "multilinear", so for any further vectors $\mathbf{v}, \mathbf{w}, \mathbf{z}$ and any scalar $x$, $$\mathrm{det}\bigl( (\mathbf{v}+\mathbf{w}), \mathbf{z}, \mathbf{c}\bigr) = \mathrm{det}\bigl( \mathbf{v}, \mathbf{z}, \mathbf{c}\bigr) +\mathrm{det}\bigl( \mathbf{w}, \mathbf{z}, \mathbf{c}\bigr) \\ \mathrm{det}\bigl(\mathbf{v}, (\mathbf{w}+\mathbf{z}), \mathbf{c}\bigr) = \mathrm{det}\bigl( \mathbf{v}, \mathbf{w}, \mathbf{c}\bigr) +\mathrm{det}\bigl( \mathbf{v}, \mathbf{z}, \mathbf{c}\bigr) \\ \mathrm{det}\bigl( x\,\mathbf{v}, \mathbf{z}, \mathbf{c}\bigr) = x.\mathrm{det}\bigl( \mathbf{v}, \mathbf{z}, \mathbf{c}\bigr)\\ \mathrm{det}\bigl( \mathbf{v}, x\,\mathbf{z}, \mathbf{c}\bigr) = x.\mathrm{det}\bigl( \mathbf{v}, \mathbf{z}, \mathbf{c}\bigr) $$ Knowing this, we know (e.g., from $\mathbf{v}=\mathbf{w}+(\mathbf{v}-\mathbf{w})$ on above) also $$\mathrm{det} \bigl( \mathbf{v}, \mathbf{w}, \mathbf{c} \bigr) = -\mathrm{det} \bigl( \mathbf{w}, \mathbf{v}, \mathbf{c} \bigr)\\ \mathrm{det} \bigl( \mathbf{v}, \mathbf{v}, \mathbf{c} \bigr) = 0$$ This suffices to straightforwardly work out the equality (I'm doing the first matrix column on the first line, then the second column of both on second line): $$\require{cancel}\mathrm{det}\bigl( (\mathbf{a}+x\,\mathbf{b}), (x\,\mathbf{a}+\mathbf{b}), \mathbf{c}\bigr) \\ = \mathrm{det}\bigl( \mathbf{a}, (x\,\mathbf{a}+\mathbf{b}), \mathbf{c}\bigr) + x.\mathrm{det}\bigl( \mathbf{b}, (x\,\mathbf{a}+\mathbf{b}), \mathbf{c}\bigr)\\ = x.\cancel{\mathrm{det}\bigl( \mathbf{a}, \mathbf{a}, \mathbf{c}\bigr)} +\mathrm{det}\bigl( \mathbf{a}, \mathbf{b}, \mathbf{c}\bigr) + x.\Bigl(x.\mathrm{det}\bigl( \mathbf{b}, \mathbf{a}, \mathbf{c}\bigr)+\cancel{\mathrm{det}\bigl( \mathbf{b}, \mathbf{b}, \mathbf{c}\bigr)}\Bigr)\\ = \mathrm{det}\bigl( \mathbf{a}, \mathbf{b}, \mathbf{c}\bigr) + x^2.\mathrm{det}\bigl( \mathbf{b}, \mathbf{a}, \mathbf{c}\bigr)\\ = \mathrm{det}\bigl( \mathbf{a}, \mathbf{b}, \mathbf{c}\bigr) - x^2.\mathrm{det}\bigl( \mathbf{a}, \mathbf{b}, \mathbf{c}\bigr) = (1- x^2).\mathrm{det}\bigl( \mathbf{a}, \mathbf{b}, \mathbf{c}\bigr)\\$$
The advantage I feel of this notation/approach is that you can see it extends to higher dimensions (adding columns $\mathbf{d}$, $\mathbf{e}$, ... ) without onerous notation. Also, you could replace in the above each "det(...)" by "$f$(...)", so it holds for any multilinear function $f$.
Compared to the highest-scoring answer: That answer is more elegant, but was found by working backwards from the solution, I feel (but it works equally well in each dimension), and needs a bit more skill in matrices.