Proving that two systems of linear equations are equivalent if they have the same solutions

Solution 1:

We have to prove the following: Given any solution set $L\subset{\mathbb R}^2$ then any two homogeneous systems $$\Sigma: \qquad a_i x+ b_i y=0 \qquad (1\leq i\leq n)$$ having the solution set $L$ can be transformed into each other by means of row operations.

The solution set $L$ can be one of the following:

(i) $\ \{0\}$,

(ii) a one-dimensional subspace $<r>$ with $r=(p,q)\ne 0$,

(iii) all of ${\mathbb R}^2$.

Ad (i): If $0$ is the only solution of $\Sigma$ then not all row vectors $c_i=(a_i,b_i)$ can be multiples of one and the same vector $c\ne0$. So there are two equations $a_1 x+b_1 y=0$, $a_2 x+ b_2 y=0$ in $\Sigma$ with linearly independent row vectors $(a_i, b_i)$, and by means of row operations one can transform these into $\Sigma_0: \ x=0, y=0$. Further row operations will remove all remaining equations from $\Sigma$. We conclude that in this case all systems $\Sigma$ are equivalent to $\Sigma_0$.

Ad (ii): The System $\Sigma$ has to contain at least one equation with $c_i=(a_i,b_i)\ne 0$. We claim that all equations with $c_i\ne 0$ are individually equivalent to $\Sigma_1: \ q x -p y=0$. So in this case any given $\Sigma$ is equivalent to $\Sigma_1$. To prove the claim we may assume $a_i\ne 0$. Now $r\in L$ implies $a_i p+ b_i q=0$, and as $r\ne 0$ we must have $q\ne 0$. This implies $b_i=-a_i p/q$, so multiplying the equation $a_i x+ b_i y=0$ by $q/a_i$ gives $\Sigma_1$.

Ad (iii): This case is trivial. All rows of $\Sigma$ are $0$.

Solution 2:

Let there be two systems, if the each equation of the second system is a Linear Combination of the first system then every solution of the first system is also the solution of the second system and if each equation of the first system is a Linear Combination of the second system the every solution of the second system is also the solution of the first system.

Now, let us consider the following two homogeneous systems in two unknowns ($x_1, x_2$), the solutions to both the systems are same. $A_{11}x_1+A_{12}x_2=0 ... A_{n1}x_1+A_{n2}x_2=0$ and $B_{11}x_1+B_{12}x_2=0 ... B_{n1}x_1+B_{n2}x_2=0$.

Select scalars $C_1, C_2, ..., C_n$. Multiply $k^{th}$ equation of the first system by $C_k$ and then add column-wise (so that variable is common) to get the following $(C_1A_{11}+...+C_nA_{n1})x_1+(C_1A_{12}+...+C_nA_{n2})x_2=0$.

Comparing this equation with all the equations of the second system and also utilising the fact given that both the systems have the same solutions, we get

$C_1A_{11}+...+C_nA_{n1}=B_{11}, B_{21},..., B_{n1}$ and $C_1A_{12}+...+C_nA_{n2}=B_{12}, B_{22},..., B_{n2}$,

which proves that the second system is a Linear Combination the the first system. Similar way we can show that the first system is a Linear Combination of the second system and thus conclude that both the systems are Equivalent.

Solution 3:

If equivalence means "that each equation in one system is a linear combination of equations in the other" then the proof is almost trivial by the method of Gauss-Jordan elimination which is an application of Gauss elimination to make a system of equations equivalent to a diagonal form (upper triangular first, then diagonal), arriving at the solution by performing elementary row operations. You can work backwards and start up from solution of your system written as a diagonal matrix and apply a different set of elementary row operations to arrive at different systems of equations which are equivalent since each equation (row of your matrix) of one can be traced back to be written as a linear combination of the other by construction.