Solving systems of linear equations when matrix of coefficients is singular

When attempting to solve a problem of type $AX=B$ where $A$ is the matrix of coefficients, $X$ contains the variables and $B$ is the right hand side, it turned out the $A$ was singular. As a result I could not premultiply both side by $A^{-1}$. In such a case, how would I solve for the variables without having to use augmented matrices?

Here is the question:

$$\begin{pmatrix} 1&2&1&1&1 \\ 2&1&2&1&1 \\ 1&2&3&1&1 \\ 2&2&1&1&3 \\3&3&5&2&2 \end{pmatrix}\begin{pmatrix} u \\ v \\ w \\ x \\ y \end{pmatrix}=\begin{pmatrix} 6.3 \\ 6.7 \\ 7.7 \\ 9.8 \\10.9 \end{pmatrix}$$

Since $A$ is singular I cannot do $X=A^{-1}B$ which is the method I would typically use to solve for the unknowns. In later parts of the book I am studying they elude to a method with which one can use to solve for $X$, but do not further elaborate. Does anyone have an idea on how to solve this problem (or any problem in general where A is singular) without the use augmented matrices or is that the only way?


Solution 1:

Even when the system of equations is singular, you can find a least-squares solution by solving the system $A^{T}Ax=A^{T}b$.

Solution 2:

Per my comment, Gaussian Elimination is perhaps the easiest thing to do.

However:

You don't have to perform row reduction as given by the Gaussian Algorithm.

Any two systems that have row equivalent augmented matrices have the same solution set.

There is a lot of cancellation that can occur in your matrix by using row operations, other than those used in Gausian elimination; try taking advantage of that.

If you're lucky, you can obtain the solution, if any, through ad-hoc methods by examining the "sparse" row equivalent form of the augmented matrix.

Note here that the augmented matrix for your system is row equivalent to (adding multiples of row 1 to the others): $$\begin{pmatrix} 1&2&1&1&1&\ \ 6.3 \\ 1&-1&1&0&0&\ \ .4 \\ 0&0&2&0&0&\ \ 1.4 \\ 1&0&0&0&2&\ \ 3.5\\1&-1&3&0&0&\ \ -1.7 \end{pmatrix} $$

The above is row equivalent to (working with rows 2 and 5):

$$\begin{pmatrix} 1&2&1&1&1&\ \ 6.3 \\ 1&-1&1&0&0&\ \ .4 \\ 0&0&2&0&0&\ \ 1.4 \\ 1&0&0&0&2&\ \ 3.5\\0&0&2&0&0&\ \ -2.1 \end{pmatrix} $$

We see that there is no solution.

Solution 3:

Use Singular Value Decomposition to obtain a low-rank approximation of the coefficient matrix. Example: (using MATLAB for simplicity. one can easily obtain this solution by hand!)

A=[1 1;2 2];
b=[5 5];
[U,S,V]=svd(A);
A_approx=U(:,1)*S(1,1)*V(:,1)';

Now, One can use A_approx instead of A to obtain a solution in the least squares sense as @Paul mentioned above.