Linear Regression Computation as $y = ax$
It's not very difficult to derive these equations. I'll start with the original equation.
So we have a set of points, $(x,y)_i$ and we want to pair it up with a function $y_{\text{predicted}}=ax_i +b$. The coefficients $a$ and $b$ are chosen so that the error $$ E = \sum_i (y_{\text{predicted},i}-y_i)^2 = \sum_i (ax_i +b-y_i)^2 $$ is minimised. Note that now $E$ is only a function of $a$ and $b$, $E=E(a,b)$. Opening the parentheses, we get $$ \begin{split} E &= \sum_i (ax_i +b-y_i)^2 \\ &= \sum_i \left[a^2 x_i ^2 + 2abx_i + b^2 - 2ax_i y_i - 2by_i + y_i^2 \right]\\ &=a^2 \sum x_i ^2 + 2ab\sum x_i + nb^2 - 2a\sum x_i y_i - 2b \sum y_i + \sum y_i^2 \end{split} $$ where $n$ is the number of points. In order to minimise this, we set the partial derivatives to zero: $$ \frac{\partial E}{\partial a} = 2a \sum x_i ^2 + 2b\sum x_i- 2\sum x_i y_i =0 $$ $$ \frac{\partial E}{\partial b} = 2a\sum x_i + 2nb - 2 \sum y_i =0 $$ So now you have the pair of equations $$ \left\{ \begin{array}{lll} a \sum x_i ^2 &+ b\sum x_i &= \sum x_i y_i \\ a\sum x_i &+ nb &= \sum y_i \\ \end{array} \right. \qquad \text{or}\qquad \left[ \begin{array}{cc} \sum x_i ^2 & \sum x_i \\ \sum x_i & n \end{array} \right] \left[ \begin{array}{c} a \\b \end{array} \right] = \left[ \begin{array}{c} \sum x_i y_i \\ \sum y_i \end{array} \right] $$ Taking the matrix inverse (easy for a $2\times 2$ matrix) gives you the set of equations that you wrote in the beginning. So now you have the idea of how to derive these equations.
Now we can approach your actual question. We only need to set $b=0$ in the expression for $E$, resulting in $$ E= a^2 \sum x_i ^2 - 2a\sum x_i y_i + \sum y_i^2 $$ Again, we set the partial derivative to zero: $$ \frac{\partial E}{\partial a} = 2a \sum x_i ^2 - 2\sum x_i y_i =0 $$ Resulting in the simple expression for $a$: $$ a =\frac{ \sum x_i y_i }{ \sum x_i ^2 } $$