Usefulness of Variance
Solution 1:
The variance is easier to deal with in intermediate computations, because it doesn't have a square root. For example, if $X$ and $Y$ are independent, then $Var(X+Y) = Var(X) + Var(Y)$, which is a simpler formula than $SD(X+Y) = \sqrt{SD(X)^2 + SD(Y)^2}$. Basically, if you want to work in terms of standard deviation all the time then you end up doing a lot of squaring and square-rooting.
Your claimed formula for $E(X^2)$ is almost true -- you're thinking of $X^2$ as a new random variable unrelated to $X$. To be strictly correct you'd need $E(X^2) = \sum_{i=0}^{n^2} i P(X^2 = i)$, since if $X$ can take values from $0$ to $n$ then $X^2$ can take values as large as $n^2$. But then you're faced with the problem of getting $P(X^2 = i)$. In practice one uses picakhu's formula $E(X^2) = \sum_{i=0}^n i^2 P(X=i)$.
Solution 2:
Variance and standard deviation are the same, in the sense that if you know one you know the other. The importance of variance is that if $X,Y$ are independent then $$V[X+Y] = V[X]+V[Y],$$ i.e. variance is additive. If you substitute the standard deviation you'd get a more complicated formula.
You might ask why do people not use the median instead of the expectation, or one of $E[|X-E[X]|], E[|X-M[X]|]$ instead of the standard deviation. The reason is that the expectation and variance enjoy many nice properties like additivity, and are much easier to work with analytically.
Your formula for $E[X^2]$ is valid (provided the range of values encompasses all possible values of $X^2$, which is $0$ to $n^2$ in your case), though a more useful one is $$E[X^2] = \sum_i \Pr[X=i] i^2.$$