Why square a constant when determining variance of a random variable?
Solution 1:
You have that
$$\text{Var}(aX) = E[(aX)^2]-(E[aX])^2 = E[a^2 X^2]-(aE[X])^2 $$
$$=a^2 E[ X^2]-a^2(E[X])^2 $$ $$= a^2( E[X^2]-(E[X])^2 ) = a^2 \text{Var}(X)$$
edit : or this one may be more basic (depending on your definition of variance)
$$\text{Var}(aX) = E[(aX-E[aX])^2 ] = E[(aX-aE[X])^2 ] $$
$$=E[a^2(X-E[X])^2 ] $$ $$= a^2E[(X-E[X])^2 ] = a^2 \text{Var}(X)$$
Solution 2:
Tryss's answer is correct. But you seem to need a more elementary illustration. Here it is, at least for the variance of sample data. (Your question is really about the variance of a random variable, but the point is the same.)
Take the two numbers $1$ and $3$. The mean of this set of data is 2. The variance is the average squared deviation from the mean. The deviations from the mean are $-1$ and $1$, so the squared deviations are $1$ and $1$, so the average squared deviation is $1$. Hence the variance of this set of data is 1.
Now look what happens when we multiply the dataset by 4. Our two numbers become 4 and 12. The mean is now 8. (This illustrates that when you multiply by a constant, the mean gets multiplied by that constant.) The deviations from the mean are $-4$ and $4$ (the deviations also get multiplied by the constant). Therefore the squared deviations are 16 and 16, so the averaged squared deviation is $16$. Hence the variance of this new set of data is 16.
Moral: when we multiplied our data by 4, the variance got multiplied by 16. This is totally unsurprising, because the variance is the average squared deviation. When you multiply your data by a constant, the deviations also get multiplied by that constant, so the squared deviations get multiplied by the square of that constant.