Number of solutions of $x^2_1+\dots+x^2_n=0,$ $x_i\in \Bbb{F}_q.$
Let $\mathbb F$ be field with $q$ element and $\operatorname{char}(\mathbb F)\neq2$. I want to know about the number of solutions of this equation:
$$x^2_1+x^2_2+\dots+x^2_n=0 \text{ where } x_i\in \mathbb F.$$
Any suggestion?
Let $A_n$ be an abbreviation for $x_1^2+x_2^2+\cdots+x_n^2$. Let $a_n$ be the number of solutions of $A_n=0$, let $b_n$ be the number of solutions of $A_n=-1$, and let $c_n$ be the number of solutions of $A_n=c$, where $-c$ is some fixed nonsquare in the field. Find a system of recurrences involving $a_n$, $b_n$, and $c_n$, and solve.
Alternatively, use exponential sums. This is easiest to do when $q$ is prime. $$\sum_{x_1}\sum_{x_2}\cdots\sum_{x_n}\sum_te^{2\pi it(x_1^2+x_2^2+\cdots+x_n^2)/p}$$ gives $p$ times the number of solutions, where the sums are over all $x_1,x_2,\dots,x_n,t$ in the field. Bring the sum on $t$ to the front, separate out the term with $t=0$ to get $$p^n+\sum_{t=1}^{p-1}\sum_{x_1}\sum_{x_2}\cdots\sum_{x_n}e^{2\pi it(x_1^2+x_2^2+\cdots+x_n^2)/p}=p^n+\sum_{t=1}^{p-1}\left(\sum_xe^{2\pi itx^2/p}\right)^n$$ The sum on the inside is a quadratic Gauss sum whose value is recorded in many texts and on many websites.
Similar methods apply when $q=p^r$, $r\gt1$, but I disremember the trick.
Note the problem $x_1^2+x_2^2+\cdots+x_n^2=1$ (over the field of $p$ elements) is done in detail in Ireland and Rosen, A Classical Introduction to Modern Number Theory, Chapter 8, Section 6, pages 101-102. $x_1^2+x_2^2+\cdots+x_n^2=0$ is left as Exercise 19 in that chapter. The tools for extending the results to finite fields more generally are developed in Chapter 10.
Partial answer that works, when $q\equiv1\pmod4$. This is a minor simplification to the use of recurrences as described in the first sentence of Gerry Myerson's answer (that I somehow missed as my gaze was drawn to the character sums :/). For the cases $q\equiv-1\pmod4$ I recommend doing it his way.
When $q\equiv1\pmod4$ we know that the field contains a primitive fourth root of unity, call it $i$. Let $A(n)$ be the number of solutions for the equation in $n$ variables. We know (see also comment's by Gerry Myerson) that $A(1)=1$ and $A(2)=2q-1$.
We can write $$x_1^2+x_2^2=(x_1+ix_2)(x_1-ix_2)=z_1z_2,$$ where $z_1=x_1+ix_2$ and $z_2=x_1-ix_2$. The mapping $(x_1,x_2)\mapsto(z_1,z_2)$ is clearly a bijection, so $A(n)$ is equal to the number of solutions of $$ z_1z_2=-(x_3^2+x_4^2+\cdots+x_n^2). $$ Here the r.h.s. is equal to zero for exactly $A(n-2)$ vectors $(x_3,x_4,\ldots,x_n)$, so that gives us $(2q-1)A(n-2)$ solutions. OTOH, if the r.h.s. is non-zero, we can let $z_2$ be any non-zero element ($(q-1)$ choices), and can then uniquely solve for $z_1$. This gives us $(q-1)(q^{n-2}-A(n-2))$ solutions. Altogether we get a recurrence relation $$ A(n)=(2q-1)A(n-2)+(q-1)\left(q^{n-2}-A(n-2)\right)=(q-1)q^{n-2}+qA(n-2). $$
It is easy to prove by induction on $n$ that this has the solution $$ A(2k+1)=q^{2k} $$ for odd values of $n=2k+1$, and $$ A(2k)=q^{2k-1}+q^k-q^{k-1} $$ for even values of $n=2k$.
I'll try to answer this in a self-contained manner - I will repeat ideas by Paul and Gerry but it will be more detailed. I'll restrict myself to $q = p$ (a prime), but the same arguments are valid in the general case.
We'll generalize, because the problem calls for it: you want to count solutions to $\sum_{i=1}^{n} x_i^2 = 0$ - why not understand the distribution of $f(x_1,\cdots,x_n) = \sum_{i=1}^{n} x_i^2$, i.e. how many times each value in $\mathbb{F}_{p}$ appears in the multiset $\{ f(\vec{x}) | \vec{x} \in \mathbb{F}_{p}^{n} \}$.
This problem is a convolution problem - let $v$ be the vector that counts solutions to $x^2 = i$ - its i'th coordinate is the number of square roots of $i$ in $\mathbb{F}_{p}$. If we convolve $v$ with itself $n$ times, you'll get the distribution of our function $f$, and you need the 0'th coordinate: $\underbrace{v * \cdots * v}_\textrm{n times}$.
You can approach this via Fourier Transform (so-called "Gauss sums"), since Fourier turns convolution into pointwise multiplication which is easier, but we can avoid that.
The key step is noting that $v*v$ is almost constant - it has a certain value on 0, and another value on all $\mathbb{F}_{p}^{\times}$. Why is that?
If $p \equiv 1 (4)$, there's a square root of -1 in $\mathbb{F}_{p}$, call it $i$. Then the equation $x^2 + y^2 = j$ is equivalent to $(x+iy)(x-iy) = j$. We can make the substitution $x+iy=t, x-iy=s$ (since $char(F) \neq 2$) and this turns to $st = j$. If $j=0$ there are $2p-1$ solution, else - there are $p-1$ solutions ($(s,js^{-1})$).
If $p \equiv 3 (4)$, there's no square root of -1 living in our field, but we can construct a quadratic extension of our field, which will contain it - $i \in \mathbb{F}_{p}[x]/(x^2+1) \cong \mathbb{F}_{p^2}$ (as there's a unique quadratic extension of any finite field). The equation $x^2 + y^2 = j$ becomes : $$x^2 + y^2 = (x+iy)(x-iy) = (x+iy)^{p+1} = j$$ because $x-iy$ is the conjugate of $x+iy$, and Frobenius Automorphism $t \to t^p$ takes elements to their conjugates. (Since it's linear, you can just verify $1^p = 1, i^p = -i$). We can write this equation as $t^{p+1} = j$ where $j \in \mathbb{F}_{p}, t \in \mathbb{F}_{p^2}$. Note that $\mathbb{F}_{p} = (\mathbb{F}_{p^2})^{p+1}$ (one inclusion follows since for $t \neq 0$, $(t^{p+1})^{p-1} = t^{p^2 - 1} = 1$, so $t^{p+1} \in \mathbb{F}_{p}$). Since $(\mathbb{F}_{p^2})^{\times}$ is cyclic, for $j \neq 0$ this reduces to the equation $$(p+1)x \equiv (p+1)a (\mod p^2-1)$$ which has $p+1$ solutions in $\mathbb{Z}/(p^2-1)\mathbb{Z}$. For $j=0$, this clearly has one solution - $t=0$, i.e. $x=y=0$.
So $v*v$ is nice - as a function of $i$, it depends only on whether $i =0$ or not.
Now, a nice exercise: Let $u = (a,\underbrace{b, \cdots , b}_\textrm{p-1 times})$. $v = (c,\underbrace{d, \cdots , d}_\textrm{p-1 times})$. Then $u*v = (e,\underbrace{f, \cdots , f}_\textrm{p-1 times})$, where $(e-f) = (a-b)(c-d)$ and $e + (p-1)f = (a + (p-1)b)(c + (p-1)d)$. If $a=c, b=d$, we get: $(e-f)=(a-b)^2, e+(p-1)f = (a+(p-1)b)^2$. This is where the Fourier aspect is hidden - convolution of vectors multiplies the frequencies.
We're really close.
If $n$ is even, then $\underbrace{v * \cdots * v}_\textrm{n times} = \underbrace{(v * v) * \cdots * (v*v)}_\textrm{n/2 times}$, and this is easy by the exercise - if we apply the exercise $\frac{n}{2}$ times, our final vector $w$ will be of the form $(a,\underbrace{b, \cdots , b}_\textrm{p-1 times})$, where $a-b = \begin{cases} \ ((2p-1) - (p-1))^{n/2} &\text{if } p \equiv 1 \pmod{4}\\ ((p+1) - (1))^{n/2} & \text{if } p\equiv 3 \pmod{4} \end{cases} $ and $a+b(p-1) = (p^2)^{n/2}=p^n$. Solving these 2 equation gives $a$, which is the number you're seeking.
If $n$ is odd, we first calculate $\underbrace{v * \cdots * v}_\textrm{n-1 times}$ similarly to the first case. Then we need to convolve $(a,\underbrace{b, \cdots , b}_\textrm{p-1 times})$ with $v$, which turns out to be a relatively easy problem: the i'th coordinate will be $(a-b)v_i + pb$ (do it yourself). For $i=0$, $v_i = 1$ and the answer is $a-b+pb = (p-1)b+a$.
So there are 4 cases, depending on whether $i \in \mathbb{F}_{p}$ ($p \equiv 1(4)$) and on the parity of $n$.
Another way to get a recursion is by first noting that, whether or not $\sqrt{-1}$ lies in the ground field $\mathbb F_q$, every element of $\mathbb F_q$ is representable as $x^2+y^2$: when there is a $\sqrt{-1}$, this factors, and, when there is no $\sqrt{-1}$, this is the norm from the quadratic extension $\mathbb F_{q^2}$, which is surjective.
To hit $0$, there is a unique solution of $x^2+y^2=0$ without $\sqrt{-1}$, while there are $2q-1$ solutions with $\sqrt{-1}$. For non-zero values $z$, there are $q-1$ solutions of $x^2+y^2=z$ with $\sqrt{-1}$, and $q+1$ without.
Thus, there are two families of cases, depending on presence or absence of $\sqrt{-1}$, and incorporating whether one is hitting $0$ or not. That is, looking at $x^2+y^2=1-(x_3^2+\ldots+x_n^2)$, treat the cases that the right-hand side is $0$, or not, etc.