Drunkard's walk on the $n^{th}$ roots of unity.

Fix an integer $n\geq 2$. Suppose we start at the origin in the complex plane, and on each step we choose an $n^{th}$ root of unity at random, and go $1$ unit distance in that direction. Let $X_N$ be distance from the origin after the $N^{th}$ step. How well can we bound $E(X_N)$ from above?

In my attempt to calculate this, I found the bound $\sqrt{N}$, but I have a feeling this could be wrong because the problem I am applying this to has instead $\sqrt{N\log N}$. (This reasoning is based on the belief that the problem is likely optimal) What I did was apply Cauchy Schwarz to get a sum with the norm squared, and then try to do some manipulations from there, relying on the fact that the sum of the vectors (not the distance) is zero by symmetry.


I get the same $\sqrt{N}$ bound.

Let the $k$-th step be $z_k$ (an $n$-th root of unity), so that the position at time $N$ is $z_1 + \cdots + z_N$. The square of the distance from the origin at time $N$ is $$ X_N^2 = \left( \sum_{k=1}^{N} \overline{z_k} \right) \cdot \left( \sum_{j=1}^{N} z_j \right) = \sum_{k=1}^{N} | z_{k} |^2 + \sum_{k \ne j} \overline{z_k} \cdot z_j = N + \sum_{k \ne j} \overline{z_k} \cdot z_j. $$ Since each summand $\overline {z_k} \cdot z_j$ (for $k \ne j$) vanishes in expectation, we get $\mathbf E[X_N^2] = N$. Finally $$\mathbf EX_N \leqslant \sqrt{ \mathbf E[X_N^2] } = \sqrt{N} .$$

This settles that $\mathbf EX_N$ is asymptotically $O(\sqrt{N})$, but the resulting constant is most likely not tight since the approximation in the final step is rather crude (for any finite $n$).


Srivatsan has given a very fine answer, with a simple, elegant analysis. With a little more work, we can sharpen the result.

Claim: For $n \geq 3$, $\mathbb E X_N \sim \sqrt{\frac{\pi}{4} N}$ .

We can analyze this by means of the central limit theorem and the continuous mapping theorem. Below is just a sketch. We have restricted ourselves to the case $n \geq 3$ since the case $n = 2$ corresponds to the standard simple random walk, which has slightly different behavior (cf. Henning's comment). Intuitively, since for $n \geq 3$, the random walk can wiggle in an extra dimension, we should anticipate that its expected norm might grow slightly faster.

Proof (sketch):

Let $Z_i = (R_i,I_i)$, $i=1,2,\ldots$, be an iid (uniform) sample of the roots of unity where $R_i$ indicates the "real" component and $I_i$ the "imaginary" component of the $i$th element of the sample. Then, it is a simple exercise to verify that $\mathbb E R_i = \mathbb E I_i = 0$, and, also, $\mathbb E R_i I_i = 0$. Furthermore, $$ \mathrm{Var}(R_i) = \mathrm{Var}(I_i) = 1/2 \>, $$ independently of $n$, using simple properties of the roots of unity.

Hence, by the multivariate central limit theorem, we have that $$ \sqrt{2N} (\bar{R}_N, \bar{I}_N) \xrightarrow{d} \,\mathcal \,N(0,\mathbf I_2 ) \> , $$ where $\bar{R}_N = N^{-1} \sum_{i=1}^N R_i$ and likewise for $\bar{I}_N$. Here $\mathbf I_2$ denotes the $2 \times 2$ identity matrix.

An application of the continuous mapping theorem using $g(x,y) = x^2 + y^2$ yields $$ 2 N (\bar{R}_N^2 + \bar{I}_N^2) = \frac{2}{N} X_N^2 = g( \sqrt{2N} \bar{R}_N, \sqrt{2N} \bar{I}_N ) \,\xrightarrow{d}\, \chi_2^2 \> . $$

That is, the rescaled squared norm has a limit distribution which is chi-squared with two degrees of freedom.

The square-root of a $\chi_2^2$ distribution is known as a Rayleigh distribution and has mean $\sqrt{\pi/2}$.

Hence, by a second application of the continuous mapping theorem, $\sqrt{\frac{2}{N}} X_N$ converges to a Rayleigh distribution.

This strongly suggests (but does not prove) that $\mathbb E X_N \sim \sqrt{\frac{\pi}{4} N}$.

To finish the proof, note that $\mathbb E \frac{2}{N} X_N^2 = 2$ for all $N$. By a standard theorem in probability theory, there exists a sequence of random variables $\{Y_N\}$ such that $Y_N \stackrel{d}= \sqrt{\frac{2}{N}} X_N$ and $Y_N$ converges to $Y_\infty$ almost surely, where $Y_\infty$ is a standard Rayleigh. By the uniformity of the second moment above, we know that the set $\{Y_N\}$ is uniformly integrable and so $L_1$ convergent. So, $$ \mathbb |\mathbb E Y_N - \mathbb E Y_\infty| \leq \mathbb E |Y_N - Y_\infty| \to 0 \> . $$

Hence $\mathbb E Y_N = \mathbb E X_N \sim \sqrt{\frac{\pi}{4} N}$ as desired.