Limit distribution of infinite sum of Bernoulli random variables

Solution 1:

An approach using generating functions is possible.

Define $Y_k = X_k/2^k$, $k = 1, 2, \ldots$. Then $$M_{Y_k}(t) = \operatorname{E}[e^{tY_k}] = e^0 \Pr[Y_k = 0] + e^{t/2^k} \Pr[Y_k = 2^{-k}] = \frac{e^{t/2^k} + 1}{2}.$$ Then defining $$S_n = \sum_{k=1}^n Y_k,$$ we find that the MGF for $S_n$ is $$\begin{align*} M_{S_n}(t) &= \operatorname{E}\left[\prod_{k=1}^n e^{tY_k}\right] = \prod_{k=1}^n M_{Y_k}(t) \\ &= 2^{-n} \prod_{k=1}^n (e^{t/2^k} + 1) \\ &= \frac{2^{-n}}{e^{t/2^n} - 1} (e^{t/2^n} - 1)(e^{t/2^n} + 1) \prod_{k=1}^{n-1} (e^{t/2^k} + 1) \\ &= \frac{2^{-n}}{e^{t/2^n} - 1} (e^{t/2^{n-1}} - 1) \prod_{k=1}^{n-1} (e^{t/2^k} + 1) \\ &= \frac{e^t - 1}{2^n(e^{t/2^n} - 1)} .\end{align*}$$ Taking the limit as $n \to \infty$, we easily get $$M_{S_\infty}(t) = \frac{e^t-1}{t},$$ which is the MGF of a $\operatorname{Uniform}(0,1)$ distribution.

Solution 2:

Let $$S_n = \sum_{k=1}^n 2^{-k}X_k.$$ As $$0\leqslant S_n\leqslant \sum_{k=1}^n 2^{-k} = 1$$ almost surely for all $n$, this inequality holds in the limit, so by dominated convergence, $$ \begin{align*} \mathbb E\left[\lim_{n\to\infty}S_n\right] &= \lim_{n\to\infty}\mathbb E[S_n]\\ &= \lim_{n\to\infty}\mathbb E\left[ \sum_{k=1}^n 2^{-k}X_k\right]\\ &= \lim_{n\to\infty}\sum_{k=1}^n 2^{-k}\mathbb E[X_k]\\ &= \lim_{n\to\infty}\sum_{k=1}^n 2^{-k+1}\\ &= \frac12\sum_{k=1}^\infty 2^{-k}\\ &= \frac12. \end{align*} $$ However, the limiting distribution is actually continuous. For each $n$, $S_n$ is uniformly distributed over $$E_n=\left\{\sum_{k\in S}2^{-k} : S\subset\{1,2,\ldots,n\}\right\}. $$ Given an $n$, and some $\omega\in E_n$, we have for $m\geqslant n$ $$\mathbb P(S_m = \omega)=2^{-m}.$$ Hence $$\lim_{m\to\infty} P(S_m = \omega)=0.$$ @gmath suggests in this answer that the limiting distribution is actually uniform on the interval $(0,1)$.