Convergence of series $\sum\limits_{k=1}^\infty\frac{1}{X_1+\dots+X_k}$ with $(X_k)$ i.i.d. non integrable
Pick a sequence $X_1$, $X_2$, $\dots$, of i.i.d. random variables taking values in positive integers with $\mathbb{P}(X_i=n)=\frac{1}{n}-\frac{1}{n+1}=\frac{1}{n(n+1)}$ for every positive integer $n$.
Q: Does the sum $\frac{1}{X_1}+\frac{1}{X_1+X_2}+\frac{1}{X_1+X_2+X_3}+\dots=\sum\limits_{k=1}^\infty\frac{1}{X_1+\dots+X_k}$ converge a.s.? In other words, is it finite a.s.?
Some remarks:
1. Some slick computations using the generating function of $X_i$ show that the expected value of the sum is $+\infty$, so it gives us no information.
2. For any fixed $i$, eventually every denominator $X_1+\dots+X_k$ becomes much larger than $X_i$. So convergence of the sum is independent of $X_i$ (more precisely of $(X_1,\dots,X_i)$ jointly) and by Kolmogorov's 0-1 law either the sum converges a.s. or it diverges a.s.
3. Notice that $\mathbb{E}[X_i]=+\infty$, so by the strong law of large numbers (applied to suitable truncations of the $X_i$'s) we have $\frac{X_1+\dots+X_k}{k}\to +\infty$ a.s. So, if we rewrite our series as $\sum_{k=1}^\infty\frac{1}{k}\left(\frac{X_1+\dots+X_k}{k}\right)^{-1}$, we can conclude that it grows slower than the harmonic series.
4. Easy estimates show that the problem is equivalent to the convergence of $\sum\limits_{k=1}^\infty \frac{1}{t_1^{-1}+\dots+t_k^{-1}}$, where $(t_i)_{i\ge 1}$ is a sequence of independent uniform random variables on $(0,1)$.
5. If we use the $t_i's$, we can rewrite the sum as $\sum\limits_{k=1}^\infty\frac{1}{k} HM(t_1,\dots,t_k)$, where $HM$ denotes the harmonic mean. The inequality $HM\le GM$ ($GM$ is the geometric mean) does not help, since by the strong law of large numbers we get $\sum\limits_{k=1}^\infty\frac{1}{k} GM(t_1,\dots,t_k)\sim\sum\limits_{k=1}^\infty\frac{\alpha}{k}=+\infty$ where $\alpha:=\exp\left(\int_0^1 \log t\,dt\right)=\frac{1}{e}$.
I do not know the origin of the problem, but it is interesting as it seems to require sharp estimates to solve it. Any idea is appreciated.
The series $\sum\limits_{n=1}^\infty\frac1{S_n}$ diverges almost surely, where $S_n=X_1+\cdots+X_n$ for some i.i.d. random variables $X$ and $(X_n)$ such that $P(X\geqslant k)=\frac1k$ for every positive integer $k$.
The same result applies to every positive i.i.d. sequence $X$ and $(X_n)$ such that $x\cdot P(X\geqslant x)\leqslant c$ for some finite $c$, for every $x$ large enough.
A proof of this uses the degenerate convergence criterion (part 1.), which provides the convergence in probability of a scaled version of the increments (part 2.), and a lemma which allows to deduce the almost sure convergence of the series from this convergence in probability (part 3.). All this suggests a natural conjecture (part 4.). But it happens that parts 1. and 2. can be replaced by a self-contained argument, explained in addendum A. Finally, in addendum B. we prove the much simpler fact that the expected value of the sum diverges.
1. A version of the degenerate convergence criterion (see the book by Michel Loève, Probability theory, section 23.5, page 329 (1963), or the quote of Loève's book by Allan Gut in his article An extension of Feller’s weak law of large numbers, dated 2003, as Theorem 1.2) is as follows.
Degenerate convergence criterion: Let $S_n=X_1+\cdots+X_n$, where $X$ and $(X_n)$ is i.i.d., and $(b_n)$ some positive sequence increasing to infinity. Introduce $$X^{(n)}=|X|\cdot\mathbf 1_{|X|\leqslant b_n},\qquad\mu_n=E(X^{(n)}),$$ and the conditions:
- (i) For every positive $\varepsilon$, $n\cdot P(|X|\geqslant\varepsilon b_n)\to0$
- (ii) $n\cdot\mathrm{var}(X^{(n)})\ll b_n^2$
- (iii) $n\cdot \mu_n\ll b_n$
Then, conditions (i) and (ii) imply that $b_n^{-1}(S_n-n\cdot\mu_n)\to0$ in probability, hence conditions (i), (ii) and (iii) imply that $b_n^{-1}S_n\to0$ in probability.
2. In the present case, conditions (i) and (ii) both mean that $b_n\gg n$, and $\mu_n\sim\log b_n$ hence the choice $b_n=n\log n$ shows that $S_n/(n\log n)\to1$ in probability. Note that every choice $b_n\gg n\log n$ would show that $S_n/b_n\to0$ in probability, for example $S_n/(n\log n\log\log n)\to0$ in probability.
3. Our second tool is a general lemma which we state independently.
Lemma: Consider any sequence $(Y_n)$ of nonnegative random variables and $(c_n)$ some positive sequence such that $P(Y_n\leqslant c_n)\to0$ and $\sum\limits_nc_n$ diverges, then $\sum\limits_nY_n$ diverges almost surely.
Proof: For every event $A$ such that $P(A)\ne0$ there exists some finite $n_A$ such that, for every $n\geqslant n_A$, $P(Y_n\leqslant c_n)\leqslant\frac12P(A)$, in particular, $E(Y_n\mathbf 1_A)\geqslant c_nP(Y_n\geqslant c_n,A)\geqslant\frac12c_nP(A)$ for every $n\geqslant n_A$ hence $E(Y\mathbf 1_A)=+\infty$, where $Y=\sum\limits_nY_n$. This holds for every $A$ such that $P(A)\ne0$ hence $Y=+\infty$ almost surely, QED.
The lemma above with $Y_n=1/S_n$ and $c_n=1/(2n\log n)$ shows that $\sum\limits_{n=1}^\infty\frac1{S_n}$ diverges almost surely.
4. Roughly speaking, $S_n$ is of order $n\log n$. A natural conjecture is that $$\frac1{\log\log n}\sum_{k=1}^n\frac1{S_n}$$ converges in distribution.
A. To make this answer fully self-contained, here is a simple proof that $\frac{S_n}{n\log n}\to 1$ in probability.
First of all, note that $P(X_1>\alpha)<\frac{1}{\alpha}$ for any real $\alpha>0$. Now fix some positive integer $n$ and define the truncations $X'_k:=X_k \mathbf{1}_{\{X_k\leqslant n\log n\}}$. Let also $S'_n:=X'_1+\dots+X'_n$ and $\mu_n:=E[X'_1]$. Since $n\mu_n\sim n\log n$, it suffices to show that $\frac{S_n}{n\mu_n}\to 1$ in probability. Notice that $$P\left(\left|\frac{S_n}{n\mu_n}-1\right|>\epsilon\right)\leqslant P(S_n\neq S'_n)+P\left(\left|\frac{S'_n}{n\mu_n}-1\right|>\epsilon\right) \leqslant nP(X_1>n\log n)+P\left(\left|\frac{S'_n}{n\mu_n}-1\right|>\epsilon\right).$$ The first term is easy to bound: $nP(X_1>n\log n)<n\frac{1}{n\log n}\to 0$ as $n\to\infty$.
The second one can be bounded using Chebyshev's inequality: $$P\left(\left|\frac{S'_n}{n\mu_n}-1\right|>\epsilon\right)\leqslant \frac{1}{\epsilon^2}\text{Var}\left(\frac{S'_n}{n\mu_n}\right)=\frac{\text{Var}(X'_1)}{\epsilon^2 n\mu_n^2} \leqslant\frac{n\log n}{\epsilon^2 n\mu_n^2}\sim\frac{n\log n}{\epsilon^2 n\log^2 n}\to 0,$$ where we used the obvious inequality $\text{Var}(X'_1)\leqslant E[X_1'^2]=\sum\limits_{1\leqslant k\leqslant n\log n}\frac{k^2}{k(k+1)}\leqslant n\log n$.
B. The divergence of expectation is much easier to prove. For simplicity, assume that $X = 1/U$, where $U$ is uniformly distributed on $[0,1]$. Then $$ E\left[\frac{1}{X_1+\dots+X_n}\right] = E\left[\int_0^\infty e^{-t(X_1+\dots+X_n)}dt \right] = \int_0^\infty \left(E[e^{-tX}]\right)^n dt. $$ Now $$ 1-E[e^{-tX}] = \int_0^1 (1-e^{-t/u}) du = t\int_t^\infty (1-e^{-z}) z^{-2} dz \sim -t\log t,\qquad t\to 0+. $$ This already implies that the expectation is infinite, since $$ E\left[\sum_{n=0}^\infty\frac{1}{X_1+\dots+X_n}\right] = \int_0^\infty \frac{dt}{1-E[e^{-tx}]} $$ and the integrand behaves like $\frac{-1}{t\log t}$ as $t\to 0+$.
Also this allows to write asymptotics for one summand: as $n\to\infty$, $$ E\left[\frac{1}{X_1+\dots+X_n}\right] = \int_0^\infty \left(E[e^{-tX}]\right)^n dt \sim \int_0^\epsilon \left(E[e^{-tX}]\right)^n dt\\ \sim \int_0^\epsilon e^{n t\log t} dt \sim \int_0^{\delta} e^{-nu}\frac{du}{- \log u}\sim \frac{1}{n\log n}, $$ which agrees with the convergence of probability $S_n/n\log n\to 1$.
It diverges a.s. since by degenerate convergence criterion (Theorem $1.2$): $$T_n^{-1}=\frac {X_1+\dots+X_n} {n\log n}\overset{p}\to 1$$
Indeed, let $P_n=\sum_{k=1}^n 1/S_k=\sum_{k=1}^n \frac {T_k}{k\log k}\to L$ and let $A$ be any set of positive measure. Then $$E(L1_A)=\lim_{n\to\infty} E(P_n1_A)=\infty$$ and $L=\infty$ a.s. follows.
We can show this holds for all sequences of i.i.d. $X_i$ such that $xP(X_i\ge x)=O(\log\log x)$ if we compare $S_n$ with $n\log n\log\log n$ instead.
But we can do more with the original sequence. Combine $T_k\overset{p}\to 1$ with $E(T_k)\sim 1$ (as shown by @zhoraster in community wiki) and remember that $T_k-1>-1$ to immediately get
$$E\left(\frac{\sum_{k=1}^n\frac{|T_k-1|}{k\log k}}{\log\log n}\right)\to 0$$ which yields
$$\frac {\sum_{k=1}^n1/S_k}{\log\log n}\overset{L^1}\longrightarrow 1$$ It remains to find scaling that produces a non-degenerate law (if such exists).