Using the first and second Borel-Cantelli Lemma to find necessary and sufficient condition for convergence in probability ($98\%$ solved)

I am working an exercise with five parts, and I've solved most of them, but still have some little but nontrivial confusions. The parts (b)-(e) coincides with Durrett 1.6.15 or Durrett 2.3.15, and were posted here: Necessary and sufficient conditions for convergence almost surely and in probability.

However, the post was not solved, so I will post my solution for all five parts here, however the exercise I am working on has some little differences in part (c) and (d).

Let $(X_{k})$ be i.i.d random variable taking values in $\overline{\mathbb{R}}$ and let $M_{n}:=\max_{k=1}^{n}X_{k}$.

(a) Show that $P( |X_{n}|>n\ \text{i.o.})=0$ if and only if $E|X_{1}|<\infty$.

(b) Show that $X_{n}/n\longrightarrow 0$ a.s. if and only if $E|X_{1}|<\infty$.

(c) Show that $M_{n}/n\longrightarrow 0$ a.s. if and only if $EX_{1}^{+}<\infty$ and $P(X_{1}>-\infty)>0$

(d) Show that $M_{n}/n\longrightarrow_{p} 0$ if and only if $nP(X_{1}>n)\longrightarrow 0$ and $P(X_{1}>-\infty)>0$.

(e)Show that $X_{n}/n\longrightarrow_{p} 0$ if and only if $P(|X_{1}|<\infty)=1$.

I have totally three questions:

(1) In part (c) and (d), why $P(X_{1}>-\infty)>0$ is also necessary and sufficient? You could see my proof below, I never used this condition. How could I include this condition in my proof?

(2) Is my proof of part (a) correct?

(3) My proof of direct $(\Leftarrow)$ follows a solution, but I don't really understand why that inequality is true.

Now, I am going to attach my proof:

Proof of (a):

$(\Rightarrow).$ Suppose $E|X_{1}|=\infty$, then we have $$\infty=E|X_{1}|=\int_{0}^{\infty}P(|X_{1}|>x)dx\leq\sum_{n=0}^{\infty}P(|X_{1}|>n),$$ since $X_{k}$'s are i.i.d, using Borel-Cantelli II yields us $$P( |X_{n}|>n\ \text{i.o.})=1$$

$(\Leftarrow).$ Let $\epsilon>0$, then we have $$E|X_{1}|<\infty\iff \sum_{n=1}^{\infty}P(|X_{n}|\geq\epsilon n)=\sum_{n=1}^{\infty}P(|X_{1}|\geq\epsilon n)<\infty,$$ then by Borel-Cantelli Lemma I, we have $$P\Big(|X_{n}|\geq\epsilon n\ \text{i.o.}\Big)=0,\ \forall\epsilon>0.$$

Setting $\epsilon=1$ concludes our proof.

Proof of (b):

Let $\epsilon>0$, since $X_{i}$'s are i.i.d, applying Borel-Cantelli I and II yields us \begin{align*} X_{n}/n\longrightarrow 0\ \text{a.s.}&\iff P\Big(|X_{n}|\geq\epsilon n\ \text{i.o.}\Big)=0\\ &\iff \sum_{n=1}^{\infty}P(|X_{n}|\geq\epsilon n)=\sum_{n=1}^{\infty}P(|X_{1}|\geq\epsilon n)<\infty\\ &\iff E|X_{1}|<\infty. \end{align*}

Proof of (c):

We need a lemma to prove firstly.

Lemma. Let $a_{n}$ and $b_{n}$ be two real numbers such that $b_{n}>0$ and $b_{n}\nearrow\infty$. Then set $a_{n}^{+}:=a_{n}\vee 0$ and $M_{n}:=\max_{m=1}^{n}a_{m}$, then $$\limsup_{n\rightarrow\infty}\dfrac{M_{n}}{b_{n}}=\limsup_{n\rightarrow\infty}\dfrac{a_{n}^{+}}{b_{n}}.$$

Proof of Lemma 1:

Denote that LHS to be $\alpha$ and the RHS to be $\beta$. Then, $\alpha\geq 0$ and $\beta\geq 0$.

Firstly, let $\epsilon>0$, then $a_{n}\leq a_{n}^{+}\leq (\beta+\epsilon)b_{n}$ for $n$ large, and thus $M_{n}\leq (\beta+\epsilon)b_{n}$, which implies $\alpha\leq \beta$. In particular, if $\beta=0$ then $\alpha=0$ immediately and they are still equal.

Conversely, by the first part, we may now assume $\beta>0$. Then, $M_{n}\geq 0$ for $n$ large and thus $a_{n}^{+}=a_{n}\vee 0\leq M_{n}$ and thus $\beta\leq \alpha$. (QED)

Now, by Lemma 1 and part (c), we have $$M_{n}/n\longrightarrow 0\ \text{a.s}\iff X_{n}^{+}/n\longrightarrow 0\ \text{a.s.}\iff EX_{1}^{+}<\infty,$$ which conclude the proof.

Proof of part (d):

Let $\epsilon>0$.

Firstly, note that since $M_{n}$ is increasing, $P(M_{n}/n\leq-\epsilon)\longrightarrow 0$, and thus let us only consider $P(M_{n}/n\geq \epsilon)=1-P(M_{n}<n\epsilon).$

$(\Rightarrow)$. Recall that $P(M_{n}<n\epsilon)=P(X_{1}<n\epsilon)^{n}=(1-P(X_{1}\geq n\epsilon))^{n}.$ Then, we have \begin{align*} M_{n}/n\longrightarrow_{p}0&\iff \forall\epsilon>0, P(M_{n}<n\epsilon)\longrightarrow 1\\ &\iff \forall\epsilon>0, \log(P(M_{n}<n\epsilon))=n\log(1-P(X_{1}\geq n\epsilon))\longrightarrow 0\\ &\iff \forall\epsilon>0, nP(X_{1}\geq n\epsilon)\longrightarrow 0\\ &\implies nP(X_{1}\geq n)\longrightarrow 0, \text{by letting}\ \epsilon=1. \end{align*}

$(\Leftarrow)$. $P(M_{n}\geq \epsilon n)\leq nP(X_{n}\geq \epsilon n)=nP(X_{1}\geq \epsilon n)\longrightarrow 0.$

Proof of part (e):

$$X_{n}/n\longrightarrow_{p}0\iff \forall\epsilon>0, P(|X_{n}|\geq\epsilon n)\longrightarrow 0\iff P(|X_{1}|=\infty)=0.$$

I am sorry for the long post, it is hard for me to explain my question without posting them, and I'd like to keep a record of this exercise.

Thank you!


The exercise you are working on assumes that variables can be $\pm \infty.$ As a result, you need the lemma to work for $a_n \in \overline {\Bbb R} = \Bbb R \cup \{\pm \infty\}$. Under those extended hypotheses, the lemma is false if all $a_n$ are $-\infty$. So you need to add the hypothesis at least one of the $a_n$ is $>-\infty$. You have to show this when you use the lemma, that is where you use $P(X_{1}>-\infty)>0$ (saying that each variable is not a.s. $=-\infty$ is a rephrasing of this condition)

(a) seems ok, for the second part it works because there is the inequality $\sum_{n=1}^{\infty}P(|X_{1}|\geq\epsilon n)\leq \frac 1 \epsilon \int_{0}^{\infty}P(|X_{1}|>x)dx <\infty $.

What other inequality do you not understand ? Is it $š‘ƒ(š‘€_š‘›ā‰„šœ–š‘›)ā‰¤š‘›š‘ƒ(š‘‹_š‘›ā‰„šœ–š‘›)$ ? This is union bound. More precisely $š‘ƒ(š‘€_š‘›ā‰„šœ–š‘›) = P(\exists i\in [1,n], X_iā‰„šœ–š‘›) = P(\bigcup_{i=1}^n \{X_iā‰„šœ–š‘›\}) \leq \sum_{i=1}^n P(X_iā‰„šœ–š‘›) = \sum_{i=1}^n P(X_nā‰„šœ–š‘›) = nP(X_nā‰„šœ–š‘›)$.