Law of large numbers for partial sums of order statistics

Let $X$ be a random variable with some CDF $F(x)=P[X\leq x]$ for all $x \in \mathbb{R}$.

Claim 1: Either there is an $M \in \mathbb{R}$ such that $P[X>M]=0$ and $P[X=M]>0$, or there is an increasing sequence of real numbers $\{M_i\}_{i=1}^{\infty}$ such that $P[X>M_i]>0$ for all $i \in \{1, 2, 3, ...\}$ and $\lim_{i\rightarrow\infty} P[X>M_i]=0$.

Proof: Define $$ M = \sup\{t \in \mathbb{R}: P[X>t]>0\}$$ If $M=\infty$ then we have $P[X>i]>0$ for all positive integers $i$, and we already know $\lim_{i\rightarrow\infty} P[X>i]=0$ from basic properties of a CDF, so we can define $M_i=i$ and we are done.

If $M$ is finite then we know $P[X>t]=0$ for all $t>M$. Since $\{X>M+1/n\}\nearrow \{X>M\}$, by continuity of probability we have $0=P[X>M+1/n]\nearrow P[X>M]$, and so $P[X>M]=0$. If $P[X=M]>0$ then we are done.

The remaining case is when $M$ is finite and $P[X=M]=0$. We already know $P[X>M]=0$. By definition of $M$ we have $P[X>M-1/i]>0$ for all $i \in \{1, 2, 3, ...\}$. Also $$\{M-1/i<X\leq M\} \searrow \{X=M\}$$ Thus by continuity of probability $$ P[M-1/i<X\leq M]\searrow P[X=M]=0$$ But $$P[X>M-1/i] = P[M-1/i<X\leq M] + P[X>M] = P[M-1/i<X\leq M]$$ and so $P[X>M-1/i]\rightarrow 0$. So we define $M_i = M-1/i$ and we are done. $\Box$

Claim 2: Assume $\{X_i\}_{i=1}^{\infty}$ are i.i.d. nonnegative random variables with CDF $F(x)$ and with finite mean $E[X]$. Then for any $\epsilon>0$ there is a $\delta>0$ such that
$$ P\left[\frac{1}{n}\sum_{i \in I_n(\delta)} X_i \geq E[X] - \epsilon\right]\rightarrow 1$$ where $I_n(\delta)$ is the set of the $\lfloor(1-\delta)n\rfloor$ smallest variables from $\{X_1, ..., X_n\}$.

Proof: Fix $\epsilon>0$. Let $X=X_1$.

Case 1: Suppose there is an increasing sequence $\{M_n\}_{n=1}^{\infty}$ with $P[X>M_n]>0$ and $P[X>M_n]\rightarrow 0$. So we can find a sufficiently large value $M$ (where $M=M_n$ for some $n$) such that $P[X>M]>0$ and $E[X1\{X\leq M\}]\geq E[X]-\epsilon/2$. The fraction of $\{X_i\}$ that are larger than $M$ converges with prob 1 to $P[X>M]$. Define $\delta = P[X>M]/2$. Then for sufficiently large $n$, the largest $\delta$ fraction of the $\{X_1, ..., X_n\}$ values are all larger than $M$, so $$ \sum_{i \in I_n(\delta)} X_i \geq \sum_{i=1}^n X_i1\{X_i\leq M\}$$ So $$ \frac{1}{n}\sum_{i \in I_n(\delta)} X_i \geq \frac{1}{n}\sum_{i=1}^n X_i1\{X_i\leq M\}$$ The right-hand-side converges to $E[X1\{X\leq M\}]$ with prob 1, so with some additional steps we are done. $\Box$

Case 2: There is an $M$ such that $P[X=M]>0$ and $P[X>M]=0$.

In this case, with $\delta \leq P[X=M]/2$, the largest $\delta$ fraction of $\{X_1, ..., X_n\}$ will all eventually take the value $M$ itself. I believe with some additional steps we are done.