Convergence in Probability

Solution 1:

The definition of convergence in probability is given below :

Let $\{X_n\}$ be a sequence of random variables on a probability space. then we say that $\{X_n\}$ convergence in probability to $\theta$ if for every $\epsilon >0$, $$Pr[|X_n-\theta| \geq \epsilon]\rightarrow0 ~\text{as}~ n\rightarrow \infty$$ and this is equivalent to $$Pr[|X_n-\theta|< \epsilon]\rightarrow 1 ~\text{as}~ n\rightarrow \infty$$

For your problem note that $$\begin{align}P[|Y_{(n)}-\theta|< \epsilon] &=P[\theta-\epsilon<Y_{(n)}<\theta+\epsilon]\\ &= F_{Y_{(n)}}(\theta+\epsilon)- F_{Y_{(n)}}(\theta-\epsilon)\\&= 1-(\frac{\theta-\epsilon}{\theta})^n, 0< \epsilon <\theta \\&=1-0,\epsilon \geq\theta \\& \rightarrow 1 ~\text{as}~ n \to \infty\end{align}$$ Hence $Y_n$ converges to $\theta$ in probability.

Solution 2:

Assuming $E \left[ Y_n \right] \rightarrow \theta$ so that $E \left[ \left| Y_n - \theta \right| \right] \rightarrow 0$, you could use Markov's inequality to show that for $\varepsilon > 0$ \begin{eqnarray*} \Pr \left[ \left| Y_n - \theta \right| \geq \varepsilon \right] & \leqslant & E \left[ \left| Y_n - \theta \right| \right]/\varepsilon \end{eqnarray*} which goes to zero proving convergence in probability.