Consistency and asymptotically unbiasedness?

Solution 1:

Let $Y_n$ take the value $0$ with probability $(n-1)/n$ and take the value $n$ with probability $1/n$. Then $(Y_n)$ is a consistent sequence of estimators for zero but is not asymptotically unbiased: the expected value of $Y_n$ is $1$ for all $n$.

If we assume a uniform upper bound on the variance, $\mathrm{Var}(Y_n-X)\leq \mathrm{Var}(Y_n)+\mathrm{Var}(X)<C$ for all $n$, then consistency implies asymptotic unbiasedness. Let $Y_1,Y_2,\dots$ be a consistent sequence of estimators for a random variable $X$. This means that for all $\epsilon>0$ the probability of the event $|Y_n-X|>\epsilon$ tends to zero as $n\to\infty$. We want to show that $Y_1,Y_2,\dots$ is asymptotically unbiased: that the expected value of $Y_n-X$ tends to zero as $n\to\infty$. This follow from Cauchy-Schwarz for example: $$\mathbb E[|Y_n-X|]\leq \epsilon+\sqrt{\mathbb E [|Y_n-X|^2]\mathbb{P}[|Y_n-X|\geq\epsilon]}\to 0.$$

It is also useful to note that by Chebyshev's inequality, if the variance tends to zero then asymptotic unbiasedness implies consistency.