Calculating maximum-likelihood estimation of the exponential distribution and proving its consistency

The probability density function of the exponential distribution is defined as

$$ f(x;\lambda)=\begin{cases} \lambda e^{-\lambda x} &\text{if } x \geq 0 \\ 0 & \text{if } x<0 \end{cases} $$

Its likelihood function is

$$ \mathcal{L}(\lambda,x_1,\dots,x_n)=\prod_{i=1}^n f(x_i,\lambda)=\prod_{i=1}^n \lambda e^{-\lambda x}=\lambda^ne^{-\lambda\sum_{i=1}^nx_i} $$

To calculate the maximum likelihood estimator I solved the equation

$$ \frac{d\ln\left(\mathcal{L}(\lambda,x_1,\dots,x_n)\right)}{d\lambda}\overset{!}{=}0 $$

for $\lambda$.

$$ \begin{align} \frac{d\ln\left(\mathcal{L}(\lambda,x_1,\dots,x_n)\right)}{d\lambda} &= \frac{d\ln\left(\lambda^ne^{-\lambda\sum_{i=1}^nx_i}\right)}{d\lambda} \\ &= \frac{d\ln\left(n\ln(\lambda)-\lambda\sum_{i=1}^n x_i\right)}{d\lambda} \\ &= \frac{n}{\lambda}-\sum_{i=1}^n x_i \end{align} $$

Finally we get $$\lambda = \frac{n}{\sum\limits_{i=1}^n x_i}$$

I hope this is correct this far.

Where I am more uncertain is the proof for consistency.

I understand that to be consistent is in this case equivalent to to converge in probability to $\lambda$. So I have a hinch, that something like

$$ \lim_{n\to\infty}\mathbb{P}\left(\mathcal{L}(\lambda,x_1,\dots,x_n)-\lambda\right)=0 $$

will lead me to a solution.

Am I correct this far? If yes, how can I solve this? A hint would be great.


Update:

Using hints by users @Did and @cardinal I will try to show the consistency by proving that $\frac{1}{\Lambda_n}\to\frac{1}{\lambda}$ for $n\to\infty$ where

$$ \Lambda_n=\frac{n}{\sum\limits_{k=1}^nX_k} $$

Since $E(X_1)=\int\limits_0^\infty\lambda xe^{-\lambda x}dx=\frac{1}{\lambda}$ and the random variables $X_i$ for $i\ge1$ are independent the strong law of large numbers implies that

$$ P\left(\limsup_{n\to\infty}\left|\frac{1}{\Lambda_n}-\frac{1}{\lambda}\right|=0\right)=P\left(\limsup_{n\to\infty}\left|\frac1n\sum_{k=1}^nX_k-\frac{1}{\lambda}\right|=0\right)=1 $$

is true which implies convergence almost everywhere. This implies convergence in probability of $\Lambda_n$ to $\lambda$, which is equivalent to consistency.

Is this proof correct?


The computation of the MLE of $\lambda$ is correct.

The consistency is the fact that, if $(X_n)_{n\geqslant1}$ is an i.i.d. sequence of random variables with exponential distribution of parameter $\lambda$, then $\Lambda_n\to\lambda$ in probability, where $\Lambda_n$ denotes the random variable $$ \Lambda_n=\frac{n}{\sum\limits_{k=1}^nX_k}. $$ Thus, one is asked to prove that, for every positive $\varepsilon$, $\mathrm P(|\Lambda_n-\lambda|\geqslant\varepsilon)\to0$ when $n\to\infty$.

In the case at hand, it might be easier to prove the stronger statement that $\frac1{\Lambda_n}\to\frac1\lambda$ almost surely when $n\to\infty$. Hint: Law of large numbers.


$\hat\lambda= \frac{n}{\sum_{i=1}^n x_i}$ to be consistent estimator of $\lambda$ it should be Asymptotically

  1. Unbiased,
  2. and it's variance goes to zero.

Using $E\left\{ x\right\}=\frac{1}{\lambda}$ and $E\left\{ x^2\right\}=\frac{2}{\lambda^2}$ and the fact that $x_i$ are iid, we have

Condition 1: $\lim_{n\rightarrow \infty} E\{\hat\lambda - \lambda\}=0$

Condition 2: $\lim_{n\rightarrow \infty}E\left\{\left(\hat\lambda - E\{\hat\lambda\}\right)^2\right\}=0 $