Is $\bar X$ a minimum variance unbiased estimator of $\theta$ in an exponential distribution?

Assuming the underlying population is exponential with mean $\theta$, i.e. with density $$f(x;\theta)=\frac{1}{\theta}e^{-x/\theta}\,\mathbb I(x)\quad,\theta>0$$

where $\mathbb I(x)=\begin{cases}1&,\text{ if }x>0\\0&,\text{ otherwise }\end{cases}$.

$(X_1,X_2,\cdots,X_n)$ is a random sample drawn from the above population.

Then, $\displaystyle\mathbb E_{\theta}(\bar X)=\mathbb E_{\theta}\left(\frac{1}{n}\sum_{i=1}^nX_i\right)=\frac{1}{n}\sum_{i=1}^n\mathbb E_{\theta}(X_i)=\frac{n\theta}{n}=\theta$ for all $\theta$.

So as usual we see that the sample mean $\bar X$ is unbiased for the population mean $\theta$.

Now, the joint density of $(X_1,X_2,\cdots,X_n)$ is \begin{align}f_{\theta}(\mathbf x)&=\prod_{i=1}^nf(x_i;\theta)\\&=\frac{1}{\theta^n}\exp\left(-\frac{1}{\theta}\sum_{i=1}^nx_i\right)\prod_{i=1}^n\mathbb I(x_i)\\\implies \ln f_{\theta}(\mathbf x)&=-n\ln \theta-\frac{1}{\theta}\sum_{i=1}^nx_i+\sum_{i=1}^n\ln \mathbb I(x_i)\\\implies\frac{\partial}{\partial\theta}\ln f_{\theta}(\mathbf x)&=\frac{-n}{\theta}+\frac{n\bar x}{\theta^2}\\&=\frac{n}{\theta^2}\left(\bar x-\theta\right)\end{align}

Thus we have expressed the score function in the form

$$\frac{\partial}{\partial\theta}\ln f_{\theta}(\mathbf x)=k(\theta)(T(\mathbf x)-\theta)\tag{1}$$

which is the equality condition in the Cramér-Rao inequality.

Hence we see that

  • $\bar X$ is an unbiased estimator of $\theta$.
  • $\bar X$ is the statistic $T(\mathbf X)$ which satisfies the equality condition $(1)$ of the Cramér-Rao inequality. That is, variance of $\bar X$ attains the Cramér-Rao lower bound for $\theta$.

These two facts imply that $\bar X$ is the UMVUE of $\theta$.


Here we have exploited a corollary of the Cramér-Rao inequality, which says that for a family of distributions parametrised by $\theta$ (assuming regularity conditions of Cramér-Rao inequality to hold), if a statistic $T$ is unbiased for $\theta$ and if it satisfies $(1)$, then $T$ must be the uniformly minimum variance unbiased estimator of $\theta$. This holds for a function of $\theta$ also. Needless to say, this does not work in every problem. In all those cases, one has to use the theory of completeness and sufficiency as mentioned in the other answer.

If you want to do this problem the usual way, you would have to prove that $\sum_{i=1}^n X_i$ is a complete sufficient statistic for the family of distributions, and that $\bar X$, a function of the complete sufficient statistic, is an unbiased estimator of $\theta$. So by the Lehmann-Scheffé theorem, $\bar X$ is the UMVUE of $\theta$.