Verification of Sufficiency, Completeness, MLE, UMVUE, and Method of Moments for Estimators of a certain random sample.

I was confronted with the next problem and a little help would be welcomed. This is what I have so far.

Problem

Let $X_1, X_2,...,X_n$ be a random sample from

$f(x;\theta)=e^{-(x-\theta)}I_{[\theta,\infty)}(x)$, for $-\infty<\theta<\infty$.

(a) Find a sufficient statistic.

(b) Find a maximum-likelihood estimator of $\theta$.

(c) Find a method-of-moments estimator of $\theta$.

(d) Is there a complete sufficient statistic? If so, find it.

(e) Find the UMVUE of $\theta$ if one exists.


Attempt of solution

Let $I=\{1,2,...,n\}$

(a)

Let $Y=min\{X_i:i \in I\}$.

Clearly, if $Y \geq \theta$, $f(x;y)=\prod_{i \in I} f(x_i;y)$ is independent of $\theta$. Therefore, $Y=min\{X_i:i \in I\}$ must be a sufficient statistic.


(b)

Analogously, the likelihood function is maximized when $min\{X_i:i \in I\}=Y \geq \theta$ (It is $0$ otherwise), so the MLE of $\theta$ must be Y, as above defined.


I need help with (c)


(d)

Since this density function is not in the exponential family given the Indicator function, it's needed a different approach.

Integrating over $[\theta,x)$ the density function $f(x;\theta)$, the distribution function $F(x;\theta)=(\frac{-e^\theta}{e^x}-1)I_{[\theta,\infty)}(x)$ is obtained. And since the event $[Y<y]$ is the intersection of $[X_i \geq y]$ for $i \in I$ events (meaning $1-P[X_i < y]$), the distribution function of Y is given by $F(y;\theta)=(\frac{e^\theta}{e^x})^nI_{[\theta,\infty)}(x)$ and $Y$ has density

$f(y;\theta)=n(\frac{e^\theta}{e^y})^{n-1}I_{[\theta,\infty)}(x)$

Now, let g be a measurable function such that $E[g(Y)]=0, \enspace \forall \theta \in \mathbb{R}-\{0\}$. ($\theta=0$ trivializes the exercise)

Observe that,

if $\forall \theta \in \mathbb{R}-\{0\}$,

$0=E[g(Y)]=n\int_{\theta}^{\infty}g(y)[\frac{e^\theta}{e^y}]^{n-1}dy$. Then, since $n \in \mathbb{N}$, and $\frac{e^\theta}{e^y} \neq 0, \enspace \forall y \in \mathbb{R}$.

Then $g(Y) \equiv 0$.

Therefore, $Y=min\{X_i:i \in I\}$ is complete, and given (a), it is sufficient as well.


(e)

In this step we should search for $T=t(Y)$, an unbiased estimator of $\theta$.

Let's then consider that $E[Y]=\frac{n\theta}{n-1}+\frac{n}{(n-1)^2}$.

I am not sure about this, but does the first member of the result imply that such UMVUE doesn't exist? i.e., is possible to make $\frac{n\theta}{n-1}$ "become" $\theta$?


Note:

I am aware of the length of this exercise, and aware -as well- of the possible existence of mistakes here. Corrections are welcomed in such case.

Thanks in advance.


Solution 1:

Your solutions to parts a), b) and d) are correct.

For c), use the first moment based estimator. For the given distribution, $$\mathbb{E}(X_1)=\int_\theta^\infty xe^{\theta-x}dx=\int_\theta^\infty (x-\theta+\theta)e^{\theta-x}dx=\int_0^\infty (t+\theta)e^{-t}dt=\Gamma(2)+\theta=\theta+1,$$

where we use the substitution $t=x-\theta$, and the fact that $\Gamma(1)=\Gamma(2)=1.$ This means

$$\mathbb{E}(\bar{X})=\theta+1.$$

Naturally the method of moments estimator will be $\bar{X}-1.$

e) We need $t(Y)$ to be unbiased for $\theta.$ It can be checked that $\mathbb{E}(Y)=\theta+\dfrac{1}{n}.$

We can then take $t(Y)=Y-\dfrac{1}{n}.$ This is of course unbiased and is a function of the complete sufficient statistic $Y,$ and hence must be the UMVUE.

Solution 2:

Since $E(X_i-1)=\theta$, we have $$E\left[\frac{1}{n}\sum_{i=1}^n (X_i-1)\right]=\theta$$

So a method of moments estimator of $\theta$ is $$T_1(X_1,\ldots,X_n)=\frac{1}{n}\sum_{i=1}^n (X_i-1)$$

For UMVUE, you have to find an unbiased estimator which is a function of complete sufficient statistic. Verify that for $X_{(1)}=\min\limits_{1\le i\le n} X_i$,

$$E\left(X_{(1)}-\frac{1}{n}\right)=\theta$$

So UMVUE of $\theta$ is $$T_2(X_1,\ldots,X_n)=X_{(1)}-\frac{1}{n}$$

In both calculations it would help to notice that $X_i-\theta$ are i.i.d $\mathsf{Exp}(1)$ variables. In the second case, you can either derive the distribution of $X_{(1)}$ and then find its expectation, or you may directly say that $X_{(1)}-\theta$ is Exponential with mean $1/n$.

For (a) and (b), answers are obviously correct but you might want to clarify your reasoning a bit.