Finding the maximum likelihood estimators for this shifted exponential PDF?

Solution 1:

The density of a single observation $x_i$ is $$f(x \mid \lambda, \theta) = \lambda e^{-\lambda(x-\theta)} \mathbb{1}(x \ge \theta).$$ The joint density of the entire sample $\boldsymbol x$ is therefore $$\begin{align*} f(\boldsymbol x \mid \lambda, \theta) &= \prod_{i=1}^n f(x_i \mid \lambda, \theta) \\ &= \lambda^n \exp\left(-\sum_{i=1}^n \lambda(x_i - \theta)\right) \mathbb{1}(x_{(1)} \ge \theta) \\ &= \lambda^n \exp\left(-\lambda n (\bar x - \theta)\right) \mathbb{1}(x_{(1)} \ge \theta), \end{align*}$$ where $\bar x$ is the sample mean. Hence the joint log-likelihood for $\lambda, \theta$ is proportional to $$\ell(\lambda, \theta \mid \boldsymbol x) \propto \log \lambda - \lambda(\bar x - \theta) + \log \mathbb{1}(x_{(1)} \ge \theta).$$ The log-likelihood is maximized for a pair of estimators $(\hat \lambda, \hat \theta)$. Because $\lambda > 0$, $\ell$ is an increasing function of $\theta$ until $\theta > x_{(1)} = \min_i x_i$; hence $\ell$ is maximal with respect to $\theta$ when $\theta$ is made as large as possible without exceeding the minimum order statistic; i.e., $\hat \theta = x_{(1)}$. For a given $\theta$, $\ell$ with respect to $\lambda > 0$ is a continuous function, thus we compute the partial derivative $$\frac{\partial \ell}{\partial \lambda} = \frac{1}{\lambda} - (\bar x - \theta),$$ for which the only critical point is $$\lambda = \frac{1}{\bar x - \theta},$$ and we can verify that this choice is a global maximum for $\lambda > 0$. Therefore, our joint maximum likelihood estimator is $$(\hat \lambda, \hat \theta) = \left((\bar x - x_{(1)})^{-1}, x_{(1)}\right).$$ Note that when both $\lambda$ and $\theta$ are unknown parameters, the MLE cannot contain any expressions involving $\lambda$ or $\theta$, as an estimator is always a function of the sample and/or known parameters.