Derive unbiased estimator for $\theta$ when $X_i\sim f(x\mid\theta)=\frac{2x}{\theta^2}\mathbb{1}_{(0,\theta)}(x)$

Solution 1:

In general, note that maximum likelihood estimators are not necessarily unbiased estimators.

I'm not familiar with Lebesgue integration, but hopefully using non-measure theoretic tools can help you find this.

First of all, observe that $$\mathbb{E}[X_1]=\dfrac{2}{\theta^2}\int_{0}^{\theta}x^2\text{ d}x=\dfrac{2}{\theta^2}\cdot\dfrac{\theta^3}{3}=\dfrac{2\theta}{3}\text{.}$$ Thus, the estimator $$\hat{\theta}=\dfrac{3}{2n}\sum_{i=1}^{n}X_i$$ is unbiased for $\theta$, since $$\mathbb{E}[\hat{\theta}]=\dfrac{3}{2n}\sum_{i=1}^{n}\mathbb{E}[X_i]=\dfrac{3}{2n}\cdot \dfrac{2\theta}{3}\cdot n = \theta\text{.}$$

Solution 2:

The likelihood function is\begin{align*} L(θ; x_1, \cdots, x_n) &= \frac{2^n}{θ^{2n}} \prod_{k = 1}^n x_k \prod_{k = 1}^n I_{(0, θ)}(x_k)\\ &= \frac{2^n}{θ^{2n}} \prod_{k = 1}^n x_k I_{(0, +\infty)}\left( \min_{1 \leqslant k \leqslant n} x_k \right) I_{(-\infty, θ)}\left( \max_{1 \leqslant k \leqslant n} x_k \right). \end{align*} For fixed $x_1, \cdots, x_n$, $L(θ; x_1, \cdots, x_n) = 0$ for $θ < \max\limits_{1 \leqslant k \leqslant n} x_k$, and $L(θ; x_1, \cdots, x_n)$ is decreasing with respect to $θ$ for $θ > \max\limits_{1 \leqslant k \leqslant n} x_k$. Thus the MLE of $θ$ is$$ \hat{θ}(X_1, \cdots, X_n) = \max_{1 \leqslant k \leqslant n} X_k. $$

Note that the density function of $\max\limits_{1 \leqslant k \leqslant n} X_k$ is$$ f_n(x; θ) = n (F(x; θ))^{n - 1} f(x; θ) = \frac{2n}{θ^{2n}} x^{2n - 1} I_{(0, θ)}(x), $$ thus$$ E_θ(\hat{θ}) = \int_0^θ x \cdot \frac{2n}{θ^{2n}} x^{2n - 1} \,\mathrm{d}x = \frac{2n}{2n + 1} θ. $$ Therefore, an unbiased estimator of $θ$ is $\displaystyle \frac{2n + 1}{2n} \max\limits_{1 \leqslant k \leqslant n} X_k$.