Computing the UMVUE for Uniform$(0,\theta)$

We use the fact that, conditional on the max, the lower order statistics are distributed like the order statistics of iid $U(0,X_{(n)}).$ So, conditional on $X_{(n)}$ $$\bar X \sim \frac{X_{(n)} + \sum_{i=1}^{n-1}U_i}{n}$$ where the $U_i$ are i.i.d. $U(0,X_{(n)}).$ Then we have $$ E(\bar X\mid X_{(n)}) = \frac{X_{(n)}+\frac{n-1}{2}X_{(n)}}{n} = \frac{(n+1)}{2n}X_{(n)}.$$

(We could have also derived this by the "what else could it possibly be?" method. It needs to be unbiased, a statistic, and a dimensionally sensible function of $X_{(n)}$... there's only one game in town here.)


Note that $X_{(n)}$ is a complete sufficient statistic for $\theta$. By Lehmann-Scheffe theorem, UMVUE of $\theta$ is that function of $X_{(n)}$ which is unbiased for $\theta$. So the UMVUE must be $\left(\frac{n+1}{n}\right)X_{(n)}$ as shown here.

By Lehmann-Scheffe, UMVUE is equivalently given by $E\left[2X_1\mid X_{(n)}\right]$ or $E\left[2\overline X\mid X_{(n)}\right]$. As UMVUE is unique whenever it exists, it must be that $$E\left[2X_1\mid X_{(n)}\right]=E\left[2\overline X\mid X_{(n)}\right]=\left(\frac{n+1}{n}\right)X_{(n)}$$

To find the conditional expectation somewhat intuitively, note that $X_1=X_{(n)}$ with probability $\frac1n$ as any value is equally likely to be the maximum for i.i.d continuous variables, and $X_1<X_{(n)}$ with probability $1-\frac1n$. Moreover, given $X_{(n)}=t$, distribution of $X_1$ conditioned on $X_1<t$ is uniform on $(0,t)$.

As shown by @spaceisdarkgreen, it follows from law of total expectation that

\begin{align} E\left[X_1\mid X_{(n)}=t\right]&=E\left[X_1\mid X_1=t\right]\cdot\frac1n+E\left[X_1\mid X_1<t\right]\cdot\left(1-\frac1n\right) \\&=\frac{t}{n}+\frac{t}{2}\left(1-\frac1n\right)=\left(\frac{n+1}{2n}\right)t \end{align}

Note that $E[X_1\mid X_1<t]=\frac{E[X_1\mathbf1_{X_1<t}]}{P(X_1<t)}=\frac t2$ can be directly verified for any $t\in(0,\theta)$.

More rigorous answers can be found here and here for instance.


An alternative method for finding the conditional expectation is using Basu's theorem.

Since $\frac{X_1}{X_{(n)}}=\frac{X_1/\theta}{X_{(n)}/\theta}$, its distribution is free of $\theta$ (an ancillary statistic). By Basu's theorem, $\frac{X_1}{X_{(n)}}$ is independent of the complete sufficient statistic $X_{(n)}$.

Due to independence, $$E[X_1]=E\left[\frac{X_1}{X_{(n)}}\cdot X_{(n)}\right]=E\left[\frac{X_1}{X_{(n)}}\right]\cdot E[X_{(n)}]$$

Therefore,

\begin{align} E\left[X_1\mid X_{(n)}\right]&=E\left[\frac{X_1}{X_{(n)}}\cdot X_{(n)}\,\Big|\, X_{(n)}\right] \\&=X_{(n)}E\left[\frac{X_1}{X_{(n)}} \,\Big|\, X_{(n)}\right] \\&=X_{(n)}E\left[\frac{X_1}{X_{(n)}}\right] \\&=X_{(n)}\frac{E[X_1]}{E[X_{(n)}]} \end{align}

Needless to say, the same calculation holds for $E\left[\overline X\mid X_{(n)}\right]$.