We can prove that the estimator $T$ is the uniformly minimum variance unbiased estimator (UMVUE) of $\theta$ for the discrete uniform distribution over $\{1,2,\ldots,\theta\}$.

First show that $X_{(n)}$ is a complete sufficient statistic for $\theta$. Sufficiency is quite easy to prove using the Factorization theorem. The argument for completeness is slightly more involved.

Since $$E(X_1)=\frac{\theta+1}{2}\,,$$

an unbiased estimator of $\theta$ is $$h(X_1)=2X_1-1$$

By the Lehmann-Scheffe theorem, UMVUE of $\theta$ is given by the conditional expectation $$E(h\mid X_{(n)})$$

Note that the conditional distribution of $X_1$ given $X_{(n)}$ is of the form

\begin{align} P(X_1=j\mid X_{(n)}=y)&=\begin{cases}\frac{y^{n-1}-(y-1)^{n-1}}{y^n-(y-1)^n}&,\text{ if }j=1,2,\ldots,y-1\\\\\frac{y^{n-1}}{y^n-(y-1)^n}&,\text{ if }j=y\\\\\quad0&,\text{ otherwise }\end{cases} \end{align}

So,

\begin{align} E(h\mid X_{(n)}=y)&=2E(X_1\mid X_{(n)}=y)-1 \\&=2\left[\frac{y^{n-1}-(y-1)^{n-1}}{y^n-(y-1)^n}\sum_{j=1}^{y-1}j+y\cdot\frac{y^{n-1}}{y^n-(y-1)^n}\right]-1 \\&=\frac{y^{n+1}-(y-1)^{n+1}}{y^n-(y-1)^n} \end{align}

Thus the UMVUE of $\theta$ is $$T=\frac{X_{(n)}^{n+1}-(X_{(n)}-1)^{n+1}}{X_{(n)}^n-(X_{(n)}-1)^n}$$

Naturally, $T$ is the most efficient estimator of $\theta$ within the unbiased class.

Edit:

To derive the distribution of $X_1$ conditioned on $X_{(n)}$, we first find the joint pmf as follows:

For $j=1,2,\ldots,y-1$,

\begin{align} P(X_1=j,X_{(n)}=y)&=P\left(X_1=j,\max_{2\le i\le n} X_i=y\right) \\&=P(X_1=j)P\left(\max_{2\le i\le n} X_i=y\right) \end{align}

And for $j=y$,

\begin{align} P(X_1=j,X_{(n)}=y)&=P(X_1=y,X_{(n)}=y) \\&=P(X_1=y,X_2\le y,\ldots,X_n\le y) \\&=P(X_1=y)P(X_2\le y,\ldots,X_n\le y) \end{align}