Confusion in finding conditional expecation of indicator function (Lehmann-Scheffe)

I'm trying to find the UMVUE of $e^{-2\lambda}$ with
$X_1, X_2,\ldots,X_n \sim \operatorname{Poisson}(\lambda)$ being independent.

So since $T(X) := \sum_{i=1}^n X_i$ is a complete sufficient statistic for the Poisson distribution, and also $W(X):= \mathbf{I}(X_1 + X_2 = 0)$ is an unbiased estimator of $e^{-2\lambda}$, then by Lehmann_Scheffe,
$$\tau(T)= \mathbb{E}(W(T)\mid T(X)) $$ is the UMVUE of $e^{-2\lambda}$.

I'm having trouble computing $\tau$ so I would like to see if this is correct:
First, to find the expectation, I need the PDF.
$$\mathbb{P}(W(T)=s\mid T(X)=t) = \frac{\mathbb{P}(W(T)=s \ \cap \ T(X) = t)}{\mathbb{P}(T(X) = t)} \\ = \frac{\mathbb{P}(\mathbf{I}(X_1 + X_2 = 0) = s \ \cap \ X_1 + X_2 + \cdots + X_2 = t)}{\mathbb{P}(X_1 + X_2 + \cdots + X_n = t)}\\ = \frac{\mathbb{P}(\mathbf{I}(X_3 + X_4 + \cdots + X_n = t)=s)}{\mathbb{P}(X_1 + X_2 + \cdots + X_n = t)} $$ The denominator is just $\frac{e^{-n\lambda}(n\lambda)^t}{t!}$.
When $s=1$, we have $$\mathbb{P}(\mathrm{I}(X_3+X_4+\cdots+X_n=t) =1) = \frac{e^{-(n-2)\lambda}((n-2)\lambda)^t}{t!} $$ So $$\mathbb{P}(\mathrm{I}(X_3+X_4+\cdots+X_n=t) =0) = 1 - \frac{e^{-(n-2)\lambda}((n-2)\lambda)^t}{t!} $$ After this, I'm unsure of how to find the expectation. Is my approach correct?


Solution 1:

Just cancel out the joint terms, i.e., $$ \varrho(X) = \mathbb{E}[W(X)\mid T=s]=\frac{\mathbb{P}(X_1 + X_2 = 0)\mathbb{P}(\sum_{i=3}^n X_i = s)}{\mathbb{P}(\sum_{i=1}^n X_i = s)} \\= \frac{e^{-2\lambda} e^{-(n-2)\lambda} (n-2)^s \lambda^s/s!}{ e^{-n\lambda} n^s \lambda^s/s! }, $$ with some canceling and rearrangement you have $$ \varrho(X) = \left( \left( 1 - \frac{2}{n} \right)^n \right)^{\bar{X}_n}. $$ To be sure you can use the continuous mapping theorem and the WLLN to note that as $(1-2/n)^n \to e^{-2}$ and $\bar{X}_n \xrightarrow{P} \lambda $, $\varrho(X) \xrightarrow{P} e^{-2\lambda}$.

And for the expectation use the fact that $\sum_{i=1}^n X_i \sim \mathcal{P}oiss(n\lambda)$, hence $$ \mathbb{E}\varrho(X) = e^{-\lambda n}\sum_{s=0}^{\infty}\left( \lambda n( 1- 2/n\right))^s/s!=e^{-\lambda n + \lambda n - 2\lambda} = e^{-2\lambda}. $$

Solution 2:

$\newcommand{\e}{\operatorname{E}}$ You need $\e(W \mid X_1+\cdots+X_n).$

To find that, first find $\e(W\mid X_1+\cdots+X_n = x)$ as a function of $x,$ as follows: \begin{align} & \e(W \mid X_1+\cdots+X_n = x) = \Pr(W=1\mid X_1+\cdots+X_n=x) \\[10pt] = {} & \frac{\Pr(W=1\ \&\ X_1+\cdots+X_n=x)}{\Pr(X_1+\cdots+X_n=x)} \\[10pt] = {} & \frac{\Pr(X_1=X_2=0)\Pr(X_3+\cdots+X_n=x)}{\Pr(X_1+\cdots+X_n=x)} \end{align} That last equality holds because the event $[W=1\ \&\ X_1+\cdots +X_n=x]$ is the same as the event $[X_1=X_2=0\ \&\ X_3+\cdots+X_n=x],$ and the event depending on $X_1,X_2$ is independent of the event depending on $X_3,\ldots,X_n.$

Then just apply the probability mass function for the Poisson distribution and then do a lot of simplification. That gets you a function of $x.$ That function evaluated at $X_1+\cdots+X_n$ will be the UMVUE.

Note that for a number of simple reasons, one must expect all occurrences of $\lambda$ to cancel out. If that does not happen, then something is wrong.