I have stumbled upon a problem on the book "Statistical Inference" second edition, by George Casella and Roger L. Berger where I have not been able to follow one of the steps.

The problem says:

Problem 4.33

I was struggling when finding the Expected value using the conditional expectation. I found a solution online for this problem. I understand most parts of it, but don't see how can we say that $$E_N\left[\left[E(e^{t\cdot X_1}|N\right]^N\right]=E_N\left(\frac{log(1-e^t\cdot (1-p))}{log(p)}\right)^N$$

I understand that using the tower property we have $E_N\left[E(e^{t\cdot X_1}|N)\right]=E(e^{t\cdot X_1})$, but other than this, I don't really see how can we deal with the exponent and then have the equality that the solution uses.

Here's the solution:

Solution to problem 4.33


Solution 1:

For any sequence of independent and identically distributed random variables $X_1, X_2, \ldots$, we have $$\operatorname{E}[e^{t(X_1 + X_2 + \cdots + X_n)}] = \operatorname{E}[e^{tX_1} e^{tX_2} \cdots e^{tX_n}] \overset{\text{ind}}{=} \operatorname{E}[e^{tX_1}]\operatorname{E}[e^{tX_2}]\cdots \operatorname{E}[e^{tX_n}] \overset{\text{id}}{=} \operatorname{E}[e^{tX_1}]^n.$$

The first equality is just the algebraic expansion of the exponent. The second, with "ind" on top, follows because the $X_i$s are independent, thus the expectation of the product equals the product of the expectations. The third equality, with "id" on top, follows because each of the $X_i$s are identically distributed, thus each expectation $\operatorname{E}[e^{tX_k}]$ for $k = 1, 2, \ldots, n$, has the same value.

Consequently, using the law of total expectation (aka "tower property"),

$$M_H(t) = \operatorname{E}[\operatorname{E}[e^{tH} \mid N]] = \operatorname{E}[M_X(t)^N],$$ where $M_X(t)$ is the MGF of a single $X$, which has the logarithmic series distribution. Hence $$M_H(t) = \operatorname{E}\left[\left(\frac{\log (1-(1-p)e^t)}{\log p}\right)^{\!N}\right].$$ The rest follows as shown in the solution.

Alternatively, we can observe that the Poisson PGF is $$P_N(t) = \operatorname{E}[t^N] = \sum_{n=0}^\infty t^n e^{-\lambda} \frac{\lambda^n}{n!} = e^{\lambda(t-1)}.$$ Therefore, $$\begin{align} M_H(t) &= P_N\left(\frac{\log (1-(1-p)e^t)}{\log p}\right) \\ &= \exp \left( \lambda \frac{\log (1 - (1-p)e^t) - \log p}{\log p}\right) \\ &= \exp \left( \log \left(\frac{1 - (1-p)e^t}{p} \right)^{\!\lambda}\right) ^{1/\log p} \\ &= \left(\frac{1 - (1-p)e^t}{p} \right)^{\lambda/\log p}. \end{align}$$