Variance of the random sum of a Poisson?
\begin{align} \operatorname{var}(Z) & = \operatorname{var}(\operatorname{E}(Z\mid N)) + \operatorname{E}(\operatorname{var}(Z\mid N)) & & (\text{This is the law of total variance.}) \\[10pt] & = \operatorname{var}(N\mu) + \operatorname{E}(N\sigma^2) \\[10pt] & = \mu^2 \operatorname{var}(N) + \sigma^2 \operatorname{E}(N) \\[10pt] & = \mu^2 \lambda + \sigma^2 \lambda = \lambda \operatorname{E}(X_1^2). \end{align}
Indeed, generally the $n$th cumulant of a compound Poisson distribution is the mean of the simple Poisson distribution times the $n$th raw moment of the distribution that gets compounded.
PS: The above applies if the sum is from $0$ to $N$; I'll leave it as an exercise to figure out whether something needs to change if it's from $1$ to $N+1$.
Hint: rewrite $$ Z = \sum_{n=1}^\infty X_n\mathbb{1}_{\{N+1 \geq n\}} $$ and apply your "expectation of the sum is the sum of the expectations" idea.
Following the comment below, more detail.
Write $Y_n = \mathbb{1}_{\{N+1 \geq n\}}$, which is independent of $X_n$.
-
Then for the expectation, you have $$\mathbb{E}[Z] = \sum_{n=1}^\infty \mathbb{E}[X_nY_n] = \sum_{n=1}^\infty \mathbb{E}[X_n] \mathbb{E}[Y_n]= \sum_{n=1}^\infty \mu \mathbb{P}\{N \geq n-1\} = \mu\sum_{n=0}^\infty \mathbb{P}\{N \geq n\} $$ using independence of $X_n$ and $Y_n$, which yields $$\mathbb{E}[Z] = \mu\lambda$$ since $\sum_{n=0}^\infty\mathbb{P}\{N \geq n\} = \mathbb{E}[N].$
- The variance will be a bit (a lot?) less straightforward, though, since the $Y_n$'s are not independent. But you have $\mathbb{E}[Z]^2$ already, so it only remains to compute $\mathbb{E}[Z^2]$ — it's not very enjoyable, but you can do so by expanding the sum: $$\begin{align} \mathbb{E}[Z^2] &= \mathbb{E}\left[\sum_{n=1}^\infty\sum_{m=1}^\infty X_nY_nX_mY_m \right ] = \sum_{n=1}^\infty\sum_{m=1}^\infty \mathbb{E}\left[X_nY_nX_mY_m \right ] \\ &= \sum_{n=1}^\infty \mathbb{E}\left[X^2_n\right]\mathbb{E}\left[Y_n^2\right ] + 2\sum_{n=1}^\infty\sum_{m=n+1}^\infty \mathbb{E}\left[X_n\right]\mathbb{E}\left[X_m\right]\mathbb{E}\left[Y_n Y_m \right ] \\ &= \sum_{n=1}^\infty \mathbb{E}\left[X^2_n\right]\mathbb{E}\left[Y_n\right ] + 2\mu^2\sum_{n=1}^\infty\sum_{m=n+1}^\infty \mathbb{E}\left[Y_nY_m \right ] \end{align}$$ using that $Y_n^2=Y_n$ (it's a random variable being either $0$ or $1$). Now, observe that for $m \geq n$, $Y_nY_m = Y_m$, so you get $$\begin{align} \mathbb{E}[Z^2] &= (\sigma^2+\mu^2)\sum_{n=1}^\infty \mathbb{E}\left[Y_n\right ] + 2\mu^2\sum_{n=1}^\infty\sum_{m=n+1}^\infty \mathbb{E}\left[Y_m \right ] \\ &= (\sigma^2+\mu^2)\sum_{n=1}^\infty \mathbb{E}\left[Y_n\right ] + 2\mu^2\sum_{m=1}^\infty (m-1)\mathbb{E}\left[Y_m \right ]\\ &= (\sigma^2+\mu^2)\sum_{n=0}^\infty \mathbb{P}\{N \geq n\} + 2\mu^2\sum_{m=0}^\infty m\mathbb{P}\{N \geq m\} \\ &= (\sigma^2+\mu^2)\lambda + 2\mu^2\sum_{m=0}^\infty m\mathbb{P}\{N \geq m\} \\ \end{align}$$ and you can continue by manipulating the last sum (it's quite tedious, but it works).