If $X$ is a complete estimator, then a linear transformation of $X$ is a complete estimator?
I've searched online and it seem to be true, although I've seen no proof of it.
I'm trying to apply it to this case:
I've reached (in a particular example) that $\sum_n X_i$ is a complete estimator of $\theta$. Now I want to argument that $\frac{1}{n}\sum_n X_i$ (the mean) is also complete, but I can't derive it from the following definition of complete:
"$T$ is complete for $\theta$ if $E(g(T))=0$ implies $g(T)=0$ almost surely"
Solution 1:
If $T$ is a complete statistic, then $S=f(T)$, where $f$ is any measurable function, is complete. Indeed, take any measurable $g$. Then $$ \mathsf{E}_{\theta}g(S)=\mathsf{E}_{\theta}(g\circ f)(T)=0\text{ for each }\theta $$ implies that $\mathsf{P}_{\theta}(g(S)=0)=\mathsf{P}_{\theta}((g\circ f)(T)=0)=1$ for every $\theta$.