Fisher Information for Geometric Distribution
Find the Cramer-Rao lower bound for unbiased estimators of $\theta$, and then given the approximate distribution of $\hat{\theta}$ as $n$ gets large. This is for a geometric($\theta$) distribution.
I am stuck on calculating the Fisher Information, which is given by $-nE_{\theta}\left(\dfrac{d^{2}}{d\theta^{2}}\log f(X\mid\theta)\right)$. So far, I have the second derivative of the log likelihood as $\dfrac{-n}{\theta^{2}}+\dfrac{\theta(n-\sum x_{i})}{(1-\theta)^{2}}$. I just need some help finding the expectation of this.
I think you misscalculate the loglikelihood:
$$L=\prod_{i=1}^{n}(1-\Theta)^{x_i-1}\Theta =$$ $$=(1-\Theta)^{\sum_{i=1}^{n}x_i-n}\cdot \Theta^n$$ Then you calculate $lnL$ $$lnL=(\sum_{i=1}^{n}x_i-n)ln(1-\Theta)+nln\Theta$$ $$\frac{\partial lnL}{\partial \Theta}=\frac{n}{\Theta}-\frac{\sum_{i=1}^{n}x_i-n}{1-\Theta}$$
The second derivate is th folowing: $$\frac{\partial^2lnL}{\partial^2 \Theta}=-\frac{n}{\Theta^2}-\frac{\sum_{i=1}^{n}x_i-n}{(1-\Theta)^2}$$
For the Fisher information you need $$-E\bigg(\frac{\partial^2lnL}{\partial^2 \Theta}\bigg)$$
$$E\bigg(-\frac{n}{\Theta^2}-\frac{(\sum_{i=1}^{n}x_i-n}{(1-\Theta)^2}\bigg)$$
In this formation the onli variable is in $$\sum_{i=1}^{n}x_i$$ all others are constans. $$E(const)=const$$ so you can get the folowing:
$$E\bigg(-\frac{n}{\Theta^2}-\frac{(\sum_{i=1}^{n}x_i-n}{(1-\Theta)^2}\bigg)=$$ $$\bigg(-\frac{n}{\Theta^2}-\frac{1}{\big(1-\Theta)^2}\big(E(\sum_{i=1}^{n}x_i\big)-n\big)\bigg)=$$
Well known as if $x_i$ is geometrical then $E(x_i)=\frac{1}{\Theta}$
Because all $x_i$ are independent so $E(\sum_{i=1}^{n}x_i)=n \cdot \frac{1}{\Theta}$
So the final formation is: $$=\bigg(-\frac{n}{\Theta^2}-\frac{1}{\big(1-\Theta)^2}\big(\frac{n}{\Theta}-n\big)\bigg)$$ So Finaly you get the Fisher information:
$$F_{\Theta}=-E\bigg(-\frac{n}{\Theta^2}-\frac{(\sum_{i=1}^{n}x_i-n}{(1-\Theta)^2}\bigg)=n\big(\frac{1}{\Theta^2}+\frac{1}{(1-\Theta)\Theta}\big)$$
By definition, the Fisher information $F(\theta)$ is equal to the expectation
$$F(\theta)=-\operatorname{E}_{\theta}\left[\left(\frac{\partial \ell(x,\theta)}{\partial \theta}\right)^2\right],$$
where $\theta$ is a parameter to estimate and
$$\ell(x,\theta):=\log p(x,\theta), $$
denoting by $p(x,\theta)$ the probability distribution of the given random variable $X$.
The expectation value $\operatorname{E}_{\theta}$ is taken w.r.t $p(x,\theta)$. In other words
$$F(\theta)=-\int \left(\frac{\partial \log(x,\theta)}{\partial \theta} \right)^2 p(x,\theta)\,dx$$
for a continuous random variable $X$ and similarly for discrete ones. Just use that
$$\operatorname{E}_{\theta}[f(X)]:=\sum_{k}f(k)p(k,\theta),$$ with $P_{\theta}(X=k):=p(k,\theta)$.