Expected Value and Variance of Monte Carlo Estimate of $\int_{0}^{1}e^{-x}dx$
Solution 1:
Let $X$ be a random variable uniformly distributed on the interval $[0.1]$. It has density function $1$ on the interval, and $0$ elsewhere. Let $Y=e^{-X}$. We first compute the mean and variance of the random variable $Y$.
We have $E(Y)=\int_0^1 e^{-x}\,dx=1-e^{-1}$. Call this $\mu$. It is the exact value of our integral. This is the whole point of the simulation: If we take a sort of big sample, such as $1000$, the sample mean will probably be close to the true mean, which is the true value of the integral.
To find the variance of $Y$, we use the shortcut formula $\text{Var}{Y}=E(Y^2)-(E(Y))^2$. We have $E(Y^2)=E(e^{-2X})=\int_0^1 e^{-2x}\,dx=\frac{1}{2}(1-e^{-2})$. Now we can calculate the variance of $Y$, either as an exact expression, or by giving a decimal approximation. Call it $\sigma^2$.
In your program, you have $1000$ random variables $Y_1,Y_2,\dots, Y_{1000}$, each with the same distribution as the above $Y$. We are studying the random variable $$\bar{Y}=\frac{Y_1+Y_2+\cdots +Y_{1000}}{1000}.$$ This random variable $\bar{Y}$ also has mean $\mu$. It can be shown (this is a general phenomenon) that the variance of $\bar{Y}$ is $\frac{\sigma^2}{1000}$.
We know $\sigma^2$, so we know the variance of $\bar{Y}$.
Note that $\bar{Y}$ is the result of a single run of your program, that is, $1000$ points.
We can imagine repeating the simulation $1000$ times, that is, effectively, using $1$ million points.
The theory is much the same. The results of the simulations are random variables $\bar{Y}_1,\dots, \bar{Y}_{1000}$. If we average them, the result has variance equal to the variance of $\bar{Y}$, divided by $1000$. so it is $\frac{\sigma^2}{1000^2}$.