Calculation of the n-th central moment of the normal distribution $\mathcal{N}(\mu,\sigma^2)$

Since integration is not my strong suit I need some feedback on this, please:

Let $Y$ be $\mathcal{N}(\mu,\sigma^2)$, the normal distrubution with parameters $\mu$ and $\sigma^2$. I know $\mu$ is the expectation value and $\sigma$ is the variance of $Y$.

I want to calculate the $n$-th central moments of $Y$.

The density function of $Y$ is $$f(x)=\frac{1}{\sigma\sqrt {2\pi}}e^{-\frac{1}{2}\left(\frac{y-\mu}{\sigma}\right)^2}$$

The $n$-th central moment of $Y$ is $$E[(Y-E(Y))^n]$$

The $n$-th moment of $Y$ is $$E(Y^n)=\psi^{(n)}(0)$$ where $\psi$ is the Moment-generating function $$\psi(t)=E(e^{tX})$$

So I started calculating:

$$\begin{align} E[(Y-E(Y))^n]&=\int_\mathbb{R}\left(f(x)-\int_\mathbb{R}f(x)dx\right)^n\,dx \\ &=\int_\mathbb{R}\sum_{k=0}^n\left[\binom{n}{k}(f(x))^k\left(-\int_\mathbb{R}f(x)dx\right)^{n-k}\right]\,dx \\ &=\sum_{k=0}^n\binom{n}{k}\left(\int_\mathbb{R}\left[(f(x))^k\left(-\int_\mathbb{R}f(x)dx\right)^{n-k}\right]\,dx\right) \\ &=\sum_{k=0}^n\binom{n}{k}\left(\int_\mathbb{R}\left[(f(x))^k\left(-\mu\right)^{n-k}\right]\,dx\right) \\ &=\sum_{k=0}^n\binom{n}{k}\left((-\mu)^{n-k}\int_\mathbb{R}(f(x))^k\,dx\right) \\ &=\sum_{k=0}^n\binom{n}{k}\left((-\mu)^{n-k}E\left(Y^k\right)\right) \\ \end{align}$$

Am I on the right track or completely misguided? If I have made no mistakes so far, I would be glad to get some inspiration because I am stuck here. Thanks!


Solution 1:

The $n$-th central moment $\hat{m}_n = \mathbb{E}\left( \left(X-\mathbb{E}(X)\right)^n \right)$. Notice that for the normal distribution $\mathbb{E}(X) = \mu$, and that $Y = X-\mu$ also follows a normal distribution, with zero mean and the same variance $\sigma^2$ as $X$.

Therefore, finding the central moment of $X$ is equivalent to finding the raw moment of $Y$.

In other words, $$ \begin{eqnarray} \hat{m}_n &=& \mathbb{E}\left( \left(X-\mathbb{E}(X)\right)^n \right) = \mathbb{E}\left( \left(X-\mu\right)^n \right) = \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi} \sigma} (x-\mu)^n \mathrm{e}^{-\frac{(x-\mu)^2}{2 \sigma^2}} \mathrm{d} x\\ & \stackrel{y=x-\mu}{=}& \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi} \sigma} y^n \mathrm{e}^{-\frac{y^2}{2 \sigma^2}} \mathrm{d} y \stackrel{y = \sigma u}{=} \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi} \sigma} \sigma^n u^n \mathrm{e}^{-\frac{u^2}{2}} \sigma \mathrm{d} u \\ &=& \sigma^n \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi} } u^n \mathrm{e}^{-\frac{u^2}{2}} \mathrm{d} u \end{eqnarray} $$ The latter integral is zero for odd $n$ as it is the integral of an odd function over a real line. So consider $$ \begin{eqnarray} && \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi} } u^{2n} \mathrm{e}^{-\frac{u^2}{2}} \mathrm{d} u = 2 \int_{0}^\infty \frac{1}{\sqrt{2\pi} } u^{2n} \mathrm{e}^{-\frac{u^2}{2}} \mathrm{d} u \\ && \stackrel{u=\sqrt{2 w}}{=} \frac{2}{\sqrt{2\pi}} \int_0^\infty (2 w)^n \mathrm{e}^{-w} \frac{\mathrm{d} w }{\sqrt{2 w}} = \frac{2^n}{\sqrt{\pi}} \int_0^\infty w^{n-1/2} \mathrm{e}^{-w} \mathrm{d} w = \frac{2^n}{\sqrt{\pi}} \Gamma\left(n+\frac{1}{2}\right) \end{eqnarray} $$ where $\Gamma(x)$ stands for the Euler's Gamma function. Using its properties we get $$ \hat{m}_{2n} = \sigma^{2n} (2n-1)!! \qquad\qquad \hat{m}_{2n+1} = 0 $$

Solution 2:

If $X\sim N(\mu,\sigma^2)$ then the $k$th central moment $E[(X-\mu)^k]$ is the same as the $k$th moment $E(Y^k)$ of $Y\sim N(0,\sigma^2)$.

For $Y\sim N(0,\sigma^2)$ the moment-generating function is$^\color{red}a$: $$E(e^{tY})=e^{t^2\sigma^2/2}.\tag1$$ One of the uses of the moment-generating function is, ahem, to generate moments. You can do this by expanding both sides of (1) as power series in $t$, and then matching coefficients. This is easily done for the normal distribution: Using $\displaystyle e^x=\sum_\limits{k=0}^\infty \frac {x^k}{k!}$, the LHS of (1) expands as $$ E(e^{tY})=E\left(\sum_{k=0}^\infty \frac{(tY)^k}{k!}\right)=\sum_{k=0}^\infty\frac{E(Y^k)}{k!}t^k\tag2 $$ while the RHS expands as $$ e^{t^2\sigma^2/2}=\sum_{k=0}^\infty \frac {(t^2\sigma^2/2)^k}{k!}=\sum_{k=0}^\infty\frac{\sigma^{2k}}{k!2^k}t^{2k}.\tag3 $$ By comparing coefficients of like powers of $t$ in (2) and (3), we see:

  • If $k$ is odd, then $E(Y^k)=0$.

  • If $k$ is even, say $k=2n$, then $\displaystyle\frac{E(Y^{2n})}{(2n)!}$, which is the coefficient of $t^{2n}$ in (2), equals the coefficient of $t^{2n}$ in (3), which is $\displaystyle\frac{\sigma^{2n}}{n!2^n}$. In other words: $$E(Y^{2n})=\frac{(2n)!}{n!2^n}\sigma^{2n}.\tag4 $$ By using $n!2^n=2(n)\cdot 2(n-1)\cdots2(1)=(2n)\cdot(2n-2)\cdots(2)$, we can rewrite (4) as: $$E(Y^{2n})=(2n-1)!!\,\sigma^{2n}.\tag5 $$


$\color{red}a:$ If $Z$ has standard normal distribution then its moment generating function is

$$E(e^{tZ})=\int e^{tz}\frac1{\sqrt{2\pi}}e^{-\frac12z^2}\,dz=\int\frac1{\sqrt{2\pi}}e^{-\frac12(z^2-2tz)}dz=e^{t^2/2}\underbrace{ \int\frac1{\sqrt{2\pi}}e^{-\frac12(z-t)^2}dz }_{1}=e^{t^2/2}.$$

If $X\sim N(\mu,\sigma^2)$ then $X$ is distributed like $\mu+\sigma Z$ hence the moment generating function of $X$ is $$E(e^{tX})=E(e^{t(\mu +\sigma Z)})=e^{t\mu} E(e^{t\sigma Z}) = e^{t\mu+(t\sigma)^2/2}.$$