Consider a random variable $X$ with cumulative distribution function $$ F(x)= [G(x)]^{\alpha} $$ where $ G(x)$ is baseline distribution, and its survival function is $ R_{G}(x)=1-G(x)$.

Put $R_{G}(x) $ in $F(x)$ it yields the density $f(x) = dF(x)/ d(x)$ as

$$ f(x)= \alpha u'(x)e^{-u(x)}\left(1 - e^{-u(x)} \right)^{\alpha-1}$$

where $u(x)= -ln R_G(x)$ and $u'(x)$ is the derivative $du(x) / dx$.

My question:

Show that the $rth$ moment of $X$ following the above pdf is given by $$ E(X^{r})= r \sum_{j=1}^{\nu} c_{j} I_{j}(r) \\ c_{j}= (-1)^{j} \alpha (\alpha-1)\ldots (\alpha-j+1)\,/\,j! \qquad I_{j}(r) = \int_{0}^{\infty} x^{r-1} e^{-j u(x)} dx $$

where $\alpha > 0$ can be of non-integral value, and $\nu \in \{ \alpha, \infty \}$.


Please load the page twice for the hyperlinks to work properly.

$\require{begingroup}\begingroup\renewcommand{\dd}[1]{\,\mathrm{d}#1}$Allow me to reiterate the setting (the question originally had confusing notations):

  1. $G$ is the baseline CDF for a non-negative random variable.
  2. $R_G \equiv 1 - G$ as the baseline survival function.
  3. Define $u \equiv -\log R_G$, or equivalently $R_G = e^{-u}$
  4. Consider a random variable $X$ which CDF is $F = G^\alpha$, where $\alpha > 0 $ can be non-integer.
  5. When $\alpha \in \mathbb{N}$ is an integer, then $X \overset{d}{=}\max \{ W_1,~W_2,~\ldots~, W_{\alpha}\}$, namely, $X$ has the same distribution as the max of an iid set with $G$ being their common CDF. At any rate, $F$ is well-defined for any non-integer $\alpha > 0$.
  6. Taking $G = 1 - R_G = 1 - e^{-u}$ yields $F = (1 - e^{-u})^{\alpha}$, which upon differentiating yields the density $f$ as shown in the question post.

The $r$-th moment of $X$ by the simplest definition is

$$E[X^r] = \int_0^{\infty} x^r f(x) \dd{x}$$

Meanwhile, there's a well-known relation between the expectation and the integral of survival function. In particular, see the second half of this answer and another anwser from the statistics site for a visualization. In the current notation, it is

$$ E[X^r] = r \int_0^{\infty} x^{r-1} R_F(x) \dd{x} $$

where $R_F \equiv 1 - F$ is the survival function of $X$ such that after invoking the given item #6 in the beginning, we have

$$R_F(x) = 1 - \left(1 - e^{-u(x)}\right)^{\alpha} \label{eq_R_G_before_expansion} \tag*{Eq.(1)}$$

Now, apply the series expansion (generalized Binomial theorem) on this quantity $1 - e^{-u(x)}$, which magnitude is smaller than unity, raised to the $\alpha$-th power.

\begin{align} \left(1 - e^{-u(x)}\right)^{\alpha} &= \sum_{k = 0}^{\infty} { \alpha \choose k} \left(- e^{-u(x)} \right)^k \\ &= \sum_{k = 0}^{\infty} (-1)^k \frac{ \alpha! }{k! (\alpha - k)! } e^{-k\,u(x)} \\ &= \small1 - \alpha e^{-u(x)} + \frac{ \alpha (\alpha - 1) }{ 2! } e^{-2u(x)} - \frac{ \alpha (\alpha - 1)(\alpha - 2) }{ 3! } e^{-3u(x)} + \cdots \\ &= 1 + \sum_{j = 1}^{\infty~~\text{or}~~\alpha} (-1)^j \frac{ \alpha (\alpha - 1) \cdots (\alpha - j + 1) }{ k! } e^{-j\,u(x)} \label{eq_expansion_for_R_G} \tag*{Eq.(2)} \end{align} The leading $1$ is pulled out in anticipation of the next step, and the shift in the lower summation limit automatically accommodates the expression $(\alpha - j + 1)$.

There are two cases for the upper summation limit:

  • when $\alpha$ is non-integer, the upper-limit is $\infty$ as the series goes on forever.
  • When $\alpha \in \mathbb{N}$, the series is just the ordinary binomial expansion that terminates at $j = \alpha$ with a total of $\alpha + 1$ terms.

$$\text{One can succinctly write the summation as}~\sum_{j = 1}^{\nu}~ ,\\ \text{where}~\nu \in \{\alpha, \infty\}~,~\text{meaning $\nu$ is either $\alpha$ or $\infty$.}$$

Put \ref{eq_expansion_for_R_G} back into \ref{eq_R_G_before_expansion} which in turn goes into the expectation integral, we have

$$R_F(x) = \sum_{j = 1}^{\nu} (-1)^j \frac{ \alpha (\alpha - 1) \cdots (\alpha - j + 1) }{ k! } e^{-j\,u(x)} \\ \implies \begin{aligned}[t] E[X^r] &= r \int_0^{\infty} x^{r-1} \sum_{j = 1}^{\nu} (-1)^j \frac{ \alpha (\alpha - 1) \cdots (\alpha - j + 1) }{ k! } e^{-j\,u(x)} \dd{x} \\ &= r \sum_{j = 1}^{\nu} (-1)^j \frac{ \alpha (\alpha - 1) \cdots (\alpha - j + 1) }{ k! } \int_0^{\infty} x^{r-1} e^{-j\,u(x)} \dd{x} \end{aligned}$$ which is the desired expression.$\quad Q.E.D.$

In case you're wondering, the exchange of the integral and summation is justified because the integrand-summand is positive and integrable. Please see e.g. this for more details or the relevant chapters in just any textbook on real analysis. $\endgroup$