In reading about the average case analysis of randomized quick sort I came across linearity of expectations of indicator random variable I know indicator random variable and expectation. What does linearity of Expectation mean ?


Solution 1:

Let $\xi_1,\xi_2:\Omega\to\mathbb R$ be two random variables on the same probability space $(\Omega,\mathscr F,\mathsf P)$ . The expectation of either is defined by $$ \mathsf E\xi_i:= \int_\Omega \xi_i(\omega)\mathsf P(\mathrm d\omega). $$ The linearity of the expectation means that for any constants $\alpha_1,\alpha_2\in\Bbb R$ it holds that $$ \mathsf E[\alpha_1\xi_1+\alpha_2\xi_2] = \alpha_1\mathsf E\xi_1+\alpha_2 \mathsf E\xi_2 $$ which follows directly from the linearity of the Lebesgue integral in the definition of the expectation. Hence, the functional $\mathsf E$ defined over the space of random variables on the probability space $(\Omega,\mathscr F,\mathsf P)$ is linear.

For the independence over the product, yet again if $\xi_1,\xi_2,\dots,\xi_n$ are random variables on the same probability space as above, and they are mutually independent then $$ \mathsf E\left[ \prod_{i=1}^n\xi_i\right] = \prod_{i=1}^n\mathsf E\xi_i $$

Solution 2:

The expectaion is a linear operator. This means it satisfies the linearity properties of a function/operator. The linearity is defined as

$$af_1(x_1)+bf_1(x_2)=f_1(ax_1+bx_2)$$

As an example to Expectaion

$$E[\frac{1}{N}\sum_iX_i]=\frac{1}{N}\sum_iE[X_i]$$ and assume that $E[X_i]=\mu$ $\forall i$ then you get

$$\frac{1}{N}\sum_iE[X_i]=\mu$$