Mean, Expected Value, or Expectation of a Constant?
According to Wikipedia, "the expected value of a constant is equal to the constant itself; i.e., if $c$ is a constant, then $\mathbb E[c] = c$." I am currently having a hard time picturing what this means. If I have a random variable $X$ that represents the number of times a coin lands on heads, then I can see how $\mathbb E(X)$ is $0.5$. But $\mathbb E(0.5)$ seems "meaningless." Can someone please explain the significance of the expected value of a constant?
Solution 1:
If it helps your intuition, think of it as a non-random random variable. Something like:
$$ p(y)=\begin{cases}1, &\quad \text{ if } y=c\\ 0, &\quad \text{otherwise} \end{cases} $$
So then the expected value is $$ E(Y)=\sum Pr(Y=y)\times y=c\times1=c. $$
If it's a constant, it can't vary and there's no randomness. So it must be itself, because it cannot be anything else!
Solution 2:
You are computing the expectation value of the random variable $X$ whose outcome is always the same. Let us focus ourselves to the discrete case, for simplicity. Formally you want to compute the exp. value of the random variable
$$X:(\Omega,P)\rightarrow \operatorname{Im}(X):=\{c\}$$
where $\omega\mapsto X(\omega):=c$ for all $\omega\in\Omega$, with $(\Omega, P)$ finite probability set and $c\in\mathbb R$. Then
$$P_X(X=c):=P(\{\omega\in\Omega~:X(\omega)=c\})=P(\Omega):=1$$
and
$$\mathbb E[X]:=\sum_{c_i\in\operatorname{Im}(X)}c_i\cdot P_X(X=c_i)=c\cdot P_X(X=c)=c\cdot 1=c.$$
The continuous case is similar, with technical differences.
Solution 3:
I think the key here is to view X as it as: a Random Variable.
When writing E(X), x is a constant, you are assuming X to be the random variable of a constant which in turn means its probability value is 1.