Expected probability of a weighted coin [closed]
Suppose a weighted coin has probability $p$ of flipping heads, where $p$ was selected uniformly at random in the range [0, 1]. After seeing $a$ heads and $b$ tails, what's the expected probability?
Let $p_{a,b}(x)$ be the PDF for flipping $a$ heads and $b$ tails, where $x \in [0, 1]$. Given the prior distribution, $p_{0,0}(x) = 1$. After flipping a head, the prior is updated by a factor of $x$, and after flipping a tail, the prior is updated by a factor of $1-x$. Therefore, $p_{a,b}(x) = x^a \cdot (1-x)^b / c$, where $c$ is some factor depending on $a, b$ which makes the $ \int_{0}^{1} p_{a,b}(x) dx = 1$.
Integrating $x^a \cdot (1-x)^b$, we get that $c = \frac{a! \cdot b!}{(a+b+1)!}$. The expected value is $E_{a,b} = \int_{0}^{1} x \cdot p_{a,b}(x) = \int_{0}^{1} x^{a+1} \cdot (1-x)^b$, and re-using the previous integration, we know that this evaluates to
$$\frac{(a+b+1)!}{a! \cdot b!} \cdot {\frac{(a+1)! \cdot b!}{(a+b+2)!}} = \frac{a+1}{a+b+2}$$
This makes sense, as $E_{a,a} = \frac{1}{2}$, $E_{\infty,b}=1$, and $E_{a,\infty} = 0$. However, I wonder if there is a combinatorial reason for this that doesn't need integrals.