Does almost sure convergence implies convergence of the mean?
I asked a slightly similar question here: Does Convergence in probability implies convergence of the mean?, but now I wish to examine a stricter scenario: Let $\{X_n\}_{n=1}^\infty$ be a sequence of random variables converging a.s to a const $c$. Is it required for the sequence to be uniformly integrable in order to imply $\lim_{n\to \infty} EX_n = c$?
And what about $\lim_{n\to \infty} EYh(X_n) = E[Y]h(c)$ for some random variable $Y$ and a continuous function $h$? Under which regularity conditions does the last equality holds?
For the first question, try $P[X_n=0]=1-1/n^2$, $P[X_n=2^n]=1/n^2$, then $X_n\to0$ almost surely but $E[X_n]$ does not converge to $0$. (The second question is unclear.)