Given an ergodic property that guarantees convergence of sample means to an expectation, how can I bound the Cesàro Mean of expectation of terms?

I have a sequence of (not iid) random variables $\{E_{i}\}$ that converge in distribution (actually, in total variation) as well as in $\mathcal{L}^{2}$ to $E_{\infty}$. For all nonnegative functions $f$ with $\mathbb{E}[f(E_{\infty})]<\infty$, it's established that \begin{align} \lim_{T\rightarrow\infty}\frac{1}{T}\sum_{n=1}^{T}f(E_{n}) \overset{\mathrm{a.s.}}{=}\mathbb{E}[f(E_{\infty})]. \end{align} I would like to bound (again for a nonnegative $f$) \begin{align} \underset{T\rightarrow\infty}{\lim\sup}\text{ }\frac{1}{T}\sum_{n=1}^{T}\mathbb{E}[f(E_{n})]. \end{align} In my particular case, I've been unable to uniformly bound the random variables $f(E_{i})$, so I cannot use, for example, the reverse Fatou's Lemma to interchange the limit and expectation to show something like $\underset{T\rightarrow\infty}{\lim\sup}\text{ }\mathbb{E}[\frac{1}{T}\sum_{n=1}^{T}f(E_{n})] \le \mathbb{E}[\underset{T\rightarrow\infty}{\lim\sup}\text{ }\frac{1}{T}\sum_{n=1}^{T}f(E_{n})]$. Does anyone have any advice for how I could try to proceed-- in particular ideas for how to interchange the limit and expectation without some kind of uniform bound?

The thing is that I don't much about the function $f$. I do know some potentially relevant facts. I can prove that $\mathbb{E}[f(E_{n})]$ is always finite.


As in my comments, the best possible upper and lower bounds are: $$ E[f(E_{\infty})] \leq \limsup_{T\rightarrow\infty}\frac{1}{T}\sum_{n=1}^TE[f(E_n)] \leq \infty$$


Lower bound: Suppose $f$ is a nonnegative function such that $$ \lim_{T\rightarrow\infty}\frac{1}{T}\sum_{n=}^T f(E_n) = E[f(E_{\infty})] \quad \mbox{almost surely}$$ We have \begin{align} \limsup_{T\rightarrow\infty}\frac{1}{T}\sum_{n=1}^TE[f(E_n)] &\geq \liminf_{T\rightarrow\infty}\frac{1}{T}\sum_{n=1}^TE[f(E_n)]\\ &\overset{(a)}{\geq} E\left[\liminf_{T\rightarrow\infty}\frac{1}{T}\sum_{n=1}^Tf(E_n) \right]\\ &= E[f(E_{\infty})] \end{align} where (a) holds by Fatou's lemma for sequences of nonnegative random variables. The lower bound is achievable by defining $X_n=X_{\infty}=0$ for all $n$.


Upper bound: Define $Y$ geometric with success probability $1/4$. Define $$E_n=(1-\frac{1}{2^n})1\{Y>n\} \quad \forall n \in \{1, 2, 3, ...\}$$
Then we surely have $0\leq E_n\leq 1$ for all $n \in \{1, 2,3, \ldots\}$, and we surely have $E_n=0$ for all sufficiently large $n$. It follows that $E_n\rightarrow 0$ surely (and in $L_2$). Also, for any real-valued function $g$ we surely have $g(E_n)=g(0)$ for all sufficiently large $n$, and so $\frac{1}{T}\sum_{n=1}^T g(E_n)\rightarrow g(0)$ surely.

Now define the continuous and nonnegative function $f:[0,1)\rightarrow \mathbb{R}$ by $$ f(x) = -1 + \frac{1}{1-x}$$ Since $f(0)=0$ we surely have $f(E_n)=0$ for all sufficiently large $n$. However \begin{align} E[f(E_n)] &= E\left[f\left((1-\frac{1}{2^n})1\{Y>n\}\right)\right] \\ &= E\left[f\left(1-\frac{1}{2^n}\right)|Y>n\right](3/4)^n + E[f(0)|Y\leq n]P[Y\leq n] \\ &= (3/4)^nf(1-\frac{1}{2^n})\\ &= (3/4)^n\left(-1 + 2^n\right) \end{align} and so $E[f(E_n)]$ is finite for all $n$, but $\frac{1}{T}\sum_{n=1}^T E[f(E_n)]\rightarrow \infty$.