$L^2(\mathbb{R})$ sequence such that $\sum_{n=1}^{\infty}\int_{\mathbb{R}}f_n(x)g(x)d\mu(x)=0$
Pretty sure it's false - maybe you should check with the guys who wrote the exam.
It's going to be a counterexample in $L^2([0,1])$, with $g=1$.
Say $(I_n)_{n=1}^\infty$ is a sequence of disjoint intervals with $|I_n|=2^{-n}$. Define $$F_1=\chi_{I_1}$$and $$F_n=2^{n-1}\chi_{I_n}-2^{n-2}\chi_{I_{n-1}}\quad(n>1).$$
Then $\sum F_n=0$ almost everywhere. But $\int F_1=1/2$ and $\int F_n=0$ for $n>1$, so $\sum\int F_n\ne 0$.
(If that last sentence is false it's an off-by-one error, getting tired; this is just one of the standard examples where $\int\sum\ne\sum\int$.)
So that's a counterexample, except that $\sum||F_n||_2^2=\infty$. We fix that:
Say $F\in L^2$. Let $f_j=F/n$. Then $$\sum_{j=1}^n||f_j||_2^2=\frac1n||F||_2^2.$$So: Let $(f_j)$ be the sequence consisting of $n_1$ copies of $F_1/n_1$ followed by $n_2$ copies of $F_2/n_2$, etc. The other things we want get inherited from the sequence $(F_n)$ (details on request), while if $n_j\to\infty$ fast enough we also have $\sum||f_j||_2^2<\infty$.