Example of two dependent random variables that satisfy $E[f(X)f(Y)]=Ef(X)Ef(Y)$ for every $f$
Here is a counterexample. Let $V$ be the set $\lbrace 1,2,3 \rbrace$. Consider random variables $X$ and $Y$ with values in $V$, whose joint distribution is defined by the following matrix :
$$ P=\left( \begin{matrix} \frac{1}{10} & 0 & \frac{1}{5} \\ \frac{1}{5} & \frac{1}{10} & 0 \\ \frac{1}{30} & \frac{7}{30} & \frac{2}{15} \end{matrix} \right)= \left( \begin{matrix} \frac{3}{30} & 0 & \frac{6}{30} \\ \frac{6}{30} & \frac{3}{30} & 0 \\ \frac{1}{30} & \frac{7}{30} & \frac{4}{30} \end{matrix}\right) $$
Thus, for example, $P(X=1,Y=2)=0$ while $P(X=1)P(Y=2)=(\frac{1}{10} + \frac{1}{5})(\frac{1}{10} + \frac{7}{30}) >0$. So $X$ and $Y$ are not independent.
Let $f$ be an ARBITRARY (I emphasize this point because of a comment below) function defined on $X$ ; put $x=f(1),y=f(2),z=f(3)$. Then
$$ \begin{array}{rcl} {\mathbf E}(f(X)) &=& \frac{3(x+y)+4z}{10} \\ {\mathbf E}(f(Y)) &=& \frac{x+y+z}{3} \\ {\mathbf E}(f(X)){\mathbf E}(f(Y)) &=& \frac{3(x+y)^2+7(x+y)z+4z^2}{30} \\ {\mathbf E}(f(X)f(Y)) &=& \frac{3x^2+6xy+3y^2+7xz+7yz+4z^2}{30} \\ \end{array} $$
The last two are equal, qed.
Here is a continuous counterexample. It has discrete analogues, some of which are given at the end. We start with a general remark, which may help understand where the counterexamples come from. (For a short version of the answer, go to the picture and ponder it, really it says everything.)
Assume that $(X,Y)$ has PDF $p$ and that there exists some measurable function $q$ such that, for every $(x,y)$, $$ p(x,y)+p(y,x)=2q(x)q(y). $$ One can assume without loss of generality that $q$ is a PDF. Then, for every function $f$, $$ E[f(X)f(Y)]=\iint f(x)f(y)p(x,y)\mathrm dx\mathrm dy=\iint f(x)f(y)p(y,x)\mathrm dx\mathrm dy, $$ hence $$ E[f(X)f(Y)]=\iint f(x)f(y)q(x)q(y)\mathrm dx\mathrm dy=\left(\int f(x)q(x)\mathrm dx\right)^2. $$ Thus, if, furthermore, $$ q(x)=\int p(x,y)\mathrm dy=\int p(y,x)\mathrm dy, $$ then indeed, $$ E[f(X)f(Y)]=E[f(X)]E[f(Y)]. $$ Now, a specific counterexample: assume that $(X,Y)$ is uniformly distributed on the set $$ D=\{(x,y)\in[0,1]^2\,;\,\{y-x\}\leqslant\tfrac12\}, $$ where $\{\ \}$ is the function fractional part. In words, $D$ (the black part in the image of the square $[0,1]^2$ below) is the union of the parts of the square $[0,1]^2$ between the lines $y=x+\frac12$ and $y=x$, and below the line $y=x-\frac12$.
$\hskip2in$
Then $(Y,X)$ is uniformly distributed on the image of $D$ by the symmetry $(x,y)\to(y,x)$ (the white part in the image of the square $[0,1]^2$ above), which happens to be the complement of $D$ in the square $[0,1]^2$ (actually, modulo some lines, which have zero Lebesgue measure). Thus, our first identity holds with $q=\mathbf 1_{[0,1]}$, that is:
If $(X,Y)$ is uniformly distributed on $D$, then $X$ and $Y$ are both uniformly distributed on $[0,1]$, $(X,Y)$ is not independent, and, for every function $f$, $E[f(X)f(Y)]=E[f(X)]E[f(Y)]$.
Note that $(X,Y)$ can be constructed as follows. Let $U$ and $V$ be i.i.d. uniform on $[0,1]$, then $(X,Y)=(U,V)$ if $(U,V)$ is in $D$, else $(X,Y)=(V,U)$.
An analogue of this in the discrete setting is to consider $(X,Y)$ with joint distribution on the set $\{a,b,c\}^2$ described by the matrix $$ \frac19\begin{pmatrix}1&2&0\\0&1&2\\2&0&1\end{pmatrix}. $$ Then the random set $\{X,Y\}$ is distributed like $\{U,V\}$ where $U$ and $V$ are i.i.d. uniform on $\{a,b,c\}$. For example, considering $S=\{(a,b),(b,a)\}$, one sees that $$ P[(X,Y)\in S]=\tfrac29=P[(U,V)\in S], $$ since $[(X,Y)\in S]=[(X,Y)=(a,b)]$, while $$ P[(U,V)\in S]=P[(U,V)=(a,b)]+P[(U,V)=(b,a)]=\tfrac19+\tfrac19. $$ Thus, $$ E[f(X)f(Y)]=E[f(U)f(V)]=E[f(U)]E[f(V)]=E[f(X)]E[f(Y)]. $$ This example can be extended to any sample space of odd size. A more general distribution on $\{a,b,c\}^3$ is, for every $|t|\leqslant1$, $$ \frac19\begin{pmatrix}1&1+t&1-t\\1-t&1&1+t\\1+t&1-t&1\end{pmatrix}. $$