$f:\mathbb{R} \to \mathbb{R}$ such that $f(x+y^2+f(y)) = f(x-y^2-f(y))$
I will assume $f$ is continuous.
Now if $f(y)+y^2$ is nonconstant then the image $I:=\{f(y)+y^2:y\in\mathbb R\}$ contains some open interval $(a,b)$. Now for any $z\in (a,b)\subseteq I$ and any $x\in\mathbb R$ we have that $f(x-z)=f(x+z)$, by our assumption. Thus, $f$ has period $2z$. Now, every real number in $(2a,2b)$ is a period of $f$, which shows that $f$ must be constant.