For a convex function, can I say that if the $L^2$ norm of the gradient $f(x)$ is smaller than the $L^2$ norm of gradient $f(y)$, then $f(x) < f(y)$?

I have a question about convex function. Suppose $f: \mathbb{R}^d \to \mathbb{R}$ is a convex function, can I claim that $f(x) < f(y)$ if and only if $\lVert \nabla f(x) \rVert_2 < \lVert \nabla f(y) \rVert_2$? It seems obvious, but can someone provide proof or hints for me?


It's not true, even in the $d = 1$ case. Consider the function $$f : \Bbb{R} \to \Bbb{R} : x \mapsto 2|x| + x.$$ Then $f$ is convex, as the sum of convex functions. Note that: $$\nabla f(x) = \begin{cases} -1 & \text{if } x < 0 \\ 3 & \text{if }x > 0.\end{cases}$$ The function is also coercive, meaning that $f(x) \to \infty$ as $|x| \to \infty$. It's continuous, and has a minimum at $0$. This means that we can find $x < 0$ so that $f(x)$ is any value we like, and $y > 0$ so that $f(y)$ is any value we like. We always have $|\nabla f(x)| < |\nabla f(y)|$, but we can have $f(x) < f(y)$, or $f(x) > f(y)$, or $f(x) = f(y)$; our choice.

If you object to $f$ not being differentiable everywhere, you can just convexly smooth the function at $0$. It won't change the problematic behaviour for large $x$ and $y$ (negative and positive).