Optimality condition of a convex function that is a.e. differentiable
If $f$ is convex and $f'(x) = 0$, then $x$ is a global minimiser of $f$, hence a. is true.
In the following, let $x$ be any global minimiser.
For b., if $f'(a)>0$, for example, then there is some nearby $a'$ such that $f(a')< f(a)$, and since $f(a') \ge f(x)$ we see that $f(a) > f(x)$. Similarly if $f'(a) <0$.
However, Daniels example above shows that there may be (at most) one point $a \in I$ at which $f(a) = f(x)$ and similarly for $J$.
Note that if $f(x') = f(x)$ then $f(y) = f(x) $ for all $y \in [x',x]$, hence there can be at most one point $a \in I$ for which $f(a) = f(x)$.
As an aside, note that $f'$ is non decreasing, whenever it exists.
Also, if $f$ is convex and defined on an open set, it is locally Lipschitz hence ae. differentiable.
Addendum: The above reasoning doesn't cover the case where $J$ is a singleton and the singleton is not contained in $A$. Let $J = \{x\}$.
Since $f$ is locally Lipschitz, it is absolutely continuous on any bounded interval, hence we have $f(a) = \int_x^a f'(t) dt$. If $ a < x$, then $f'(t) <0$ for ae. $t \in [a,x]$ and hence $f(a) > f(x)$. Similarly for $a>x$. Hence $x$ is a strict minimum.