Probability for a random variable to be greater than its mean
Solution 1:
We can adjust your $X_\varepsilon$ example ever so slightly to get a counter-example to your variance suggestion: $$ Z_\varepsilon = \begin{cases} 0 & \text{wp} \ 1 - \varepsilon, \\ 1 / \sqrt \varepsilon & \text{wp} \ \varepsilon. \end{cases} $$ This has $$ \mathbb E(Z_\varepsilon) = \sqrt \varepsilon \quad\text{and}\quad \mathbb V\text{ar}(Z_\varepsilon) = \mathbb E(Z_\varepsilon^2) - \mathbb E(Z_\varepsilon)^2 = 1 - \varepsilon. $$ So, $\mathbb P(Z_\varepsilon > \mathbb E(Z_\varepsilon)) = \varepsilon$ but its variance is approximately $1$. Of course, higher moments are divergently large (as $\varepsilon \to 0$). If you want the $k$-th momet to be bounded in $\varepsilon$, just replace $1/\sqrt\varepsilon$ with $\varepsilon^{-1/k}$.
This has the propery that the only way of being larger than the mean is to be enormously larger than the mean. In particular, it does not obey your second condition of having bounded support---well, for fixed $\varepsilon$ it does, but this support as a function of $\varepsilon$ is not bounded.
Highly related, but not exactly the same, is the Paley–Zygmund inequality: $$ \mathbb P(Z \ge (1 - \theta) \mathbb E Z) \ge (1 - \theta)^2 \mathbb E(Z^2) / \mathbb E(Z)^2. $$ Obviously, this is useless when $\theta = 0$.
Regarding your candidate, which was added after my answer above, I think it can be proved for discrete-valued measures. I expect this can be extended to continuous ones with a density, maybe beyond. Of course, you can rescale to be in $[0,1]$. Also, assume that $Z$ is non-constant, otherwise the variance is $0$ and it trivially holds.
Let $Z = \sum_{i=1}^k p_i \delta_{z_i}$ with $0 \le z_1 \le \cdots \le z_k \le 1$ and $\sum_i p_i = 1$. Now, let $\mu = \mathbb E(Z)$. Choose $I$ such that $z_{I-1} < \mu \le z_I$. Then, $\mathbb P(Z \ge \mu) = \sum_{i \ge I} p_i$.
Consider the following adaptation to $Z$, which cannot decrease the variance but does not change the probability of being larger than its mean:
define $Z'$ by moving all the $z_i$ with $i < I$ to $0$ and all with $i \ge I$ to $1$, ie $$ \textstyle Z' = (\sum_{i < I} p_i) \delta_0 + (\sum_{i \ge I} p_i) \delta_1. $$ Let $\mu' = \mathbb E(Z') \in (0,1)$.
Then, the probability of being larger than the mean is unchanged: $$\textstyle \mathbb P(Z' \ge \mu') = \sum_{i \ge I} p_i = \mathbb P(Z \ge \mu).$$ It shouldn't be so hard to check that the variance cannot decrease, as the mass is pushed further apart: $$\mathbb V\text{ar}(Z') \ge \mathbb V\text{ar}(Z).$$
Now, $Z'$ is just a $\{0,1\}$-valued random variable. It has mean $p$ and variance $p(1-p)$, where $p := \sum_{i \ge I} p_i$. Hence, $$ \mathbb V\text{ar}(Z) \le \mathbb V\text{ar}(Z') = p(1-p) \le p = \mathbb P(Z' \ge \mu') = \mathbb P(Z \ge \mu). $$
One should be able to approximate density functions with such finitely-supported discrete random variables and take limits.
This inequality is tight up to a multiplicative constant, as can be seen by taking a $\operatorname{Bern}(p)$ random variable and letting $p \to 0$. This is natural, because this random variable is unaffected by the above procedure. So, the only inequality which is not tight is $p(1-p) \le p$. But, there does not exist a constant $c < 1$ such that $p(1-p) \le c p$ for all $p > 0$.