An upper bound of binary entropy
Given $I=(0,1)$, for any $x\in I$ let $g(x)=-x\log x$ and $f(x)=g(x)+g(1-x)$.
Obviously $f\in C^{\infty}(I)$, $\,f(x)=f(1-x)$, $\,f>0$ and $$ \lim_{x\to 0^+}f(x)=\lim_{x\to 1^-}f(x) = 0. $$ We have $f'(x)=\log(1-x)-\log(x)$, hence:
$$\begin{eqnarray*} \frac{d}{dx}\frac{f(x)^2}{x(1-x)}&=&f(x)\cdot\frac{-x\log x+(1-x)\log(1-x)}{x^2(1-x)^2}\\&=&\frac{g(x)^2-g(1-x)^2}{x^2(1-x^2)}.\tag{1}\end{eqnarray*}$$ Since $g(x)>g(1-x)$ over $J=\left(0,\frac{1}{2}\right)$ (the proof of this fact comes later), the function $\frac{f(x)^2}{x(1-x)}$is increasing over $J$, hence:
$$ f(x)\leq 2\cdot f\left(\frac{1}{2}\right)\cdot\sqrt{x(1-x)}=\color{blue}{\left(2\log 2\right)}\sqrt{x(1-x)} \tag{2}$$
follows. Back to the missing part: obviously $g(x)-g(1-x)$ vanishes at $0,\frac{1}{2},1$. In order to prove that these points are the only zeros, it is enough to check that $h(x)=g(x)-g(1-x)$ is concave over $J$. Pretty easy:
$$ \frac{d^2}{dx^2}h(x) = \frac{2x-1}{x(1-x)}.\tag{3} $$ The proof is finished by noticing that $f(p)=\log(2)\cdot H_{\mathrm b}(p)$.
An very tight approximation for the binary entropy function is given by:
$$ H_{\mathrm{b}}(p) \approx \left(4p(1-p)\right)^{\frac{3}{4}}.\tag{4}$$
It does not hold as an upper bound or a lower bound, the the absolute value of the difference between the RHS and the LHS is always less than $8\cdot 10^{-3}$.
You may use the inequality $\ln(x)\le \sqrt{x-1}$ for any $x\ge 1$. Precisely speaking, note that \begin{equation} \begin{aligned} H(p)&=p\ln\frac{1}{p} +(1-p)\ln\frac{1}{1-p} \\ &\le p\sqrt{\frac{1}{p}-1} +(1-p)\sqrt{\frac{1}{1-p}-1}\\ &= 2\sqrt{p(1-p)} \end{aligned} \end{equation} Here the logarithm is to base natural exp..