Maximize a function involving binary entropy
I have the following function: $$ f(\gamma)=\log _{2}\left(1+2^ {\frac{h\left(p_{0}\right)-h\left(p_{1}\right)}{p_{0}-p_{1}}}{}\right)+\frac{(1-p_{1}) h\left(p_{0}\right)-(1-p_{0}) h\left(p_{1}\right)}{p_{1}-p_{0}}$$ where $h(p) = -p\log_2 p - (1 - p)\log_2(1 - p)$ is the binary entropy function, and $$p_0=1-e^\frac{-\gamma^2}{a^2}, \quad p_1=1-e^\frac{-\gamma^2}{a^2+b^2}.$$
Question: For given values of $a$ and $b$, what value of $\gamma$ maximizes $f(\gamma)$? I am trying to come up with an analytical expression for this optimal value of $\gamma$.
Can someone help me to analytically find the optimal $\gamma$.? I am ok with good approximations as well.
Starting from @River Li's last comment, solving $f'(x)=0$ for a given value of $c$ is not bad from a numerical point of view since $f'(x)$ is a smooth function (more or less looking like an hyperbola).
Some results obtained using Newton method $$\left( \begin{array}{cc} c & x \\ 0.00 & 0.000000 \\ 0.01 & 0.005313 \\ 0.02 & 0.009586 \\ 0.03 & 0.013397 \\ 0.04 & 0.016907 \\ 0.05 & 0.020195 \\ 0.10 & 0.034588 \\ 0.15 & 0.047012 \\ 0.20 & 0.058343 \\ 0.25 & 0.068978 \\ 0.30 & 0.079138 \\ 0.35 & 0.088957 \\ 0.40 & 0.098522 \\ 0.45 & 0.107890 \\ 0.50 & 0.117101 \\ 0.55 & 0.126179 \\ 0.60 & 0.135141 \\ 0.65 & 0.143996 \\ 0.70 & 0.152751 \\ 0.75 & 0.161408 \\ 0.80 & 0.169966 \\ 0.85 & 0.178426 \\ 0.90 & 0.186784 \\ 0.95 & 0.195039 \\ 0.96 & 0.196677 \\ 0.97 & 0.198311 \\ 0.98 & 0.199941 \\ 0.99 & 0.201567 \end{array} \right)$$ This looks like a power law and an empirical curve fit $$x=\alpha \,c^{\beta}$$ gives $(R^2=0.999981)$ $$\begin{array}{clclclclc} \text{} & \text{Estimate} & \text{Standard Error} & \text{Confidence Interval} \\ \alpha & 0.202615 & 0.000210 & \{0.202183,0.203046\} \\ \beta & 0.781326 & 0.002224 & \{0.776755,0.785897\} \\ \end{array}$$ and the maximum absolute error is $\sim 0.0011$ and an average error of $0.0005$.