convexity of matrix "soft-max" (log trace of matrix exponential)

$ \DeclareMathOperator{\tr}{tr}$ You can get some information using the concepts of Schur-convexity and majorization:

Let $X$ be an $n \times n$ real symmetric matrix, with spectral decomposition $ X = U \Lambda U^T$, $U$ is an orthogonal matrix, $\Lambda$ is diagonal with the eigenvalues. Then your function is $$ g_\lambda(X)= \frac{1}{\lambda} \log \tr e^{\lambda X} = \frac{1}{\lambda} \log\sum_{k=1}^n e^{\lambda \lambda_k}. $$ (the matrix exponential function)

Now, $g_\lambda$ is a Schur-convex function (the proof is easy), the proof method given in the wikipedia article works. And, if now $X$ and $Y$ are two real symmetric matrices as above, then, if the eigenvalues of $X$ are $\lambda_i$ and the eigenvalues of $Y$ is $\omega_i$, we have the majorization result that the vector of eigenvalues of $\mu X + (1-\mu) Y$ $(0 \le \mu \le 1)$ is majorized by the vector $\mu \lambda + (1-\mu) \omega$. This gives the result that $$ g_\lambda(\mu X + (1-\mu) Y) \le \frac{1}{\lambda} \log \sum_{k=1}^n e^{\lambda(\mu \lambda_k + (1-\mu) \omega_k)} $$ which might be of some help in further analysis.

EDIT The following paper seems to give an answer: http://people.orie.cornell.edu/aslewis/publications/96-convex.pdf

Your function is convex. The result proved there is that a spectral matrix function (of Hermitian or real symmetric argument) ($g_\lambda$ is spectral, meaning that it depends only on the eigenvalues of the matrix argument): convex spectral functions can be characterized as symmetric convex functions of the eigenvalues. $g_\lambda$ is clearly symmetric, and you proved yourself it is convex. That is all which is needed (after reading that paper above).