Proof of nonnegativity of KL divergence using Jensen's inequality
Solution 1:
That follows from a rather trivial generalization of Jensen inequality:
Let $f,g:\mathbb{R} \to \mathbb{R}$ with $f(\cdot)$ convex. Then $E[f(g(X))] \ge f (E[g(X)])$
The proof is simple: apply the Jensen inequality to the random variable $Y=g(X)$. Notice that no convexity condition (actually, no condition at all) is required for the function $g$. But also notice that it's only the (convex) function $f$ the one that "goes outside the expectation" in the inequality.
In your case, take $f(x) = \log(x)$ (concave) and $g(x)=q(x)/p(x)$ (further: don't let the fact that in $g(x)=q(x)/p(x)$ $q$ and $p$ are densities confuse you; that does not matter at all).
Solution 2:
A little reminder - there seems to be a minor fault in your proof. Since $log(x)$ is concave, it follows that $\mathbb{E}(f(x)) \leq f(\mathbb{E}(x))$, not the other way around.
You can denote $f(x) = -\log(x)$ here as a whole to make it convex though, so that $\mathbb{E}(f(x)) \geq f(\mathbb{E}(x))$ and your argument still holds.