If $A,B$ symmetric positive semidefinite, show tr$(AB) \geq 0$
Supposing $V$ is a finite dimensional vector space (over $\mathbb{R}$) of dimension $n$, and $A,B$ are symmetric positive definite linear mappings from $V$ to $V$, how can I show that in any orthonormal basis $\mathrm{tr}(AB) \geq 0$?
I noticed that since they are symmetric we have that $$\mathrm{tr}(AB) = \sum_{i=1}^n\sum_{j=1}^nA_{ij}B_{ji} = \sum_{i=1}^n\sum_{j=1}^nA_{ij}B_{ij}$$ which is the sum of the elements of the element-wise product of $A,B$. I don't know if this is helpful.
As others have remarked, you might as well suppose that $A$ and $B$ are positive semidefinite matrices. We may write $A = X^{t}X$ and $B = Y^{t}Y$ where $X$ and $Y$ are $n \times n$ real matrices. Then ${\rm tr}(AB) = {\rm tr}(X^{t}X Y^{t}Y)$ = ${\rm tr}((YX^{t})(XY^{t})).$ The latter matrix has the form ${\rm tr}(UU^{t})$ for a real $n \times n$ matrix $U$, and such a trace is always non-negative.
Since this may be homework, I will only give hints.
Without loss of generality you may assume that $V=R^n$.
Trace is independent of the basis you use. Thus it suffices to show this in the basis where $A$ is diagonal.
A positive semi-definite matrix has nonnegative diagonal. Why?
Putting 1-3 together, one needs to show that the $tr(AB)\geq 0$ where $A$ is a nonnegative diagonal matrix and $B$ has nonnegative diagonal.
Here's another derivation (7 years later):
Let $A,B\succeq0$. Then the eigendecomposition of symmetric $B$ gives $B=\sum_{i=1}^n \lambda_i v_i v_i^T$. Therefore,
$$\begin{align} \operatorname{Tr}[AB]&=\operatorname{Tr}[A\sum_{i=1}^n \lambda_i v_i v_i^T]\\ &=\sum_{i=1}^n \lambda_i \operatorname{Tr}[Av_i v_i^T]\\ &=\sum_{i=1}^n \underbrace{\lambda_i}_{\geq0} \underbrace{v_i^TAv_i}_{\geq0} \\ &\geq 0 \end{align}$$
where the last equality is from the cyclic property of the trace.
Feel free to ask for any clarifications neeeded.
Edit: Here's the explanation of the eigendecomposition.
In matrix form, the eigen-equation is: $BV=V\Lambda$, where $V$ is a matrix whose columns are the eigenvectors $\{v_i\}$ of $B$, and where $\Lambda=\operatorname{diag}(\lambda_1,...,\lambda_n)$ is a diagonal matrix with the eigenvalues of $B$ along the diagonal. Because $B$ is symmetric, these $V$ matrices are orthogonal, meaning their columns are orthonormal, so $VV^T=V^TV=I_n$. We can then re-write the matrix equation: $BVV^T=B=V\Lambda V^T$.
$$\begin{align} \rightarrow B &= \begin{bmatrix} v_1 & \cdots & v_n \end{bmatrix} \begin{bmatrix} \lambda_1 & \cdots & 0\\ \vdots & \ddots & \vdots\\ 0 & \cdots & \lambda_n \end{bmatrix} \begin{bmatrix} v_1^T\\ \vdots\\ v_n^T \end{bmatrix}\\ &= \begin{bmatrix} v_1 & \cdots & v_n \end{bmatrix} \begin{bmatrix} \lambda_1v_1^T\\ \vdots\\ \lambda_nv_n^T \end{bmatrix}\\ &= \sum_{i=1}^n \lambda_i v_i v_i^T \end{align}$$
where this is a sum of outer products of the eigenvectors. Note that I wrote my matrices above as vector with elements that are vectors. This shorthand is valid and very convenient, but feel free to write it out and check me!