Prove the inequality: $\prod_{j=1}^ka_{jj}\leq\left(\frac{1}{k}\sum_{j=1}^k\lambda_j\right)^k.$

Apply G.M. $\le$ A.M. to $a_{jj}, j = 1,\ldots,k$, we have:

$$\prod_{j=1}^k a_{jj} \le \left( \frac{1}{k} \sum_{j=1}^k a_{jj} \right)^{k}$$

Since $A$ is a real non-negative symmetric matrix with non-negative eigenvalues $\lambda_1 \ge \lambda_2 \ge \cdots \lambda_n \ge 0$, we can find an orthogonal matrix $\Omega$ and diagonal matrix $\Lambda$ such that $A = \Omega^{T} \Lambda \Omega$ and $\Lambda_{ii} = \lambda_i$ for $i = 1,\ldots,n$. In terms of coefficients of $\Omega$, we have:

$$\sum_{j=1}^k a_{jj} = \sum_{j=1}^k \sum_{i=1}^n \lambda_i |\Omega_{ij}|^2 =\sum_{i=1}^n\lambda_i \beta_i \,\,\,\text{ where }\,\,\,\beta_i = \sum_{j=1}^k |\Omega_{ij}|^2 $$

Using properties of orthogonal matrices, it is not hard to see $0 \le \beta_i \le 1$ and $\sum_{i=1}^n \beta_i = k$.

Let $x_0 = 0$ and $x_i = \sum_{s=1}^i \beta_s$ for $i = 1,\ldots,n$. Define two functions $\beta(t)$, $\gamma(t)$ on $[0,k)$ by:

$$\begin{align} \beta(t) &= \lambda_i \,\,\,\text{ for }\,\,\, t \in [x_{i-1}, x_i), & i = 1,\ldots, n\\ \gamma(t) &= \lambda_i \,\,\,\text{ for }\,\,\, t \in [i-1, i), & i = 1,\ldots, k \end{align}$$

Since $\lambda_i$ is non-increasing, it is easy to see $\beta(t) \le \gamma(t)$ on $[0,k)$. From this, we can deduce:

$$\sum_{i=1}^{n} \lambda_i\beta_i = \int_{0}^{k} \beta(t) dt \le \int_{0}^{k} \gamma(t) dt = \sum_{i=1}^k \lambda_i$$

As a result, we can conclude: $$\prod_{j=1}^k a_{jj} \le \left( \frac{1}{k} \sum_{j=1}^k \lambda_j \right)^{k}$$

EDIT Background information.

The statement $\sum_{j=1}^k a_{jj} \le \sum_{j=1}^k \lambda_j$ is actually a corollary of a well known theorem first proved by Schur. Namely,

Let $A$ be an $n\times n$ Hermitian matrix. Let $\text{diag}(A)$ denote the vector whose coordinates are the diagonal entries of $A$ and $\lambda(A)$ a vector whose coordinates are the eigenvalues of $A$ arranged in any order, then $\text{diag}(A)$ is majorized by $\lambda(A)$.

What this means is when we sort the components of $\text{diag}(A)$ and $\lambda(A)$ into two n-tuples of decreasing order:
$$a^{\downarrow}_1 \ge a^{\downarrow}_2 \ge \cdots \ge a^{\downarrow}_n \,\,\,\text{ and }\,\,\, \lambda^{\downarrow}_1 \ge \lambda^{\downarrow}_2 \ge \cdots \ge \lambda^{\downarrow}_n$$ we have $a^{\downarrow}_1 \le \lambda^{\downarrow}_1$, $\,\,\,a^{\downarrow}_1 + a^{\downarrow}_2 \le \lambda^{\downarrow}_1 + \lambda^{\downarrow}_2$ and in general, $$\sum_{i=1}^k a^{\downarrow}_i \le \sum_{i=1}^k \lambda^{\downarrow}_i \,\,\,\text{ for } 1 \le k < n\,\,\,\text{ and }\,\,\, \sum_{i=1}^n a^{\downarrow}_i = \sum_{i=1}^n \lambda^{\downarrow}_i$$


These inequalities hold more generally for $A\in M_n(\mathbb{C})$ hermitian semidefinite positive. My favourite argument uses the most convenient min-max theorem, which says that when the eigenvalues of $A$ are in nonincreasing order $\lambda_1\geq \lambda_2\geq\ldots \geq \lambda_n\geq 0$, we have $$ \lambda_j=\max_{\dim F=j}\;\;\min_{x\in F, \|x\|=1} (Ax,x) $$ where the max is taken over all dimension $j$ subspaces $F$ of $\mathbb{C}^n$.

Denoting $\{e_1,\ldots,e_n\}$ the canonical basis of $\mathbb{C}^n$, we see that $a_{jj}=(e_j,Ae_j)\geq 0$ for every $j$. By AM-GM applied to $a_{11},\ldots,a_{kk}$, we see that it suffices to prove that $$ \sum_{j=1}^k a_{jj}\leq \sum_{j=1}^k\lambda_j \qquad\forall 1\leq k\leq n. $$ In the case $k=n$, we easily see that this is an equality as both numbers are equal to the trace of $A$. It is more delicate to handle the case $k\leq n-1$.

Now we denote $A_k$ the upper-left square block of $A$ of size $k$, i.e. $A_k=(a_{ij})_{1\leq i,j\leq k}$. Note that it is a hermitian positive semidefinite matrix in $M_k(\mathbb{C})$ as $(A_kx,x)=(Ax,x)$ for every $x\in \mathbb{C}^k$ identified with the subspace of $\mathbb{C}^n$ spanned by $e_1,\ldots,e_k$. Writing $\mu_1\geq\ldots\geq \mu_k\geq 0$ the eigenvalues of $A_k$, the min-max theorem applied to $A_k$ yields $$ \mu_j=\max_{F\subseteq \mathbb{C}^k\dim F=j}\;\;\min_{x\in F, \|x\|=1} (A_kx,x)=\max_{F\subseteq \mathbb{C}^k\dim F=j}\;\;\min_{x\in F, \|x\|=1} (Ax,x). $$ Since the maximum of the rhs runs over a subset of the set of all dimension $j$ subspaces of $\mathbb{C}^n$, it is not greater than the maximum over the latter. Therefore $$ \forall 1\leq j\leq k\qquad \mu_j\leq \lambda_j\quad\Rightarrow\quad \sum_{j=1}^ka_{jj}=\mbox{tr}A_k=\sum_{j=1}^k\mu_j\leq \sum_{j=1}^k\lambda_j. $$