On the sum of all elements of inverted correlation matrix

Assume I have a correlation matrix,$A$ \begin{equation} A_{i,j} = \begin{cases} 1,& \text{if}\ i=j\\ \rho_{i,j},& \text{otherwise} \\ \end{cases} \end{equation} Where $ 0\leq \rho_{i,j} \leq 1 $ and $A$ is positive definite.

Now define a correlation matrix $\bar{A}$ as follows: \begin{equation} \overline{A} = \begin{cases} 1,& \text{if}\ i=j\\ \overline{\rho},& \text{otherwise} \\ \end{cases} \end{equation} Where $\overline{\rho}=\frac{\sum_{i \neq j} \rho_{i,j}}{n^{2}-n}$

I want to prove that the sum of elements in the inverse matrix $\overline{A}^{-1}$ is less than the the sum of elements in the inverse matrix ${A}^{-1}$. Any ideas?

I wrote a program that makes random matrices of this sort and it was true for all matrices in many different dimensions so it can't be just by chance. I just don't know how to prove it.

Thanks


Solution 1:

This is true because with respect to the partial ordering of positive semidefinite matrices, matrix inverse is convex on the set of all positive definite matrices.

More specifically, if $X,Y\succ0$ and $t\in[0,1]$, then $(1-t)X^{-1}+tY^{-1}\succeq\left[(1-t)X+tY\right]^{-1}$, because positive definite matrices are simultaneously diagonalisable by congruence and the inverse function on positive reals is convex.

Now observe that $\bar{A}=\frac1{n!}\sum_PP^TAP$, where the summation runs through all permutation matrices $P$. By assumption, $A$ is positive definite. Therefore $\bar{A}$ is positive definite too and its inverse exists. So, by convexity of matrix inverse of positive definite matrix, \begin{aligned} \frac1{n!}\sum_PP^TA^{-1}P =\frac1{n!}\sum_P(P^TAP)^{-1} \succeq\left(\frac1{n!}\sum_PP^TAP\right)^{-1} =\bar{A}^{-1}. \end{aligned} Consequently, $$ e^TA^{-1}e =\frac1{n!}\sum_Pe^TP^TA^{-1}Pe \ge e^T\bar{A}^{-1}e. $$