Determinant of a Certain Block Structured Positive Definite Matrix

Solution 1:

I believe the lower bound on the eigenvalues is 0.

The eigenvalues of $\Gamma=\left[\begin{array}{cc} I & B \\ B^{*} & I \end{array} \right]$ are $1\pm \sigma$ where $\sigma$ is a singular value of $B$. Every eigenvalue has a multiplicity of the corresponding singular value of $B$. The eigenvectors are $\left(\begin{array}{c} u \\ \pm v \end{array}\right)$ where $u$ is a left singular vector of $B$, and $v$ is the corresponding right singular vector of $B$.

$B$ is rather constrained. In the real case, $\hat{B}=\sqrt{d}B$ is a matrix whose entries are all $\pm 1$. The eigenvalues of such a matrix can be as large as $\pm d/2$. We want to construct a family of matrices whose smallest eigenvalue is strictly greater than $-\sqrt{d}$ but as close as possible to that value.

I first found an example with $d=16$ where $\Gamma$ is positive semidefinite. (In my experience, computational experiments with other $d$ show it isn't hard to find $\hat{B}$ with minimum eigenvalues very close to $-\sqrt{d}$.)

If you set: $$\hat{B}=\left[\begin{array}{cccccccc} -1 & 1 & 1 & 1 &-1 &-1 & 1 & 1 \\ 1 &-1 & 1 & 1 &-1 &-1 & 1 & 1 \\ 1 & 1 &-1 &-1 & 1 &-1 & 1 & 1 \\ 1 & 1 &-1 &-1 &-1 & 1 &-1 & 1 \\ -1 &-1 & 1 &-1 &-1 &-1 &-1 & 1 \\ -1 &-1 &-1 & 1 &-1 &-1 &-1 &-1 \\ 1 & 1 & 1 &-1 &-1 &-1 & 1 &-1 \\ 1 & 1 & 1 & 1 & 1 &-1 &-1 & 1 \\ \end{array}\right]$$ and $B=\hat{B}/4$, then $\Gamma$ is positive semidefinite. This $\Gamma$ is a Gram matrix. You can recover the basis using a Cholesky factorization $\Gamma=V^{*}V$. The factor $V$ has a special structure, $V=\left[\begin{array}{cc} I & B \\ 0 & M \end{array} \right]$ where $M$ is the Cholesky factor of $I-B^{*}B$. The square root is guaranteed to exist as this is positive semidefinite.

I also constructed a family of $B$ which make $\Gamma$ positive semidefinite (and which can be altered to instead have vanishingly small eigenvalues). Take $n$ to be any multiple of 36. In the first column of $B$, set all $n/2$ entries to $\frac{1}{\sqrt{n}}$. In the second column, set the first $n/6$ entries to $\frac{1}{\sqrt{n}}$, and the remaining $n/3$ entries to $-\frac{1}{\sqrt{n}}$. In the third column, set the first $5n/36$ entries to $\frac{1}{\sqrt{n}}$, the second $n/36$ entries to $-\frac{1}{\sqrt{n}}$, the third $n/36$ entries to $\frac{1}{\sqrt{n}}$, and the remaining $11n/36$ entries to $-\frac{1}{\sqrt{n}}$. Fill out the rest of $B$ however you wish.

If you compute the Cholesky factorization, you'll find the entries (in $M$) below the first three columns are: $$\left[\begin{array}{ccc} \frac{1}{\sqrt{2}} & \frac{\sqrt{2}}{6} & \frac{\sqrt{2}}{6} \\ 0 & \frac{2}{3} & -\frac{2}{3} \\ 0 & 0 & 0 \end{array}\right]$$ The last row is significant: it means we can find a linear combination of the three columns that are all zero in these entries. Glued with the columns of $B$ above, it means this lies in the span of the first $n/2$ vectors of the standard basis. Note this can be altered slightly to introduce an $\epsilon$ in the lower right entry; the entries of $M$ would be altered but not the entries of $B$. This would give vanishingly small eigenvalues to $\Gamma$.

As a little bit of geometric insight as to what's going on, $B$ is a matrix of inner products. Its largest singular value measures the angle between two spaces: the span of the columns of $\left[\begin{array}{c}I\\0\end{array}\right]$ and the span of the columns of $\left[\begin{array}{c}B\\M\end{array}\right]$. That is $\sigma_{\max}=cos(\theta)$ where $\theta$ is the angle between the spaces. The columns of $B$ always lie in the span of the columns of $I$. We can construct the entries of $M$ (which is upper triangular) so that they make $\left[\begin{array}{c}B\\M\end{array}\right]$ orthonormal, but so that there are one or more zeros on the diagonal. (The columns of $M$ are linearly dependent.)

Questions

Are there other constraints you can put on $B$? For example, if $\hat{B}$ is a Hadamard matrix, then the eigenvalues of $\hat{B}$ are $\pm \sqrt{d/2}$ (each with multiplicity $d/4$). The eigenvalues of $\Gamma$ would be $1\pm 1/\sqrt{2}$.

EDIT:

In your third comment, you're describing the Sylvester construction of Hadamard matrices of order $2^n$, right? If so, why not use a Hadamard matrix, and avoid questions like this about bounding the spectrum? If your concern is in constructing Hadamard matrices of an order that isn't a power of 2, you should know there are other constructions besides the Sylvester construction. That said, the construction of even-order Hadamard matrices is an open research problem.

What you ask in your first comment sounds like a much more open-ended research question. Could you post a new question with some more details about the problem you're actually trying to solve? It would help to understand where you're coming from, and what you really want to do.

For instance, what restrictions do you face in constructing $B$? Or is $B$ simply given to you from somewhere else? If there are no restrictions, why not use a Hadamard matrix to construct $B$? Why the $\pm 1$ restriction on the entries (or unit size restriction in the complex case)? What restrictions, if any, are there on the basis that's generating the Gram matrix?

What size or range of sizes of $B$ are you interested in? Are you also interested in asymptotics? Are you using these for computing, or are you trying to answer an analytical question?

More generally, where is this problem coming from?

EDIT2: I've edited my answer above to correct an error, and to add an infinite family of examples whose eigenvalues are 0 or can be made as close to 0 as desired.

Solution 2:

Using one of the block matrix determinant formulas, we find that $$ \det(\Gamma) = \det(I - BB^*) $$ So, let $s_1,s_2,\dots,s_{d/2}$ denote the singular values of $B$ in decreasing order. We have $$ \det(\Gamma) = \prod_{i=1}^{d/2} (1-s_i) $$ Now, we note that $B$ is a $d/2$ by $d/2$ matrix whose entries have magnitude $1/\sqrt{d}$. Using the Frobenius norm, we have the upper bound $$ \sigma_1(B) \leq \sqrt{\sum_{i,j =1}^{d/2} |B_{ij}|^2} = \sqrt{d/4} = \frac{d}{2} $$ (We could similarly use any of the Schatten $p$-norms, or any other unitarily invariant norm).

Thus, we have $$ \det(\Gamma) = \prod_{i=1}^{d/2} (1-s_i) \geq \prod_{i=1}^{d/2} \left(1-\frac {\sqrt d}{2}\right) = \left(1-\frac {\sqrt d}{2}\right)^{d/2} $$


In fact, I believe we get a similar bound (if not the same bound) using the Gershgorin circle theorem.