How to find closest positive definite matrix of non-symmetric matrix
I have a matrix $A$ given and I want to find the matrix $B$ which is closest to $A$ in the frobenius norm and is positiv definite. $B$ does not need to be symmetric.
I found a lot of solutions if the input matrix $A$ is symmetric. Are they any for a non-symmetric matrix $A$? Is it possible to rewrite the problem as a minimization of a convex problem?
Solution 1:
A real, square matrix $B$ is positive definite iff $v^TBv> 0$ for all $v\neq 0$. But $$v^TBv = \tfrac{1}{2}(v^TBv+v^TB^Tv) = \tfrac{1}{2}v^T(B+B^T)v.$$ It follows then that $B$ is positive definite iff $B+B^T$ is positive definite. Therefore, your model becomes $$\begin{array}{ll} \text{minimize} & \|A-B\|_F \\ \text{subject to} & B+B^T \succ 0 \end{array}$$ This remains a convex optimization problem. For example, in CVX the model is
n = size(A,1);
cvx_begin sdp
variable B(n,n)
minimize(norm(A-B,'Fro'))
subject to
B+B' >= 0
cvx_end
(Disclaimer: I am the author of CVX. You do not need to use it to solve this problem, however. Any SDP solver can handle this problem.)
Note that the CVX model relaxes the condition to require $B$ to be positive semidefinite. That will be necessary with any numerical solver you are likely to employ here. Of course, an interior-point method would get you a sequence of strictly positive definite solutions that converge to an optimum, but this optimum may itself be positive semidefinite.
Solution 2:
I think this is a direct way to compute the closest psd matrix without using numerical optimization.
-
Find the closest symmetric psd matrix (call it $S$) to $A$ as follows (see the proof of Theorem 2.1 in Higham's 1988 paper):
(i) Compute the symmetric part of $A$: $C=(A+A^T)/2$
(ii) Compute a spectral decomposition $C=UDU^T$, where $D$ is diagonal
(iii) Replace the negative entries in $D$ with zero to get diagonal matrix $D_+$
(iv) Compute $S=UD_+U^T$
-
Add an anti-symmetric matrix $Q$ to $S$ that gets it closest to $A$:
(i) Stack up a generic anti-symmetric matrix $Q$ into a vector $\text{vec}(Q)$ and rearrange it to the form $Px$, where $P$ is a known basis matrix and $x$ is a vector containing the upper-triangular elements of $Q$
(ii) Compute $Q$ from $\text{vec}(Q)=P(P^TP)^{-1}P'\text{vec}(A-S)$
(iii) The desired closest psd matrix is $B=S+Q$.
Solution 3:
Let suppose C is non positive definite correlation matrix $$C=Q\Lambda Q^*=Q (\Lambda_+ -\Lambda_-)Q^*$$ Where $\Lambda$ is diagonal matrix of Eigen values. Replace all negative eigen values with zero. Then we use the Symmetric , non negative definite matrix $\rho^2C$ with suitable value of $\rho$. Two choices of $\rho$ are $$\rho_1=tr(\Lambda)/tr(\Lambda_+) \space\space\space\space\space \rho_1=\sqrt{tr(\Lambda)/tr(\Lambda_+)}$$ User defined $\rho$ is also allowed. For +ve definite matrix $\Lambda=\Lambda_+$ and $\rho=1$ .Another Way is to add $C*I$ to your corr-matrix, where is C is a constant and I is an identity matrix.
Chan, Grace; Wood, Andrew T.A., An algorithm for simulating stationary Gaussian random fields, J. R. Stat. Soc., Ser. C 46, No.1, 171-181 (1997). ZBL0913.65142.