Given three vectors, how to find an orthonormal basis closest to them?

I know Gram-Schmidt process but that is not what I am looking for. Given three vectors in $\mathbb{R}^3$, $\{v_1,v_2,v_3\}$, I want to find three vectors $\{w_1,w_2,w_3\} \subset \mathbb{R}^3$ such that $$(w_i,w_j) = \delta_{ij}, \quad \forall i,j\in\{1,2,3\},$$ $$D=\sum_{j=1}^3 \|v_j-w_j\|^2\ \mbox{is minimal.}$$ The drawback of the GS process is that it assumes a preferred order in the 3-tuple and only changes the length of the first vector, resulting in the change being done to the second and third too large and the distance $D$ suboptimal. I want to deform all three in a "fair" manner.


Solution 1:

If the norm is the Euclidean $2$-norm, the problem has been studied to death. Put it simply, since $\|x\|^2=x^Tx=\operatorname{trace}(xx^T)$, you can rewrite $D$ as a constant plus $-2\operatorname{trace}(VW^T)$. By performing singular value decomposition on $V$, the answer is trivial (but a little care has to be taken if $W$ is not just an orthogonal matrix but a rotation matrix).

Solution 2:

I've the same question. After googling keywords "nearest orthogonal matrix", I found this post, and two other related posts:

[1]. https://en.wikipedia.org/wiki/Orthogonal_matrix#Nearest_orthogonal_matrix

[2]. https://en.wikipedia.org/wiki/Singular_value_decomposition#Applications_of_the_SVD

Nearest orthogonal matrix

It is possible to use the SVD of a square matrix A to determine the orthogonal matrix O closest to A. The closeness of fit is measured by the Frobenius norm of O − A. The solution is the product UV∗.[3] This intuitively makes sense because an orthogonal matrix would have the decomposition UIV∗ where I is the identity matrix, so that if A = UΣV∗ then the product A = UV∗ amounts to replacing the singular values with ones.

The $D=\sum_{j=1}^3 \|v_j-w_j\|^2$ is the Frobenius norm of (V-W).

Hope this would help ;)