Algorithm to find an orthogonal basis (orthogonal to a given vector)
Solution 1:
By the way, most likely a much simpler and efficient way to compute your matrix $B$ here is just using a single Householder reflection. Consider $w=v\pm\|v\|_2e_1$ (the sign in "$\pm$" is usually chosen to be same as in the first component in $v$ from numerical reasons). Then set $Q=I-2ww^T/w^Tw$. Since $Qv=\mp\|v\|_2e_1$ (depending on the choice of the sign in $w$), we have $v=\mp\|v\|_2Qe_1$ (multiply by $Q^T=Q$). It means that $v$ is a scalar multiple of the first column (row) of the orthogonal matrix $Q$. Let $Q=[q_1,\ldots,q_K]$. It follows that $q_i^Tv=0$ for $i=2,\ldots,K$. Hence $B=[q_2,\ldots,q_K]$ satisfies $B^Tv=0$ and $B^TB=I$.
In contrast to the $O(K^3)$ complexity of the Gram-Schmidt approach (which is of course generally applicable to compute orthonormal basis for any given set of vectors), this way you get your $B$ using $O(K^2)$ operations if you need the full matrix explicitly computed or $O(K)$ if it's enough to have it implicitly represented by $w$.
Solution 2:
Run Gram Schmidt orthogonalization on the vectors $v,e_1,...,e_K$ (ignore one of the reductions as it will be zero or near zero if numerics are an issue), to get $v_1,...,v_K$. Then let $B= \begin{bmatrix} v_2 \cdots v_K \end{bmatrix}$.
Details:
By $e_k$ I mean the vector of zeroes with a one in the $k$th position. Note that $e_1,...,e_K$ forms a basis for $\mathbb{R}^K$.
Let $u_1 = v, u_k = e_{k+1}$ for $k=1,...,K$. Note that $u_1,...,u_{K+1}$ is a spanning set but is not a basis.
Then the algorithm (this is theoretical, in a practical situation the comparison in Step 3 would be replaced by something like $\|w_i\| < \epsilon$) looks like:
- $v_1 = {u_1 \over \|u_1\|}$, $i=2$
- $w_i = u_i - \sum_{j=1}^{i-1} v_j^T u_i$
- if $w_i \neq 0$ then $v_i = { w_i \over \|w_i\|}$ else $v_i = 0$
- $i \leftarrow i+1$, if $i \le K+1$ goto Step 2
This is basically the Gram Schmidt algorithm, except one of the $v_i$s will be zero.
By construction, $\operatorname{sp} \{ u_1,...,u_i \} = \operatorname{sp} \{ v_1,...,v_i \}$, hence the final set will span $\mathbb{R}^K$ (since $u_2,...,u_{K+1}$ is a basis). Consequently exactly one of the $v_i$s will be zero.
Also by construction, $v_j^T v_i = 0$ for all $j<i$, and $\|v_i\| = 1$ for all $i$ except one. Let $v'_i, i=1,...,K$ be the non-zero $v_i$s.
Now let $B = \begin{bmatrix} v'_2 \cdots v'_K \end{bmatrix}$ and note that $B^T v'_1 = 0$, and since $v = \|v\| v'_1$, we are finished.