Finding vectors orthonormal to a given vector set and the Gram-Schmidt process
Here is a likely very simple problem that I am confused about: Let $\{v_k\}$ be a set of vectors ($k=1, ..., n$). I would like to find a set of vectors $\{q_k\}$ such that
$$\langle q_i| v_j \rangle = \delta_{ij}$$
where $\langle .| . \rangle$ is the inner product (assume standard inner product for simplicity). We do not place any additional restrictions on the properties of $\{q_k\}$.
This question has (in modified form) already been asked on the network: Given a set of non orthogonal functions. Find another set of functions that are orthogonal to the first set.. The answer there links to the Gram-Schmidt process as the solution. I think I understand the latter, but I don't understand how it solves the above problem.
Specifically, I do not understand how the functions $\{u_k\}$ obtained from the Gram-Schmidt process (notation consistent with the wikipedia article above) correspond to the $\{q_k\}$, since the $\{u_k\}$ do not fulfill the required property. This can be seen straightforwardly, since $$u_1=v_1,$$ such that (assuming the $\{v_k\}$ are already normalized) $$\langle u_1| v_1 \rangle = 1$$ and $$\langle u_1| v_2 \rangle = \langle v_1| v_2 \rangle \neq 0 .$$
Also from a conceptual perspective, the two problems look rather different to me. Gram-Schmidt generates an orthogonal basis that spans the same subspace (vectors whose inner-product with themselves is identity matrix), while what I am looking for are vectors whose inner product with the original vectors is the identity matrix. The $\{q_k\}$ are likely not orthonormal themselves in general.
I am probably missing something really simple (physicist here, please be gentle...). Any help would be appreciated. I am not clear under what conditions the required set of vectors can be constructed, so assume linear independence or other properties of $\{v_k\}$ where necessary.
Also it is acceptable if the $\{q_k\}$ lie outside the span of $\{v_k\}$ (or if the vector space and inner product have to suitably be extended for $\{q_k\}$ to exist). E.g. if $\{v_k\}$ are functions $\{v_k(r)\}$and the inner product is the $l^2$-Norm, the functions $\{q_k\}$ do not have to be linear combinations of $\{v_k\}$. In this sense, we are essentially looking for the functions which invert the matrix of the original vectors.
Solution 1:
Let $q_i = \sum_{j=1}^n \beta_{ij}v_j$, we want it to satisfy
$$\langle q_i, v_j\rangle= \delta_{ij} $$
$$\langle \sum_{k=1}^n \beta_{ik}v_k, v_j\rangle= \delta_{ij} $$
$$\sum_{k=1}^n \beta_{ik}\langle v_k, v_j\rangle= \delta_{ij} $$
This is a linear system of equations.
That is if we define a matrix $A$ such that the $(i,j)$-th entry is $\langle v_i, v_j\rangle$. If $A$ is invertible, then $\beta_{ij}$ is the $(i,j)$-th entry of $A^{-1}$.
Upon knowing $\beta_{ij}$, we can now solve for $q$.
If $\{ v_1, \ldots, v_n\}$ is not linearly independent, then no such $q$ exists.
WLOG, if $v_1 = \sum_{k=2}^n c_k v_k$, suppose $q_1$ exists, then we have
$$1=\langle v_1, q_1\rangle = \sum_{k=2}^n c_k \langle v_k, q_1 \rangle=0$$
we get a contradiction.
Now, suppse $\{v_1, \ldots, v_d\}$ form a basis where we pick $v_{n+1}, \ldots, v_d$ to be orthonormal and orthogonal to the first $n$ vectors.
Now, let $q_i = \sum_{j=1}^d \beta_{ij}v_j$,
$$\langle \sum_{k=1}^d \beta_{ik}v_k, v_j\rangle= \delta_{ij} $$
$$\sum_{k=1}^d \beta_{ik}\langle v_k, v_j\rangle= \delta_{ij} $$
which reduces to
$$\sum_{k=1}^n \beta_{ik}\langle v_k, v_j\rangle= \delta_{ij} $$
which is the previous case.
Solution 2:
$\langle q_i|v_j\rangle=0$ for $i\neq j$ implies $\langle q_i|w\rangle=0$ for any $w$ that is a linear combination of the $v_j$ with $j\neq i$. In particular, $\langle q_i|v_j\rangle=\delta_{ij}$ for all $i$ and $j$ implies no $v_i$ is a linear combination of $v_j$ with $i\neq j$, i.e. that $\{v_k\}$ is a linearly independent set.
The Gram--Schmidt procedure takes as input a vector $v$ and a finite set $\{u_i\}$ of vectors satisfying $\langle u_i|u_j\rangle=0$ if $i\neq j$ and $0$ or $1$ when $i=j$, its output are the numbers $\langle v|u_i\rangle$, the normalization $u=w/\|w\|$ of the difference $w=v-\sum_i\langle v|u_i\rangle u_i$ (unless $w=0$, in which case set $u=0$), and the number $\langle v|u\rangle$.
Iterating the Gram--Schmidt process on the sequence of vectors $v_1,\dots,v_n$ results in a sequence vectors $u_1,\dots,u_n$ such that $\langle u_i|u_j\rangle=0$ for $i\neq j$, $\langle u_i|u_i\rangle$ is $0$ or $1$, and $R=(r_{ij})_{i,j}=\langle u_i|v_j\rangle$ an upper triangular matrix, hence such that $v_j=\sum_{i=1}^jr_{ij}u_i$.
Note that $u_k=0$ (equivalently $r_{kk}=0$) if and only if $v_k$ is a linear combination of the vectors before it. This suggests weakening the condition $\langle q_i|v_j\rangle=\delta_{ij}$ for the desired set $\{q_k\}$ to $\langle q_i|v_j\rangle=0$ if $i\neq j$, $q_k=0$ if $v_k$ is a linear combination of the vectors before it, and $\langle q_k|v_k\rangle=1$ otherwise.
This can be achieved as follows. First, set $q_k=0$ if $u_k=0$. Then, assuming we have found $q_{k+1},\dots,q_n$, set $q_k=r_{kk}^{-1}(u_k-\sum_{j=k+1}^nr_{kj}q_j)$. This works because we get $\langle q_k|v_j\rangle=r_{kk}^{-1}(r_{kj}-r_{kj})=0$ if $j>k$ and $\langle q_k|v_k\rangle=r_{kk}^{-1}r_{kk}=1$, and $\langle q_k|v_j\rangle=0$ if $j<k$.