Proof: $\det\pmatrix{\langle v_i , v_j \rangle}\neq0$ $\iff \{v_1,\dots,v_n\}~\text{l.i.}$
Let $V=[v_1,v_2,\dots,v_n]$. Then $A=V^TV$. So
$$\det(A)=\det(V^TV)=\det(V^T)\det(V)=\det(V)^2.$$
The determinant of a square matrix is non-zero if and only if its columns form a linearly independent set of vectors.
Let $V=(v_1,\dots,v_n)$, then $ (a_{ij})=V^TV$, so $\det (a_{ij})=\det V^TV=(\det V)^2$.
I'll give an answer that does not assume that $\def\R{\Bbb R}V=\R^n$ or even that $\dim V=n$, even though the essential argument boils down to that case anyway.
One direction is easy: if the $v_i$ satisfy a nontrivial linear dependence relation, then so do the corresponding rows of your matrix (and the columns as well) which forces the determinant to be zero.
In the other direction, suppose the $v_i$ are linearly independent. Now $W=\left<v_1,\ldots,v_n\right>\subseteq V$ is a subspace of dimension$~n$. Then the linear map $f:W\to\R^n$ given by $f(w)=(\left<v_1,w\right>,\ldots,\left<v_n,w\right>)$ is injective: a vector $w$ in the kernel of $f$ is othogonal to each $v_i$, hence by linearity to each vector of $W$, and this implies $w=0$ because (the restriction to$~W$ of) the bilinear form is non-degenerate. The image by the injective linear map $f$ of the linearly independent $n$-tuples of vectors $v_1,\ldots,v_n$ in$~W$ gives a linearly independent $n$-tuple of vectors in$~\R^n$, whose determinant is nonzero.
A less abstract way to finish off the second part is to choose an orthonormal basis of$~W$, and express the $v_i$ on this basis. The matrix of $f$ with respect to this basis has as row$~i$ the list of coordinates of$~v_i$ in the chosen basis, in other words it is the transpose of the matrix whose column$~j$ gives the coordinates of$~v_j$, and the two matrices have the same nonzero determinant$~d$. The matrix you are interested in the the product of those two transpose matrices, and its determinant is therefore $d^2\neq0$.
Hints (for you to understand, complete and, eventually, prove):
Suppose we have $\,\{v_1,...,v_m\}\subset V\;,\;\;\text{with}\;\;m\le\dim V\;$, so we have that the Gramian is
$$G:=A^tA\;,\;\;\text{with}\;\;A=\left(v_1\;v_2\,\ldots\;v_m\right)=\text{ the matrix with columns $\,v_i\,$}$$
Note that $\,G\,$ is your matrix and this is an $\,m\times m\;$ square matrix , so:
$$(1)\;\;\;\exists\, 0\neq u\in V\;\;s.t.\;\;Gu=0\implies 0=u^tA^tAu=\langle\,(Au)^t\,,\,Au\,\rangle=\left\|Au\right\|\implies Au=0$$
and since $\,u\neq 0\,$ this means the rank of $\;a\;$ isn't full
(2) OTOH, if $\;\text{rk}(A)\;$ isn't full then $\,\exists\,0\neq u\in V\;\;s.t.\,\,Au=0\;$, so that
$$Gu=A^tAu=0\ldots\ldots$$