A sufficient condition for a symmetric $n\times n$ matrix $C$ to be invertible is that the matrix is positive definite, i.e. $$\forall x\in\mathbb{R}^n\backslash\{0\}, x^TCx>0.$$

We can use this observation to prove that $A^TA$ is invertible, because from the fact that the $n$ columns of $A$ are linear independent, we can prove that $A^T A$ is not only symmetric but also positive definite.

In fact, using Gram-Schmidt orthonormalization process, we can build a $n\times n$ invertible matrix $Q$ such that the columns of $AQ$ are a family of $n$ orthonormal vectors, and then: $$I_n=(AQ)^T (AQ)$$ where $I_n$ is the identity matrix of dimension $n$.

Get $x\in\mathbb{R}^n\backslash\{0\}$.

Then, from $Q^{-1}x\neq 0$ it follows that $\|Q^{-1}x\|^2>0$ and so: $$x^T(A^TA)x=x^T(AI_n)^T(AI_n)x=x^T(AQQ^{-1})^T(AQQ^{-1})x \\ = x^T(Q^{-1})^T(AQ)^T(AQ)(Q^{-1}x) = (Q^{-1}x)^T\left((AQ)^T(AQ)\right)(Q^{-1}x) \\ = (Q^{-1}x)^TI_n(Q^{-1}x) = (Q^{-1}x)^T(Q^{-1}x) = \|Q^{-1}x\|^2>0.$$ Being $x$ arbitrary, it follows that: $$\forall x\in\mathbb{R}^n\backslash\{0\}, x^T(A^TA)x>0,$$ i.e. $A^TA$ is positive definite, and then invertible.


@RobertLewis

A Gram matrix is usually defined by giving a set of vectors and then defining the i,j entry as the dot product of the i,j vectors. In doing so, clearly the set of vectors can be thought of as column vectors of A. So saying "the vectors for A" is a completely natural thing to say, and should be unambiguous.

here is an elegant proof Gram matrix invertible iff set of vectors linearly independent