If GCD $(a_1,\ldots, a_n)=1$ then there's a matrix in $SL_n(\mathbb{Z})$ with first row $(a_1,\ldots, a_n)$

Let $n \geq 2$. Let $a_1, a_2, \ldots, a_n$ be $n$ integers such that $\gcd\left(a_1, a_2, \ldots, a_n\right) = 1$. Prove that there exists a matrix in $\operatorname{SL}_n\left(\mathbb{Z}\right)$ whose first row is $\left(a_1, a_2, \ldots, a_n\right)$.

Since the gcd of the integers $a_1,\ldots, a_n$ is $1$, there exists weights $x_i \in \mathbb{Z}$ such that $a_1x_1+\cdots+ a_nx_n=1$. My two ideas are (a) to brute force construct an $n\times n$ matrix with first row $a_1,\ldots ,a_n$ and to construct the remaining rows such that the determinant is $\sum a_ix_i=1$ or (b) to use induction.

(a) (Constructive) This is tedious since once I find a way to construct the remaining $n-1$ rows to ensure that $a_1x_1$ appears in the determinant, I am not sure how to modify these $n-1$ rows to ensure that only the terms $a_ix_i$ appear in the cofactor expansion. If such a matrix exists, I'd like to see it.

(ii) (Non-constructive) If I proceed by induction then the base case $n=2$ is settled since I can choose the 2nd row to be $-x_2, x_1$ so that the determinant is $a_1x_1-a_2(-x_2)=1$. However, I'm not sure how to use the inductive hypothesis to show that if I can construct such an $n\times n$-matrix then I can construct an $\left(n+1\right) \times \left(n+1\right)$-matrix with the desired property. In particular, if the gcd $(a_1,\ldots ,a_{n+1})$ is $1$, it is not necesarry that the gcd of any $n$ of these terms is $1$, so induction may not even apply here.

How can I construct such a matrix or prove that one exists (without necessarily constructing it)?


Suppose the statement holds for $n\geq2$, and consider integers $a_1,\ldots,a_{n+1}$ whose GCD is $1$. Let $d=\gcd(a_1,a_2,\ldots,a_n)$, and let $a_i'=a_i/d$ for $i\leq n$. Note that $$ 1=\gcd(a_1,\ldots,a_{n+1})=\gcd(d, a_{n+1}) $$ and $$ 1=\gcd(a_1',\ldots,a_n'). $$ By induction, there are matrices $X\in SL_2(\mathbb Z)$ and $Y\in SL_n(\mathbb Z)$ whose first rows are $(d, a_{n+1})$ and $(a_1',\ldots,a_n')$ respectively. Let $$ X=\begin{pmatrix}d&a_{n+1}\\ p&q\end{pmatrix}. $$ Also consider the $1\times n$ row matrix $v=\begin{pmatrix}p&0&\ldots&0\end{pmatrix}$ and the $n\times 1$ column matrix $w=\begin{pmatrix}a_{n+1}&0&\ldots&0\end{pmatrix}^T$. Let $D$ be the $n\times n$ diagonal matrix with diagonal $(d,1,\ldots,1)$. Let $$ Z=\begin{pmatrix}DY&w\\ vY&q\end{pmatrix}. $$ Note that the first row of $Z$ is $(a_1,\ldots,a_{n+1})$. Also $$\begin{eqnarray*} Z\begin{pmatrix}Y^{-1}&0\\ 0&1\end{pmatrix} &=&\begin{pmatrix}D&w\\ v&q\end{pmatrix}\\ &=&\begin{pmatrix}d&0&a_{n+1}\\0&I&0\\ p&0&q\end{pmatrix}. \end{eqnarray*}$$ Since $X$ is invertible over $\mathbb Z$, so is the RHS, and so is $Z$. Thus $\det(Z)=\pm1$, and we get the required matrix by flipping the sign of one row of $Z$ if necessary.


I can get close; consider the block matrix $$\left(\begin{matrix} a_1x_1 & r\\ c&I\end{matrix}\right)$$ where $r$ is the row vector $(\begin{matrix} a_2 & a_3 & \cdots & a_n\end{matrix})$, $c$ is the column vector $(\begin{matrix} -x_2 & -x_3 & \cdots & -x_n\end{matrix})^T$, and $I$ is the $(n-1)\times (n-1)$ identity matrix.