Can every nonsingular $n\times n$ matrix with real entries be made singular by changing exactly one entry?

I was just thinking about this problem:

Can every nonsingular $n\times n$ matrix with real entries be made singular by changing exactly one entry?

Thanks for helping me.


If $A$ is a nonsingular matrix with rows $r_1,r_2,\ldots,r_n$, then $\{r_2,\ldots,r_n\}$ spans an $(n-1)$-dimensional subspace $P$ of $\mathbb R^n$. At least one of the standard basis vectors $e_1,e_2,\ldots,e_n$ is not in $P$, say $e_i$. Then $\{e_i,r_2,r_3,\ldots,r_n\}$ is a basis of $\mathbb R^n$, and it follows that there is a real number $c$ such that $r_1-ce_i$ is in $P$. The matrix $A'$ with rows $(r_1-ce_i),r_2,r_3,\ldots,r_n$ is singular, and it is obtained from $A$ by subtracting $c$ from the entry in the first row and $i^\text{th}$ column.


Here's a way to rephrase this somewhat more geometrically. The subspace $P$ is a hyperplane that divides $\mathbb R^n$ into two half-spaces, and $r_1$ lies in one of these halves. The line through $r_1$ in the direction of a vector $v$ has the form $\{r_1+tv:t\in\mathbb R\}$. This line is parallel to $P$ only if $v$ is in $P$; otherwise, the line will cross $P$. Since $P$ can't be parallel to all of the coordinate directions (or else it would fill up all of $\mathbb R^n$), there must be a line of the form $\{r_1+te_i:t\in\mathbb R\}$ that crosses $P$, where $e_i$ is the standard basis vector with a $1$ in the $i^\text{th}$ position and $0$s elsewhere. This means that there exists $t_0\in \mathbb R$ such that $r_1+t_0e_i\in P$. And then, linear dependence of the vectors $r_1+t_0e_i,r_2,\ldots,r_n$ means that the matrix with those rows is singular.


The determinant is a linear polynomial in any given entry, so yes.

To see that $$\det\begin{pmatrix} a_{11} & \cdots & a_{1n}\\ \vdots & \ddots & \vdots\\ a_{n1} & \cdots & a_{nn}\end{pmatrix}$$ depends linearly on $a_{k\ell}$ for any given $k$ and $\ell$, note that $$\det\begin{pmatrix} a_{11} & \cdots & a_{1n}\\ \vdots & \ddots & \vdots\\ a_{n1} & \cdots & a_{nn}\end{pmatrix}=\sum_{\sigma\in S_n}\operatorname{sgn}(\sigma)\prod_{i=1}^n a_{i,\sigma(i)}=\sum_{\substack{\sigma\in S_n\\ \sigma(k)=\ell}}\operatorname{sgn}(\sigma)\prod_{i=1}^n a_{i,\sigma(i)}+\sum_{\substack{\sigma\in S_n\\ \sigma(k)\neq \ell}}\operatorname{sgn}(\sigma)\prod_{i=1}^n a_{i,\sigma(i)}$$ $$=\left(\sum_{\substack{\sigma\in S_n\\ \sigma(k)=\ell}}\operatorname{sgn}(\sigma)\prod_{\substack{i=1\\i\neq k}}^n a_{i,\sigma(i)}\right)a_{k\ell} + \left(\sum_{\substack{\sigma\in S_n\\ \sigma(k)\neq \ell}}\operatorname{sgn}(\sigma)\prod_{i=1}^n a_{i,\sigma(i)}\right)$$ As JeffE rightly points out below, we might have that the coefficient of $a_{k\ell}$ in the above expression is 0, and that therefore varying the value of $a_{k\ell}$ won't change the determinant. I don't see any way of guaranteeing that won't happen, but we can show that, given an $\ell$, it can't happen for every $k$: if it did, then varying the entire column $$A_\ell=\begin{pmatrix} a_{1\ell} \\ \vdots \\ a_{n\ell}\end{pmatrix}$$ in any way we want doesn't change the determinant, so (for example) $$\det(A_1\mid \cdots \mid 2A_\ell\mid \cdots \mid A_n)=\det(A_1\mid \cdots \mid A_\ell\mid \cdots \mid A_n).$$ But because the determinant of $A$ is a multilinear function of the columns, $$\det(A_1\mid \cdots \mid 2A_\ell\mid \cdots \mid A_n)=2\det(A_1\mid \cdots \mid A_\ell\mid \cdots \mid A_n)$$ so $$2\det(A_1\mid \cdots \mid A_\ell\mid \cdots \mid A_n)=\det(A_1\mid \cdots \mid A_\ell\mid \cdots \mid A_n)$$ which is impossible because the assumption that $A$ is non-singular means that $$\det(A_1\mid \cdots \mid A_\ell\mid \cdots \mid A_n)\neq0.$$ Thus, given an $\ell$, there exists at least one $k$ such that varying $a_{k\ell}$ can produce a singular matrix.