Let $X=(X_{ij})_{1\le i,j\le n}$ be a matrix of indeterminates over $\mathbb C$. For choices $I,J\subseteq\{1,\ldots,n\}$ with $|I|=|J|=k$ denote by $X_{I\times J}$ the matrix $(X_{ij})_{i\in I,j\in J}$.

I will call the polynomials $\delta_{I\times J}:=\det(X_{I\times J})$ the $k$-minors of $X$, for all choices of $I$ and $J$ as above. It is well-known that each of the $\delta_{I\times J}$ is an irreducible polynomial, but assume we have a linear combination

$$ f = \sum_{\substack{I,J\subseteq\{1,\ldots,n\}\\|I|=|J|=k}} \lambda_{IJ} \cdot \delta_{I\times J} \quad\in \mathbb C[X_{ij}]_k $$

with certain $\lambda_{IJ}\in\mathbb C$, and $f\ne 0$. My question is:

Is $f$ always irreducible?

If no, I would be curious to see a counterexample. I somehow can't manage to come up with one. Are there well-known conditions under which $f$ is irreducible?


Let's prove, by induction on $k$, that any linear combination of $k$-minors is an irreducible polynomial. For $k=1$ all is clear. In general one can write $f=X_{11}q+r$, where $q$ is a linear combination of $(k-1)$-minors of the submatrix $X'$ of $X$ obtained after removing the first row and column of $X$, and $r$ is a polynomial which doesn't contain $X_{11}$. (If $f$ is a linear combination of minors none of them containing $X_{11}$, then one can choose another indeterminate instead of $X_{11}$ and all the arguments remain the same.) By the induction hypothesis, $q$ is irreducible. If $f$ is reducible, then it's easy to see that we must have $q\mid f$. But one can choose a suitable matrix $A$ such that $q(A)=0$ and $f(A)\neq 0$. In order to do this we impose on $A$ the condition that all the $(k-1)$-minors of $A'$ (the submatrix of $A$ obtained after removing the first row and column of $A$) are $0$, but not all the $(k-2)$-minors are $0$, that is, $\operatorname{rank}A'=k-2$. (The condition makes sense for $k\ge 3$. For $k=2$ we have $A'=0$, but this case is rather easy.) Moreover, in this case all the $k$-minors of $A$ not containing the first row/column are $0$, so $f(A)$ is a linear combination $k$-minors containing the first row and column of $A$. Now we have to choose suitable values for the elements of the first row and column of $A$ such that $f(A)\neq 0$. Now is better to consider that on the first row and column of $A$ there are still indeterminates and see that it is possible to choose suitable values for the the elements of $A'$ such that the polynomial $f(A)=\sum \alpha_{ij}X_{1i}X_{j1}$ has not all the coefficients equal to $0.$ In fact, $\alpha_{ij}$ are linear combinations of some $(k-2)$-minors of $A'$ with the same coefficients as those from the representation of $f$ as a linear combination of $k$-minors. This suggests us to choose $A'$ as simple as possible, e.g. a diagonal matrix with exactly $k-2$ nonzero elements on the main diagonal.

Edit. For $k=2$ the matrix $A$ has indeterminates on the first row and column, and all the other entries are $0$. Then $f(A)$ is a linear combination of $2$-minors of $A$ containing $X_{11}$. This means that $f(A)$ is a linear combination of the products $X_{1i}X_{j1}$, $i,j\neq 1$, and such a polynomial can't be $0$.


I am inspired by YACP's answer, but I think the following is more "algebraic" and only deals with $f$ as a polynomial, not as a function. It should work over any field: We do induction on $k$, and write $f=X_{ij}q+r$ with $X_{ij}$ not appearing in $r$ and $q\ne 0$. Now let's assume $f=gh$ and wlog. assume that $X_{ij}$ appears only in $g$. Then, we write $g=aX_{ij}+b$ for certain other polynomials $a$ and $b$, where $X_{ij}$ does not appear in either $a$ or $b$. Then, $$ahX_{ij}+bh = gh = f = X_{ij}q+ r$$ so we have $ah=q$ and by irreducibility of $q$, we have $a\in\mathbb C$ and $h\sim q$ (differ by a nonzero constant). In particular, $\deg(g)=1$. If no variable $X_{ij}$ appears in $h$, we are done because $h$ must then be constant. Hence, assume that at least one variable appears in $h$. Then, we can perform the above argument exchanging the roles of $g$ and $h$ to deduce $\deg(h)=1$ as well. It follows that $k=2$, i.e. $f$ is the product of two linear forms.

Write $g=\sum_{ij} \lambda_{ij} X_{ij}$ and $h=\sum_{ij} \mu_{ij} X_{ij}$. Then, with the index tuples lexicographically ordered, $$ f= gh = \sum_{(i,j)} \sum_{(r,s)>(i,j)} (\lambda_{ij}\mu_{rs}+\lambda_{rs}\mu_{ij}) X_{ij}X_{rs}$$ We know two things:

  • $\mu_{ij}\lambda_{ij}=0$
  • $\lambda_{ij}\mu_{rs}+\lambda_{rs}\mu_{ij} = - (\lambda_{is}\mu_{rj} + \lambda_{rj}\mu_{is})$

The first equality says that $f$ is multilinear, and the second equality follows because $f$ is a sum of $2$-minors. There certainly exists some $\mu_{ij}\ne 0$. Thus, $\lambda_{ij}=0$. Picking $s=j$ and leaving $r$ arbitrary, it follows that $\lambda_{rj}\mu_{ij} + \lambda_{rj}\mu_{ij}=0$, i.e. $\lambda_{rj}=0$. Similarly by choosing $r=i$ and leaving $s$ arbitrary, we get $\lambda_{is}=0$.

On the other hand, there must be some $\lambda_{rs}\ne 0$. Hence, $\mu_{rs}=0$. Symmetrically, we get $\mu_{is}=0$ and $\mu_{rj}=0$. But now, we have

$$0 = - (\lambda_{is}\mu_{rj} + \lambda_{rj}\mu_{is}) = \lambda_{ij}\mu_{rs}+\lambda_{rs}\mu_{ij} = \lambda_{rs}\mu_{ij},$$

a contradiction.