An efficient way to check the coefficient of $x_1 x_2...x_n$ in the product of $m$ polynomials

This is a Hadamard matrix of Sylvester type, written in a different order. There is a massive literature on this topic.

Just use the order $[1,X,Y,XY,Z,ZX,ZY,ZXY]$ which corresponds to lexicographic order over the integers, and apply the corresponding permutation to the columns as well.

Ryan O'Donnell's notes on Analysis of Boolean functions available here which are now published as a book, Claude Carlet's chapters on Boolean functions here are easily accessible online.

In the order I gave above, your matrix is simply the 3-fold Kronecker product $$H_2 \otimes H_2\otimes H_2$$of the basic 2 by 2 Hadamard matrix.

$$ H_2=\left(\begin{array}{cc} 1 & 1 \\ 1 & -1 \end{array} \right), $$

This paper considers efficient evaluation of Hadamard representation coefficients.


Here is the proof of the complexity. I am still interested in suggestions for different creative approaches to solve the problem (even partial solutions) and/or ideas or references for other possible applications.

To give a brief proof, I will reduce a problem that was previously posted here and answered by Yuval Filmus to this new problem.

The first step is a simple change of variable: $X' = 2X-1$ that is applied to all input variables. So that the inputs are now defined over $\left \{ -1,1 \right \}$ instead of $\left \{ 0,1 \right \}$. Hence now we have $X'^2=1$ instead of $X^2=X$. Alternatively one could directly use the transformation described in the problem to get the polynomials. For 3SAT problem, we only need the polynomial transformation for the following three functions:

\begin{matrix} \text{Gate Name} & \text{Boolean} & \text{Polynomial} \\ XOR (y=\neg{x})& xy'+x'y& 1-XY\\ AllOrNone (x=y=z)& xyz+x'y'z' &1+XY+XZ+YZ\\ atLeastOne (x\lor y \lor z)& (x'y'z')' &7+ X + Y + Z -XY-XZ-YZ+ XYZ \end{matrix}

The second step is multiplying $F$ by $X_1 X_2 ... X_n$. This will shift all the powers by one and convert $F = a X^2 + b X + c$ to $F = a X^3 + b X^2 + c X \equiv (a+c) X + b$. It would not be hard to show that $a+c$ is indeed the total number of solutions ($F(X=1)$ is number of solutions with $x$ and $F(X=-1)$ is number of solutions with $\neg{x}$) and would be zero if and only if the original set of equations are not satisfiable. Note that in practice, for each variable $X_i$, we pick one polynomial (clause) with $X_i$, multiply it by $X_i$ and convert $X_i^2$ to $1$ for that clause. This will reduce the 3SAT problem to the problem described in this question. In above calculations, I used multiplication by a constant for simplification, but that does not affect the results.


EDIT:

Here is a numerical approach:

Let $\alpha$ be the coefficient for $x_1 x_2 ... x_n$ in $F=f_1 f_2 ... f_m$. We have:

$\alpha = \frac{\partial^n }{\partial x_1 \partial x_2 ... \partial x_n} F \bigg|_{x_1=x_2=...=x_n=0}$

Is there a way to estimate $\alpha$ numerically (maybe using some random algorithm) and check if it is zero or not? For example if we assign small random values to $x_1,x_2, ..., x_n$ and calculate $F$, do we really need exponential different measurements to be able to approximate the derivative at origin ?

Another related approach would be to come up with a system of equations that could be numerically solved. For a subset of polynomial this seems to be feasible. (e.g. see here).