In the following I give a partial answer: I prove the inequality for rational $y$ which only implies the weak inequality for real $y$.

First suppose that $y_j$ are integers. Set $\lambda_j=y_j-j$ which is a partition.

Set $a_i=e^{x_i}$. Your determinant becomes $\det(a_i^{y_j})=\det(a_i^{j-1}) s_{\lambda}(a_1,\dots,a_n)$, a product of the Vandermonde determinant which is positive by the monotonicity of your variables (and exp respects this, of course) and a Schur function of positive variables (since exp is always positive) which is also positive because the Schur function is a sum of monomials in its variables.

Now, suppose that the $y_j$ are rational. Let their common denominator be $D$ and change $x_i$ and $y_j$ to $x_i/D$ and $y_iD$ which brings us back to the the integer case.

Finally, for real numbers, you can take the limit which unfortunately just gives you $\det \ge 0$, but maybe you have another way of knowing that your matrix is not singular.


It suffices to prove that given $x_1<\ldots <x_n$ and $y_1<\ldots <y_n$ the determinant $D$ of the matrix $M=\left(e^{x_iy_j}\right)_{1\leq i,j\leq n}$ is nonzero.

Indeed, $D$ is a continuous real function on the open connected subset $$ U=\{(\bar{x},\bar{y}):x_1<\ldots <x_n\ \wedge\ y_1<\ldots <y_n\} \subset \mathbb{R}^n\times\mathbb{R}^n $$ , hence, if it never vanishes, it must be always positive or always negative. Then you check positivity for a particular choice of the variables (for example, for $x_j=j-1$ you get a Vandermonde determinant and positivity is an easy check).

To prove the nonvanishing of the determinant, we use induction on $n$, imagine to fix the values of $x_1,\ldots, x_{n-1}$ and of $y_1,\ldots ,y_n$ and let vary $x_n=z$ over $\mathbb{R}$. Consider the determinant $D(z)$ of $M$ as a function of $z$. Clearly, we have $D(x_i)=0$ for $i=1,\ldots n-1$ and we expect (if the thesis is true) that $D(z)$ does not vanishes elsewhere.

Using Laplace expansion for the determinant we see that $D(z)$ has the following form $$ D(z)= A_1 (e^{y_1})^z+\ldots A_n (e^{y_n})^z $$ with coefficients $A_j\neq 0$ (by induction on $n$).

Now to conclude it is natural to recall a well-known fact about exponential polynomials.

Lemma 1 Let $\rho_1\ldots,\rho_n$ positive distinct real numbers and $c_1,\ldots, c_n\in\mathbb{R}$ non vanishing real numbers. Then the equation $\displaystyle\sum_{i=1}^n c_i\rho_i^z=0$ has at most $n-1$ solutions $z\in\mathbb{R}$

Lemma 1 it is a direct corollary of the more general.

Lemma 2 Let $\rho_1\ldots,\rho_n$ positive distinct real numbers and $c_1,\ldots, c_n\in\mathbb{R}[x]$ non trivial polynomials with real coefficients. Then the equation $\displaystyle\sum_{i=1}^n c_i(z)\rho_i^z=0$ has at most $\displaystyle\sum_{i=1}^n (deg(c_i)+1)-1$.

Lemma 2 is easily proven using only induction and Rolle's theorem.


Some ideas:

1) What about induction on n? For $\,n=1\,$ the claim is trivial. For $\,n=2\,$ we have to prove that $\,\displaystyle{e^{x_1y_1+x_2y_2}>e^{x_1y_2+x_2y_1}\Longleftrightarrow (x_1-x_2)(y_1-y_2)>0}\,$, check...etc. (this looks really awful)

2) For any $\,X:=(v_1,v_2,...,v_n)\in\mathbb{R}^n\,$ , show that $\,X^TAX>0\,$ , with $\,A:=\left(e^{x_iy_j}\right)\,$ , making this matrix positive definite and thus its determinant is positive (this looks slightly better...)