Can we show that the determinant of this matrix is non-zero?

$\newcommand{\R}{{\mathbb R}}$ This is a partial answer to the question. We show

  • for any real analytic function $f:[0,\delta[\to\R$, $\delta>0$ that is not a polynomial and any positive integer $n$, the determinant of $$M_n(f)= \begin{bmatrix} f(x) & f(2x) & \dots & f(nx)\\ f(2x) & f(4x) & \dots & f(2nx)\\ \vdots & \vdots & \dots & \vdots\\ f(nx) & f(2nx) & \dots & f(n^2x) \end{bmatrix}$$ does not vanish for sufficiently small $x>0$.
  • A polynomial $f$ with $f(0)=0$ that is strictly increasing does not satisfy condition 2 of the question in the case of positive quadruples $a,b,c,d>0$.

Thus the answer to the question is yes for real analytic functions.

Observe that the function of my comment is a polynomial on $]-\infty,0]$ with two non-vanishing terms ($f(x)=x-x^2$ for $x\leq0$) and hence, as will be shown, $\det M_n(f)=0$ for all $n\geq3$ and $x<0$.

Remark: It is an interesting problem to characterize the set of functions satisfying the conditions of the question, i.e. the set of all continuous, strictly increasing functions $f:\R\to\R$ with $f(0)=0$ such that for all $a,b,c,d\neq0$ with $ab=cd$ and $a+b>c+d$ we have $f(a)f(b)<f(c)f(d)$.

Proof of the first item: For $n=1$ there is nothing to show. For $n=2$, it has been shown in the question without using real analyticity. So consider any integer $n\geq 3$ and a non-polynomial real analytic function $f:[0,\delta[\to\R$. Write the first $n$ non-vanishing terms in the Taylor expansion of $f$ at the origin. There are infinitely many non-vanishing terms because otherwise, $f$ would be a polynomial. $$f(x)=a_1x^{m_1}+...+a_nx^{m_n}+ O(|x|^{m_n+1})$$ with $a_1,...,a_n\neq0$ and integers $1\leq m_1<m_2<\cdots<m_n$, where $O(|x|^{m_n+1})$ contains the remainder of the Taylor series of $f$ and can be estimated by a constant times $|x|^{m_n+1}$.

With $g(x)=a_1x^{m_1}+...+a_nx^{m_n}$ we can write $M_n(f)=M_n(g)+O(|x|^{m_n+1})$ where $O(...)$ denotes a certain $n$ by $n$ matrix whose entries are power series divisible by $x^{m_n+1}$. Now we factor \begin{eqnarray*}M_n(g)&=&\begin{bmatrix} 1^{m_1}& 1^{m_2} & \dots & 1^{m_n}\\ 2^{m_1} & 2^{m_2} & \dots & 2^{m_n}\\ \vdots & \vdots & \dots & \vdots\\ n^{m_1} & n^{m_2} & \dots & n^{m_n} \end{bmatrix} \begin{bmatrix}a_1x^{m_1}&0&\dots&0\\ 0&a_2x^{m_2}&\dots&0\\ \vdots& &\ddots&\vdots\\ 0&\dots&0&a_nx^{m_n} \end{bmatrix} \begin{bmatrix} 1^{m_1}& 2^{m_1} & \dots & n^{m_1}\\ 1^{m_2} & 2^{m_2} & \dots & n^{m_2}\\ \vdots & \vdots & \dots & \vdots\\ 1^{m_n} & 2^{m_n} & \dots & n^{m_n} \end{bmatrix}\\ &=&V_n ^T\, \mbox{diag}(a_1x^{m_1},\dots,a_nx^{m_n})\,V_n \end{eqnarray*} with a generalized Vandermonde matrix $$V_n=V_n(m_1,\dots,m_n)=\begin{bmatrix} 1^{m_1}& 2^{m_1} & \dots & n^{m_1}\\ 1^{m_2} & 2^{m_2} & \dots & n^{m_2}\\ \vdots & \vdots & \dots & \vdots\\ 1^{m_n} & 2^{m_n} & \dots & n^{m_n} \end{bmatrix}.$$ We will show later that this matrix is invertible for all $n$, $1\leq m_1<m_2<\dots<m_n$.

As $V_n$ is a constant invertible matrix, we have $$M_n(f)=V_n^T\left(\mbox{diag}(a_1x^{m_1},\dots,a_nx^{m_n})+O(|x|^{m_n+1})\right)V_n.$$ Here again $O(...)$ denotes a certain $n$ by $n$ matrix whose entries are power series divisible by $x^{m_n+1}$, not necessarily the same one as above. This implies that \begin{eqnarray*}\det M_n(f)&=&(\det V_n)^2 \det\left(\mbox{diag}(a_1x^{m_1},\dots,a_nx^{m_n})+O(|x|^{m_n+1})\right)\\ &=&(\det V_n)^2a_1\cdot\dots\cdot a_n x^{m_1+\cdots+m_n}+O(|x|^{m_1+\cdots+m_n+1}). \end{eqnarray*} Here we used that every product in the expansion formula of the last determinant except that of the diagonal elements is divisible by $x^{m_1+\cdots+m_n+1}$.

This implies that $\det M_n(f)$ does not vanish for sufficiently small $x>0$.

Observe that the above formulas remain valid if $f$ is a polynomial with less than $n$ nonvanishing terms, if we allow $a_n=...=a_j=0$ for an appropriate $j\leq n$ and replace all $O(...)$ terms by $0$. In this case, we obviously have $\det M_n(f)\equiv0$.

It remains to show that $\det V_n(m_1,\dots,m_n)$ does not vanish. We show

Claim 1: $\det V_n(m_1,\dots,m_n)>0$ for all integers $n>0$ and $0\leq m_1<...<m_n$.

In the case $m_i=i-1$, we have a Vandermonde determinant well known to be positive. The quotient $$\det V_n(m_1,\dots,m_n)/\det V_n(0,\dots,n-1)=s_{(m_n-n+1,m_{n-1}-n+2,\dots,m_1)}(1,2,\dots,n)$$ actually is a Schur polynomial evaluated at $x_i=i, i=1,\dots,n$ and Schur polynomials have non-negative coefficients, so our Claim follows.

For completeness, we give here a proof by induction. For $n=1$, there is nothing to show, for $n=2$, the Claim follows from the determinant formula.

Now suppose the Claim is true for all $\det V_{n-1}(m_1,\dots,m_{n-1})$, $0\leq m_1<...<m_{n-1}$. We have to show the Claim for determinants of size $n$ by $n$. First we modify $\det V_n(m_1,\dots,m_n)$ in the following way: Substract $n^{m_{i+1}-m_i}$ times row $i$ from row $i+1$ in the order $i=n-1,n-2,\dots,1$. As only the first entry of the last row in the resulting determinant is not zero, we obtain $\det V_n(m_1,\dots,m_n)=(-1)^{n+1}n^{m_1}\det W$, where $$W=\begin{bmatrix} 1^{m_2}-n^{m_2-m_1}1^{m_1} &2^{m_2}-n^{m_2-m_1}2^{m_1}&\dots& (n-1)^{m_2}-n^{m_2-m_1}(n-1)^{m_1}\\ 1^{m_3}-n^{m_3-m_2}1^{m_2} &2^{m_3}-n^{m_3-m_2}2^{m_2}&\dots& (n-1)^{m_3}-n^{m_3-m_2}(n-1)^{m_2}\\ \vdots&&&\vdots\\ 1^{m_n}-n^{m_n-m_{n-1}}1^{m_{n-1}} &2^{m_n}-n^{m_n-m_{n-1}}2^{m_{n-1}}&\dots& (n-1)^{m_n}-n^{m_n-m_{n-1}}(n-1)^{m_{n-1}} \end{bmatrix} $$ As every entry of column $j$ of $W$ can be divided by $n-j$, we obtain $\det V_n(m_1,\dots,m_n)=n^{m_1}(n-1)!\det Z,\ Z=[ z_{i,j}]_{i,j=1}^{n-1}$ where $$z_{i,j}=\sum_{k=m_i}^{m_{i+1}-1} j^k n^{m_{i+1}-k-1}.$$ Seeing this as writing row $i$ of $Z$ as a linear combination of the rows $(1^k,\dots,(n-1)^k)$ present in certain matrices $V_{n-1}$, we find that $$\det V_n(m_1,\dots,m_n)=n^{m_1}(n-1)!\sum_{k_1=m_1}^{m_2-1}n^{m_2-k_1-1}\cdots \sum_{k_{n-1}=m_{n-1}}^{m_n-1}n^{m_n-k_{n-1}-1} \det V_{n-1}(k_1,\dots,k_{n-1}).$$ Observe that $0\leq k_1<\dots<k_{n-1}$ in all cases. This proves that $\det V_n(m_1,\dots,m_n)>0$ using the induction hypothesis and the proof of the first item is complete.

Proof of the second item. Consider first any continuously differentiable function $f:[0,\infty[\to\R$ that is strictly increasing and satisfies $f(0)=0$. We will use the auxiliary function $q(x)=x\,f'(x)/f(x)$. Note that it has only non-negative values and is positive on a dense subset of $\R\setminus\{0\}$.

Claim 2: If $f$ satisfies condition 2 of the question for positive $a,b,c,d$ then $q$ is strictly decreasing on $]0,\infty[$.

Proof: Choose any positive $p$. Then the function $S:]0,\sqrt p]\to\R$, $S(t)=t+p/t$ is strictly decreasing. Therefore for all $a,c$ with $0<a<c\leq\sqrt p$ we have $a+p/a>c+p/c$. By condition 2, we therefore have $$F(a)<F(c)\mbox{ for the function }F(t)=\log f(t)+\log f(p/t)\mbox{ and all }0<a<c\leq\sqrt p.$$ This means that $F$ is strictly increasing. Hence its derivative satisfies $F'(t)>0$ on a dense subset of $]0,\sqrt p]$. Now $$t\ F'(t)=\frac{t\,f'(t)}{f(t)}-\frac{f'(p/t)}{f(p/t)}\frac{p}{t}$$ Thus $q(p/t)<q(t)$ for all positive $p$ and $t$ in a dense subset of $]0,\sqrt p]$.

This implies that for all positive $p$, there exist arbitarily close $t<\sqrt p<s$ such that $q(t)>q(s)$. This implies that $q(x)=x\,f'(x)/f(x)$ is strictly decreasing on $]0,\infty[$ and the proof of the Claim is complete.

Suppose now additionally that $f$ is a polynomial. Write the nonvanishing terms of $f$ $$f(x)=a_1x^{m_1}+...+a_nx^{m_n}$$ with $a_1,...,a_n\neq0$ and integers $1\leq m_1<\cdots<m_n$. In the case $n=1$, there is only $m_1$. Then $$\lim_{x\to0}\frac{x\,f'(x)}{f(x)}=m_1\mbox{ and } \lim_{x\to\infty}\frac{x\,f'(x)}{f(x)}=m_n.$$ Now $m_1\leq m_n$ and therefore $q(x)=\frac{x\,f'(x)}{f(x)}$ cannot be strictly decreasing on $]0,\infty[$. Hence by Claim 2, no polynomial $f$ with $f(0)=0$ that is strictly increasing can satisfy condition 2 of the question for positive $a,b,c,d$. This completes the proof of item 2.