Average Distance Between Zeroes of $\sin(x)+\sin(x\sqrt{2})+\sin(x\sqrt{3})$

QUESTION: What is the average distance between the consecutive real zeroes of the function $$f(x)=\sin(x)+\sin(x\sqrt{2})+\sin(x\sqrt{3})$$ or, more specifically, if $z(x)$ is defined as the number of zeroes $\zeta$ satisfying $|\zeta|<x$, what is the value of $$\lim_{x\to\infty} \frac{2x}{z(x)}=?$$

Here’s some context. I’ve been studying sums of sinusoids with “mutually irrational” periods, such that the sum of the sinusoids is not actually a periodic function. For example, the function $$\sin(x)+\sin(x\sqrt{2})$$ is not periodic, because $\sqrt{2}$ is irrational. In particular, I’ve been looking at the asymptotic distribution of solutions $x$ to equations in the form $$\sin(x)+\sin(\tau x)=\alpha$$ where $\tau \notin \mathbb Q$ and $|\alpha|<2$. I’ve actually come up with a formula for the average distance between the solutions to the above equation along the real line, but it’s messy so I won’t type it out unless someone cares enough to ask for it. The case of $\alpha = 0$ is almost trivial though, and can be figured out with an easy trig identity.

However, when dealing with three summed sinusoids the case of $\alpha = 0$ is no longer trivial. For two sinusoids, $$\sin(x)+\sin(\tau x)=2\sin\bigg(\frac{\tau+1}{2}x\bigg)\cos\bigg(\frac{\tau - 1}{2}x\bigg)$$ so we can easily calculate the actual explicit values of the zeroes. But for three sinusoids with mutually irrational periods so that $\tau_1, \tau_2, \tau_1/\tau_2 \notin\mathbb Q$, $$\sin(x)+\sin(\tau_1 x)+\sin(\tau_2 x)$$ I haven’t been able to come up with any explicit formulas for zeroes, or even an asymptotic density of/average distance between zeroes.

Can anyone figure out how to work out this problem for the specific case of $\tau_1 = \sqrt{2}$, $\tau_2 =\sqrt{3}$?


Solution 1:

Partial answer

I thought of a way to transform the problem into a double integral. I don't prove every step, so I can't say I'm a 100% sure this is right. I'm pretty confident that this approach works, but let me know if I made a mistake.

I'm gonna add cosines together instead of sines. It's the same thing, but cosine is a little easier to work with because it's an even function.

Let $n \ge 2$ be the number of cosine functions we're adding together and let $\tau$ be an $n$-dimensional vector containing the rationally independent positive coefficients. We define: $$ \begin{align} C &= [-\pi, \pi]^n && \text{($n$ dimensional hypercube)} \\ S &= \left\{x \in C\ \middle|\ \sum_{i=1}^n \cos(x_i) = 0\right\} && \text{($n{-}1$ dimensional surface}) \\ g(x) &= \sum_{i=1}^n \cos(\tau_i x) \\ l_i(x) &= ((\tau_i x + \pi) \operatorname{mod} 2\pi) - \pi && \text{(line through $C$)} \end{align} $$ For $n=2$ and $n=3$, $S$ looks like this:

Surface when n=2 and n=3

The function $l(x)$ is a line that starts at the origin and goes in direction $\tau$. Whenever it reaches an edge of $C$, it comes out at the edge on the other side.

Now $g(x) = 0$ whenever $l(x) \in S$. So to count the zeroes we can follow the line $l(x)$ and see how often it crosses the surface $S$.

Because of the rational independence, it seems intuitive that the line will travel through each part of $C$ equally often. Therefore we can integrate over the surface $S$ to calculate how often $S$ is crossed.

I figured out the following formula for calculating the frequency $f$. The average distance between zeroes is $1/f$. The function $p(x)$ gives one of the two possible unit normal vectors on the surface $S$ at $x$. The dot represents the dot product of two vectors. $$ f = \frac{1}{(2\pi)^n} \int_S |p(x) \cdot\tau |\ \mathrm{d}x \label{surfaceint}\tag{1} $$ This gives for $n=2$: $$ \begin{align} f_2 &= \frac{1}{(2\pi)^2} \cdot 2 \pi \sqrt{2} \cdot \left(\left| \left[\begin{smallmatrix}\frac12 \sqrt{2} \\ \frac12 \sqrt{2}\end{smallmatrix}\right] \cdot \tau \right| + \left| \left[\begin{smallmatrix}\frac12 \sqrt{2} \\ -\frac12 \sqrt{2}\end{smallmatrix}\right] \cdot \tau \right| \right) \\ &= \frac{1}{(2\pi)^2} \cdot 2 \pi \cdot (|\tau_1 + \tau_2| + |\tau_1 - \tau_2|) \\ &= \max(\tau_1, \tau_2) / \pi \end{align} $$ For higher $n$, the surface $S$ is more complex and this isn't as easy. Because $S$ is mirror symmetric, we can make it ourselves easier by only integrating over the positive part of $S$. But we do have to take the different normals into account.

$$ \begin{align} R &= \{x \in S\ |\ \forall_i\ x_i \ge 0\} \\ I(x) &= \sum_{d \in \{-1, 1\}^n} \left| \sum_{i=1}^n d_i \cdot p_i(x) \cdot \tau_i \right| \\ f &= \frac{1}{(2\pi)^n} \int_R I(x)\ \mathrm{d}x \end{align} $$ For the normal vector $p(x)$ we can use the normalized gradient of $\sum_{i=1}^n \cos(x_i)$. I multiplied the whole thing by $-1$ to get a positive normal. $$ p_i(x) = \sin(x_i) / \sqrt{\sum_{j=1}^n \sin(x_j)^2} $$

The case $n=3$

Let $u$ be a vector of 3 non-negative elements. The following equation holds: $$ \sum_{d \in \{-1, 1\}^3} |d \cdot u| = 4 \max (2u_1, 2u_2, 2u_3, u_1+u_2+u_3) $$ If we combine that equation with the equation for the normal, we get: $$ I(x) = \frac{ 4 \max \left( \begin{array}{} 2\sin(x_1) \tau_1, \\ 2\sin(x_2) \tau_2, \\ 2\sin(x_3) \tau_3, \\ \sin(x_1)\tau_1+\sin(x_2) \tau_2+ \sin(x_3) \tau_3 \end{array} \right) } { \sqrt{\sin(x_1)^2 + \sin(x_2)^2 + \sin(x_3)^2} } $$ To rewrite the equation to a normal two dimensional integral, instead of a surface integral, we first replace $x_3$. $$ \begin{align} x_3 &= \arccos(-\cos(x_1)-\cos(x_2)) \\ \sin(x_3) &= \sqrt{1 - (\cos(x_1)+\cos(x_2))^2} \\ I(x) &= \frac{ 4 \max \left( \begin{array}{} 2\sin(x_1) \tau_1, \\ 2\sin(x_2) \tau_2, \\ 2 \tau_3\sqrt{1 - (\cos(x_1)+\cos(x_2))^2} , \\ \sin(x_1)\tau_1+\sin(x_2) \tau_2+ \tau_3\sqrt{1 - (\cos(x_1)+\cos(x_2))^2} \end{array} \right) } { \sqrt{\sin(x_1)^2 + \sin(x_2)^2 + 1 - (\cos(x_1)+\cos(x_2))^2} } \end{align} $$ Now we use the equation: $$ \begin{align} \int_R I(x)\ \mathrm{d}x = \int_A I(x)J(x)\ \mathrm{d}x \end{align} $$ Where: $$ \begin{align} A &= \left\{ \begin{bmatrix}1 & 0 & 0 \\ 0 & 1 & 0\end{bmatrix}x\ \middle|\ x \in R \right\} \\ J(x) &= \sqrt{\left(\frac{\partial x_3}{\partial x_1}\right)^2 + \left(\frac{\partial x_3}{\partial x_2}\right)^2 + 1} \\ &= \sqrt{\frac{\sin(x_1)^2+\sin(x_2)^2+1-(\cos(x_1)+\cos(x_2))^2}{1-(\cos(x_1)+\cos(x_2))^2}} \end{align} $$ Combining $I$ and $J$, we get: $$ I(x)J(x) = \frac{ 4 \max \left( \begin{array}{} 2\sin(x_1) \tau_1, \\ 2\sin(x_2) \tau_2, \\ 2 \tau_3\sqrt{1 - (\cos(x_1)+\cos(x_2))^2} , \\ \sin(x_1)\tau_1+\sin(x_2) \tau_2+ \tau_3\sqrt{1 - (\cos(x_1)+\cos(x_2))^2} \end{array} \right) } { \sqrt{1-(\cos(x_1)+\cos(x_2))^2} } $$ So our new integral becomes: $$ \begin{align} f_3 &= \frac{1}{(2\pi)^3} \int_A I(x)J(x)\ \mathrm{d}x \\ &= \frac{1}{(2\pi)^3} \left(\int_0^{\frac12\pi} \int_{\arccos(1-\cos(x_1))}^\pi I(x)J(x)\ \mathrm{d}x_2\ \mathrm{d}x_1 + \int_{\frac12\pi}^\pi \int_0^{\arccos(-1-\cos(x_1))} I(x)J(x)\ \mathrm{d}x_2\ \mathrm{d}x_1 \right) \\ &= \frac{1}{4\pi^3} \int_0^{\frac12\pi} \int_{\arccos(1-\cos(x_1))}^\pi I(x)J(x)\ \mathrm{d}x_2\ \mathrm{d}x_1 \end{align} $$ We can get rid of those nasty sines and cosines by using integration by substitution. Replacing $x_2$ with $\arccos(v)$ gives: $$ \begin{align} H(x_1, v) &= \frac{ \max \left( \begin{array}{} 2\sin(x_1) \tau_1, \\ 2\tau_2\sqrt{1-v^2}, \\ 2 \tau_3\sqrt{1 - (\cos(x_1)+v)^2} , \\ \sin(x_1)\tau_1+\tau_2\sqrt{1-v^2} + \tau_3\sqrt{1 - (\cos(x_1)+v)^2} \end{array} \right) } { \sqrt{1-v^2} \cdot \sqrt{1-(\cos(x_1)+v)^2} } \\ f_3 &= \frac{1}{\pi^3} \int_0^{\frac12\pi} \int_{-1}^{1-\cos(x_1)} H(x_1, v) \ \mathrm{d}v\ \mathrm{d}x_1 \end{align} $$ Then replacing $x_1$ with $\arccos(u)$ gives: $$ \begin{align} G(u, v) &= \frac{ \max \left( \begin{array}{} 2\tau_1\sqrt{1-u^2} , \\ 2\tau_2\sqrt{1-v^2}, \\ 2 \tau_3\sqrt{1 - (u+v)^2} , \\ \tau_1\sqrt{1-u^2}+\tau_2\sqrt{1-v^2} + \tau_3\sqrt{1 - (u+v)^2} \end{array} \right) } { \sqrt{1-u^2} \cdot \sqrt{1-v^2} \cdot \sqrt{1-(u+v)^2} } \\ f_3 &= \frac{1}{\pi^3} \int_0^1 \int_{-1}^{1-u} G(u, v) \ \mathrm{d}v\ \mathrm{d}u \end{align} $$ One possible way to continue solving this integral, is splitting it into four integrals, one for each of the arguments of $\max$. To do this we need to find the values for $u$ and $v$ in which these arguments are the maximum. The conditions can be reduced to: $$ \begin{align} 1\colon&&\ \tau_1 \sqrt{1-u^2} &> \tau_2 \sqrt{1-v^2} + \tau_3 \sqrt{1-(u+v)^2} \\ 2\colon&&\ \tau_2 \sqrt{1-v^2} &> \tau_1 \sqrt{1-u^2} + \tau_3 \sqrt{1-(u+v)^2} \\ 3\colon&&\ \tau_3 \sqrt{1-(u+v)^2} &> \tau_1 \sqrt{1-u^2} + \tau_2 \sqrt{1-v^2} \\ 4\colon&&\ \text{otherwise} \end{align} $$

Numerical approximation

The following Mathematica code calculates an approximation to $\ref{surfaceint}$. I tried to write it to work in any number of dimensions, but it only seems to work in exactly 3 dimensions (Mathematica 11.2).

frequency[t_] := Module[{n, vars, x, r},
   n = Length[t];
   vars = Table[x[i], {i, n}];
   r = ImplicitRegion[Total[Cos[vars]] == 0, Evaluate[{#, -Pi, Pi}& /@ vars]];
   NIntegrate[Abs[t.Normalize[Sin[vars]]], vars \[Element] r] / (2Pi)^n
];
Print["Average distance between zeroes: ", 1 / frequency[{1, Sqrt[2], Sqrt[3]}]];

The code outputs $2.22465$. I don't know how many digits of that are right.