Waves of differing frequency are orthogonal - help me understand

I know that sinusoidal waves of different frequencies are orthogonal to each other. For instance:

# Shows that 1Hz and 2Hz waves are orthogonal
import numpy, scipy
x = numpy.linspace(0, 1, 1000)
wave_1hz = scipy.sin(1 * 2*scipy.pi*x)
wave_2hz = scipy.sin(2 * 2*scipy.pi*x)
numpy.dot(wave_1hz, wave_2hz)
# This prints a value very near 0, showing they are orthogonal

I am wondering if someone can give me an analogy to help me understand "why" waves of different frequencies are orthogonal.

To give a better idea of what I am looking for, I intuitively understand that if you have 2 $\Bbb R_2$ vectors which are at a right angle, if you look at the dot product as projecting one onto the other, there it will be 0 (like shining a flashlight straight down on a vertical poll). This helps me understand what orthogonality means in the context of $\Bbb R_2$ vectors. But I don't have any such analogy for waves of different frequencies.


Solution 1:

Orthogonality in this context means using an inner product like $$\langle\phi_1,\phi_2\rangle = \int_0^{2\pi} \phi_1(x)\phi_2(x)\ dx.$$ This inner product measures scalar projections by averaging two functions together.

So let's look at the integral of the product of two sine curves of differing frequency. Let's use $\phi_1 = \sin(x)$ and $\phi_2 = \sin(2 x)$. Note that the frequency of $\phi_1$ is $1$ and the frequency of $\phi_1$ is $2$.

The basic idea is that if the frequencies of the two sine curves are different, then between $0$ and $2\pi$, the two sine curves are of opposite sign as much as they are of the same sign: the two sine curves, courtesy WolframAlpha

Thus their product will be positive as much as it is negative. In the integral, those positive contributions will exactly cancel the negative contributions, leading to an average of zero: sin(x)sin(2x)

That's the intuition. Proving it just takes a bit of Calc 2:

We know from trig that $\sin(mx)\sin(nx) = \frac{1}{2}\bigg(\cos\big((m-n)x\big) - \cos\big((m+n)x\big)\bigg)$, so here for $m=1$ and $n=2$, \begin{align*} \int_0^{2\pi}\sin(x)\sin(2x)\ dx &= \frac{1}{2}\int_0^{2\pi}\cos(-x)-\cos(3x)\ dx\\ &= \frac{1}{2}\bigg(-\sin(x)\bigg|_0^{2\pi}-\frac{1}{3}\sin(3x)\bigg|_0^{2\pi}\bigg)\\ &= 0 \end{align*} since $\sin(0) = \sin(2\pi mx) = 0$.

I'll leave the general case of of two sines/cosines of differing frequencies to you as an exercise.


More generally, functions out of some space into $\mathbb{R}$ form a vector space. They can be added, subtracted, and scaled. Thus you can do linear algebra to them.

In particular, you can decompose functions on $[0,2\pi]$ into sinusoidal components by averaging them with sine and cosine curves. This is exactly analogous to shining a flashlight on the function and seeing how much of its shadow projects onto the $\sin(x)$ vector; the projection is $$\langle f(x),\sin(x)\rangle \sin(x).$$

As we saw previously, the sine and cosine curves of different frequencies are orthogonal to each other because they average against each other to zero. In fact, they form an orthonormal basis of the vector space of functions on $[0,2\pi]$. Every function $f$ can be written as a sum of these basis vectors: $$f(x) = \sum_{k=0}^\infty \langle f(x),\sin(kx)\rangle\sin(kx) + \langle f(x),\cos(kx)\rangle\cos(kx).$$ This is its Fourier series; the study of this decomposition is Fourier analysis.


Here's one more neat trick. The second derivative $\frac{d^2}{dx^2} = \Delta$ is a linear operator on the vector space of functions on $[0,2\pi]$. If you integrate by parts, you can see that it's a symmetric linear operator, like a symmetric matrix. It turns out that the sines and cosines are eigenvectors of $\Delta$, a fact you can easily verify for yourself by differentiating.

Abstract fun fact: different-eigenvalue eigenvectors of a symmetric operator on any vector space with inner product are orthogonal, for if $v$ and $w$ are eigenvectors with eigenvalues $\lambda_v$ and $\lambda_w$, respectively, and $\lambda_v\neq \lambda_w$, \begin{align*} \lambda_v\langle v,w\rangle &= \langle \lambda_vv,w\rangle \\ &= \langle \Delta v,w\rangle \\ &= \langle v,\Delta w\rangle \\ &= \langle v,\lambda_w w\rangle\\ &= \lambda_w\langle v,w,\rangle \end{align*} so $\langle v,w\rangle = 0$.

Solution 2:

Vectors in $\mathbb{R^2}$

As you already know, we write a vector in $\mathbb{R^2}$ as a pair

$\textbf{v}=(v_1, v_2)$

where $v_1$ and $v_2$ are the components of the vector. We define the norm (length) of $\textbf{v}$ as

$\vert\vert \textbf{v} \vert\vert = \sqrt{v_1^2 + v_2^2}$

Now let's try to define orthogonality using the Pythagorean theorem.

Let $\textbf{u}, \textbf{v} \text{ and } \textbf{w}$ be vectors in $\mathbb{R^2}$ such that $\textbf{u} = (u_1, u_2)$ and $\textbf{v}=(v_1, v_2)$ and $\textbf{w}=\textbf{u}+\textbf{v}$. Let's demand that $\textbf{u}, \textbf{v} \text{ and } \textbf{w}$ form a right triangle with $\textbf{w}$ being the hypotenuse, then it is true (Pythagorean theorem) that

$\vert \vert \textbf{w}\vert\vert^2 = \vert\vert \textbf{u}+\textbf{v}\vert\vert^2 = \vert\vert \textbf{u}\vert\vert^2+\vert\vert\textbf{v}\vert\vert^2$

$(u_1 +v_1)^2 + (u_2+v_2)^2 = (u_1^2+u_2^2)+(v_1^2+v_2^2)$

$(u_1^2+2 u_1 v_1 +v_1^2)+(u_2^2+ 2u_2v_2 +v_2^2) = (u_1^2+u_2^2)+(v_1^2+v_2^2)$

After some cancellation we arive at

$\begin{equation}u_1 v_1 + u_2 v_2 = 0\end{equation} ~~~~~~~~(*)$

So, for two vectors to be orthogonal they must satisfy this condition which you may know as the dot product or the inner product.

$\textbf{u} \cdot \textbf{v} = u_1 v_1 + u_2 v_2 $.

There's more to say about this, but let's not.

Functions in $\mathbb{L^2([0, 1])}$

Now you can apply the same ideas to functions. So to say that two functions are orthogonal means that their norms satisfy the Pythagorean theorem.

We define the norm $\mathbb{L^2}$ as

$|| f || = (\int\limits_0^1 | f(t)|^2 dt)^{\frac{1}{2}}$

so the Pythagorean theorem, for real functions $f$ and $g$, is now

$|| f + g || = ||f||^2 +||g||^2$

$\int\limits_0^1 ( f(t)+g(t))^2 dt = \int\limits_0^1 f(t)^2 dt +\int\limits_0^1 g(t)^2 dt$

$\int\limits_0^1 ( f(t)^2 + 2f(t) g(t) +g(t)^2) dt = \int\limits_0^1 f(t)^2 dt +\int\limits_0^1 g(t)^2 dt$

which after some cancellation gives

$\int\limits_0^1 f(t) g(t) dt =0 ~~~~~~~~(**)$

So, for two function to be orthogonal in $L^2([0,1])$ they must satisfy this condition. As we did with vectors in $\mathbb{R^2}$ we will now do with functions in $L^2([0,1])$ and define the inner product for real function $f$ and $g$ in $L^2([0,1])$ as

$(f, g) = \int\limits_0^1 f(t) g(t) dt$

Comments

In conclusion, two vectors are orthogonal if their inner product is zero, or equivalently, when Pythagorean theorem is satisfied. Should you have a mental picture of what it means for two sin function to be orthogonal? I don't know. I don't have it. I guess you could think of it in terms of a total area on the interval $[0, 1]$ of the product of the two functions being zero.