What is the norm measuring in function spaces

In spatial euclidean vector spaces norm is an intuitive concept: It measures the distance from the null vector and from other vectors.

The generalization to function spaces is quite a mental leap (at least for me).

My question:
Is there some kind of intuition what "norm" or "euclidean norm" is even supposed to "mean" here? What is the null vector and how what does a "distance" between to functions reveal?

Edit: Please also see this follow up question: Visualization of 2-dimensional function spaces


Solution 1:

Since you used the word "distance", I am going to focus more on metric spaces rather than norms. But once you see what I'm driving at, the basic idea is the same.

In usual, Euclidean space, there is the intuitive sense of "distance" between two points. This allows us to intuitively define a metric function, which takes input two points in Euclidean space, and outputs the "distance" between them.

Now there are a lot of things we know about the Euclidean distance (triangle inequality etc.) In an effort to find out the logical relationships between the various properties of the Euclidean distance, we try to distill out the bare necessities of the properties of the metric function. In particular, we try to see how much of the usual Euclidean intuition can be removed while still leaving a logically consistent whole.

Fromr this we were eventually led to the definition of a metric space as nothing more than a set $M$ and a distance function $d$, taking as input two points in $M$ and outputing a non-negative real number, with certain basic properties on $d$.

You can think of this abstraction as a way to offer insights on when can we draw an analogy between whatever it is we are looking at, and the distance function of Euclidean space. The abstraction says that "for all intents and purposes, the really useful properties of the Euclidean distance is captured in such and such properties." So whenever we are faced with a situation where we have a set $M$ and a function $d: M\times M\to\mathbb{R}$ with those properties, we can by analogy think of Euclidean space and its distance function.

What I am trying to say is that perhaps looking for "meaning" of the norm-induced metric on a function space is going about the wrong way. It is not because we intuit certain qualities of functions which reminds us of the Euclidean distance that we consider the metrics. Rather, you should think of the norms and metrics being something we impose on the space of functions so that we can appeal to our intuitions of the Euclidean distance function by analogy.

Solution 2:

Let's start with the $L_{2}[0,1]$ norm.

Start with a single function $f$. The 2-norm of $f$ is

$ \| f \|_{2}=\sqrt{\int_{0}^{1} f(x)^2 dx} $

Now, $\| f \|_2$ will be 0 only if the integral of $f(x)^2$ is 0. This will certainly happen if $f(x)=0$, but it can also happen in cases where $f$ has some isolated points with discontinuities where the function values are non zero. Thus the "null vector" in this space is the zero function and all of the functions that are zero "almost everywhere" are considered to be equivalent to $f(x)=0$ with respect to this norm.

If you're familiar with electrical engineering, where $f(x)$ might be a time varying voltage, then $\| f \|_2$ is essentially the energy associated with that signal.

Now, suppose that $f(x)$ and $g(x)$ are two functions defined on $[0,1]$, and consider the 2-norm of the difference:

$ \| f-g \|_{2}=\sqrt{\int_{0}^{1} (f(x)-g(x))^2 dx} $

If the functions $f$ and $g$ are very nearly identical on $[0,1]$, then this norm will be close to 0.

If you're familiar with electrical engineering terminology, you might have heard of the "root mean square (RMS)" difference between two signals. This norm is exactly the RMS difference between $f$ and $g$.

If $f(x)=g(x)$, except for perhaps some individual points where there are discontinuities, then the norm will actually be 0. Thus the $L_{2}$ distance between two functions is 0 if the functions are equal "almost everywhere."