What is the intuition behind uniform continuity?
There’s another post asking for the motivation behind uniform continuity. I’m not a huge fan of it since the top-rated comment spoke about local and global interactions of information, and frankly I just did not get it.
Playing with the definition, I want to say uniform continuity implies there’s a maximum “average rate of change”. Not literally a derivative, but the rate of change between two points is bounded in the domain. I’m aware that this is essentially Lipschitz continuity, and that Lipschitz implies uniform. This implies there’s more to uniform continuity than just having a bounded average rate of change.
And also, how is it that $ f(x)=x$ is uniform yet $f(x)f(x)=g(x)=x^2$ is not? I understand why it isn’t, I can prove it. But I just don’t understand the motivation and importance of uniform continuity.
Solution 1:
The real "gist" of continuity, in its various forms, is that it's the "property that makes calculators and measurements useful". Calculators and measurements are fundamentally approximate devices which contain limited amounts of precision. Special functions, like those which are put on the buttons of a calculator, then, if they are to be useful, should have with them some kind of "promise" that, if we only know the input to a limited amount of precision, then we will at least know the output to some useful level of precision as well.
Simple continuity is the weakest form of this. It tells us that if we want to know the value of a target function $f$ to within some tolerance $\epsilon$ at a target value $x$, but using an approximating value $x'$ with limited precision instead of the true value $x$ to which we may not have access or otherwise know to unlimited precision, i.e. we want
$$|f(x) - f(x')| < \epsilon$$
then we will be able to have that if we can make our measurement of $x$ suitably accurate, i.e. we can make that
$$|x - x'| < \delta$$
for some $\delta > 0$ which may or may not be the same for every $\epsilon$ and $x$.
Uniform continuity is stronger. It tells us that not only do we have the above property, but in fact the same $\delta$ threshold on $x'$'s accuracy will be sufficient to get $\epsilon$ worth of accuracy in the approximation of $f$ no matter what $x$ is. Basically, if the special function I care about is uniform continuous, and I want 0.001 accuracy, and the max $\delta$ required for that is, say, 0.0001, by measuring to that same tolerance I am assured to always get 0.001 accuracy in the output no matter what $x$ I am measuring. If, on the other hand, it were the case that the function is merely continuous but not uniformly so, I could perhaps measure at one value of $x$ with 0.0001 accuracy and that accuracy would be sufficient to get 0.001 accuracy in the function output, but if I am measuring at another, such a tolerance might give me only 0.5 accuracy - terrible!
Lipschitz continuity is even better: it tells us that the max error in approximating $f$ is proportional to that in approximating $x$, i.e. $\epsilon \propto \delta$, so that if we make our measurement 10 times more accurate, say (i.e. one more significant figure), we are assured 10 times more accuracy in the function (i.e. gaining a significant figure in the measurement lets us gain one in the function result as well).
And in fact, all the functions (that are real-analytic, not combinatorial functions like nCr and what not) on your real-life calculator are at least locally Lipschitz continuous, so that while this proportionality factor (effectively, absolutely how many sig figs you get for a given number of such in the input) may not be the same everywhere, you can still be assured that in relative terms, adding 10x the precision to your measurements, i.e. one more significant figure, will always make the approximation (however good or not it actually is) returned by your calculator 10x more accurate, i.e. also to one more significant figure.
And to top it all off, all these forms of continuity - at least in their local variants, that is, over any bounded interval - are implied by differentiability.
Solution 2:
While I really like The_Sympathizer's answer, none of the answers describe my intuition for how I think about uniform continuity.
Uniform continuity is about horizontal shifts not changing the graph too much
In precalculus we learn how to move graphs around. If we have a function $f(x)$, then we can shift the graph of the function to the right by an increment $\Delta$ by graphing the function $f(x-\Delta)$.
Then let's take a look at the definition of uniform continuity. $f$ is uniformly continuous if for all $\epsilon > 0$, there is some $\delta$ such that for all $x,x'$, $$|f(x)-f(x')| < \epsilon$$ if $|x-x'|<\delta$.
Another way to say this is to let $x' = x-\Delta$, and say that when $|\Delta| < \delta$, then $|f(x)-f(x-\Delta)|<\epsilon$.
Intuitively, $f$ is uniformly continuous if, when we bump the graph of $f$ left or right by a small enough amount, then the vertical distance between the shifted graph and the original graph will also be small.
Here's an example of how this works on Desmos. The slider controls how much we shift the graph by. The function in the fourth slot measures the vertical distance between the graphs. Unless we make the shift zero, the vertical distance between the shifted graph and the original graph always goes off to infinity, and is never bounded, no matter how small the shift is. In other words, $f(x)=x^2$ is not uniformly continuous, because no matter how small the left or right shift is, the graph of the shifted function gets really far away from the graph of the original function.
Alternative view: Uniform continuity is about the difference between horizontal and vertical shifts
Another (basically equivalent) way to say this is by comparing to vertical shifts.
Imagine the region bounded by the graph of $f$ shifted up by $\epsilon$ and the graph of $f$ shifted down by $\epsilon$. Do small horizontal shifts of the original graph stay in this region?
If the answer is yes, that sufficiently small horizontal shifts stay in the region, then $f$ is uniformly continuous. If the answer is no, no nonzero horizontal shift remains in the region, then $f$ is not uniformly continuous.
Here's a Desmos (again with $x^2$) for this view point.
Solution 3:
I'd like to point out one misconception in the problem statement:
... the rate of chance between two points is bounded in the domain
This is incorrect, the function $f:[0,\infty) \to [0,\infty)$ defined by
$$f(x)=\sqrt{x}$$
is uniformly continuous over the whole domain $[0,\infty)$, despite having an unlimited derivative near $0$. For any given $\epsilon > 0$, we can choose $\delta=\epsilon^2$, which fulfills the uniformly continuity condition:
$$|x_1 - x_2| \le \delta \Rightarrow |\sqrt{x_1}-\sqrt{x_2}| \le \epsilon$$
The difference to cases like $y=x^2$ or $y=\tan(x)$ is that $f$ is itself bounded around the point where the limit of the derivative is unbounded.