Help me correct my ideas of continuity

I've been studying real analysis over the past few months, and I'm having trouble organizing the different notions of continuity and ideas related to continuity in my head geometrically. I will explain my notions in general terms (in other words, without delta-epsilon definitions). Can you tell me how to sharpen my intuitive thinking wherever my ideas are incorrect or too general?

Continuous - The preimage of every open set is an open set.

Lipshitz Continuous - The absolute values of the slopes of all secant lines for the function are bounded.

Uniformly Continuous - The best notion I have is that it's just what a continuous function over a closed and bounded set is. One idea I have is that it means the function is the uniform limit of some series of piecewise linear functions, but does this hold for uniformly continuous functions over domains that are not compact?

Absolutely Continuous - This is my shakiest notion geometrically. The delta-epsilon definition gives me a loose notion of being able to break up the condition for uniform continuity over disconnected unions of open intervals. I know absolutely continuous functions have to be of bounded variation, which carries a geometrical notion of a continuous function whose image over any partition of the domain has a finite arc length, but I can't see what makes this notion stronger visually, nor can I grasp how it connects so well to the Fundamental Theorem of Calculus.

Another question: how do these different types relate to one another, and what examples can show these relations? I know the Cantor function is a good example of a uniformly continuous function (one of bounded variation) that is not absolutely continuous, but is it Lipschitz? If not, is there a function that is Lipschitz but not absolutely continuous?

I appreciate your input, and I apologize if this question is too general and lacking in rigor - I am still learning my way around this site!


Partial answer:

  • Uniformly continuous: This can be arbitrarily bad. Hölder continuity implies uniform continuity, and functions with Hölder index below $1/10$ will not look continuous attempts in computing and drawing their graph, some of them do not look like functions at all. Not surprisingly, these are fractal constructions.

  • Lipschitz implies absolute continuity: Since $|f(x)-f(y)|<L\,|x-y|$ you get that the variation over a collection of intervals is bounded by $L$ times the sum of interval lengths.

  • absolutely continuous and Riemann integral: Look up the Habilitationsschrift of Riemann, one can not describe the connection of integrability and variation in a shorter and more intuitive way.

In all, your understanding of continuity looks already solid, everyone has different visualizations.

One that I used to explain continuity of real functions: Continuity is boxes on the graph with solid top and bottom that can be made arbitrarily small in height. Uniform continuity means you can freely move these boxes along the graph, without changing their shape. Lipschitz continuity allows to replace the boxes by butterfly shapes, a boxed cross with slope $L$ so that the center is on the graph and the graph stays in the triangles at the sides, leaving via the vertical side.


Continuous - The preimage of every open set is an open set.

It is a perfect. The intuition: if you choose a place around $f(x)$, you can find a place around $x$ whose image is included in the first one.

Lipshitz Continuous - The absolute values of the slopes of all secant lines for the function are bounded.

Perfect.

Uniformly Continuous - The best notion I have is that it's just what a continuous function over a closed and bounded set is. One idea I have is that it means the function is the uniform limit of some series of piecewise linear functions, but does this hold for uniformly continuous functions over domains that are not compact?

Let me remind you first that this is reserved to metric topologies (or metric-like, like the Fréchet spaces). In the metric framework, an uniformly continuous function is such as for all $r>0$ exists $\delta>0$ such as $$ d(x,y) < \delta \Rightarrow d(f(x),f(y)) < r $$ In this case, I will note $\delta \le \omega(r)$, $\omega(r)$ being the $\inf$ of the set of values of $\delta$ that makes the previous inequality work.

Uniform continuity can also be formulated $r\neq 0 \Rightarrow \omega(r)>0$.

What you make reference to is the Heine theorem, under asumption that the space is compact.

Be carefull! as soon as the dimention is $\infty$, this is not equivalent to bounded (note that this is a metric notion!) and closed: those two conditions are not sufficient (see for example Riesz's theorem).

Absolutely Continuous - This is my shakiest notion geometrically. The delta-epsilon definition gives me a loose notion of being able to break up the condition for uniform continuity over disconnected unions of open intervals. I know absolutely continuous functions have to be of bounded variation, which carries a geometrical notion of a continuous function whose image over any partition of the domain has a finite arc length, but I can't see what makes this notion stronger visually, nor can I grasp how it connects so well to the Fundamental Theorem of Calculus.

Here again we are in the framework of metric spaces. The definition is:

for all $n\in \Bbb N$ $r>0$ there is a $\delta>0$ such as $$ \sum_{k=1}^n d(x_k, y_k ) < \delta \Rightarrow \sum_{k=1}^n d(f(x_k), f(y_k) ) < r $$ so taking $n=1$ gives that if $f$ is absolutely continous, it is uniformely continous.

Now in the case of $\Bbb R$, if $f(x) = f(a) + \int_a^x f'(t) dt$ with $f\in L^1[a,b]$ then $$ \sum_{k=1}^n |f(x_k)- f(y_k)| \le \sum_{k=1}^n \int_a^t |f'(t)| dt \le \sum \int_{[x_k,y_k]} |f'(t)| dt \to 0 $$ as $\sum_{k=1}^n |x_k- y_k|\to 0 $ because of the monotomic convergence theorem. Conclusion: this condition implies absolute continuity.

Now suppose $f$ is absolutely continous. Let $$ V_f(x) = \sup_{\{a = x_0 < \ldots <x_n = x\}} \sum_{i=1}^n|f(x_i) - f(x_{i-1})|\\ f^+(x) = \frac12(V_f(x) + f(x))\\ f^-(x) = \frac12(V_f(x) - f(x)) $$ $f^\pm$ are both increasing, and are absolutely continous and $f = f^+-f^-$ : hence, we just have to make the proof in the increasing case.

In this case, $f$ is almost everywhere differentiable, and $$ \int_a^x f'(t) dt = f(x) - f(a) + \sum_{a\le y\le x} \Delta f(y) = f(x) - f(a) $$because as $f$ is absolutely continous, it has no jumps.