Why is this combination of nearest-integer functions --- surprisingly --- continuous?
Alright, I didn't know the best way to formulate my question. Basically, whilst doing some physics research, I naturally came upon the function
$$ f(x) = 2x[x] - [x]^2 $$
where I use $[x]$ as notation for the `nearest-integer function' (i.e. rounding off). Usually this function has to have a caveat of how we exactly define the value for $x \in \frac{1}{2} \mathbb Z$, but interestingly for this function it does not matter, since it turns out to be continuous! In fact, it turns out $f(x)$ is exactly given by the glued function of taking all the tangent lines of $x^2$ at integer values of $x$:
(Note: due to properties of $x^2$, the tangent lines exactly intersect at half-integer values of $x$.)
So my question is not literally `why is it continuous?', but rather: considering it is continuous, and considering that that is not a generic property of functions which are defined in terms of nearest-integer functions, is there a better (i.e. more insightful) way of expressing $f(x)$? Relatedly, is there some part of mathematics where functions similar to these naturally arise?
Solution 1:
Let $g(x,y)$ be any continuous function such that $g(x,-\frac12)=g(x,\frac12)$. Then $f(x)=g(x,x-[x])$ is continuous.
In particular, your function is given by $g(x,y)=x^2-y^2$. Consequently, we can write $f(x)=x^2 - (x-[x])^2$.
Solution 2:
Call a function $f : \mathbb{R} \rightarrow \mathbb{R}$ weird-continuous iff for all $x \in \mathbb{R}$, the left and right limits of $f$ at $x$ exist, whether or not they're equal. Define the weird derivative $\Delta f$ of a function $f : \mathbb{R} \rightarrow \mathbb{R}$ to be the function $\Delta f : \mathbb{R} \rightarrow \mathbb{R}$ defined by: $$\Delta(f)(x) = (\lim_+f)(x)-(\lim_-f)(x),$$ where for example the $(\lim_+f)(x)$ denotes the limit of $f(x')$ as a $x'$ approaches $x$ from the right.
Also, for each $a \in \mathbb{R}$, write $\langle a \rangle$ for the function $\mathbb{R} \rightarrow \mathbb{R}$ defined as follows:
$$\langle a \rangle (x) = \begin{cases} 1 & x = a \\ 0 & x \neq a\end{cases}$$
For example:
- if $H$ is the Heaviside step function, then $\Delta (H) = \langle 0\rangle$.
- if $H$ is the Heaviside step function, then $\Delta (x \mapsto 3H(x-1)+4H(x-2)) = 3\langle 1\rangle+4\langle 2\rangle$.
- if $f$ is continuous, then $\Delta(f) = 0$.
Letting $f$ and $g$ denote weird-continuous functions, the basic results (I think) are:
Proposition 0. Suppose $f : \mathbb{R} \rightarrow \mathbb{R}$ has no removable discontinuities. Then $f$ is continuous iff $\Delta f = 0$.
Proposition 1. Additivity $\Delta(f+g) = \Delta(f)+\Delta(g)$ and $\Delta(0) = 0$.
Proposition 2. Product Rule. $\Delta (fg) = \Delta(f)g + f \Delta(g)$
(Is the product rule even true? I don't really use it below...)
Corollary to the product rule. Weird derivatives are linear with respect to continuous functions, meaning that if $f$ is continuous, then $\Delta(fg) = f\Delta(g)$.
Now think of $x$ as the identity function $\mathbb{R} \rightarrow \mathbb{R}$. Let $f = 2 x [x] - [x]^2$.
To prove that $f$ is continuous, we'll show that $\Delta(f) = 0$. We have:
$$\Delta (f) = \Delta(2 x [x] - [x]^2) = 2 x \Delta(x) - \Delta([x]^2)$$
Also: $$\Delta([x]) = \left(\sum_{n \in \mathbb{Z}+\frac{1}{2}}\langle n\rangle\right)$$
Furthermore: $$\Delta([x]^2) = \sum_{n \in \mathbb{Z}+\frac{1}{2}}((n+1/2)^2 - (n-1/2)^2)\langle n\rangle = \sum_{n \in \mathbb{Z}+\frac{1}{2}}(n+1/2+n-1/2)(n+1/2-n+1/2)\langle n\rangle = \sum_{n \in \mathbb{Z}+\frac{1}{2}}2n\langle n\rangle$$
Hence: $$\Delta (f) = 2x\left(\sum_{n \in \mathbb{Z}+\frac{1}{2}}\langle n\rangle\right)-\sum_{n \in \mathbb{Z}+\frac{1}{2}}2n\langle n\rangle = \sum_{n \in \mathbb{Z}+\frac{1}{2}}2n\langle n\rangle-\sum_{n \in \mathbb{Z}+\frac{1}{2}}2n\langle n\rangle = 0$$
So $f$ is continuous.
Solution 3:
is there a better (i.e. more insightful) way of expressing $f(x)$?
Maybe one way you could do that is by noticing that $$ f(x) = \max_{k\in\mathbb{Z}} \left( 2kx-k^2 \right). $$ Since, in any interval of fixed length, $f$ is the max of a finite number of continuous functions, then it is continuous.
is there some part of mathematics where functions similar to these naturally arise?
Personally, I have encountered this a lot in information theory, when many lower/upper bounds on communication rates are derived together, implying that the max/min of these bounds holds. This is especially true when many different linear inequalities can be naturally derived and make intuitive sense. For instance, $$ \begin{cases} R_1 \ge 2-4R_2\\ R_1 \ge 1-R_2 \end{cases} \implies R_1 \ge \max\left\{ 2-4R_2, 1-R_2 \right\}. $$ If you really want an example, off the top of my head I can think of Theorem 2 in this paper.
Solution 4:
The difference between the rounding function (piecewise constant) and $x$ is a triangle wave, i.e. piecewise linear, periodic, ranging in $[-\frac12,\frac12)$.
If you square it, you get a continuous, piecewise quadratic function. This is enough to explain your observation.