What determines if a function has a least positive period?

Solution 1:

For a given function $f$, consider the set $P = \{ p \in \mathbb{R} \mid \forall x \in \mathbb{R} : f(x+p) = f(x) \}$. Clearly $0 \in P$, and if $p, q \in P$ also $p - q \in P$. That means $P$ is a subgroup of the additive group of the real numbers. Several cases can arise

  1. $P = \{0\}$. In that case $f$ is not periodic at all.
  2. $P = p\mathbb{Z}$ for some $p > 0$. Then $p$ is the least period of $f$.
  3. $P = \mathbb{R}$. This is the case for constant functions.
  4. $P$ is a dense proper subgroup of $\mathbb{R}$, as in your example of the indicator function of the rationals, where $P = \mathbb{Q}$.

No other subgroups exist. To construct an arbitrary periodic function, you can take any nontrivial subgroup $P$ and define an arbitrary function on the quotient $\mathbb{R}/P$. This fixes the whole function, since it must be constant on every coset of $P$. In case 2 the quotient can be represented as the interval $[0, p)$; for case 4 this tends to be trickier, but a slightly more interesting example would be $$ f(x) = \begin{cases} p & \text{for}\, x = q + \sqrt{p},\; q \in \mathbb{Q}, p \,\text{prime}, \\ 0 & \text{otherwise}. \end{cases} $$

Solution 2:

The other answers are very good. Here is a 1915 theorem of Burstin that also seems relevant:

If a Lebesgue measurable function $f: \mathbb{R} \rightarrow \mathbb{R}$ has arbitrarily small periods, then $f$ is constant almost everywhere.

A nice proof is given in this one page MONTHLY note of J.M. Henle. A comment at the bottom claims that Burstin's original proof was faulty.

Solution 3:

It seems that a nonconstant continuous periodic function must have a least positive period. First, if there are periods arbitarily close to zero, then pick your favorite point $x$ and value $f(z)$. Since the periods are getting smaller and smaller, there must be a sequence of $z_n \rightarrow x$ with $f(z_i) = f(z)$ for all $i$. Hence $f(z) = f(x)$ by continuity, violating that $f$ is nonconstant. So there is some positive lower bound $L$ to the set of periods.

If there is a sequence of periods $p_i \neq L$ converging to $L$, then $f(x + L) = f(x) = f(x + L + p_i)$ and so we have a sequence of periods converging to zero. So $L$ must be a period.