Sample path of Brownian Motion within epsilon distance of continuous function

Let $f: [0,1] \to \mathbb{R}$ be a continuous function. Since $[0,1]$ is compact, $f$ is uniformly continuous on $[0,1]$, i.e. we can choose $n \in \mathbb{N}$ such that

$$|f(s)-f(t)| < \frac{\varepsilon}{2} \quad \text{for all $|s-t| \leq \frac{1}{n}$.}$$

If we set $t_j := j/n$ for $j=0,\ldots,n$, then

$$\begin{align*} \mathbb{P} \left( \sup_{0 \leq t \leq 1} |B_t-f(t)| <\varepsilon\right) &\geq \mathbb{P}\left( \forall j=0,\ldots,n-1: \sup_{t \in [t_j,t_{j+1}]} |B_t-f(t_j)| < \frac{\varepsilon}{2 (n-j)} \right) \\ &= \mathbb{P}\left( \prod_{j=0}^{n-1} 1_{A_j} \right) \end{align*}$$

for

$$A_j := \left\{\sup_{t \in [t_j,t_{j+1}]} |B_t-f(t_j)| < \frac{\varepsilon}{2(n-j)} \right\} \in \mathcal{F}_{t_{j+1}}.$$

It follows from the Markov property (of Brownian motion) and tower property (of conditional expectation) that

$$\begin{align*} \mathbb{P}\left( \prod_{j=0}^{n-1} 1_{A_j} \right) &= \mathbb{P} \left[ \left(\prod_{j=0}^{n-2} 1_{A_j} \right) \mathbb{P}^{B_{t_{n-1}}} \left(\sup_{t \in [0,1/n]} |B_t-f(t_{n-1})| < \frac{\varepsilon}{2} \right) \right]. \end{align*}$$

It suffices to show that

$$\mathbb{P}^x \left( \sup_{t \in [0,1/n]} |B_t-f(t_{n-1})| < \frac{\varepsilon}{2} \right)>c>0 \tag{1}$$

for all $x \in B(f(t_{n-1}),\varepsilon/4)$. (Then we can iterate the procedure and obtain the desired lower bound.) To this end, we note that

$$\begin{align*} \mathbb{P}^x \left( \sup_{t \leq 1/n} |B_t-f(t_{n-1})| < \frac{\varepsilon}{2} \right) &= \mathbb{P} \left( \sup_{t \leq 1/n} |B_t+x-f(t_{n-1})| < \frac{\varepsilon}{2} \right) \\ &\geq \mathbb{P} \left( \sup_{t \leq 1/n} |B_t| < \frac{\varepsilon}{4} \right) \end{align*}$$

for all $x \in B(f(t_{n-1}),\varepsilon/4)$. As $M_{1/n} := \sup_{t \leq 1/n} B_t \sim |B_{1/n}|$ (by the reflection principle), $(1)$ follows.


Remark: The asymptotics of the probability $\mathbb{P} \left( \sup_{t \in [0,1]} |B_t-f(t)| < \varepsilon \right)$ as $\varepsilon \to 0$ is subject of so-called small ball estimates.


Here is a proof (this is a combination of an exercise from Steele's stochastic calculus, and a lemma which I learned from Freedman's 'Brownian motion and diffusion processes'.)

It is the case that $P( \sup_{0 \leq s \leq 1} |B_s| \leq \epsilon) > 0$ for any $\epsilon > 0$. I've written a proof here: How to show that $P( \sup_{0 \leq s \leq 1} |B_s| \leq \epsilon) > 0$ for any $\epsilon > 0$?

We assume that we can write: $f(t) = \int_0^t h(s) ds$. (I don't think this is a huge assumption, morally.)

We define a stochastic process by $Z_t = B_t - \int_0^t h(s) ds$.

Call $(\Omega, F, P)$ the probability space underlying the process $(B_t)_{0 \leq t \leq 1}$.

Then Girsanov's theorem (Steele,Theorem 13.2) tells us that there is a continuous Martingale $0 < M_t$ so that under the measure $Q$ on $\Omega$ defined by $Q(A) = E_P[ 1_A M_1]$, $Z_t$ becomes a Brownian motion. That is, distribution on the path space $C[0,T]$ induced by using $Z_t$ to push forward the measure $Q$ is standard Wiener measure.

Let $A = \{ \sup_{ 0 \leq t \leq 1} | Z_t | \leq \epsilon \}$. We wish to show that $P(A) > 0$.

From the linked argument, we know that $Q(A) > 0$, because under $Q$, $Z_t$ is a Brownian motion. Since $Q(A) = E_P[1_A M_T]$, $P(A) = 0$ is impossible.

Since $A = \{ \sup_{ 0 \leq t \leq 1} | Z_t | \leq \epsilon \} = \{ \sup_{ 0 \leq t \leq 1} | B_t - f(t) | \leq \epsilon \}$, it follows that $P( \sup_{ 0 \leq t \leq 1} | B_t - f(t) | \leq \epsilon) > 0$ for any $\epsilon$.

We can actually get something better than what you asked for -- as long as we have a good enough microscope, we can find $f(t)$ up to $1/n$ accuracy for any $n$ in any Brownian motion path almost surely...

  1. Define $X_t^{(a,b)} = (b - a)^{-1/2} ( B_{a + t(b - a)} - B_a )$ for $0 \leq t \leq 1$. This is a Brownian motion on $[0,1]$, because of the Markov property and scaling.

  2. Let $a_k = 2^{-k - 1}$ and $b_k = 2^{-k}$, for $k = 0,1,2,3 \ldots$. Then the processes $X_t^{(a_k,b_k)}$ are independent. (This follows from the independent increments property of Brownian motion.)

  3. Moreover, because of the work above, $P ( \sup_{0 \leq t \leq 1} |X_t^{(a_k,b_k)} - g(t)| \leq \epsilon) = P( \sup_{0 \leq t \leq 1} |B_t - g(t) | \leq \epsilon) > 0$, for any $\epsilon$.

  4. Name $A_k = \{ \sup_{0 \leq t \leq 1} |X_t^{(a_k,b_k)}| \leq \epsilon \}$. These are independent, and from 4.,the all have the same probability. Hence, $\Sigma P(A_k) = \infty$. From the Borel-Cantelli lemma, we know that $a.s$ infinitely many of the $A_k$ occur. In particular, one of them occurs. (But maybe one would like to observe that they occur arbitrarily close to time zero, so any moment of time will be enough to observe something which lookslike our function.)

  5. Setting $\epsilon = 1/n$ and intersecting over all $n$ gives you what I claimed.

Truly mind boggling!