Considering Brownian bridge as conditioned Brownian motion

Let $B$ be a standard Brownian motion. Define a Brownian bridge $b$ by $b_t=B_t-tB_1$. Let $\mathbb{W'}$ be the law of this process.

According to Wikipedia,

A Brownian bridge is a continuous-time stochastic process B(t) whose probability distribution is the conditional probability distribution of a Wiener process W(t) (a mathematical model of Brownian motion) given the condition that B(0) = B(1) = 0.

Surely it makes no sense to condition on a probability 0 event? So I'm trying to show that $\mathbb{W'}$ is the weak limit as $\epsilon\to 0$ of Brownian motion conditioned upon the event $\{|B_1|\leq \epsilon\}$. How do we prove this?

Thank you.


Brownian motion $B_t$ over the interval $[0,1]$ can be decomposed into two independent terms. That is, the process $X_t=B_t-tB_1$ and the random variable $Y=B_1$. As these are joint normal, to prove that they are independent, it is enough to show that the covariance ${\rm Cov}(X_t,Y)={\rm Cov}(B_t,B_1)-t{\rm Var}(B_1)=t-t$ vanishes.

The distribution of $B$ conditional on $\vert B_1\vert < \epsilon$ is just the same as that of $X$ plus the independent process $tY$ (conditioned on $\vert Y\vert < \epsilon$). As $\epsilon$ goes to zero, this converges to the distribution of $X$, which is a Brownian bridge.


The question says: "Surely it makes no sense to condition on a probability 0 event?"

Suppose $X,Y$ are jointly normally distributed random variables. "Jointly" means every linear combination of them is a normally distributed random variable. Suppose their means and variances are $\mu_X$, $\mu_Y$, $\sigma_X^2$, and $\sigma_Y^2$, and their correlation is $\rho$. It is commonplace to read in textbooks that the conditional distribution of $X$ given the probability-0 event that $Y=y$ is normal with expected value $$E(X\mid Y=y) = \mu_X + \rho\sigma_X\left(\frac{y-\mu_Y}{\sigma_Y}\right)$$ and variance $$(1-\rho^2)\sigma_X^2.$$ So you're conditioning on a probability-$0$ event.

Going a small step further, one can condition on a random variable rather than on an event, and get $$E(X\mid Y) = \mu_X + \rho\sigma_X\left(\frac{Y-\mu_Y}{\sigma_Y}\right),$$ and that is a random variable in its own right. It is the random variable whose expecation one finds in the "law of total expectation" $$ E(E(X\mid Y))=E(X), $$ and the "law of total variance": $$ \operatorname{var}(X) = \operatorname{var}(E(X\mid Y)) + E(\operatorname{var}(X \mid Y)). $$