Examples of analysis results using probability theory

Sometimes, nice results from analysis appear unexpectedly in probability theory.

Here are a couple of examples:

$1.$ If $Z \sim \mathcal{N}(0,1)$, then $Z^2 \sim \Gamma(1/2,2)$

When we want to prove this, we find that $Z^2$ has density function $x \mapsto \sqrt{2\pi}^{-1} x^{-1/2} e^{-x/2}$ for $x \geq 0$ and comparing this to the density function of the gamma $(1/2,2)$ distribution, and using the fact that $\int_{-\infty}^{+ \infty} f(x)dx = 1$ for a density function $f$, it follows that $\boxed{\Gamma(1/2) = \sqrt{\pi}}$

$2.$ If $X \sim \Gamma(\alpha_1, \beta), Y \sim \Gamma( \alpha_2, \beta)$ and $X,Y$ are independent, then $X+Y \sim\Gamma(\alpha_1 + \alpha_2, \beta)$

While proving this, one can find the identity

$$\boxed{\int_0^1 u^{\alpha_1 -1}(1-u)^{\alpha_2 -1}du = \frac{\Gamma(\alpha_1)\Gamma(\alpha_2)}{\Gamma(\alpha_1 + \alpha_2)}}$$

So my question is: what are other examples where we can find interesting results from analysis (or other other branches of mathematics) using probability theory?


Solution 1:

  1. The first I can think of is definitely Weierstrass's Approximation theorem, saying that every continuous function on a closed interval can be uniformly approximated by polynomials. This can be proved as follows (source: Grimmet & Welsh, Probability: an introduction, 2th edition):

Let ($X_i$) be a sequence of i.i.d. Bernoulli$(p)$ variables, thus $\mathbb{P}(X_i=0)=1-p$ and $\mathbb{P}(X_i=1)=p$, for all $i\in\mathbb{N}$.

a) Let $f$ be a continuous function of $[0,1]$ and prove that $$B_n(p)=\mathbb{E}\left(f\left(\frac{\sum_{i=1}^nX_i}{n}\right)\right)$$ is a polynomial in $p$ of degree at most $n$.

b) Use Chebyshev's inequality to prove that for all $p$ such that $0\leq p\leq1$ and for any $\epsilon>0$, $$\sum_{k\in K}\binom{n}{k}p^k(1-p)^{n-k}\leq\frac{1}{4n\epsilon^2},$$ where $K=\{k:0\leq k\leq n, |k/n-p|>\epsilon\}$.

c) Using this and the fact that $f$ is bounded and uniformly continuous on $[0,1]$, prove the following version of the Weierstrass approximation theorem: $$\lim_{n\to\infty}\sup_{0\leq p\leq1}|f(p)-B_n(p)|=0.$$

  1. Another nice result is $$\lim_{n\to\infty}\left(\exp(-n)\sum_{k=0}^n\frac{n^k}{k!}\right)=\frac12,$$ which can be proved by applying the Central Limit Theorem to a sequence of i.i.d. $\mathrm{Pois}(1)$ distributed random variables.

  2. Originally, Stirling's approximation, first appeared in Doctrine of Chances by de Moivre, was a probabilistic result. However, it seems to be as much as important to analysis. Three quite elementary probabilistic proofs can be found here.

  3. There is a probabilistic proof on this site for the fact that $1/\zeta(s) = \prod_p(1-p^{-s})$, where $\zeta(\cdot)$ is the Riemann Zeta function and the product on the right hand site ranges over all primes. The accepted answer to the linked post only uses probability theory!

  4. [This is more a graph theory result] There is even a Wikipedia page on the probabilistic method, pioneered by Erdős. I especially like the first example on this page, which gives a non-constructive proof of the possibility to color the edges of a complete graph using two colours so that there is no complete subgraph on $r$ vertices which is monochromatic.

Solution 2:

There exist continuous functions $f:[0,1] \to \mathbb{R}$ which are nowhere differentiable.

The, so-called, Brownian motion is a stochastic process which has (with probability one) sample paths which are Hölder continuous but nowhere differentiable. This shows, in particular, the existence of functions with the above properties.

Moreover, there is a close connection between PDEs and Brownian motion, and therefore Brownian motion can be used to give probabilistic proofs of PDE results, for instance to study existence and uniqueness of solutions to the heat equation or the Dirichlet problem. Take a look at the book Brownian motion by Schilling & Partzsch if you are interested in the topic.


Lipschitz continuous functions are almost everywhere differentiable.

There is a probabilistic proof of this statement which relies on the martingale convergence theorem, see this question here for details.


Numerical calculation of $\pi$

The strong law of large numbers can be used to compute $\pi$ numerically. Indeed, if we consider a sequence of independent random variables $(X_n)_{n \geq 1}$ which are uniformly distributed on the square $[-1,1] \times [-1,1]$, then

$$\frac{1}{n} \sum_{i=1}^n 1_{|X_i| \leq 1}(\omega) = \frac{1}{n} \sharp \{1 \leq i \leq n; |X_i(\omega)| \leq 1\}$$

converges almost surely to $\pi/4$ as $n \to \infty$. Sampling such a sequence $(X_n)_{n \in \mathbb{N}}$ is pretty easy, and therefore this is a nice way to calculate $\pi$ numerically.


Fundamental theorem of algebra

There is a probabilistic proof of the fundamental theorem of algebra; it relies on a martingale convergence theorem and the (neighbourhood) recurrence of Brownian motion in dimension $d=2$; see here or the book by Rogers & Williams for details.


Open mapping theorem

There is a probabilistic proof of the open mapping theorem for analytic functions, see this article; the proof relies on the conformal invariance of Brownian motion.


There exist normal numbers.

The existence of normal numbers can be shown by applying the strong law of large numbers. Borel used probabilistic methods to prove that Lebsgue-almost all real numbers are normal.


Remark: Note that there are two similar threads on mathoverflow (No.1, No. 2) with plenty of examples!

Solution 3:

1) A classic problem first investigated by Erdos is:

Let $a_{0}=1$ and $$a_{n}=a_{\left\lfloor n/2\right\rfloor}+a_{\left\lfloor n/3 \right\rfloor}+a_{\left\lfloor n/6\right\rfloor}.$$ Show that

$$\lim_{n\to\infty}\dfrac{a_{n}}{n}=\dfrac{12}{\log{432}},$$

where $\lfloor x \rfloor$ is the largest integer not greater than $x$. There's a wonderful solution involving optional stopping and Markov processes: How prove this nice limit $\lim\limits_{n\to\infty}\frac{a_{n}}{n}=\frac{12}{\log{432}}$