Step of Kolomogorov 0-1 law proof

I'm trying to understand the proof of the Kolomogorov 0-1 law, and I'm stuck on the following lemma:

Let $A_1, A_2, ..., B_1, B_2, ...$ be a collection of independent events. then $\sigma(A_1, A_2, ...), B_1, B_2, ...$ are independent.

It relies on the next statement:

Let $B_1, B_2, ...$ be independent. Then $\sigma(B_1, ..., B_{i-1}, B_{i+1}, ...)$ and $\sigma(B_i)$ are independent classes.

The problem is that author says that the proof of this statement is quite length and involves results of extension theorem and he omit it.

May be there is another way to proof this lemma or do you know the prof of the statement?


Dynkin's $\pi$-$\lambda$ theorem allows us to prove many results of this type, and it is a very useful technical lemma from measure theory, but also the proof isn't too hard, so you should definitely familiarize yourself with it (I'm not going to write out the proof; you can easily google it). It is one of those "extension theorems", where we start with a property which we know to be true on a "small"/managable collection of sets, and then with a little bit of work, we can deduce it to be true for a much larger collection of sets (the $\sigma$-algebra generated by what we started from).


Here is an example theorem (useful for what you're asking about)

Theorem $1$.

Let $(\Omega,\mathscr{A}, \Bbb{P})$ be a probability space, and $\{\mathscr{P}_i\}_{i\in I}$ be $\pi$-systems contained in $\mathscr{A}$. If $\{\mathscr{P}_i\}_{i\in I}$ is independent then the the collection of generated $\sigma$-algebras $\{\sigma(\mathscr{P}_i)\}_{i\in I}$ is also independent.

Recall that a $\pi$-system on a set $\Omega$ just means $\mathscr{P}\subset \text{power set of $\Omega$}$, which is closed under finite intersection: $A,B\in\mathscr{P}$ implies $A\cap B\in \mathscr{P}$. Also, we say an arbitrary indexed collection of subsets of $\Omega$ is independent if every finitely-index collection is independent.

Due to the definition, we thus just have to prove the following: for each $n\in\Bbb{N}$, if $\mathscr{P}_1,\dots, \mathscr{P}_n$ are independent $\pi$-systems, then $\sigma(\mathscr{P}_1),\dots, \sigma(\mathscr{P}_n)$ are independent $\sigma$-algebras (i.e it is no loss of generality in assuming the index set is $I=\{1,\dots, n\}$). This is easy: define \begin{align} \mathscr{L}:= \{A\in \mathscr{A}\,|\,&\text{for any $A_2\in\mathscr{P}_2,\dots, A_n\in \mathscr{P}_n$,}\\ &\text{we have $\Bbb{P}(A\cap A_2\cap \dots \cap A_n)= \Bbb{P}(A)\cdot \Bbb{P}(A_2)\cdots \Bbb{P}(A_n)$} \} \end{align} Now, $\mathscr{L}$ clearly contains the empty set, it is also closed under complementation (use basic set identities and that $\Bbb{P}$ is a probability measure), and $\mathscr{L}$ is clearly closed under countable disjoint unions. This means $\mathscr{L}$ is a $\lambda$-system, hence by Dynkin's theorem, it follows that $\sigma(\mathscr{P}_1)\subset \mathscr{L}$. In words, this is saying that$\sigma(\mathscr{P}_1),\mathscr{P}_2, \dots, \mathscr{P}_n$ are independent.

Now, we can repeat this argument: define $\mathscr{P}_1':= \mathscr{P}_2,\dots, \mathscr{P}_{n-1}':=\mathscr{P}_n, \mathscr{P}_n':=\sigma(\mathscr{P}_1)$. Then the $\{\mathscr{P}_j'\}_{j=1}^n$ are also independent $\pi$-systems, so by the above arugment, it follows $\sigma(\mathscr{P}_1'),\mathscr{P}_2',\dots, \mathscr{P}_n'$ are independent. In other words, $\sigma(\mathscr{P}_2),\mathscr{P}_3,\dots, \mathscr{P}_{n},\sigma(\mathscr{P}_1)$ are independent. Continuing inductively, it easily follows that $\sigma(\mathscr{P}_1),\dots, \sigma(\mathscr{P}_n)$ are independent, thereby completing the proof of the theorem.


Now, we can take this one step further: if we have independent $\pi$-systems, then we can group some of them together, make a $\sigma$-algebra out of them, and then the result will still be independent. More precisely:

Theorem $2$ (Grouping theorem).

Let $(\Omega,\mathscr{A}, \Bbb{P})$ be a probability space, and $\{\mathscr{P}_i\}_{i\in I}$ be independent $\pi$-systems contained in $\mathscr{A}$. Suppose we partition our index set into $I=\bigcup_{t\in T}I_t$; for each $t\in T$, define \begin{align} \mathscr{A}_t:=\sigma\left(\bigcup_{i\in I_t}\mathscr{P}_i\right). \end{align} Then, the $\sigma$-algebras $\{\mathscr{A}_t\}_{t\in T}$ are independent.

For each $t\in T$, let $\mathscr{C}_t$ be the $\pi$-system generated by $\bigcup\limits_{i\in I_t}\mathscr{P}_i$. Then, it is easy to see that $\mathscr{A}_t= \sigma(\mathscr{C}_t)$. So, by theorem 1, it is enough to show that the $\pi$-systems $\{\mathscr{C}_t\}_{t\in T}$ are independent. Now, note that elements of $\mathscr{C}_t$ are obtained by taking finite intersections of sets in $\bigcup_{i\in I_t}\mathscr{P}_i$. So, it follows almost immediately from independence of $\{\mathscr{P}_i\}_{i\in I}$ that $\{\mathscr{C}_t\}_{t\in T}$ is independent (I feel like the more I try to explain this, the more complicated it will sound; you just write out the condition for independence "probability of intersection is product of probabilities" and this should become obvious). This completes the proof of theorem 2.


I hope you now realize why $\pi$-systems are so natural when talking about independence: the definition of $\pi$-systems is about closure under finite intersection, while independence is a statement about probabilities of intersections of events.

In your special case, you're given two sequences of events $A_1,A_2,\dots, B_1,B_2,\dots$. If you make a set out of them: then $\{A_1\}, \{A_2\},\dots, \{B_1\},\{B_2\},\dots$ are very clearly $\pi$-systems, and they are independent. Hence by the grouping theorem, we can group all the $A$'s together and all the $B$'s together to conclude that $\sigma\left(\{A_k\}_{k=1}^{\infty}\right)$ and $\sigma\left(\{B_k\}_{k=1}^{\infty}\right)$ are independent $\sigma$-algebras. This proves your first statement (and is actually slightly stronger).