An informal description of forcing.

I don't know how "non jargonish" you want your answer, but I'll try a very short outline and hopefully it will work:

Given a model $M$ (usually a transitive model of ZFC), any poset $(P,<)$ in it is a notion of forcing and its elements forcing conditions. A $G$ in $M$ is said to be generic if it is a filter and any dense set in $P$ that belongs to $M$ has a nonempty intersection with $G$. There's a theorem that states that for a transitive model $M$ of ZFC and a generic set $G\subset P$ there's a transitive model $M[G]$ of ZFC that extends $M$ and, associated with that, we define a forcing relation $\Vdash$ where some element $p\in G$ forces a formula $\varphi$ iff $M[G]\vDash \varphi$, i.e., $(\exists p \in G) p\Vdash \varphi$ iff $\varphi$ is valid in $M[G]$, this will happen for every generic $G$ if $\varphi$ is said to be in the forcing language.

In summary, forcing is a way of extending models to produce new ones where certain formulas can be shown to be valid so, with that, we are able to do (or to complete) independence proofs. This new model is provided by a poset and a generic set, this gives a forcing relation that can be used to show that such models indeed satisfy certain formulas.

With that said, given the "right" choice for $P$ and $G$, we can produce, from $M$, a model where $\neg \textbf{CH}$ (the negation of the continuum hypothesis) is valid and, together with the fact that there's a model in which $\textbf{CH}$ is valid (this can be shown more "easily" without the need of forcing, you can find some proofs in the books I'll recommend), we complete a proof of the independence of $\textbf{CH}$. With a similar proof (with some adjustments) one can also show the independence of the Axiom of Choice and much more.

Now I'll give you some directions to what you need to study to understands forcing at a technical level. First you must know some basic logic (the basics of syntax and how formulas are defined recursively and some basic metatheorems) and basic model theory (basic definitions, soundness, consistency, completeness, compactness and Löwenheim–Skolem theorems); it's good if you also understand Gödel's incompleteness theorems, but only the main results, you don't have to dive into their proofs unless you are interested in doing so. With that background you now have to study some axiomatic set theory to have a more solid notion of things such as ordinals, cardinals, transitivity, rank, $\Delta$-systems and order theory. The last step is to study some basic properties of boolean algebras, as the most (IMO) intelligible and modern approach uses boolean-valued models.

All this and more you can find in the following books:

Set Theory - The Third Millennium Edition, revised and expanded;

Axiomatic Set Theory;

Set Theory: Boolean-Valued Models and Independence Proofs


The existing answers are great; let me take a different tack and describe names.

Let's suppose I have some unknown set $X$. I can define "recipes" for building sets relative to $X$. (The technical term here is "names.") For example:

  • $Y=\emptyset$ if $7\in X$, and $Y=\mathbb{N}$ if $7\not\in X$.

  • $Y=\{n\in\mathbb{N}: 2n\in X\}$.

  • $Y=\{\{\{...\}\}\mbox{ ($n$ many brackets)}: n\in X\}$.

  • And so on.

Write "$Y[X]$" to mean "The evaluation of $Y$ given $X$." (So e.g. if $Y$ is the first recipe described above, and $X=\{2, 3, 4\}$, then $Y[X]=\mathbb{N}$.) We can even have recipes which call other recipes! Suppose I've defined recipes $Y_i$ ($i\in\mathbb{N}$). Now "$Z=\{Y_i[X]: i\in X\}$" is a recipe! And we can have recipes calling recipes calling recipes calling . . . and so on.

This gives a method for attempting to expand a model $V$ of ZFC. Take a set $X\subseteq V$ (maybe $X\not\in V$!), and let $V[X]$ be the set of all recipes in $V$ evaluated at $X$. This makes perfect sense. But . . .

Question. Is this groovy?

Note that on the face of it, there's no reason to expect anything nice to happen at all! Cohen amazingly showed (among other things) the following:

Theorem. For certain types of $X$ - namely, if $X$ is a $V$-generic filter through some poset $\mathbb{P}\in V$ - we have $V[X]\models ZFC$.

The proof of this is quite technical, and I think it's here that we need to actually do some work; but hopefully this helps explain what sort of object the generic extension (this is $V[X]$) is, and what it is we need to prove about it.


Let me say a little bit about the proof. The key idea is the forcing relation:

Definition. For $\mathbb{P}\in V$ a poset and $p\in\mathbb{P}$, we say $p$ forces $\varphi$ - and write "$p\Vdash\varphi$" - if for every generic (over $V$) filter $X$ containing $p$, $V[X]\models\varphi$. (Here $\varphi$ is a sentence that maybe also refers to recipes; and when I write "$V[X]\models \varphi$," we look at the version of $\varphi$ where all recipes are evaluated at $X$.)

It turns out that the forcing relation is definable inside $V$, even though of course $V$ can't directly talk about generic filters! This turns out to be a very powerful tool; let me sketch an application.

Suppose $A\in V$ is a countable set, and $\mathbb{P}$ is countably closed - if $p_0\ge p_1\ge p_2\ge . . .$ is a descending $\omega$-chain of conditions, then there is some $p$ such that $p\le p_i$ for every $i$. Let $X$ be $\mathbb{P}$-generic over $V$. Then I claim that every subset of $A$ which is in $V[X]$, is already in $V$.

Why? Well, suppose $B$ is a subset of $A$ which is in $V[X]$. Then $B=\nu[X]$ for some recipe $\nu$. Suppose WLOG that $\Vdash \nu\subseteq A$. (The fact that this is WLOG is not at all obvious, but skip that for now.) Now let $$E=\{p\in\mathbb{P}: \exists C\subseteq A, C\in V,\mbox{ such that }p\Vdash \nu=C\}$$ be the set of conditions which guarantee that $\nu$ isn't "new." I claim $E$ is dense in $\mathbb{P}$. If so, we're done, since $X$ (being generic) contains an element of $E$, and hence $\nu[X]\in V$.

To see this, let $q\in\mathbb{P}$ and write $A=\{a_0, a_1, a_2, . . .\}$. Now, since the forcing relation is definable, inside $V$ we may define a sequence of conditions $p_0, p_1, p_2, . . .$ such that

  • $q\ge p_0\ge p_1\ge p_2\ge . . .$, and

  • for each $i$, $p_i\Vdash a_i\in \nu$ or $p_i\Vdash a_i\not\in\nu$.

(Why the latter? Well, if we can't find a condition forcing $a_i\in\nu$, that must be because we've already forced $a_i\not\in\nu$! This takes proof, but isn't too hard - it's a good exercise.)

But since $\mathbb{P}$ is countably closed, and the sequence $\{p_i\}$ exists in $V$, we must have some $p\in\mathbb{P}$ such that $p\le p_i$ for every $i$. But then $p$ is in $E$, since $V$ can tell which $a_i$ are forced by $p$ to be in $\nu$!

So every element of $\mathbb{P}$ lies above some element of $E$ - that is, $E$ is dense.

This is the key step to showing how we can force the Continuum Hypothesis to be true. For forcing the Continuum Hypothesis to be false, we use an analysis of a different combinatorial property - the countable chain condition. The key takeaway is that combinatorial properties of the poset translate into properties of the generic extension. But I think I'll stop here.