Estimation of $\pi$ using dice

Solution 1:

Each die roll generates $2.58$ bits of entropy, and a sequence of die rolls can generate a uniformly distributed random real number in $[0,1]$ to any desired degree of precision. (For example, consider the die rolls to be a sequence of base-6 digits, where a roll of 6 represents a 0 digit.)

Generate two such random numbers, $x$ and $y$, appending digits to each until there are enough digits in both numbers to establish with certainty whether $x^2+y^2 < 1$ or $x^2+y^2>1$. (Equality occurs with probability $0$ and can be disregarded.) If $x^2+y^2<1$, add a tally to the "in" column; otherwise add a tally to the "out" column.

After generating a $n = \mbox{in}+\mbox{out}$ such pairs, and so accumulating a total of $n$ tallies, we have $$\pi\approx 4\frac{\mbox{in}}{\mbox{in}+\mbox{out}}.$$

The idea here is that $x$ and $y$ determine a random point in the square $[0,1]^2$ that is uniformly distributed. The area of the quarter-circular region $x^2+y^2<1$ is $\frac\pi4$, and so a uniformly selected point in the square will lie in that region with probability $\frac\pi4$.

Solution 2:

Let $N$ be the number of dice thrown randomly over a square of area $R\times R$. The largest circle inscribed has area $\pi R^2/4$. Let $N=R\times R$. The number of darts that fall inside the circle is the area of the circle: $N_{in}=\pi R^2/4$, i.e., $$\pi=\dfrac{4N_{in}}{N}$$ Edit What I did seems almost in parallel to http://www.cs.cornell.edu/courses/cs100j/2004sp/Notes/h0506.pdf, and hence I will cite it here as a reference.