Estimation of $\pi$ using dice

Each die roll generates $2.58$ bits of entropy, and a sequence of die rolls can generate a uniformly distributed random real number in $[0,1]$ to any desired degree of precision. (For example, consider the die rolls to be a sequence of base-6 digits, where a roll of 6 represents a 0 digit.)

Generate two such random numbers, $x$ and $y$, appending digits to each until there are enough digits in both numbers to establish with certainty whether $x^2+y^2 < 1$ or $x^2+y^2>1$. (Equality occurs with probability $0$ and can be disregarded.) If $x^2+y^2<1$, add a tally to the "in" column; otherwise add a tally to the "out" column.

After generating a $n = \mbox{in}+\mbox{out}$ such pairs, and so accumulating a total of $n$ tallies, we have $$\pi\approx 4\frac{\mbox{in}}{\mbox{in}+\mbox{out}}.$$

The idea here is that $x$ and $y$ determine a random point in the square $[0,1]^2$ that is uniformly distributed. The area of the quarter-circular region $x^2+y^2<1$ is $\frac\pi4$, and so a uniformly selected point in the square will lie in that region with probability $\frac\pi4$.


Let $N$ be the number of dice thrown randomly over a square of area $R\times R$. The largest circle inscribed has area $\pi R^2/4$. Let $N=R\times R$. The number of darts that fall inside the circle is the area of the circle: $N_{in}=\pi R^2/4$, i.e., $$\pi=\dfrac{4N_{in}}{N}$$ Edit What I did seems almost in parallel to http://www.cs.cornell.edu/courses/cs100j/2004sp/Notes/h0506.pdf, and hence I will cite it here as a reference.


Suppose the die is a 1-inch cube. Draw a bunch of parallel lines spaced 1-inch apart. When you roll a die onto this field of lines, it will always cross one line (although it's not "impossible" to land right between two lines, there's $0$ chance that it will.) Sometimes, the die will cross two lines.

You can compute the theoretical likelihood that it will cross two lines, and it involves $\pi$. Then you can roll the die and get an empirical probability. Set the two things equal, solve for $\pi$, and you have an approximation.

This is a lot like the famous needle problem, where you view a face-diagonal of the die as the needle.

To calculate the theoretical probability, we only care about where the center of mass of the die lands along a line orthogonal to the lines we sketched, and the angle that the face-diagonal make with the drawn lines. In the epression below, the "$2$" and "$8$" exploit symmetry to make the integral easier to write. The probability is $$\frac{2\cdot\int_{y={1-\sqrt{2}/2}}^{y=1/2}8\cdot\int_{\theta=\arcsin\left(\frac{1-y}{\sqrt{2}/2}\right)}^{\theta=\pi/2}\,d\theta\,dy}{\int_{y=0}^{y=1}\int_{\theta=0}^{\theta=2\pi}\,d\theta\,dy}$$ which simplifies to

$$\frac{4}{\pi}-1$$

So eventually, after enough die tosses, $0.2732...$ of them will have crossed two lines. Adding $1$ to this, dividing by $4$, and inverting will give an estimate of $\pi$.

If you want to know how many tosses it will take until you can be reasonably sure that you will have $\pi$ correct to three digits, that's about the same as getting the empirical probability to have an error less than $0.001$. Assuming we do not know a decimal for $\frac{4}{\pi}-1$, we would assume the worst case: that this is $0.5$. In that case, to have an error of under $0.001$ with $95\%$ certainty, we would solve $$0.001=1.96\sqrt{\frac{0.5\cdot0.5}{n}}$$ and you'd need over $960{,}000$ tosses. I'd be surprised if any of the dice-based methods for approximating $\pi$ are markedly faster than this.

Tags:

Probability