# Pi to 10 Trillion Digits

After more than a year of electronic computation, Alexander Yee and Shigeru Kondo finally reached 10 trillion digits of . This was a follow up to last year’s computation of up to 5 trillion digits.

The hardware used was a 3.33 GHZ-processor desktop with 96GB DDR3 RAM. About 44TB of hard disk space was used for the computation and another 7.6 TB was used to store the value of . The program used was called *y-cruncher* (see details).

is the ratio of the circumference of a circle to its diameter. That means that if we have a circle, regardless of the size, and we divide the circumference by the diameter, the quotient will always be or approximately .

is an irrational number: it cannot be written as ratio of two integers. Its decimal digits do not end and do not repeat; no computer, however powerful, would be able to compute its exact value. No matter how many digits we compute, it will always be an approximation.

The computation of began since antiquity. The first approximation was as early as 2500 BC (see Chronology of Computation of Pi). Archimedes (about 200 BC), one the most famous mathematicians of his time, inscribed a regular polygon in a circle with radius 1 (and also circumscribed the circle with a regular polygon) to approximate . The area of a polygon inscribed in a circle (or circumscibes the circle) with radius 1 approximates . The greater the number of sides, the more accurate the value (see approximating pi).

In 1873,William Shanks, published his computation of up to 707 decimal places. It took him 15 years to do the computation, but 72 years later, it was found out that the computation was only correct up to 527 places.

The invention of desk calculators and computers revolutionized the computation of . In 1949, D.F. Ferguson and John Wrench computed using a desk calculator up to 1120 decimal places. The same year, using an ENIAC, the first electronic computer, was computed up to 2037 decimal digits. The computation took 7 hours.

Since then more and more mathematicians and computer scientists tried to surpass the accuracy of previous computations. In 1973, John Gilloud and Martin Bouyer computed up to more than a million digits, and in 1989 Yasumasa Kamada and Yoshiaki Tamura computed up to more than a billion digits.

As of this writing, the latest computation of was the one we discussed above — up to 10 trillion digits. The computation started October 10, 2010 and ended October 16, 2011.