Pi Day: Celebrating The Math Gods Who Made IT Possible
A celebration of math and IT in honor of Pi of the Century Day.
We invite you to join us to celebrate Pi Day of the Century. In case you don't know what this is, today at 9:26:53 the date will be Pi to 9 digits past the decimal point, 3.141592653. This is the only time until March 14 2115 that you will be able to celebrate this.
Not only is Pi Day of the Century just cool, but Pi, mathematics, and information technology have a long and illustrious history together. Pi is the ratio of a circle's circumference to its diameter. The interesting thing about Pi is that, when represented as a decimal, it never ends and it never repeats a pattern (unlike, for example, 1 divided by 3 which is .3333 with the 3's going forever in an obvious pattern). You can keep adding digits to Pi with increasing accuracy. The current record for calculating pi is 13.3 trillion decimal places.
Obviously, we don't need to calculate Pi to 13.3 trillion decimals for any function. Even astrophysicists use a more manageable number of decimal places. But calculating Pi for speed is a great way to check a supercomputer for speed and accuracy. It is a good benchmark for the state of computing power, in general. Recently we've been upping the record for calculating Pi by about a trillion places per year. It took 208 days to make the current record.
The relationship between computing to Pi, and math in general, deserves recognition. So in honor of Pi of the Century Day, we're celebrating the mathematicians who got us closer to understanding Pi, and those who laid the foundations for the modern computer. As you'll soon see, many did both.
Have a slice of pie, flip through the following pages, observe the clock at 9:26:53, and celebrate math and IT with us by remembering these math Gods that made it all possible.
Archimedes of ancient Greece is widely credited as being the first person to try to express Pi through math. He expressed it as a fraction 22/7, which is fairly accurate. Archimedes is credited with other inventions and mathematical discoveries that are the basis for modern physics. He is often described as the "father of calculus," or the "father of mathematical physics." He is also known for burning enemy ships with mirrors  though that probably didn't happen  which beats the heck out of frying ants with a magnifying glass.
A Catalan philosopher from the 13th and 14th centuries, Llull is often called the father of information science. He created a machine out of paper discs that was designed to convert Muslims to Christianity through logic. The wheels turned and were associated with certain logic concepts. The machines combined different thoughts represented by letters and symbols. Llull believed that with this machine (or something like it) he could construct all thoughts the human mind could produce. To be honest, it doesn't look all that different than a toy to me, but it inspired many great thinkers and mathematicians of the future, especially Leibniz, who is also in this list.
An Indian mathematician in the 15th century, Madhava was the first to accurately calculate pi to as many digits as Pi Day of the Century calculates to. Without him, we wouldn't know it was even Pi Day of the Century. He is credited with bringing the idea of an infinite series to math, which makes sense since Pi is infinite. He is also responsible for torturing you in school with sines and cosines. He founded a school of astronomy and math, the Kerala School, which may have helped transfer many early concepts of calculus to Europe through trade.
Leibniz invented calculus at roughly the same time as Newton, but both probably did it separately. He also invented mechanical calculators, and is one of the great minds of the 17th and early 18th centuries. Leibniz actually worked in binary code. He even invented the concept of punch cards. Funny enough, the biggest weakness of his machine is that it couldn't carry digits during calculations without physical intervention. Leibniz sits prominently as a bridge between Babbage, more than a century later, and the early fathers of logic and math. In many ways, he's the modern father of all 19th century computing. And that's some great hair.
Charles Babbage, in the minds of many, invented the world's first programmable computers in the 1820s and 30s. They were essentially giant mechanical calculators that worked in staggeringly similar ways to today's computers. Oldtime gamers like me might remember there used to be a computer game store called Babbage's. After several iterations, it is now called Gamestop. Babbage was also known for counting broken windows, and other things he didn't like, and reporting them to the authorities as a nuisance. So he was not only a genius but a busybody. Sadly, because of funding issues, Babbage rarely got to finish any of his machines and so, to my knowledge, he never used them to calculate pi to an extreme number of digits.
No discussion of Babbage would be complete without mentioning Ada Lovelace, the world's first computer programmer. She worked with Babbage creating the first algorithm intended for Babbage's analytical engine. She also happened to be Lord Byron's daughter, and she was friends with Charles Dickens. Think about that. The world's first computer program was written at roughly the same as A Tale Of Two Cities was written. Lovelace married, became a baroness, and died at age 37, barely tapping into her potential. Due to many social factors, she's the only woman on this list, but she (and the long legacy of women mathematicians) should not be taken lightly. She did go out with a great nickname though, the "Enchantress of Numbers."
If you're a programmer, or just someone who likes to know the name of things, you've probably heard of George Boole, creator of Boolean Algebra which is the basis for digital logic. Born to a shopkeeper in Victorian England, Boole didn't get much education at first, struggling to become a teacher. When he didn't like the math books with which he had to teach, he started studying the subject on his own. Many computer languages, including Java and Pascal, pay tribute to him by having Boolean data types or other references to his name. He has a lunar crater named after him, which shows you how much he matters.
It is possibly more accurate to create a heading for the "great logicians" here, rather than singling out Russell. It would be wrong not to mention his inspiration, Gottlob Frege (he of the Frege computing language) and GE Moore, among others, as founders of analytic philosophy. This inspired much of the way computer logic is written. I picked Russell to feature here not only because he is one of the great philosophers of the 20th century, but also because he is so darn interesting.
Jailed for pacifism during World War I, he also protested imperialism, nuclear armament, and the Vietnam War. He won a Nobel Prize for literature. One of the reasons he got the prize was for his brilliant Principea Mathmetica, which he cowrote with Alfred Whitehead. It laid out the principles of mathematical logic that we still use in computing today. To my knowledge, Russell is the only person ever given a Nobel Prize in literature for any contribution related to computer science or math. There isn't even a Nobel Prize for math, though math and computer science are certainly represented in other Nobel Prizes. That pipe of his is bringing sexy back, too.
Alan Turing is not the only early 20th century computing pioneer. In fact, picking only one seems unfair. At the same time, if you're going to pick one, he is the one. Turing not only invented early computers and the algorithms to run them, he invented the tests for determining whether they were really computers. The race for the first "Turing complete" computer is the IT equivalent to the race to be the first to fly a plane across the Atlantic.
Of course, Turing is finally getting his due with the Oscar nominated movie The Imitation Game. While his chemical castration for his homosexuality is well known, one of the more interesting mysteries surrounds his death. He was found dead of cyanide poisoning with a halfeaten apple at his feet. The death was originally ruled a suicide, but many have speculated that it was an accident based on a device for electroplating spoons in his house. The most interesting speculation is that he may have been recreating the scene where Snow White eats the poison apple, apparently one of his favorite movie scenes. Whatever happened, Turing was only 41 and, had he lived longer, would likely have made many more contributions to IT.
There are many other great mathematicians who contributed to the field of information technology. We should mention Alexander Yee, who invented ycruncher, the program that computers are using to find more digits of Pi. Emil Post, who invented a machine similar to Turing's, is also an important figure. Joseph Marie Jacquard built the first computer using punch cards. If we had gone later in years, Grace Hopper, one of the developers of COBOL, is another great addition to the list of important female IT pioneers. And let's not forget PI Day is also Albert Einstein's birthday. He's no IT pioneer, but he's certainly worth acknowledging. The list goes on. For Pi Day, I suggest you have a slice of pie for every one of them. Sure, it is a lot of calories, but like these great pioneers, great things only come around once in a century, or so.
(Source: Quickmeme.com)
There are many other great mathematicians who contributed to the field of information technology. We should mention Alexander Yee, who invented ycruncher, the program that computers are using to find more digits of Pi. Emil Post, who invented a machine similar to Turing's, is also an important figure. Joseph Marie Jacquard built the first computer using punch cards. If we had gone later in years, Grace Hopper, one of the developers of COBOL, is another great addition to the list of important female IT pioneers. And let's not forget PI Day is also Albert Einstein's birthday. He's no IT pioneer, but he's certainly worth acknowledging. The list goes on. For Pi Day, I suggest you have a slice of pie for every one of them. Sure, it is a lot of calories, but like these great pioneers, great things only come around once in a century, or so.
(Source: Quickmeme.com)

About the Author(s)
You May Also Like
The Promise and Perils of AI
July 25, 2024Making Enterprise Service Management a Reality
July 30, 2024Exploring PostgreSQL 17: New Features & Enhancements
August 8, 2024