In this seventh installment in our series, Howard Anderson profiles the men behind the ENIAC.
John Mauchly and J.P. Eckert
Philadelphia is a lot like Merry Old England. The first digital general-purpose computer, the ENIAC, was born in Philadelphia in 1946, so you might have expected a flourishing computer industry to spring up there, just as you might have expected Great Britain to become a technology force given its industry pioneers. But Philadelphia today is known more for its cheesesteaks than its high-tech prowess. (By the way, the best cheesesteak can be found at The White House in Atlantic City, N.J. Mention my name.)
The credit for this first digital computer should go first to the U.S. government. The U.S. Census Bureau in 1890 realized that it was taking eight full years to compile the 10-year census and it needed a faster solution, so it turned to Herman Hollerith, an inventor and statistician, to make the first mechanical tabulators. He used punch cards, a technology the textile industry had used for years to program its looms. Hollerith's company eventually turned into International Business Machines.
IBM, recognizing something significant, bankrolled Howard Aiken and Grace Hopper, a Navy 2nd Looie, at Harvard in 1944. Their "Harvard Mark I" was an electro-mechanical device that could handle differential equations. Hopper, troubleshooting the Mark I, found a dead moth cooked by the vacuum tubes and removed it ... or, as she said, "debugged it." By 1947, the Mark IV was all-electronic and was using magnetic drum and magnetic core memory.
"Amazing Grace" was a computer scientist of the first order. She conceptualized the idea of machine-independent programming languages, which was the forerunner of COBOL. Hopper eventually was promoted to rear admiral in the U.S. Navy.
Aiken, the senior dude on this team, gave credit to Charles Babbage, the English mathematician who in the early 1800s originated the concept of a programmable computer, though he never got one to work. The need was for a better way to do logarithms, which at the time were done by people. Babbage designed a system in which the data and program memory were separated and the machine had a separate I/O unit. It was the forerunner of the computer as we now know it.
The British government sponsored that program for 10 years but eventually gave up. The wimps! In 1991, the London Science Museum actually built Babbage's Difference Engine -- just 200 years too late!
Back we go to Philadelphia in the 1940s. World War II is on and the U.S. Department of War (renamed the Department of Defense in 1949) had a problem: It had to calculate firing tables for artillery. Suppose you're in a ship that's listing back and forth and the wind is blowing at 15 miles per hour and you're moving at 15 knots. Now what? The military had hired hundreds of young women, called "computers," to work out the complex math, but even that solution was too slow.
While Aiken and Hopper were working on computers in Cambridge, Mass., John Mauchly and J.P. Eckert were pursuing parallel development in Philadelphia, sponsored by the U.S. government. They developed the ENIAC (Electronic Numerical Integrator And Computer). Containing 17,000 vacuum tubes, weighing 30 tons and consuming 160 kilowatts, it could perform 5,000 additions or 357 multiplications in one second, which was 1,000 times faster than those perky human "computers" could do. The ENIAC could do in 30 seconds what a person could do in half a week.
Just as no one quite knew what to do with the first personal computers (store recipes?), no one was quite sure what could be done with the first electronic computers. There was no software, no applications and no trained IT industry.
Rumor had it that the first time the ENIAC was fired up, the lights dimmed in all of West Philadelphia. This is the first known benefit of digital computers. Others suggest that there were only dim lights at Mauchly and Eckert's University of Pennsylvania, which was foolish enough to grant me a degree. (And no, they can't have it back!)
Still, the score was Penn 1, Harvard 0 in the race to build the first real computer.
Mauchly, the physicist, developed the mathematical theory for the ENIAC. Eckert, the electrical engineer, developed the hardware. Eckert had been the instructor when Mauchly took an early EE course at Penn. Later, having enough of academia, both left to start the Eckert-Mauchly Computer Corp., which started making machines for commercial use and which morphed into Univac.
The ENIAC was perfected after World War II ended, but by 1945 it was used to do the millions of discrete calculations for top-secret work on the hydrogen bomb.
Twenty years later, there were really just six mainframe computer companies: IBM, Burroughs, Univac, NCR, Control Data and Honeywell. Most of them would ignore the revolutionary minicomputer to their ultimate detriment.
So within 70 years, technology leaped forward. The infrastructure was in place: energy/electricity. The early light bulbs morphed into the early vacuum tubes. U.S. government funding pushed the technology world forward, and that technology jumped the Atlantic.
The trouble with all those vacuum tubes is that they burn out, a problem tackled by three guys at AT&T Bell Labs: William Shockley, John Bardeen and Walter Brattain. AT&T needed to build a labor-efficient method that would amplify long-distance signals, which meant finding a replacement for those unpredictable vacuum tubes. So in this case it was the burgeoning communications industry that drove the demand. All three would win the Nobel Prize, and they "germinated" (inside joke) Silicon Valley. It's a subject we will cover in our eighth installment in this series.
How Enterprises Are Attacking the IT Security EnterpriseTo learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
2017 State of IT ReportIn today's technology-driven world, "innovation" has become a basic expectation. IT leaders are tasked with making technical magic, improving customer experience, and boosting the bottom line -- yet often without any increase to the IT budget. How are organizations striking the balance between new initiatives and cost control? Download our report to learn about the biggest challenges and how savvy IT executives are overcoming them.