Formulated in the earliest days of electronic computing in 1952-3 by Dr. Herbert R. J. Grosch, a scientist at IBM's Watson Scientific Computing Laboratory at Columbia University, the law postulated that the costs of computer systems increase at a rate equivalent to the square root of their power.
Grosch's Law became a standard for measuring data processing power as computers began to be mass produced. Now 87, in good health and teaching at the University of Toronto, Grosch said in an interview Tuesday that his famous law could still have relevance if applied rigorously.
"Grosch's Law was originally intended as a means for pricing computing services," said Grosch. "Tom Watson Jr. ordered me to start a service bureau in Washington. The first question was 'how much do I charge.' So I developed what became known as Grosch's Law."
Before the law, Grosch noted that people who wanted computers had to build them from start themselves. The popularity of Grosch's Law grew when IBM announced its first mainframe in the 1950s.
The law has been the subject of several doctoral theses, a few of them seeking to disprove the law, Grosch said. Grosch's Law is still discussed and debated in the data processing community. A typical discussion on the Internet finds an Australian computer scientist asserting that Grosch's Law prevailed until about 1980. Before that computing power had been growing exponentially along with price.
In those days, Grosch partially explained his law by saying: "There is a fundamental rule, which I modestly call Grosch's Law, giving added economy only as the square root of the increase in speed " that is, to do a calculation 10 times as cheaply you must do it 100 times as fast." One wag said the law could be translated to mean that no matter how much performance the hardware engineers provide, the software people will manage to use it up.
Grosch observed that the law was more useful when the computer user community was strong and better organized in the 1960s and 1970s. The law doesn't see much use today, and Grosch said IT managers don't need it any longer, primarily because they have their own ways of carrying out metrics on their existing computer installations to figure costs for their next computing purchases.
Grosch has had a long and varied career. After he received his PhD in astronomy at the University of Michigan in 1942, he put in two stints at IBM " he boasts that he was fired each time, although he is proud that Tom Watson later gave him a wooden duck to signify his status as a creative IBM "wild duck," who did imaginative work in the buttoned-down company.
Grosch has been in and out of academia teaching at Columbia, Boston University and finally landing at the University of Toronto where he is teaching a course on early computing. "I cover everything from the marks on cave walls, up to 1953," he said. "I may mention Grosch's Law, but it's not too relevant today."
In related news, Intel is looking for a copy of the 1965 magazine where co-founder Gordon Moore first laid out his famous "Moore's Law."
The Business of Going DigitalDigital business isn't about changing code; it's about changing what legacy sales, distribution, customer service, and product groups do in the new digital age. It's about bringing big data analytics, mobile, social, marketing automation, cloud computing, and the app economy together to launch new products and services. We're seeing new titles in this digital revolution, new responsibilities, new business models, and major shifts in technology spending.
Join InformationWeek’s Lorna Garey and Mike Healey, president of Yeoman Technology Group, an engineering and research firm focused on maximizing technology investments, to discuss the right way to go digital.