We now say, 'Software is eating the world.' In 1984, many years ago, it was clear that it was going to change it.

Charles Babcock, Editor at Large, Cloud

July 31, 2017

5 Min Read
Source: Pixabay

I sat down to breakfast once a long time ago with John Cullinane, founder and CEO of Cullinet Software, and a small group of his executives. I was the brand new, inexperienced software editor for the publication, Computerworld, and Cullinet was one of its major advertisers.

When breakfast was over, Cullinane took me aside to explain that there'd be no need to waste a lot of time covering a new type of software, relational database. Mainframe users, the biggest software buyers, didn't need one. They had his product, IDMS, and the whole who-haa over the emergence of something called relational database was just smoke and mirrors.

It was that 1986 conversation more than any other that convinced me relational database was here to stay. The establishment was trying hard to disregard its existence -- but was clearly worried.

Another sign was later that year while sitting down with Peter Tierney, a senior VP at Relational Technology Inc. at the McCormick Center in Chicago, he tried to explain why a brash young executive at Relational Software was saying such terrible things about his product. The database market was going to be worth "millions," and Larry Ellison with his Oracle system wanted to own it.

"Welcome to the database wars," he said with a hint of world weariness. Little did I know at that time, the battle was just getting underway.

Want to learn more about adapting to rapid change? See Accenture: Seek Business Value in Digital Transformation.

It's hard to remember how much the world has changed from the days when even IBM salesmen weren't quite sure how to present the relational system DB2 and its "experimental" capabilities. It was nice that a database system could reach deep into a data vault and pluck out the one thing that you wanted, via a query. But what was the business use for that?

Two years later, DB2 was firmly established and I was sitting talking to the head of Software AG USA in Reston, Va. His name was Stuart J. Miller, president and CEO, and with a lot pain, he said: "There was no need for IBM to introduce DB2, we had that segment of the industry fully covered." He said it almost with anguish, since he was terribly worried about the future of Software AG's Adabas and Natural. I hadn't realized up until then that a highly paid executive could sit around wringing his hands.

Software was a dynamic field and the leading companies had gained their position by riding the changes underway in the industry. To keep them, they needed to keep riding. As a breed, I came to respect the execs of major software companies as keen observers of change and how to position their firms in the midst of upheaval. One should listen closely to what they had to say.

Sitting in an office in the Buckhead section of Atlanta, John Imlay, CEO of Management Science America, was among the most entertaining, well-informed and likeable of all the people I've talked to. He understood that IBM's substantial research effort was going to produce products, and third parties such as MSA that wanted to compete were going to have their hands full. Instead of dwelling on that, he'd launch into one story after another, until it was time to tell his favorite joke about a man caught roasting an eagle in the wilderness. When the judge asked him why, he explained he had been lost and starving. The judge dismissed the charges in sympathy and asked, "So out of curiosity, how did it taste?"

"Not as gamey as the Whooping Crane but sweeter than the Spotted Owl," he responded, and Imlay would roar at his own joke.

I remember talking Supra programming issues with the founder of Cincom Systems – was it Tom Nies? --  while riding back to the hotel on a paddle wheeler on the Ohio River. And I recall getting a rare interview with Charlie Wang at Computer Associates International on Long Island, when his press averse, quick-to-anger brother, Anthony, burst into the room to accuse me of putting words into Charlie's mouth.

When it came to the leading software brass, I learned early on that I liked nearly all of them -- but shouldn't necessarily believe any of them.

The software business consists of hazardous duty. It's on the frontlines, lines that are constantly shifting. For those who achieve a certain amount of success, it often leads to a digging in, a narrowed field of vision, a disbelief that anything better could ever come along that might dislodge you. But in the software industry, something better always comes along, sometimes sooner rather than later.

In 1992, I paused while at Computerworld to take a look backward. "I carry with me a copy of the June 19, 1968, issue of Computerworld, all 16 pages of it, as a reminder of how changes sweep through the computer industry," I wrote at that time. On that 1968 issue is the headline: First Patent Issued for Software; Full Implications Not Yet Known.

The third party software industry was legitimized with that patent. It was awarded to Marty Goetz, CEO of Applied Data Research, supplier of DatacomDB and Ideal. At the time of the precedent the patent created, the independent software industry was already in a youthful stage. The decision ratified it as legal. Software writers were free to create programs for hardware platforms and sell them without the permission of the hardware supplier. Up until then, IBM and other hardware makers had insisted only they could do so. And with the proliferation of software writing, the hardware suppliers slowly realized their platform took on more value. The concept of a software ecosystem had been born.

Today we echo Marc Andreesen in saying, "Software is eating the world." In 1984 when I first entered this business, the IBM PC was all the rage and everyone wanted to cover the hardware. But the software was the flexible, frequently adaptable part of computing, I concluded, and I would write about that. I've been toiling in this vineyard ever since and it's one decision I've never regretted.

[Editor's note: After more than 30 years of exceptional work in covering the world of software, Charles Babcock is heading into retirement after this week. While we wish him well we know that the software industry and the IT community will miss him.]

About the Author(s)

Charles Babcock

Editor at Large, Cloud

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week. He is a graduate of Syracuse University where he obtained a bachelor's degree in journalism. He joined the publication in 2003.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights