Look ahead to grid computing and open source or you'll be left to deal with a morass of legacy systems, Lou Bertin says.
My friend Jack has a headache. He sat slumped with chin on chest during a recent InformationWeek roundtable and, far from being the lively, insightful presence I've come to know and admire, he looked every bit a man in considerable pain.
Jack's the IT director at a company that sits in the lower middle of a global value chain that, at the top, has companies that manufacture the conveyances that transport people and things via land and sea and air. His company is in the middle of things but is hardly the top dog in terms of dictating standards to partners.
So it was that when I interrupted his self-contained lamentations during a conversation about gathering and sifting and delivering management information from multiple individual sources, he lifted his head and half-croaked, half-growled the following: "Right now, I have to support 12 different platforms. Have to. The companies we do business with aren't going to change their platforms to suit us. And my management isn't going to spend a penny on integration as long as things are working.
"I can make a case for efficiency, but all they see is the spend, not the benefit. Right now, they're not exactly interested in spending."
His statement alone bears commenting upon (and we'll get to that later), but the rejoinder that followed was the crystallizing moment of that particular conversation. That speaker was an ex-IBM'er who confessed: "I worked at IBM for 30 years and we had a lot to do then with creating the problems you're facing today." The unvarnished truth, spoken simply and neatly, summed up the causes of many (if not most) of the technology migraines afflicting so many companies and institutions.
To be sure, the platform propagation that took place in the pre-Windows world was an unavoidable consequence of the technologies and technology realities of the day. The notion of desktop business computing even into the mid-'80s was beyond the grasp of most. The model was simple and linear and was built first, last, and always around customized code written for a single, proprietary platform.
Do those archaic technology investments continue to deliver value? Apparently so, based on the observations from the majority of you I speak with regularly. Legacy systems, viewed intramurally by their enterprise proprietors, are sources of immense value. However, legacy systems, especially when shoehorned into extramural value chains that were inconceivable when those systems were installed, can represent elements at the fringes of one of the seven circles of hell.
Solutions abound, but as our pal Jack can attest, it's a long way from theory to a reality that can only be brought about with the investment of actual legal tender. Surely, the best and the brightest and not only the biggest of the enterprise users have found their way through the pain and awkwardness of their technology adolescence. The folks there, whether their lineage is (to apply the archaic tribal definitions) on the business side of the house or the technology side of the house, have seen the ultimate benefit that comes from fully exploiting technologies and the unrelentingly evolving techniques for deploying those technologies. Justification for investment is slowly replacing return on investment as the ultimate measure of where, how, and how many dollars will be deployed. So it ought be and so it is becoming, albeit at a glacial pace, based on what I hear at our reader gatherings. Patience and perseverance remain required in large doses.
5 Top Federal Initiatives For 2015As InformationWeek Government readers were busy firming up their fiscal year 2015 budgets, we asked them to rate more than 30 IT initiatives in terms of importance and current leadership focus. No surprise, among more than 30 options, security is No. 1. After that, things get less predictable.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?