"Essentially, all models are wrong, but some are useful." --George E. P. Box
One of my most embarrassing memories from college:
My lab partner and I went to a nuclear research reactor to expose samples to thermal neutrons and measure the decay spectrum. The reactor, one of two such facilities in the country, had to be started up. Technicians in white lab coats were busy turning knobs, and a large vacuum tube display indicated the increasing power output. Around the reactor core the bluish glow of the Cherenkov radiation outlined the deep underwater structure. The machine was amazing.
Then the chief scientist looked at our lab results from the previous week's experiment and noticed that we skipped the error calculations. He promptly gave us Fs and sent us home. The reactor had to be powered down. We wasted the time of many people.
How many times do we prioritize our IT projects and services based on "objective" scores while all the numbers are well within one standard deviation of one another? How many IT decisions should get an F because, behind the numbers, there's an unknown degree of uncertainty?
All IT decisions should be based on forecasts. We try to predict the immediate and future impact of our decisions, such as picking a product, choosing a vendor, funding a project or launching a new IT service. Forecasting is nothing but the collection of information and the improvement of the signal/noise ratio until we find the option with the best probable outcome.
Here is where it gets complicated. We have to evaluate tangible and intangible criteria such as risk, capex, opex, TCO, ROI, strategic value, business outcome, competitive advantage, fit in the current environment, institutional memory, customer perception, future IT trends and sustainability.
Since testing the different options -- implementing them and waiting a few years to see the results – isn't feasible, we use models instead. These models can get rather sophisticated. We can bring in the vendors for short intro projects or install proof-of-concept pilot systems.
We usually make decisions within a group of business and IT experts who come from various areas and share little common ground or knowledge. And when it comes to new information technologies, often the only deep expert in a room is a biased vendor.
So how can we improve the IT decision-making process? Here's a small collection of ideas:
-- Express technology in terms of business outcome and value.
-- Assemble a well-balanced team of business and IT experts.
-- Deploy devil's advocates to shake the group out of groupthink.
-- Don't give short shrift to culture and institutional memory.
-- Be aware of biases such as anchoring, availability heuristics and loss aversion.
Other IT worst practices and "core incompetencies" are discussed on the AntipatternZOO.
Attend Interop Las Vegas, May 6-10, and attend the most thorough training on Apple Deployment at the NEW Mac & iOS IT Conference. Use Priority Code DIPR02 by March 2 to save up to $500 off the price of Conference Passes. Join us in Las Vegas for access to 125+ workshops and conference classes, 350+ exhibiting companies, and the latest technology. Register for Interop today!
The Business of Going DigitalDigital business isn't about changing code; it's about changing what legacy sales, distribution, customer service, and product groups do in the new digital age. It's about bringing big data analytics, mobile, social, marketing automation, cloud computing, and the app economy together to launch new products and services. We're seeing new titles in this digital revolution, new responsibilities, new business models, and major shifts in technology spending.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?