"Essentially, all models are wrong, but some are useful." --George E. P. Box
One of my most embarrassing memories from college:
My lab partner and I went to a nuclear research reactor to expose samples to thermal neutrons and measure the decay spectrum. The reactor, one of two such facilities in the country, had to be started up. Technicians in white lab coats were busy turning knobs, and a large vacuum tube display indicated the increasing power output. Around the reactor core the bluish glow of the Cherenkov radiation outlined the deep underwater structure. The machine was amazing.
Then the chief scientist looked at our lab results from the previous week's experiment and noticed that we skipped the error calculations. He promptly gave us Fs and sent us home. The reactor had to be powered down. We wasted the time of many people.
How many times do we prioritize our IT projects and services based on "objective" scores while all the numbers are well within one standard deviation of one another? How many IT decisions should get an F because, behind the numbers, there's an unknown degree of uncertainty?
All IT decisions should be based on forecasts. We try to predict the immediate and future impact of our decisions, such as picking a product, choosing a vendor, funding a project or launching a new IT service. Forecasting is nothing but the collection of information and the improvement of the signal/noise ratio until we find the option with the best probable outcome.
Here is where it gets complicated. We have to evaluate tangible and intangible criteria such as risk, capex, opex, TCO, ROI, strategic value, business outcome, competitive advantage, fit in the current environment, institutional memory, customer perception, future IT trends and sustainability.
Since testing the different options -- implementing them and waiting a few years to see the results – isn't feasible, we use models instead. These models can get rather sophisticated. We can bring in the vendors for short intro projects or install proof-of-concept pilot systems.
We usually make decisions within a group of business and IT experts who come from various areas and share little common ground or knowledge. And when it comes to new information technologies, often the only deep expert in a room is a biased vendor.
So how can we improve the IT decision-making process? Here's a small collection of ideas:
-- Express technology in terms of business outcome and value.
-- Assemble a well-balanced team of business and IT experts.
-- Deploy devil's advocates to shake the group out of groupthink.
-- Don't give short shrift to culture and institutional memory.
-- Be aware of biases such as anchoring, availability heuristics and loss aversion.
Other IT worst practices and "core incompetencies" are discussed on the AntipatternZOO.
Attend Interop Las Vegas, May 6-10, and attend the most thorough training on Apple Deployment at the NEW Mac & iOS IT Conference. Use Priority Code DIPR02 by March 2 to save up to $500 off the price of Conference Passes. Join us in Las Vegas for access to 125+ workshops and conference classes, 350+ exhibiting companies, and the latest technology. Register for Interop today!
How Enterprises Are Attacking the IT Security EnterpriseTo learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
2017 State of IT ReportIn today's technology-driven world, "innovation" has become a basic expectation. IT leaders are tasked with making technical magic, improving customer experience, and boosting the bottom line -- yet often without any increase to the IT budget. How are organizations striking the balance between new initiatives and cost control? Download our report to learn about the biggest challenges and how savvy IT executives are overcoming them.