Reflecting on many of the technology-related articles that I've read lately, I think "resistance" should be considered for Word of the Year honors. There are repeated references to resistance, often regarding IT "resisting" cloud strategies, business manager "resistance" to analytics and data-driven decision making, operations and dev "resistance" to DevOps, and workforce "resistance" to artificial intelligence.
Don't you get the impression that everyone is trying to derail great ideas and there is a cabal of end users and front-line IT staff conspiring in the lunch room to maintain the status quo with legacy systems?
Yes, there are individuals in every organization who fear and fight change, whether it is in how they do their jobs or even out of concern that change could cost them their jobs.
However, they aren't in the majority. I'll argue that what advocates for new concepts view as resistance is closer to passiveness than it is to resistance. Keep in mind that resistance implies a conscious decision to fight back. When it comes to adopting new technologies, the greatest challenge is overcoming ignorance -- and I don't mean that in an insulting way -- on the part of end users, business leaders, and IT staff.
I've witnessed this alleged resistance over the decades, through many generations of tech breakthroughs, dating back to when something called voicemail came into the workplace as a replacement for stacks of pink message slips.
There's a common flaw in most technology implementations. The people who are expected to use the technology are taught -- if trained at all -- how to make the technology work, as in which buttons to push or where to click. It tends to be about step-by-step procedures documented by people who already know those procedures and the lingo of the technology supplier.
All of that is about the "how" and not the "why" of technology adoption. If I'm going to use a new application I want to know how it fits into my workday and how it helps me do my job. If people don't know why that technology is going to help them, they simply are going to ignore it.
A complicating factor is that when you do try to use the new technology it might not work. Maybe it's user error, maybe it's poor documentation, or maybe it's a product flaw. As the user you can't be bothered to keep trying. So, without a reason to use the tool, it becomes easy to stop trying.
Whose job is it?
If someone shows you how that technology really can help you do your job you might stick with it, despite any flaws, because you see potential value.
Highlighting that value falls on the shoulders of the technology advocates, those who select the tech products, those who implement the tech, and even those in the upper pay grades who make the enterprise decisions.
Of course, the process of identifying that potential value starts when someone decides to acquire or build the new system. Smart companies do that already, defining the "business case" for the project. However, that tends to happen in the upper echelons, where the benefits fall into categories such as cost savings, acquiring new customers, and more efficient customer support. The decision makers might even bring in a handful of line workers who can show the tech team how they complete tasks and do their jobs. That information helps the tech team design the system and the interfaces, but can be easily forgotten in the months until system rollout.
When rollout does happen, project advocates are apt to say, "Here's a new app; be sure to use it." They forget to shout, "Here's how it will help you, and you, and you do your job."
Take the case of a new analytics initiative for the sales department. The sales team may be infamous for relying on their "gut" rather than data. If the tech/analytics team pitches a new analytics app as something that will tell sales reps how they should approach customers -- the type of pitch that many companies are learning to avoid -- the sales reps are almost guaranteed to ignore the app.
However, if that app is pitched as a tool that will supplement the sales reps' experience, providing data about factors such as why people say they won't buy a product or the impact of price changes, those reps can see the real business value.
Defining that value can be challenging. It means really getting to know how end users do their jobs and where they encounter roadblocks. Maybe it requires more facetime for the tech team right in the user departments. Maybe it means bringing more user representatives into the development process (something that some DevOps experts advocate). Maybe it requires a commitment from the business unit manager to take the lead and explore and explain how the new tech will make a difference on a day-to-day basis.
Whatever the approach, it's time for new technology's advocates to recognize that shiny new tools will rust away if people can't see the real value they offer.Jim Connolly is a versatile and experienced technology journalist who has reported on IT trends for more than two decades. As editorial director of InformationWeek and Network Computing, he oversees the day-to-day planning and editing on the site. Most recently he was editor ... View Full Bio