Application Insight: Don't Let Process Become an Innovator's Dilemma
Businesses must be extremely vigilant that they aren't "automating out" exploratory innovation all in the name of process automation.
Is your company focused on exploiting existing approaches and processes, or does the corporate culture encourage innovation? Everything we're hearing about the value of business process management and process modeling may revolve around this somewhat oblique question. And how you answer may determine how closely you may want to embrace the business process automation, service-oriented architecture and business process modeling trends du jour.
The crux of the issue, proposed in a 2002 article by two business school professors, Mary Benner and Michael Tushman, is that highly developed processes may be an excellent way of making companies more efficient, but they can become an inhibitor to that thinking-out-of-the-box creativity that drives what the authors call "exploratory innovation."
Process management can allow companies to excel at exploiting innovation by leveraging great ideas through the use of a highly efficient set of processes. But the moment you try to be truly innovative, investigating radically new approaches or technologies, process becomes an inhibitor. "Our results," Benner and Tushman wrote, "suggest that exploitation crowds out exploration." In other words, too much process can be as bad as, or worse than, too little process.
These findings have been borne out in recent research by Robert Cole, professor emeritus at the Haas School of Business in Berkeley, Calif. In his new book, Recovering from Success: Innovation and Technology Management in Japan, Cole looks into why NTT, a leading Japanese systems integrator, failed to jump on the Internet and its TCP/IP protocol, and thereby lost an opportunity to be a market leader. Again, process versus progress was at the crux of the problem.
TCP/IP is based on packet switching technology that has an extremely high tolerance for network failure. The assumption of failure built into the protocol was antithetical to NTT's culture. The Japanese integrator had a "powerful and effective reliability culture that stressed progressive elimination of error," Cole says. A protocol like TCP/IP, which assumes that error is a given, made no sense to NTT's sense of process, and therefore its strategists rejected TCP/IP, and the Internet, as a domain in which they should direct their efforts.
What does this all mean for business process management, modeling and the future of service-oriented architectures? Automated processes often take on a life of their own, regardless of whether the requirements have changed or the environment has evolved. We have all seen this in our jobs, in our interactions with the service economy and in our dismay at how our political and social systems work.
It's bad enough when there's an option for human intervention in largely people-oriented processes. But when process-for-process-sake meets the world of highly automated software, watch out. As the push to encode process in software continues, businesses must be extremely vigilant that they aren't "automating out" exploratory innovation, or stifling good ideas, all in the name of process automation.
It's important to note that the companies Benner and Tushman researched had all implemented ISO 9000, Six Sigma or Total Quality Management--process improvement methodologies that, it turns out, were just as successful at creating a straitjacket for innovation as they were at making sure quality was a major company goal. In Cole's case study, NTT was a former Deming Prize winner, the quality world's equivalent of a Nobel or a Pulitzer. The bottom line: Even though these companies were good at process, they missed the opportunity to be innovators.
So is process an innovator's dilemma? Only if you let efficiency get in the way of creativity. Because, fundamentally, process rigidity is in the mind of the developer. Most business process management, modeling and SOA tools in the market can support a fluid process model that allows exploitation to be replaced by exploration. The onus is on the user to know when such a replacement is necessary and to make sure "that's the way we always do it" isn't a prelude to "why didn't we try to do something else?" After all, you can't take ISO 9000 certification or a Deming Prize to the bank. No matter how hard you try.
Joshua Greenbaum is a principal at Enterprise Applications Consulting. Write to him at email@example.com.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
Top IT Trends to Watch in Financial ServicesIT pros at banks, investment houses, insurance companies, and other financial services organizations are focused on a range of issues, from peer-to-peer lending to cybersecurity to performance, agility, and compliance. It all matters.
Join us for a roundup of the top stories on InformationWeek.com for the week of September 18, 2016. We'll be talking with the InformationWeek.com editors and correspondents who brought you the top stories of the week to get the "story behind the story."