Eight Comebacks on 'BI and Technology'

Excel comes with lots of baggage, and that has to be addressed. If there is a company policy that you don't wear flip-flops to work, why can't there be a company policy that unaudited spreadsheet models can't be used in presentations, or, data has to be sourced with adequate provenance before a spreadsheet can be shared?

Neil Raden, Contributor

April 14, 2008

10 Min Read

There were lots of provocative questions and comments on my previous two posts ("Technology is Not the Driver of BI Adoption" and "BI and Technology: Part II"), so I thought I'd just batch all my responses together.Andy:

Sure, the vast majority of users need Excel, but I wouldn't stop there. Excel comes with lots of baggage, and that has to be addressed. One way is to look at it as a management issue, not a software issue. If there is a company policy that you don't wear flip-flops to work, why can't there be a company policy that unaudited spreadsheet models can't be used in presentations, or, data has to be sourced with adequate provenance before a spreadsheet can be shared? There are even some enterprise-level software tools to assist with this, such as www.compassoft.com. I know there is some sentiment that all BI interfaces should be through spreadsheet interfaces, but I don't really buy that either. There is too much information that is not matrix-oriented. Anyway, pervasive BI as we know it today is sort of silly, and I think that's what you're getting at, so we're in agreement on that.

Jake Freivald:

When you mention BI can spread to a wide audience "…when they can use it as part of their day-to-day lives" I agree 100 percent. I always felt that BI should be a transformative process, but that doesn't imply that people have to change their lives or their work (though it can). In many cases, the transformation is just improving what they do by making it faster, easier, more accurate, timelier, more efficient and/or more informed. But most BI implementations I've seen assume that it will change the way people work. The problem is, BI implementers, whether internal or external, are rarely qualified to understand the nuance of the work that people actually do nor do they have the tools or the portfolio to run a successful change management process. "Good idea, Neil, but aren't you the data warehouse guy? Maybe we should call McKinsey or BoozAllen to do this." I've had this conversation more than once. I call it the Moses effect - we get them to the Jordan River but can't get them across.

Unlike an ERP application, in BI, participation is usually optional. There are alternatives. No one ever mounted a get-them-to-use-spreadsheets campaign. People used spreadsheets, and continue to use them instead of BI for good reasons. It may not be a good solution as a multi-user application development platform from an enterprise point of view, but this is what people chose when left to their own devices. Spreadsheets provide some features that BI does not. Spreadsheets are expressive, BI is not. You get what's been provided and wait for the rest. Spreadsheets have no layers of security and roles that restrict your functionality or access. Spreadsheets allow people to construct what is relevant for them at the time. Spreadsheets are subversive and can be used to work the back channels, which is where the real work gets done. I'm not suggesting that BI roll over and play dead and let spreadsheets take over (at least, not any more than they already have). But it would be wise to learn why they work and emulate those features.

So I agree that BI needs to grow in organizations, it has the potential to be useful, but it has to be implemented in a way that is cognizant of the fact it won't be picked up if it means retooling the way people work. That can happen over time, but it won't get out of the starting blocks if that is the intention going in.

Cindi Howson:

When you mentioned the survey that found that half of the analysts weren't using BI, that's exactly what I meant. They are not potential users, they are not going to use it.

Pervasive BI is also a little troubling. It's a term that was invented by people who have a vested interest in selling more licenses, but I haven't heard too many convincing arguments about how it is going to do anyone else any good, at least not in its current form. You can talk about alerts and other sorts of "embedded" BI, but unless you draw a clear connection between how people work (and its variations) and how you can accommodate that with existing BI practices that are highly contrived before-the-fact, I don't buy it. It just sounds like more of, "We're from IT, we're here to help" while people duck and cover. James Taylor and I are pursuing decision automation as one avenue, but the application of BI tools in an EDM architecture is before and after the events. BI and predictive models are used in the construction and refinement of business rules and adaptive control mechanisms employ analytics to evaluate the accuracy of the decisions being made. But that's a lot different than putting a dashboard on everyone's desk.


I was with you until your comment about, "help people realize and communicate understanding." I don't think that's were pervasive BI should be headed. We need to aim lower. There are other tools that are better suited to this, such as enterprise search that combines content and structured data, collaboration tools, semantic technologies that can apply deductive reasoning and automated attendants. This is all beyond the scope of BI.

Kurt Schlegel:

I don't think we disagree. I was trying to make the point that technology is not the driver, and these five technologies were not going to get BI out of the desert. It will take a lot more than that, like, for example, rethinking the whole proposition, which is still "getting the right data to the right person at the right time in order to make better decisions. This was a marketing phrase that became an aphorism, without merit. Let's just say that it may be possible that it is necessary to have good information to make decisions, but it surely isn't sufficient. In fact, it may not even be necessary. Predictive models make good decisions with lousy data all the time.

I'm not even sure it was good technology that made the Internet. The technology for the Internet, access to it and the content on it were all laughable a dozen years ago, but that's when it took off, version 1.0 software. The need for it and the purposes that it served were so overwhelming that people (like me) tolerated slow dial-up speeds, shaky Web sites and horrible ergonomics just to get to use it.

So, I believe that some of these five technologies will be important for the next phase of BI, but if it is going to succeed, it needs to take a different direction. Existing BI is very useful and will not go away, but the growth is going to come from deploying new BI concepts to a different group of people. Not by turning the rank and file into analysts with whiz-bang technology. I don't believe in competency centers, having witnessed them as a user, an implementer and an analyst. Centralizing expertise is just an admission that no one can use it.


I was tracking with you for a while, but I think you got off the track when you said, in reference to decision automation, "First, we must be dealing with an operational activity, rather than an after-the-fact analysis. Second, the person responsible for the activity must be, as you put it, 'numerate.' Finally, the person must be empowered to actually MAKE a decision."

In decision automation, there is no person. In your example, when the bank Customer Service Rep picks up the phone, the call has either been routed to that person as a result of the decision service, or some information is provided to him/her such as prioritized suggestions, or both. That's the end of the decision service until the next event, perhaps. But the CSR does not need to be numerate, it is only necessary to do their job. Empowered to make a decision may or may not be part of it. Presumably, they know when to say thank-you.


You hit the nail on the head when you said, "to understand how to make BI a part of their lives in all aspects." I assume you mean in all aspects of their work. What I interpret this to mean is to adopt the orientation that BI exists to serve the (work) needs of the organization. To date, most BI implementations have been the other way around - let's adjust the way people work with this new technology. You can for some people, but not for everyone, only those who are temperamentally and intellectually capable. That has proven to be between 10-20 percent of the population, though I think it is actually less than that.

However, when you said, "We can't manage what we can't measure," I have some serious reservations. Many studies have shown that over reliance on measurement can lead to disastrous distortion and dysfunction. I'm not going to measure my kids by their grades and their SAT scores and I hope my wife isn't going to measure me by my waistline or BMI.

I think I mentioned this is a previous article, there is an excellent book on this topic, "Measuring and Managing Performance in Organizations," Dorset House Publishing, 1996, by Robert Austin. He makes the case that performance measurement often leads, paradoxically, to distortion and dysfunction instead of improvement. Of the many reasons he cites, two stand out for me. First, any measurement of performance is a multidimensional model. For example, if you are measuring the length of a board, a one-dimensional measure, it's pretty likely you'll get it right, or at least close within some tolerance level. But when you measure the performance of a salesman or the success of an advertising campaign, there are lots of variables to consider and unless your model is a good proxy for the phenomena you are trying to measure, the risk is high that your metrics can be completely misleading. Coupled with the difficulty of designing these models of performance, it is problematic. In addition, unlike 2 x 4's, human beings know they are being measured and are ingenious at defeating the system by gaming it, thereby generating the desired measurements but not adopting the desired behaviors.

So my suggestion is that what you measure is what you get, though it may not be what you're after.

John Patton:

I already answered John in my blog comments before I decided to compile all of these responses together, but I might as well repeat it here.

I agree with your sentiment, but I disagree that all BI is reporting. A report is a static display of information, so even an ad hoc report or a parameterized report is static in its final form, but the process of generating the latter two is not static. So you have to separate the report from the reporting process. OLAP may render each navigation as a static display, but the process of data exploration is hardly a static report.

Many BI tools offer a form of low-latency alerting based either on events or background routines watching for certain conditions. Again, what is rendered on the screen may look like a report, but the nature of its creation is not reporting.

When you speak of signals and context, you have my attention because what happens in BI, whether it is reporting or something else, is still pretty constrained by the models and templates that are created ahead of time. We have a long way to go before signals and context work their way into mainstream BI. I've seen oodles of fascinating start-ups dealing with this in many creative ways, but it's going to be a while before any of them start showing up on the short lists.Excel comes with lots of baggage, and that has to be addressed. If there is a company policy that you don't wear flip-flops to work, why can't there be a company policy that unaudited spreadsheet models can't be used in presentations, or, data has to be sourced with adequate provenance before a spreadsheet can be shared?

About the Author(s)

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights