Now that we have more than a decade of experience with modern business intelligence (BI) technology, an increasing number of "new" BI applications aren't really new at all. They're second-generation projects. We use this term to describe replacements for existing applications that have become obsolete for several reasons, including poor scalability and flexibility and limited acceptance by the user community.
Even highly successful implementations must some day be replaced. The technology platforms upon which they were built may have fallen out of favor; or, the original developers are gone and didn't leave a well-documented legacy. Other possible reasons for replacement might include new decision processes, new business challenges, changes in user profiles, merger and acquisition activity, and so on.
Some projects can certainly move forward with existing toolsets. But many organizations will choose to put the application "in play" and go through a vendor selection process. Their intent will be to break with the past and go with the best currently available tools — as opposed to what was available when the original toolset was chosen.
BI professionals have proven methods for driving a software selection process. However, best practices must change if you're dealing with a second-generation project. BI users and developers bring biases to the process that can hinder objectivity and lead to something other than true, fact-based decision-making. In this article, we'll explore some of the unique challenges of second-generation BI projects and identify where you must keep bias in check so that your organization doesn't get trapped by history on the way to reaching strategic business objectives.
The most obvious obstacle to effective vendor selection is decision makers' biases acquired during their individual histories with BI. Some will say that it's important to keep the competition "fair" and level the playing field. However, the true direction should be to bring objectivity and efficiency to the process. Then, your organization can find the best possible fit of tools to applications and pursue strong return on investment (ROI). The products that offer the best fit will rightly have an advantage going into a competitive sales situation.
Recent experience is perhaps the strongest source of bias. Both users and IT professionals tend to associate their opinions about tools and vendors with their views on recent experiences with the application of these tools. Some even go so far as to name the application after the vendor: for example, "the organization gets its daily detail reports from Cognos" — meaning, of course, the application built using the vendor's BI tools, not from Cognos itself. If the application is successful, such close brand-name identification works in the vendor's favor.
Users and developers will, however, associate the weaknesses of the existing application with the vendor's tools, whether merited or not. As a result, incumbent vendors often go into the application replacement process with one strike against them. You'd think that the opposite would be true: The incumbent vendor already has license arrangements with the client, developers and users are already trained, and a code base has been established. Such advantages are negated if the vendor isn't perceived as supportive when the system hasn't proved to be satisfactory.
Bias based on recent experience can lead users and IT professionals to overestimate their knowledge of the tools' capabilities. In recent years, we've seen rapid enhancement and innovation in mainstream BI tools, much of it the result of the current wave of industry consolidation. Thus, previous vendor evaluations could be obsolete if they haven't been revised. It's rare to see applications rebuilt to leverage new product release enhancements, assuming they're installed.
The IT perspective generally offers strong biases both for and against the status quo. For example, if the original technology buyers are still around, they'll tend to be defensive about their decisions unless convinced that requirements have changed or that another vendor has come up with a demonstrably superior product. Also, over time, vendors will often cultivate "preferred customer" relationships with IT personnel. Trusted relationships form between developers and vendor support personnel. IT managers become active in vendor user organizations. They occasionally benefit directly or indirectly from making their organization a valued vendor reference site.
IT managers are frequently reluctant to abandon existing expertise. To switch from a current toolset, their organization will have to acquire new skills, rewrite otherwise usable code, and retrain users and help-desk personnel. In some cases, the introduction of new tools requires changes to the underlying DBMS and/or OS platforms, which can compound the overall disruption and increase the risk.
There are other situations, however, where organizations favor change for its own sake. This powerful bias occurs, for example, when there's a change in IT management. As new buyers, they bring their own product preferences with them. They immediately look for an opportunity to exert influence. Coming into a situation where there's already dissatisfaction with an application, some new IT managers may feel a need to make a change solely to appease a disgruntled user base.
A third source of IT bias is the tendency of some technical specialists to do some "resume building" by gaining experience with a currently popular product. This factor creates a strong incentive to favor change.
Even when strong impetus exists to replace a BI solution, users may show significant resistance to changing vendors. Opinion leaders in the user community are often the power users who've developed expertise with current tools, along with the status and job security that comes with that expertise. Going to a new toolset endangers their positions. In view of this situation, user management will consider carefully the cost and disruption of a major retraining effort.
Of course, politics often play a key role in the decision process. Given the opportunity, users who didn't play a major role in the development of the original BI application will try to assume more control over its replacement. They'll do this either by attempting to take ownership of the technology selection process or by choosing a product that they feel they can implement with a minimum of IT support. Such decisions are sometimes made regardless of functional or technical fit.
As is the case with other technologies, BI users are also influenced by their peers in other organizations and will have a tendency to follow their lead with regard to vendor preferences. At times, a single highly visible feature or feature set will become a singular focus in a way that's completely inconsistent with its true importance. A common example is the manner in which BI tools work with Microsoft Office. Smooth Excel integration can provide a basis for compelling vendor demonstrations, but isn't, in and of itself, a preemptive reason to buy one product over another, especially if the application is to be deployed primarily over the Web.
To have success with a second-generation selection process, a fresh look is essential. BI vendor selection methodologies are generally aimed at "clean slate" scenarios, where buyers have little or no experience with the technology. Clearly, the presence of incumbent technology injects new factors into the process. The following sections discuss ways that we've learned to adapt vendor selection best practices to second-generation BI initiatives.
No matter how thoroughly your organization analyzed data volumes, response time, and other technical requirements when the original tools were selected, it's essential to completely reassess those requirements to establish a more current baseline. In the world of data warehousing, some reasons are obvious: You might have new data sources and an expanding user community. More subtle reasons could include the following:
Security needs are another critical factor. Very often, first-generation BI environments were built and deployed with minimally acceptable security standards set by what any mainstream toolset could provide. After all, part of the purpose was to make data more accessible. Now, with organizations typically much smarter and more demanding about security, second-generation BI systems must be better. BI products vary considerably with respect to security models. Choose your new toolset based on today's and tomorrow's security needs, not yesterday's.
Additionally, since you deployed your current BI system, the core technologies that comprise commercial software products — BI or otherwise — have almost certainly evolved. Very likely, your organization has some sort of "architectural blueprint" that governs permissible standards and platform technologies for all new applications; the blueprint offers a "playbook" of allowable interfaces among systems and components. If, for example, Web services, a common portal, and directory services integration are now required capabilities for all new deployments — whether transactional or analytic — then the technical compatibility requirements factored into your BI tool evaluation need to reflect this changing landscape.
Since BI is an end-to-end proposition, it's likely that both the ETL and user-facing tools (reporting, OLAP, and so forth) must conform as a group to a set of forward-looking technical standards. The key point here is that you proceed at great peril if you undertake second-generation tool evaluation and selection based on an obsolete set of technical requirements.
More often than not, first-generation BI applications do little beyond producing a mixture of standard and parameter-driven, after-the-fact reports and analyses that can't be altered without IT involvement. This lack of flexibility won't satisfy today's decision makers who value self-service and want the ability to look ahead. Most likely, your organization now employs several packaged applications. The desired data must pass through a new generation of ETL tools and reach users via the new BI tools that you're evaluating. While it's clear that you'll need to factor in issues related to these new classes of data, don't neglect the metadata issues associated with the new data sources in your evaluation.
A critical shortcoming of most first-generation BI environments is inadequate metadata management, due in large part to the product-centric nature of most early tools. Develop a broader and bolder vision of the role metadata management will play in your future BI environment; thoroughly evaluate the candidate tools to see if they can implement that vision.
Many organizations now have functional requirements that call for real-time data flows from source systems into the BI environment. This becomes more than just a technical issue; the business success of the BI environment may depend on recognizing that, for example, the widely used "Daily Sales Activity Report" currently produced overnight must now be available, on demand, several times a day. Therefore, real-time data must be included for users across the enterprise. This sort of data demand has far-reaching implications not only for extract, transform, and load (tools must support multiple interface protocols between source systems and the data warehouse), but also for reporting and analysis tools. Consider how the system will alert users as to the exact "age" or latency of the information they're analyzing. You'll also need to think about the BI system will update information in OLAP cubes and caches.
A common mistake made during second-generation product selection is to evaluate against existing functionality that doesn't accurately reflect current (and certainly future) functional requirements. At one company, we encountered a BI environment largely centered on "The Management Dashboard" — a ubiquitous application that in reality functioned like a quasi-portal, with single-interface access to an assortment of reports and analyses covering almost every functional area in the enterprise. It had no true architecture; rather, it had evolved with the addition of components put in place to meet specific needs of the moment. Rather than evaluate replacement candidates against the current backdrop of wildly diverse functionality, we looked at new functional requirements that were closely aligned with the company's strategic business goals as the foundation for evaluating new ETL and user-facing tools.
Finally, when setting up the vendor Proof of Concept, include the current functionality that you want to retain. But make sure you highlight the differences among toolsets for essentially the same capabilities.
Other key factors, beyond technical issues, are important to consider. First, vendors fervently guard their "turf": They'll do everything they can to leverage existing relationships with key decision makers. You'll likely hear the ROI pitch, which emphasizes quantifiable benefits of staying the course with the current product investment — even if some glaring product shortcomings have tarnished the vendor's reputation among those who work most closely with its product.
At the same time, however, nonincumbent vendors covet "converted references": that is, organizations that have dumped rival products in favor of their own. If it makes sense to become a converted reference, don't forget that your conversion could give you some leverage in pricing negotiations.
Second, you'll see that pricing strategies vary considerably among BI software vendors. It's hard to count on even apples-to-oranges; you're more likely to face an "apples-to-end table" comparison. We recently issued a Request for Solution (RFS) document to five leading BI tool vendors for a second-generation implementation. From the exact same set of explicitly defined requirements for user population and functionality, pricing ranged from $750,000 to $15 million! Analyzing the vendors' responses, it was clear that the wide disparity was due to the challengers trying desperately to unseat two incumbent vendors included in the evaluation.
A final pricing consideration to look out for is the up-front concessions and discounts that come only at the cost of excessive, back-loaded maintenance costs or expenses for upgrades and enhancements. In the RFS situation just mentioned, one nonincumbent actually refused to provide three-year maintenance pricing in concert with its lowball estimate. The vendor wrote that it wouldn't "provide maintenance pricing at this time and [would] do so only during final pricing negotiations."
In the sidebar "Vendor Evaluation: Words From the Wise," we've summarized a few "best practices" points to remember as you embark on vendor evaluations for second-generation BI projects. Here, we'd like to close with a few final recommendations about tool evaluation.
As BI environments mature, most organizations will find fewer and fewer areas where they can start with a clean slate. The good news is that more modern technology holds the opportunity of getting you closer to achieving your objectives. The trick is to move forward with open eyes, wiser from experience, and ready to balance all the factors to achieve a successful conclusion.
Steve Robinson is the Business Intelligence practice director for Verity Partners, LLC and co-author of Principles and Practice of Information Security (Pearson Education, 2003).
Alan Simon is vice president of the Business Intelligence Practice at Alliance Consulting; he is the author of Data Warehousing for Dummies (Wiley, 1997) and was the data warehousing columnist for Database Programming & Design.
VENDOR EVALUATION: WORDS FROM THE WISE