What are the latest directions in business intelligence? In our seventh annual report, we take a look at the major drivers impacting how you'll architect and implement BI in your organization. Could search, semantics and master data management push BI over the top?

Neil Raden, Contributor

August 15, 2006

20 Min Read

"In all affairs it's a healthy thing now and then to hang a question mark on the things you have long taken for granted. " --Bertrand Russell

In the next five years, business intelligence as we know it will disappear.

To be more precise, BI as we know it will disappear from view. Existing BI will linger, but the real action will be elsewhere. In an accelerating arc of progress with enterprise systems, organizations are moving beyond purely operational systems and toward embedding analytics into operational processes. The need to take action on analytics, not just be informed by them, has never been more urgent. The relentless externalization of business, the rapid emergence of loosely coupled computing environments based on standards and the Web-as-the-platform paradigm are all driving BI in a new and exciting direction. This is an extraordinary opportunity for the BI community, but only for those who can adapt to the changing circumstances.

We'll take a fresh look at BI "megatrends"--the major drivers that are impacting how you'll architect and implement BI in your organization. Guided search, master data management, Web services, semantics and open source are just some of the exciting topics that we'll discuss in this year's tour of the changing BI landscape.

Bye-bye, Bi?

Last year's megatrends article predicted that BI would be dominated by a few large players. One year later, some of those players are struggling and many new ones are emerging to address the gaps in conventional BI. Open-source vendors like Pentaho and JReport are making their presences felt. Actuate provides BIRT (Business Intelligence Reporting and Tools), a report-authoring tool, based on the open-source development environment Eclipse. Endeca Technologies, which offers a guided search, navigation and analysis platform, threatens to redefine the meaning of BI by concentrating on the user experience instead of the data.

Celequest is breaking new ground by delivering real-time agent technology with analytical capability, and recently, offering an all-in-one BI appliance. And through interactive visualization and data manipulation, Spotfire is vastly expanding what users can do with analytics. Fueling the momentum toward software as a service (SaaS), Sharp Analytics and similar companies are providing expert-level domain knowledge bundled with software and services for a flat fee.

Small companies aren't the only innovators. SAP's NetWeaver architecture encourages third parties to add value to the SAP environment by developing hybrid, operational/analytical applications as combinations of services. Database vendors are incorporating analytics and also developing strategies or even new products for SOA. Farthest along is NCR's Teradata Division with its Teradata Application Protocol (TAP); it lets Web Services leverage Teradata data-warehouse engines.

Microsoft is moving its desktop tools, particularly Excel, into a Web Services format. Office 2007 will bring Excel Web Services, Excel integration with SharePoint portal and HTML output for viewing spreadsheets and manipulating them through a browser. Microsoft is tightening BI and Reporting Services integration with its SQL Server and Analysis Services. As its PerformancePoint plans unfold, the company will incorporate the proven capabilities found in the acquired ProClarity tools.

Typically, BI has been a disconnected activity, with users unable to traverse the big gap between being informed and being able to do anything about it. Currently, stacks of BI functionality, such as reporting, ad hoc analysis and OLAP (online analytical processing), serve only a small fraction of the user community (as measured by licenses)--10 percent to 20 percent, according to most studies, though actual use is probably even lower. Although the tools generally require no programming and the basics can be taught in a couple of days, conceptually BI still demands a level of understanding about the data, data models and data manipulation processes beyond most knowledge worker's skills or patience. Consequently, the level of IT involvement in BI is still high, leading to reduced ROI and an overall lack of agility and timeliness, except for those few power users who invest the time and effort to develop proficiency.

Despite these drawbacks, there are many BI success stories. It would be unfair to call BI an all-out failure. What factors become evident when you look at BI successes? First is an unusually receptive company culture, often driven to BI by pressing crises that only BI technology can address. The second factor is where everything is simply done right: that is, excellent application of the right practices along with management encouragement. Unfortunately, such conditions are not prevalent in most situations, causing BI to suffer from a "failure to thrive."

The BI community is replete with methodologies, layered architecture diagrams, technical product reviews, secrets of success, and mistakes to avoid. However, most of what's discussed focuses on the "plumbing" enablers: data movement, quality, integration, management and modeling. This stuff more properly falls under the heading of "data warehousing." What we should be talking more about is simply what real people need from BI to do their work.

Now, Adam Smith's invisible hand is moving; the BI market is beginning to shift as the players bring new opportunities to thrive into focus. Some of the largest vendors perceive this movement and are reacting. Smaller players are pressing their advantages and making their presence known.

Megatrend Categories

Where are the strongest signs of a shift? Factors influencing BI's future fall into three categories, with important subjects as part of each of them: new capabilities, including search and guided navigation, master data management, semantics and operational BI; technology architecture changes, including service oriented architecture, Web 2.0, Web services, emerging standards, Ajax and intelligent, real-time agents; and external (to the BI industry) drivers, such as open source, business externalization and Moore's Law.

• Guided search and navigation. Navigation through internal, structured data is provided by BI tools through prompted reporting, OLAP or dashboards. After years of trial and error with visual metaphors and sometimes ridiculous classification of people into "roles," such as farmers and miners, BI adoption is still well below expectations. Google, on the other hand, has effectively one interface, no roles and returns too many hits with little or no guidance. Yet, in a few years' time, Google has attracted billions of users. How can this be reconciled?

BI tools assume that users know what they are looking for and understand the meaning of the terms and derivation of values and relationships, explicit and implicit, between all elements. BI's low acceptance rate shows the flaw in this assumption. Relatively feeble attempts at informing the users through "metadata" are barely definitional. Cognitive studies show that people will only pursue multistep navigation if they are reasonably certain that they will find what they want. For a BI tool to get and keep someone's interest, it has to reorient the data without being captive to hierarchies and taxonomies that don't fit his or her frame of reference.

The application of search technology to the interactive BI experience offers a long-overdue improvement. It also carries the promise of linking indexed unstructured text, though real apps are a year or two away. BI vendors Cognos, SAS and others are responding to customer interest in search and navigation by inking partnerships with Google, Inxight Software and other search and unstructured content analysis vendors. Alternatively, rather than plug search into a BI product, the law firm Morrison & Foerster is using RecomMind's MindServer tools to go beyond keyword search and use concepts and relationships to gather information from assorted sources, including documents, databases, Web pages and e-mail. The objective is to keep its more than 1,000 attorneys informed about external developments in subjects important to clients and ongoing cases.

• Master Data Management (MDM). MDM focuses on how companies can share reference data among different constituencies, including business functions and external partners. Although several application and information management domains are trying to "own" MDM, the discipline is emerging as something separate--and thus, taking a big bite out of that mega-enabler of BI, the data warehouse.

Creating a single repository of integrated data--the so-called "single version of the truth"--is a core data-warehousing promise. However, application of this hard-fought utility has been limited to those processes that access the data warehouse. A single, common set of master data is too important to an organization to bury in the murky processes of a data warehouse. Data definitions mediated by parties that participate in the data-warehouse design do not necessarily serve the enterprise view, which is more operational than analytical.

• Semantics. Because the data warehouse can only play a limited role in delivering MDM capabilities, semantic technology becomes relevant. The Semantic Web of frictionless information exchange that Tim Berners-Lee and the World Wide Web Consortium envision is still far off, but technology created in its pursuit is showing promise. Ontologies provide not only richer and more flexible ways to represent definitions, meanings and relationships, they also let computers draw inferences. In other words, by asserting things in an ontology, you could create a vast new resource of "implicit" information that has its own discoverable patterns and relationships.

Not everyone is convinced that ontologies are promising. Pessimists argue that the production of syllogisms is applicable only to that part of our lives where we currently use syllogisms: in other words, a very small part. Semantic technology is hot in the intelligence and military communities to do things like unify databases across law enforcement and intelligence agencies. However, the technology has yet to gain significant traction in commercial enterprises. At least not directly--Google, Yahoo and others are pursuing Semantic Web technology, which then becomes part of what commercial enterprises use to discover and interpret search results.

Thus, semantic technology is finding its way into data integration, search and navigation and as the basis for "supercharged" metadata efforts. IBM, which acquired Unicorn Solutions in May, is an example of a vendor applying advanced metadata management to SOA (service-oriented architecture) Web service governance and management, among other business-integration challenges. But what problems might prompt user organizations to bite on semantics themselves? One that draws interest: Most organizations exist as tightly bound groups that communicate internally in languages that group participants understand, but present barriers to communication for those not in the groups. As businesses outsource, offshore and otherwise "externalize" processes outside the borders of their organizations, fluid communication and understanding become essential.

Unfortunately, individual groups or functions are rarely willing to invest the time to exhaustively delineate all the elements and combinations required for translation and better understanding. They feel it's too big a job, and they're right. Here is where the opportunity may lie for semantic models, especially those based on first-order logics (as ontology is). The models can allow for structure to emerge over time and for machines to draw inferences, without human participants needing to explicitly code each relationship.

• Operational BI. Although this subject is getting lots of attention--including through the adoption of technologies for business activity monitoring (BAM) and performance management--operational BI offers a classic example of a "boundary object": that is, something identified by different domains, and therefore being given very different meanings. Classical data warehousing and BI attempted to meet some operational reporting requirements with the addition of the operational data store (ODS). The role of the ODS has been to integrate "atomic"-level data, generally for later aggregation or summarization by data warehouses, OLAP and BI tools.

Unfortunately, the challenges of developing an ODS to meet the low-to-zero-latency demands of hybrid operational/ analytical applications have mostly proven beyond existing data-warehouse infrastructure and mainstream BI tools. Melding operations and analytics requires that analytical tools step up to the service levels of operational software, something that is impossible to achieve with prevailing best practices. Also, it was never clear whether an ODS offered truly integrated data--and was thus part of the data warehouse and therefore lined up with other data structures in that edifice.

Most database vendors can provide at least near-real-time access to operational data on a physical level, but Teradata seems to be the farthest along in providing what it calls an "Active Data Warehouse." In this scenario, the warehouse supports tactical decision-making with performance levels and data refreshes approaching what you'd see in transactional systems. However, most existing BI environments provide weak support for the kind of real-time query federation between a data warehouse and other operational data sources necessary for operational-analytical hybrids to support business processes.

One alternative is to be very selective about the data elements you involve and use real-time agents to gather the data as it flows through a message queue or read the applications' logs for changed data. This is BI's version of event-stream processing. By providing such capabilities (and others) as part of its core technology, Celequest has found a way for BI to participate in an SOA cooperatively. The event orientation helps move analytical information out to users, processes or applications in real-time--either as a standalone set of dashboards or as services employed by composite applications.

Operational BI or, more accurately, operational analytics, must be as lightweight and configurable as services. Grabbing a piece of historical data from a data warehouse, aligning it with current information from an operational process, perhaps dynamically generating a forecast based on trend analysis or even a stochastic process like Monte Carlo Simulation to produce a range of outcomes--all these activities must happen transparently and in near-real time. Although it's true that many operational-analytical hybrids can operate in a more relaxed timeframe, the demand for analytical services will drive the development of fast, thin applets.

Technology Architecture

SOA, Web Services, W3C standards and Ajax all came to us through exploding use of the Internet. Ajax, which is a critical element of Web 2.0, has in its brief life spawned new and exciting applications that get us to visual environments as rich and interactive as a Windows client, but through zero-footprint browsers. Other Web 2.0 things of interest are standards and tools for collaboration, such as RSS feeds and Friend of a Friend (FOAF) for social networking--both of which are based on Semantic Web technology. Good analysis is only useful if it can be shared and vetted. Blogging is emerging as a respectable way to do that.

The Web 2.0 era is also spreading Google-like software licensing and distribution, featuring version-less upgrades. The agony of BI software version upgrades will simply not cut it in a loosely coupled, dynamic environment. Today, even in a fairly straightforward BI environment, managers face version upgrades with a dozen or more pieces of software, all on different upgrade schedules. This leads to excessive downtime and wasted effort, not to mention cost.

Polling databases is another consuming BI effort. However, systems are beginning to incorporate technology, such as unattended agents that know both what to look for and the most efficient way to find it. TIBCO Software, WebMethods and a few other vendors offer agents that can choose the right means of communicating analysis results and information automatically. In the past, software tools were managed with scarcity in mind: There was never enough CPU, memory or bandwidth. Today, scarcity isn't so much the problem; as the polling issue attests, it is finding and exploiting information resources. As SOA and Web services mature, it will be simple to devise and deploy "bots" to do the work of polling database and searching for information automatically.

Drivers External To BI

Three external factors will drive BI out of its comfort zone and into new modes of usefulness. The open-source movement is one of them. However, it remains to be seen whether open source's impact will be simply to drive down software costs or to start something bigger. Can the open-source community marshall its participants' creative resources to develop truly breakthrough products? Everyone knows about Linux, but BI software doesn't have a similarly vast number of interested parties behind it. On the other hand, the pressure might be a good influence on BI vendors, who might otherwise feel they can get away with mediocre software at exorbitant prices.

Two other factors look more influential: the continued externalization of business and Moore's Law. Today, from the smallest to the largest, all organizations have to transact business with partners and customers electronically. Historically, a single large customer such as Wal-Mart or producer such as Proctor & Gamble could dictate the format of business-to-business electronic interchange. Those participating had to make a big investment in proprietary technology, including mainframe systems and EDI (electronic data interchange). Those that didn't were locked out.

With open standards solidifying and technology lowering the cost barriers to almost zero, nearly anyone can participate in electronic commerce. Now, the stumbling block in business externalization is getting all parties to accurately and effectively communicate and share information. It's been tough enough to get parties within an organization to find common ground; with business partners connecting over the Internet, commonality can't be taken for granted. Accurate interpretation of information must happen in real time.

In this new world, last-century solutions, such as "single version of the truth" in data warehousing, are unacceptable. Facing a global, externalized business environment, leading organizations must push beyond conventional BI and data-warehousing approaches and seek adaptable, agile solutions.

Finally, what megatrends discussion would be complete without a reference to Moore's Law? As mentioned earlier, BI and data warehousing came of age when resource scarcity confined technology innovation. Scarcity is baked into BI's circuitry. However, since conventional BI and data warehousing methods were first laid out, the volume and cost of physical resources have been riding a Moore's Law-like curve.

The time has come to radically rethink BI's basic methodologies. Caching, virtual data warehousing, query federation, autonomous agents, real-time and direct access to operational data stores are all more feasible now and must be built into the new BI. Doing so will help companies alleviate the delays and rigidity that confound current approaches.

No Longer A Solo Act

As mentioned at the outset, with analytics submerging into ERP, CRM and human resources management and other applications, we can no longer view BI as a separate discipline. The division of analytical from operational systems was artificial in the first place; it has proven inefficient, has made it difficult for managers and operational employees to connect analysis with decisions and action and has left open time-consuming and expensive gaps in overall workflow.

Embedded analytics, composite applications and operational BI all lead us to further co-processing of operational and analytical data and formulae. Since SOA will increasingly be the foundation of applications, companies will prefer to distribute and deploy analytical applets as needed. Monolithic BI suites with expensive per-seat licenses will lose favor--as will BI and data warehousing "stacks" cobbled together from different generations of technology. Market advantage will go to newer, less comprehensive entrants. Even older applications that provide specific functionality, such as visualization, Monte Carlo simulation or other stochastic processes and industry-specific analytics will get a leg up.

BI has not yet lived up to its expectations because those who were to be served most routinely rejected it in favor of less grandiose alternatives. BI needs to win hearts and minds by doing a better job of connecting with the new user experience. Data exploration without guided search and more useful abstraction from physical data representation won't capture attention. And those layered architectures that look so good on the PowerPoint slides? They won't cut it in a future dominated by flatter architectures characterized by SOA. Rather than a typical data architecture that looks like a layer cake and has all sorts of arrows with no explanation of what the arrows do, advanced organizations will embrace the cooperative, peer relationship inherent between services.

Analysis is collaborative, not a singular effort. If BI is going to become pervasive in the enterprise, it must thrive within the network of business processes. In the not distant future, the best BI may be the BI you don't even see.

Neil Raden is the founder and president of consulting firm hired brains. he is an author, analyst, consultant and implementer of decision-support environments. Write to him at [email protected].

Debriefing

Oz Benamram Practice Resources Attorney Morrison & Foerster LLP

You are in charge of firm-wide knowledge management. What's at the top of your agenda?

Our goal is to support actions. Ten years ago, we didn't have enough information. Now we have too much, but most of our systems were built to solve the last decade's problem. Our job is not simply to be a librarian and make information available online. We have to learn from retailers, who are measured by demand. We have to understand what the mission-critical actions are in our organization and support those.

How do you apply knowledge about users to assembling information?

We try to understand who you are and what you want. If I know that you are a second-year associate and you're working on your first merger, we want to push you video training or other materials. If a tax partner in the San Francisco office does a search and two identical documents come up, we want to make sure they know which one is more relevant. For BI and text search to work together, you have to have the "I": intelligence. Dumb systems don't understand that information demands change as the process and decision making situations change. We use RecomMind MindServer [enterprise search and machine learning] to basically do what the e-commerce guys do so well.

How are you improving processes at Morrison & Foerster?

Rather than tag each document, we let MindServer tag them for us. Even if it is not 100 percent accurate, the software is faster and more consistent than a human. We then have one person tag the project, doing what the software cannot do. And we have automated agents reading court filings and matching information with billing and other systems so the proper people are notified when a client is involved in a legal action. This helps us shorten processes dramatically and assemble the right team to address our client's needs without wasting time.

Executive Summary

Far from resting on its laurels, the business intelligence (BI) community must stay true to its mission of leveraging information for strategic advantage. No matter how the universe is defined, most surveys report that the current BI tool user base is smaller than it should be. Competitive advantages once delivered by standard BI query, reporting and analysis products are becoming commoditized. Consolidation of separate tools into larger suites and the "mainstreaming" effect of Microsoft's growing presence mean that the adventurous must push out further to capture the next information edge.

Part of BI's new frontier comes into focus by incorporating tools for search and analysis of unstructured information, which through metadata and semantic integration can let decision-makers combine internal and external sources to gain a complete, single view of objects of interest. BI's other adventure may involve the loss of its separate identity. Process management, service-oriented architecture and real-time information applications are fueling growth in embedded analytics, activity monitoring and agent-based event alerting.

No matter how adventurous, BI is successful only if project leaders gain business users' support by demonstrating strategic value. A solid underlying data strategy is essential. Tool standardization will save money and improve BI coherence, but shouldn't come at the expense of satisfying the never-ending urge for insight.

About the Author(s)

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights