Software // Information Management
News
4/11/2006
04:50 PM
Connect Directly
RSS
E-Mail
50%
50%

Dashboard: Web 2.0's Effect on Business Intelligence

Web 2.0 is a murky term whose definition even its creators can't articulate. But aspects of Web 2.0 are having an impact on business intelligence that will prove to be profound in a few years.

Web 2.0 is a murky term whose definition even its creators can't articulate. But aspects of Web 2.0 are having an impact on business intelligence that will prove to be profound in a few years.

Tim O'Reilly, who organized the first Web 2.0 conference in 2004, wrote, "You can visualize Web 2.0 as a set of principles and practices that tie together a veritable solar system of sites." So one might conclude that prior to Web 2.0, the Web was a protosolar system, a hot swirling bubble of gas that hadn't coalesced into anything concrete yet. Many observers believe the bubble-bust of late 2001 swept away all the posers and pretenders and left standing only those Web businesses that had a bona fide chance of being viable, like Google. This may be a good metaphor from a market/investor point of view. From a BI perspective, however, we have to dig a little deeper to understand the architectural and functional aspects of Web 2.0 and what effect it will have.

For example, though BI vendors have ported most desktop functionality to browsers, for the most part, we do not leverage the Web, we just deploy "WebTops." While Web 1.0 gave us portals, process integration and Web services, the wider capabilities made possible by Web 2.0, such as collaboration, advisers, personal agents and cognitive engines, have not entered the BI space yet. The catalyst for developing far better Web browser applications is a souped-up Javascript called Asynchronous JavaScript and XML (Ajax) that communicates small bits of data with a remote Web app without the clunky reloading of an entire page with each change. When Google Maps, developed with Ajax, was introduced, a new industry was born: writing apps for browsers that operate just like desktops.

Another fundamental shift in Web 2.0 is the whole process of software distribution. The way BI software is licensed and deployed is still firmly rooted in pre-Internet thinking, unlike Web 2.0 leaders like Google, which operates with no software upgrades, just continuous improvement. There's no upfront licensing cost, just usage, all running on massively scalable commodity hardware running open-source operating systems.


P2P technologies break the Internet down by rendering each client as a server and shredding files so they can be served from multiple locations. An intriguing aspect of this for data warehousing is that the more often a file is requested, the better the performance as more resources are engaged servicing the request. Serving up popular queries and data in this manner will be very different from DBAs creating aggregations, denormalizations and data marts. But new issues will arise too. For instance, when Web 2.0 extends your business processes to your suppliers, customers, partners and regulators in a truly distributed architecture, who will own the data? We might even ask, who owns the metadata?

A tour of large organizations' data warehouse/BI environments would reveal not only a few software and tool vendors garnering most of the business, but little turnover in the group from a decade ago. Open source, a staple of Web 2.0, should have a very large impact on that, posing a threat to the largest DW/BI vendors. --Neil Raden


[ KEY PERFORMANCE INDICATORS ]
Business Continuity Planning

In spite of last year's devastating natural disasters, a recent survey by OpenSky Research found that nearly half of U.S. businesses don't have a business continuity plan, and nearly 13 percent don't plan to use business continuity solutions. More than 45 percent suffered an IT failure last year, and only 12 percent of respondents are confident their company is doing enough to protect IT.
Competitive Advantage

Author and consultant Tom Davenport told attendees at a recent InformationWeek event that he had studied 32 companies and found only a third of them were truly "competing on analytics" (the topic of his January Harvard Business Review article). The rest were making some kind of effort toward obtaining and using analytics, but without any real competitive success.
Merrill Lynch

The SEC has concluded its investigation into Merrill Lynch's "systemic failure to furnish promptly" to SEC staff company e-mails from October 2003 through February 2005. The firm's e-mail system, the SEC found, didn't capture messages absent from a user's inbox during scheduled backups nor certain "bcc" addresses. Merrill Lynch agreed to pay $2.5 million, and the SEC is dropping the case.

Comment  | 
Print  | 
More Insights
The Agile Archive
The Agile Archive
When it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Tech Digest - July 22, 2014
Sophisticated attacks demand real-time risk management and continuous monitoring. Here's how federal agencies are meeting that challenge.
Flash Poll
Video
Slideshows
Twitter Feed
InformationWeek Radio
Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.