Kicking off the new year, we're going for seven trends that represent the kind of moving and shaking in business and IT that will have repercussions beyond just the next release. Forget the little stuff--we're talking tectonic shifts.

InformationWeek Staff, Contributor

December 13, 2006

25 Min Read

Are you ready for 2.0? It's in vogue to affix that software-style upgrade digit to just about every industry Driven by the Internet, 2.0 proclaims the start of a new generation of business collaboration and leadership. Kicking off the new year, we're going for 7.0--seven trends that represent the kind of moving and shaking in business and IT that will have repercussions beyond just the next release.

Our story is a collaborative effort. Editor Doug Henschen gets us going with a look at the importance of knowledge capture in the face of a brain drain caused by key personnel departures. Noted consultant Neal McWhorter discusses the growing role of business analysts as companies look to achieve agility. Executive editor Penny Crosman writes about electronic medical records, which, if standardized, will overhaul several industries, not to mention enable better health care. Then, manufacturing expert Michael McClellan talks about how firms must get out of application and information silos if they are ever to deliver on enterprise goals.

Our final three focus on technology. We're honored to have David Patterson, long one of the industry's leading experts in computer architecture, describe how the time has come to truly exploit parallelism. We then close it out with David Stodder's look at how real-time information demand is changing business intelligence and how service-oriented architecture is doing a 2.0 on business integration and collaboration.

1. Capture Expertise Before Boomers Retire. Think of the impact when one of your most trusted, experienced employees retires. Now multiply that impact by 3 to 4 million employees each year over the next two-plus decades and you'll get some idea of the looming brain drain organizations will face as baby boomers retire (see "By the Numbers," right).

The Numbers

3.4 million
Number of U.S. baby boomers turning 62 in 2008

62
Average retirement age reported by the Bureau of Labor Statistics in 2000

26.6 million
Number of U.S. workers who will be 55 or older in 2010

26.6 million
Number of U.S. workers who will be 55 or older in 2010

55
Age at which 2/3 of government workers are eligible to collect pensions

4.3 million
U.S. births per year in '57, the peak of the Baby Boom

3.1 million
U.S. births per year in '74, the nadir of the Baby Bust

Knowledge transfer is an ongoing business problem, whether it's tied to retirement or to employees changing jobs, transferring to new assignments or being laid off. Meanwhile, the very nature of work is changing.

"The problem we're really up against is that we're moving into a knowledge economy, yet most companies and nearly all individuals are ill-prepared for working in that economy," says Jonathan Spira, chief analyst at Basex and author of Managing the Knowledge Workforce (Mercury Business Press, 2005).

Not only is intellectual property (such as software) and expertise (such as services) increasingly the product, the value of intellectual capital behind physical goods routinely outweighs that of factories and infrastructure (which can be outsourced). At the same time, companies are being transformed by globalization, mergers and acquisitions, outsourcing and telecommuting. These trends have been enabled, in whole or in part, by technology, but they also are being held back by information silos and inefficient ways of sharing knowledge.

The Airbus debacle is a case in point. Airbus (and its parent company, EADS) bet the company on building the A380, a 555-seat, super-jumbo jet that surpasses Boeing's 747 as the largest commercial airliner. The German-French conglomerate had 164 orders for the plane, but that was before manufacturing problems set back production by two years, which, in turn, led to $6 billion in losses and cancelled orders. The biggest problem has been design flaws in the aircraft's wiring.

"The simple explanation was that their PLM systems were on different versions, but their internal organizations [and widely scattered subassembly plants] were so disconnected that they didn't find out the wiring didn't fit until it arrived at the factory in Toulouse, [France]," Spira says. "That shows what can go wrong and why we need to address the need to share knowledge and collaborate."

Imagine a portal-like workspace in which you can both view all needed information, regardless of source, and do your job. Spira calls it the "collaborative business environment."

"The system provides contextually embedded community and collaboration tools that appear as you do your work," he says. "It could be as simple as a search-result window having a presence awareness icon so you can start sharing messages with the author of a document."

Aspects of content management, expertise management, e-learning and various collaboration tools will all play in this future, and Spira says recent IBM and Microsoft moves toward blending personal productivity tools and portals (with Notes/Workplace/WebSphere and the Microsoft Office Systems, respectively) will be "giant leaps ahead in integrating all the components we believe will be part of the collaborative business environment."

Blogs and wikis also will have a role. As the New York Times Magazine reported in the Dec. 3 cover story "Open-Source Spying," the U.S. intelligence community is putting these tools to work to capture tacit knowledge, turn it into explicit knowledge and make connections between bits of information that might uncover the next terrorist plot. In the fall of 2005, a CIA team built Intellipedia, "a prototype wiki that any intelligence employee with classified clearance could read and contribute to," the Times reported. "By this fall, more than 3,600 members of the intelligence services had contributed 28,000 pages," and developers are expanding the project.

Encouragingly, several trends are helping organizations capture knowledge that otherwise might be lost. Because authors aren't necessarily good archivists, many content-management systems automatically tag and retain new documents based on the context of creation and use. Business rules systems are pulling business logic out of code and turning it into human-readable information. And process-modeling tools are being used to document and diagram core business processes.

The bigger problem will be transforming routine collaboration without forcing uncomfortable changes in work habits (like asking everyone to blog) or instituting heavy-handed debriefings that might scare older workers not quite ready to retire. "Attempting to actively document particular systems or procedures in anticipation of a pending retirement might be viewed as an indication that an 'age-related' layoff is planned," writes technology blogger Dennis McDonald.

Like many experts, McDonald suggests building lexicons or "expertise maps" that list topics, systems and processes that are important to the organization. You can then start to compile information topically with informal tools, such as blogs and wikis, or use more formal efforts to document processes.

McDonald also suggests a strategy of post-retirement engagement. “When one considers the operation of an expertise management system, retirees [could] continue to function as "online experts" after they formally retire,” he writes. “In this way, the organization could continue to take advantage of the knowledge and expertise of retirees even after they formally leave the organization.”

--Doug Henschen

2. Designing for Agility: Business Analysts Step Up. The role of the business analyst has emerged as a focal point for enterprises trying to wring more out of their automation-technology investments. The business analyst has always been a key player in identifying and articulating business objectives to IT and bringing projects through to completion. But there are new factors changing not only business/IT alignment, but also the role of the business analyst.

Ironically, outsourcing is feeding a frenzy to bring aboard more analysts. The experience of outsourcing IT has taught firms their technology-project specifications were in much worse shape than they believed. When companies would deliver specs to outside firms and individuals who didn't have deep experience with their existing automation solutions, the knowledge gaps became painfully apparent. The solution has been to re-emphasize comprehensive specs that leave no room for interpretation of the desired business behavior but to leave decisions about IT implementation to the outsourcer.

What does this mean for business analysts? Companies are looking to analysts to lead them into the Promised Land of business agility. The analyst role is evolving into something more akin to what we would think of in the consumer-products field as a product designer. This shift puts the business analyst in the vanguard of many organizations' efforts to differentiate by reducing the time it takes to complete a development cycle and deliver design refreshes quickly.

Thus organizations are changing their thinking about the business-analyst role and the tools he needs. (See the "Debriefing" with business analyst Denise Birdsell on page 24.) Business analysts must be able to design a complete business specification without connection to any particular IT implementation. This is equivalent to what manufacturers did to enable themselves to construct computer-based product simulations before ever having to cut a single piece of material.

Thus, organizations are in the process of changing their thinking about the business analyst role and the tools he or she needs. Business analysts have to be able to design a complete business specification without connection to any particular IT implementation. This is equivalent to what manufacturers did to enable themselves to construct computer-based product simulations before ever having to cut a single piece of material.

Tools for this new role are either available or on the way. These include business process management and business rules management platforms as well as sophisticated, business-level modeling tools based on standards, such as UML (Unified Modeling Language). These tools and platforms make it possible for the analyst to model requirements in business terms and then execute directly from those models.

A key development to watch will be the growth in adoption by tool vendors and user organizations of the Object Management Group's BPMN (Business Process Modeling Notation). And as Executable UML becomes more prevalent, we should see the emergence of virtual execution environments, which will let organizations model an automation solution from soup to nuts, then execute it.

But tools alone won't empower business analysts to step up. Organizations must build expertise and establish a common skill set that supports the goals. Not until recently has there been any kind of trade organization to define and champion the business analysts' new role. The International Institute of Business Analysts is emerging to fill that void. The institute is developing a certification program to give business analysts a formal path for professional development and define a common body of knowledge so that organizations can set expectations for this role.

The transition will require other organizational changes, and a difficult decision: Does this role belong on the business side or within IT? The struggle with this question reflects the deep divide that exists between these two sides in most organizations. But as businesses realign themselves to focus energy and resources on innovation, the importance of the business analyst role may force them to finally settle the business/IT divide.

Neal McWhorter is principal at Enterprise Agility, which helps organizations transform their business goals into business solutions. You can write to him at [email protected]

3. Integration as Cure: the Future of Medical Records. During Hurricane Katrina, there was a crying need for electronic medical records (EMRs) that could be read by ER rooms and doctors. "A patient would walk in and a doctor would ask him what medications he was on," says Dr. Lynn Harold Vogel, CIO at The University of Texas's MD Anderson Cancer Center. "A typical response was, 'I take two yellow ones in the morning, three blue ones at lunch and a pink one with dinner.' We had physicians sitting with patients going through the Physician's Desk Reference, which has pictures of pills, pointing out which pills they took."

Electronic health records could improve care, reduce errors and eventually lower costs, but obstacles abound. Most hospitals and doctors' offices use noninteroperable software and databases. More than 25 health-care software providers, including Cerner and EpicSystems, offer programs written in arcane languages like MUMPS and databases such as Cache, and they're apparently unwilling to rewrite their software to meet the HL7 standard for sharing health-care data. Also, some say HL7 is no panacea. "HL7 is a lot like Unix," Vogel says. "My version of Unix may be slightly different than yours, so it's always a problem to get everybody to do exactly the same thing."

Who will pay for the necessary new technology and national framework? Hospitals prefer to keep their patients in their hospital, and their data in-house. Doctors' practices tend to be undercapitalized and have tiny IT budgets. Medicare, Medicaid and publicly traded insurance companies like Aetna and Prudential are all under increasing pressure to keep costs down. Employers already pay a lot for health care--General Motors spends more on employee health care than it does on steel for its cars; Starbucks spends more on health care than it does on coffee beans. Although EMRs could drive such costs down, someone has to make a large up-front investment to make it happen.

Yet e-medical record initiatives abound:

n More than 200 Regional Health Information Organizations (RHIOs) have been established in the past few years to promote the sharing of health information. Two of the most successful are in Santa Barbara, Calif., and in Indianapolis, cities that have histories of institutional cooperation. In November, Vermont's RHIO chose GE Healthcare to provide a statewide hosted EMR system that will store medication history for consenting patients; it will be initially available at two pilot hospitals. Vermont's RHIO is also building an exchange to connect providers, payers and patients throughout the state. So far the project is being funded by $1.4 million in seed funding from the state legislature. But who will fund it when that money is used up?

n Some vendors, such as RedMedic, offer patients hosted access to a secure, Web-based electronic record the patients maintain and control. In December, five big U.S. employers--Applied Materials, BP, Intel, Pitney Bowes and Wal-Mart--launched a health-record system that will provide a similar service for their 2.5 million employees starting in mid-2007. It uses the Connecting for Health Common Framework developed by a collaboration of medical groups, insurers and nonprofit groups. Will other facilities adopt this framework?

n Taking the idea of a portable health record to a new level, John Halamka, CIO at Boston's CareGroup Healthcare System, had an RFID chip implanted in his arm a year ago. The VeriChip contains a 16-digit identification number that can be read with a special reader and is linked to a database at a CareGroup facility. Issues with this approach include the fact that a health care provider can't tell if a patient has a chip implanted and that readers could be used to obtain patient information without the patient's consent (not to mention, most people don't want a chip implanted in their body).

n A handful of medical institutions, including The University of Texas' MD Anderson Cancer Center, are building their own electronic health record systems. MD Anderson is re-architecting its clinical software from Microsoft Visual Basic to .net. With the help of consulting firm Avenade, the Center is using web services to gather data from legacy databases, and plans to make its EMRs accessible to authorized outside doctors through a portal. Such hospital-initiated efforts could be linked through common patient identification numbers, the way state motor-vehicle bureaus use a common drivers' license database.

So, plenty of smart people are working on the challenge of integrating systems and information via EMRs. Maybe 2007 will be the year they start talking to each other and working together. -- Penny Crosman

4. Manufacturing's Job One: Improve Information Flow. In 2007 and beyond, manufacturing companies know they must make progress in both integrating and reducing the number of applications within the operations side of the enterprise. The latest management initiatives, such as demand-driven manufacturing and real-time enterprise, not to mention improved regulatory compliance, all require high-level process access to information. Generated in real-time across many applications and locations, access to this information is difficult, to say the least.

From product development to production to customer delivery, the problems in operations have become even more glaring because information systems on the business side of most manufacturing companies are now better integrated and consolidated. In a typical small-to-midsize manufacturing facility, it's not unusual to find as many as 50 standalone data-generating applications, none of which share or easily give up information. I know of one company that has 24 different, nonintegrated applications on its 200-person plant floor. This does not include engineering or downstream warehouse and shipping applications.

A quality-assurance application instance, for example, might be running on a desktop computer in the office of the plant quality manager--and it was probably purchased 10 years ago. When installed, the application fit the manager's exact requirements--and still does. But in 2007, let's say the company decides that it wants to collect and coordinate quality-assurance information across multiple plant locations and access similar information at its supplier's facilities. And the company wants to do this as events occur.

Such an enterprise initiative will stall quickly. Historically, plant applications have been purchased by department managers according to the organizational chart. Within a manufacturing facility, there is usually a maintenance department manager, a quality assurance manager, a scheduling manager, a warehouse manager and so on. Each has the authority to buy technology solutions based on their department's needs. As a result, many data functions overlap—or go unused. From a higher-level process perspective, too many gaps exist. And layers of complexity grow thicker as enterprises acquire companies, change their org plans or in other ways increase the number of nonintegrated systems.

Companies are finding that they must frequently enter (and re-enter) common information, such as the product order number or identification number into the many plant and value-chain applications. This is expensive and prone to error. Any attempt to apply IT department standards to the variety of disparate systems is difficult and costly. Finally, the portion of the budget dedicated to maintaining older, narrow applications is growing, making it difficult to adopt new systems that might be broader and more accessible.

Product-production data is needed to support regulatory compliance. Real-time information demands coming from process- and equipment-control systems, as well as plant-technology applications, also make the status quo untenable. Response to these pressures will require manufacturing companies to change, among other things, the departmental approach they've taken with purchasing technology.

Manufacturing executives are getting the religion that if they can solve the problem of gathering information from their company's portfolio of data sources in the plant or value chain--and build that capability into a linked, enterprise process--they will have a considerable competitive advantage. Thus, in the coming year we will see the leaders in manufacturing focused on lessening redundancy and improving data quality through integration and consolidation. And they are on the lookout for more powerful and accessible systems that can support enterprise objectives and regulatory compliance.

Michael McClellan is president of Collaboration Synergies, Inc. and author of Collaborative Manufacturing: Using Real-Time Information to Support the Supply Chain. Write to him at [email protected].

5. Parallel or Bust: Computing at a Crossroads. When Intel announced it had canceled its successor to the Pentium 4 in 2004, the news marked a milestone in the history of computing. Rather than delivering on its promise of 10-GHz clock rates by the end of this decade, Intel joined AMD, IBM and Sun Microsystems in saying it could no longer build microprocessors with much higher clock rates.

The media's mischaracterization of Moore's Law is now evident. Gordon Moore predicted the regular doubling of the number of transistors on a chip. The job of computer architects was to turn twice as many transistors into twice as much performance. Between 1986 and 2002 architects succeeded, and we saw the greatest sustained increase in performance in computing history. The problem was that they kept increasing the power dissipated per chip, and in 2004 it was obvious that the industry had hit a power wall. Today, microprocessors are about a factor of three slower than if we could keep increasing power and doubling performance every 18 months (see "Microprocessors Hit the Wall," right). Thus, while Moore's Law continues, power dissipation hit the wall.

Fortunately, parallel is more power-efficient. Power is a function of the square of the voltage, so if you double the number of processors but lower the clock rate and voltage some, you can increase the potential performance significantly without increasing power.

Hence, all microprocessor companies have been forced to bet their futures on multiple processors or "cores" per chip. Conservative companies like AMD and Intel sell just two cores per chip, while the radical companies like Sun offer eight cores per chip. To give a new media version of Moore's Law, it is likely that the number of cores per chip will double every 18 months.

This milestone means that we have reached the end of the "La-Z-Boy" programming era, where programmers could lounge about waiting for computer architects to double program performance every 18 months. Going forward, if programmers need speed improvements to, say, add new features, they are going to have to start writing parallel code that turns the potential performance of twice as many cores into delivered performance. Much of the future of enterprise applications and analytical systems will be told by how well programmers can take advantage of parallelism.

There's some confusion about terms now that we have passed this parallel crossroads. Multicore means multiple processors per chip. Some use the term socket instead of chip--the holder of the microprocessor package on the printed circuit board--because some companies are putting multiple chips into one package as their multicore product.

Another buzzword is multithreading. Since the programmer must create a thread per processor to keep processors busy, it sounds like the same idea. It is not. Multithreading is a trick architects use to try to keep a single processor busy. It does this by having extra copies of some hardware resources, like registers, but much less than another whole processor. Multithreaded processors switch to run another thread instead of waiting for, say, a cache miss. The Sun Niagara microprocessor, for example, has hardware support for four threads per processor. Although it appears to the programmer that the microprocessor is executing four times as many threads, it only executes one at a time. Hence, multithreading may improve performance per watt or performance per dollar, but it can't increase the peak performance of a microprocessor.

To add to the confusion, these terms are not mutually exclusive. The Sun Niagara uses both multi-core and multithreading. It has eight cores, each four-way multithreaded, presenting the programmer with the illusion of 32 threads executing simultaneously.

Companies are betting their futures on the success of parallel computing. The bet is not because of a research breakthrough that made parallel programming easy; it is because no hardware design can deliver faster single-program performance given the power wall. It is not clear that this bet will be successful; it has been tried many times over the decades and largely failed.

We need researchers and practitioners to address the biggest challenge and opportunity to face the IT industry in 50 years. If we solve the problem of making it easy to program thousands of processors efficiently, the future is rosy. If we don't, the IT industry will have to learn to live without the performance rush that it has been addicted to for decades.

David Patterson is co-author, with John L. Hennessy, of Computer Architecture: A Quantitative Approach (Fourth Edition), from which this article was adapted. He holds the Pardee Chair of Computer Science at the University of California, Berkeley, where has been teaching computer architecture since 1977. You can write to him at [email protected].

6. Let's See Action: BI Adapts to Real Time. Some people climb mountains because they're there. But with business intelligence, the drive to speed up information access, analysis and reporting is not a journey taken just for the sake of it. For one thing, it's neither easy nor inexpensive; there's a reason why data warehouses were born in a batch environment. And when you start to involve heterogeneous data sources, daresay unstructured content sources, "hurry up and wait" becomes a fair description--albeit one that's less and less adequate.

With hundreds, if not thousands, of users always on and frequently networked into information-rich applications, BI's environment today is radically different from what it was during the technology's formative stages. Real-time, actionable information is a competitive advantage. Employee and customer interaction with business processes depends on activity monitoring, timely insight and alerts about exceptions to business rules. Customer-facing employees need all the information they can get to increase the odds of an up-sell or cross-sell success.

In other words, speed rules. Can search engines accelerate users toward the information they seek--or at least to the point where a traditional BI query can take them the rest of the way? If so, then companies will gladly invest in the tough metadata and semantic integration necessary to bring the two technologies together. But they won't do it just to do it.

Performance management also depends on BI technology's ability to back up insight with metrics based on relevant information. Can dashboards and visualization tools increase information clarity so that users can take action faster? If not, managers will quickly tire of the pretty but irrelevant pictures.

Beyond speed, BI's other demon is data quality. Once lost, a good reputation is hard to regain. In 2007, we'll hear much about master data management, real-time data capture and other technologies that enable organizations to match speed with quality.

Real time doesn't mean the same thing to all users or systems; organizations must assess requirements carefully. But whether embedded in processes, hidden behind search functions or on display in a performance dashboard, all forms of BI have a common mission: Get faster. --David Stodder

7. Build Up From SOA to Business Integration. Establishing a service-oriented architecture (SOA) is a major accomplishment: so great that even after all the hype, most organizations have progressed only as far as the drawing board. While SOA looks good as a cheaper, "thinner" way of integrating legacy applications, organizations don't want to stop there. They want more than just another layer over the "cement" that brings them no closer to full business agility. And they want the sizzle: loosely coupled, reusable business services and composite applications made from plug-and-play Web services.

"SOA should represent a shift away from integration as middleware to integration as architecture," says Ronald Schmelzer, senior analyst with ZapThink. In other words, to understand the full potential of SOA, organizations need to look beyond SOA to what they might build once they have it.

Thus, when leading-edge firms today talk about SOA, they also talk about business process management, process modeling and orchestration via an expanded enterprise service bus. Or they talk about using SOA to increase the number of users they can support and the variety of service configurations they can create, often through a portfolio of solutions from software-as-a-service providers.

Organizations that have moved forward with SOA as a new platform for business integration are trailblazers, feeling their way toward solutions to sometimes severe XML scalability, performance and management challenges inherent in SOA designs. In 2007, we may see Web 2.0 options take some of the performance pressure off with more "human-centered" approaches to information delivery and integration. After all, business integration is about more than middleware: It's about connecting people so they can collaborate like never before. --David Stodder

Debriefing

Denise Birdsell, Business Analyst,
Zurich North America

As a business analyst, you're involved in complex system-development projects for various business areas. How does it feel to be in the middle of what makes or breaks business initiatives?

Well, if you're doing it right, you're exhausted. And if you're doing it right, you really know about a lot. You're not an expert in any one field, but what you become is the expert in the delivery of the project. You begin by helping to figure out what the problem statement is and what the objectives are. Then, you live the project for the business: You translate objectives into IT requirements, new UI standards, functional specs and more. You nurse it through all phases and iterations.

Are you working on pulling business rules out of applications and centralizing them?

We recently did some work that involved pulling out database events, application triggers and security rules. We did a project to basically automate the diaries for claims handlers. The objective was to reduce "leakage," or the amount we overpay on a claim. We automated notification of best practices triggers that would tell people if a claim exceeded a certain dollar amount. If we can pull them out, it creates less dependency on IT. But you have to make sure that you maintain the domain knowledge.

Why is that important?

You can't just extract things and then say to the businesspeople, "Here you go." This is not going to be business as usual; you're changing the culture. You're creating a whole new level of expertise that has to be managed and maintained. If businesspeople are not properly prepared, the project won't succeed.

Executive Summary

Onward and upward. That classic phrase describes not only optimism as IT heads into a new year, but also a literal expression of what organizations must do to maintain their competitive edge.

Through layoffs, outsourcing and the first disruptions from what will eventually become the great baby-boomer retirement bubble, businesses must capture core knowledge so they can push onward, despite the loss of institutional wisdom. And to become smarter organizations that can twist and turn with market changes, business and IT execs must focus upward to a higher level and bring clarity to processes, rules and services.

Capturing the knowledge of departing employees and empowering business analysts so they can design agile organizations are two of seven trends identified by INTELLIGENT ENTERPRISE as we mark the turn into 2007. A third is the business imperative to improve data quality and integration, exemplified by the halting progress toward standard electronic medical records. Three key technology trends are the shift to parallel programming to take advantage of new computing architectures; changing strategies for business intelligence aimed at delivering real-time information value; and the race to establish new forms of business integration through service orientation.

As Internet innovation combines with growing information sophistication, "2.0" visions are upending entire industries. Thriving in the new age will demand more than tools and technology; success will also require evolved thinking about people and processes.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights