In Focus: Up Close With HP's Content Management Guru
There's plenty of hype about "enterprisewide" content management, but few companies have come as close as Hewlett-Packard to taking a truly holistic approach.
There's plenty of hype about "enterprisewide" content management, but few companies have come as close as Hewlett-Packard to taking a truly holistic approach. I recently had a chat with Mario Queiroz, HP's vice president of content and product data management, who led the company's three-year effort to rationalize taxonomies, metadata, technologies and management approaches spanning 17 business units. The deployment touches some 85 percent of the products sold by an $83-billion technology giant, yet the practices aimed at efficient content reuse are pertinent to any size organization.
Doug Henschen (DH): What was the impetus for HP's enterprise content management (ECM) initiative?
Mario Queiroz (MQ): Things really started with the merger of HP and Compaq. Both companies had content management systems, but they were fairly fragmented, with a lot of departmental deployments. We knew we wanted something streamlined and consolidated, so in May of 2002 we brought together resources and decided to take a strategic look at the problem.
ECM can mean a lot of things, but our project has really been about sales and marketing content. That includes marketing collateral and product content as well as solutions information created by the business units. We're working with more than 3,000 authors across product management and marketing communications (marcom) units, and we're funneling their content--both chunks and larger documents--into a create-once, use-many approach.
DH: How does ECM fit into the larger context of information management at HP?
MQ: We're part of an E-Business, Customer and Sales Operation group that handles activities including pricing, order management, business intelligence, e-commerce, hp.com, CRM, partner relationship management and other infrastructure areas. Our piece is getting data and information to our customers and internal systems on behalf of the business units--Imaging and Printing, Personal Systems, Servers, Storage, Software, Services and so on. Day-to-day, that means working with the product marketing and engineering organizations at the worldwide and regional levels and funneling their content into the right repositories so we have it structured the right way.
DH: Just what do you mean by structured?
MQ: We've set standards for taxonomy that enable business units to create content efficiently and that drive our strategy of creating content once and using it many times. That's what we do on the "back end." We then offer a subscription service to more than 40 internal publishers, including marcom organizations that print collateral material as well as sales force portals and the hp.com e-commerce Web site that make content available online. The service has a standardized interface for pulling or pushing content, and there's also an interface to the product lifecycle and product data management (PLM/PDM) systems that serve the supply chain. If a product description has to appear on an invoice, for example, we want to make sure that it's the same description that's found on the Web site.
DH: Why is the one-to-many approach so important?
MQ: Without it, we'd have employees in regional markets separately calling up, for example, the marcom people in Boise, Idaho, for information on printing and imaging products. If you start drawing the point-to-point connections between 17 business units and more than 65 localized markets, you quickly get to a very complicated environment. Now that we have a centralized resource, we're showing all our constituents where to go to find the content and how to get it out.
DH: What's the nature of the content you're managing?
MQ: It varies from sales and marketing literature, which can be very granular, with technical specs such as processor speeds or page-per-minute ratings, to more conventional documents, such as solutions white papers used by the sales force. We didn't think we should take a single approach, so we put much of the granular product content in a product information management solution from Trigo [and since acquired by IBM] and similar solutions in Europe and Asia. That system manages more than one million data elements. For more unstructured content, we use a Documentum repository shared with the support management team. That repository currently manages about 1.8 million documents.
DH: That's a lot of content, yet you say it's all covered by a single taxonomy?
MQ: We've internally agreed upon a taxonomy and done work on metadata standards, but we're still overcoming the pain of having multiple taxonomies and marketing hierarchies [stemming in part from the HP/Compaq merger]. Over the last three years, we've been driving region by region and product team by product team to a single taxonomy. The documents within our next-generation Documentum implementation, for example, and the product data elements in our product-information management systems [Trigo/IBM and homegrown EMEA/Asia solutions] are structured according to a standard, seven-layer marketing hierarchy that starts with the general product category and then drills down to product families, models and SKUs underneath each model. We're upgrading to Documentum 5 in part because work had already begun within a different group within the company to configure it to the company standards. Our approach was to complete that work and deploy it aggressively across the corporation.
DH: What was the metadata work about?
MQ: We needed terminology standards, particularly for the unstructured documents. We were having a terrible time sticking to one version of a document, so we went through a modeling exercise, breaking down product content into different attributes. One key piece of functionality we exploit is inheritance because it can automatically apply metadata based on parent-child relationships between documents.
DH: How do you enforce consistent use of the metadata?
MQ: At the very beginning, three years ago, we worked with each business unit to assign people to help us define the metadata standards and that gave us a lot more leverage to promote adoption. Now, when somebody wants to enter a document into the system, they have to have key data entered, but there's a degree of inheritance. If they want to create a document that applies to all X SKUs, for example, they don't have to enter that metadata over and over again.
DH: Are you exploiting XML to promote content reuse?
MQ: Absolutely. If you don't have documents in a form that makes it easy to pull out chunks of content, you're going to end up with many, many instances of the same content. Our direction is to break documents down and use them interchangeably. Thus far, we've transformed somewhere between one-third and one-half of our documents in the Documentum repository into XML.
We're also using [Blast Radius] XMetal [an XML authoring tool] on the creation side, among other selected tools. Right now, the more technical content creators are using XMetal, but we're trying to get some of the marketing types to use our content creation tools.
DH: Does XML also figure in translation and localization?
MQ: Yes. We're using Trados' translation memory technology companywide, and we're ramping up more and more of the sales and marketing content [in addition to product manuals, which are managed by another group]. We probably have 10 to 15 percent of our content flowing through translation and localization.
DH: How much has HP spent and what can you say about ROI?
MQ: I can only say that the investment is in the millions. It's a major strategic initiative that's been three years in the making. We've had efficiency gains of about 30 percent per year. As an example, we figure we're saving about $6 million per year just from our translation and localization infrastructure. We've also lowered the cost of developing content for new products by more than half. One of the big reasons we made the investment is that it has allowed the company to scale without having to scale equally in manpower, which we would not be able to afford.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
InformationWeek Must Reads Oct. 21, 2014InformationWeek's new Must Reads is a compendium of our best recent coverage of digital strategy. Learn why you should learn to embrace DevOps, how to avoid roadblocks for digital projects, what the five steps to API management are, and more.