Business information that's redundant, outdated, or flat-out wrong trips up organizations large and small--but there are fixes in the offing.
A home in the small town of Valparaiso, Ind., valued at $121,900 somehow wound up recorded in Porter County's computer system as being worth a whopping $400 million. Naturally, the figure ended up on documents used to calculate tax rates. By the time the blunder was uncovered in February, the damage was done: Valparaiso, its school district, and government agencies were forced to slash their budgets by $3.1 million when they found they wouldn't be getting the tax dollars after all.
It's a nightmare scenario--and one like it could be yours. Bad data remains a major cause of botched marketing campaigns, failed CRM and data warehouse projects, angry customers, and lunkhead decisions. Despite all we know about the importance of data scrubbing and quality management, many companies are still using data that's redundant, incomplete, conflicting, outdated, and just plain wrong.
Bad data isn't a new problem, but urgency in dealing with it is at an all-time high. Customers are voicing anger at the mistargeted marketing pitches and poor service that result from off-the-mark data, and they're taking their business elsewhere. Companies are investing billions of dollars in CRM applications and data integration projects to gain a better view of their customers--only to discover that conflicting data makes them blind. "Our marketing effectiveness leads to our sales effectiveness, which leads to our service effectiveness. Data quality is key to the success of that," says Chuck Scoggins, VP of customer solutions at Hilton Hotels. "If you don't have quality data, that whole chain breaks down."
Managers and employees increasingly base decisions on insights gleaned from performance management applications and dashboards. But business intelligence tools are only as good as the data that goes into them; faulty data leads to ill-informed decisions. The ramifications range from ticked-off customers to misled investors to testy regulators. Executives can face jail time under the Sarbanes-Oxley Act if they don't have financial data in order. Bad data can even increase the cost and time involved in completing mergers by making it more difficult to integrate operations and combine customer lists.
The problem is getting harder to manage as the amount of data generated and maintained by many businesses doubles every 12 to 18 months. And as more businesses share information with outside partners and customers, more bad data is being exposed to others. Lax quality is familiar to anyone with a mail box: Consumers get credit card pitches from issuers with which they already have cards, mailings from charities in triplicate with slightly different name spellings, and warranty extension offers from auto dealers for cars they no longer own.
Occasional inconvenience for consumers aside, low-quality data is foremost a problem for the company holding it. Bad data can be an embarrassment--companies are loath to talk openly about internal data disasters. Businesses may be legally bound to share information about security breaches that result in consumers' personal information being compromised, but that's not the case with bad data. As a result, tales of mishaps are hard to come by, even as the problem persists.
The biggest obstacle to fixing the mess is that business managers view data quality as a technical problem, when business processes are really what's broken. IT has little control over the sales rep who gets a customer address wrong on an order or the manufacturing manager who enters an incorrect part number in an inventory database. A Gartner survey of 600 executives in November found that 49% think the IT department is responsible for their organizations' data quality; much smaller numbers say responsibility lies with top execs, data quality teams, line-of-business managers, and others.
"Business has to accept the fact that it has primary responsibility for data quality. Data is a business asset," says Nigel Turner, who as project lead manager for data quality programs at BT Group (formerly British Telecom) in the late '90s helped get that company's data cleanup efforts off the ground.
Gartner estimates that more than 25% of critical data within large businesses is somehow inaccurate or incomplete. And that imprecise data is wreaking havoc. Fifty-three percent of the 750 IT professionals and business executives surveyed by the Data Warehousing Institute late last year said their companies had experienced problems and suffered losses or increased costs because of poor-quality data, up from 44% in a similar survey in 2001.
While IT managers may not own the processes that spew bad data, they can make the business case to change those processes to improve data quality. Moreover, they can provide the technology to support those improved processes and, since no process is perfect, operate the tools needed to automate the downstream steps of identifying and correcting bad data.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?