Executive says poor-quality data makes regulatory compliance a challenge

Rick Whiting, Contributor

May 14, 2004

1 Min Read

Data-quality management is gaining more interest from top executives who oversee broad business challenges, such as complying with the Sarbanes-Oxley financial-reporting rules or developing revenue-growth strategies.

Quality of data was a focus last week at a conference hosted by SAS Institute Inc., which includes data-cleansing tools in its software product line. "It's difficult to be a CFO and not know with certainty about the integrity of the data," says conference attendee David Klementz, CFO and senior VP at Progress Rail Services, a subsidiary of Progress Energy Inc. Klementz spends a lot of time reconciling data collected from multiple sources throughout the company that managers need for complying with Sarbanes-Oxley regulations.

While Klementz is considering buying data-cleansing software to help, having consistent data-management processes in place, ensuring they're adhered to, and hiring competent people to manage those processes is critical, he says. Other vendors selling data-cleansing tools include Ascential Software, FirstLogic, and Trillium Software.

More than a quarter of critical data used within large companies is flawed, according to a research report Gartner issued last week. Problems include errors made during data entry, incompatible data formats, conflicting data definitions, and inconsistent compilation of data across divisions. Gartner analyst Ted Friedman says that poor-quality data is causing many customer-relationship-management, business intelligence, and business-to-business initiatives to fail.

About the Author(s)

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights