Software // Information Management
Commentary
12/4/2007
02:22 PM
Neil Raden
Neil Raden
Commentary
Connect Directly
RSS
E-Mail
50%
50%

Performance Management or Measurement Tyranny?

In "Measuring and Managing Performance in Organizations," Dorset House Publishing, 1996, Robert Austin made a very clear case that performance measurement often leads, paradoxically, to distortion and dysfunction instead of improvement. According to Austin... measuring an indicator of a performance raises the risk of making things worse. How can that be?

In "Measuring and Managing Performance in Organizations," Dorset House Publishing, 1996, Robert Austin made a very clear case that performance measurement often leads, paradoxically, to distortion and dysfunction instead of improvement. According to Austin - and I agree with him, having witnessed this phenomenon firsthand more than once - measuring an indicator of a performance (since we usually can't indicate the actual performance itself), raises the risk of making things worse. How can that be?First of all, the actual phenomenon we usually try to measure is how well we're performing. It's a multidimensional problem because the causal aspects are usually numerous and there are obvious dependencies among the variables. But to model it correctly requires a great deal of precision, which is typically not employed in measurement systems. Those who are energized to put performance management systems together are often too enthusiastic about the "project" and deliver models that are plausible, but too easily distorted. Unless the entire problem is modeled corrected, missing just one minor dimension can render the entire system valueless. Deming himself opined that that performance measurement was, "the most powerful inhibitor to quality and productivity in the Western world" (Gabor, Andrea, Catch a Falling Star System, U.S. News and World Report (June 5, 1989, p.43))

The greater problem is that humans are aware they are being measured and are quite creative at finding ways to beat the system, from salespeople sandbagging sales from one period to another to exploit inconsistencies in the compensation system, to entire organizations focused on a small set of metrics. Which is what brought this subject up with me today. An aspect of this problem is the interpretation of measurement. I read this piece today in the New York Times about the effect of college rankings:

"Like Florida, more leading public universities are striving for national status and drawing increasingly impressive and increasingly affluent students, sometimes using financial aid to lure them. In the process, critics say, many are losing force as engines of social mobility, shortchanging low-income and minority students, who are seriously underrepresented on their campuses.

"'Public universities were created to make excellence available to all qualified students,' said Kati Haycock, director of the Education Trust, an advocacy group, 'but that commitment appears to have diminished over time, as they choose to use their resources to try to push up their rankings. It's all about reputation, selectivity and ranking, instead of about the mission of finding and educating future leaders from their state.'"

In other words, by chasing the ranking model of a magazine or other service, in the drive to make it into the Top 10 Public Universities, Florida's performance in providing what they've been chartered to do is failing. TWA, before they foundered and were gobbled up by American, instituted an ill-fated initiative to drive everything by profitability. Needless to say, all of the important things an airline needs to do were disrupted. Financial institutions have also gone the profitability route, only to see their book of business deteriorate, leading to reserve deficiencies as the better business fled.

Just like the rest of the IT industry, there is too much discussion of the BI industry itself, the vendors and products and who's on top. Performance management is an art, not a science, and I for one would like to see more thoughtful pieces like Austin's (which is no more than ten years old) on the craft of building models that work.

Neil Raden is the founder of Hired Brains, providers of consulting, research and analysis in Business Intelligence, Performance Management, real-time analytics and information/semantic integration. Neil is co-author of the just-released book "Smart Enough Systems," with business rules expert James Taylor.In "Measuring and Managing Performance in Organizations," Dorset House Publishing, 1996, Robert Austin made a very clear case that performance measurement often leads, paradoxically, to distortion and dysfunction instead of improvement. According to Austin... measuring an indicator of a performance raises the risk of making things worse. How can that be?

Comment  | 
Print  | 
More Insights
The Agile Archive
The Agile Archive
When it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Government Oct. 20, 2014
Energy and weather agencies are busting long-held barriers to analyzing big data. Can the feds now get other government agencies into the movement?
Video
Slideshows
Twitter Feed
InformationWeek Radio
Archived InformationWeek Radio
A roundup of the top stories and trends on InformationWeek.com
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.