Just how good--or bad--is the quality of Microsoft's software? It's a tricky question if you're a Microsoft executive with responsibility for developing or selling products. On the one hand, you want customers to have confidence in your operating systems, databases, and applications. On the other, you want them to know you're fully committed to improvement. That dilemma is resulting in some mixed messages at the highest levels.
Bill Gates, Microsoft's chairman and chief software architect, brought attention to the subject in January with his memo on trustworthy computing, which was ostensibly intended only for Microsoft's employees but which quickly made its way into the mainstream. "As an industry leader, we can and must do better," Gates wrote at the time (see "Software's Challenge," Jan. 21). All of a sudden, Microsoft's software was in a glass house, though not the kind of glass house (the corporate data center) the company has been aiming for with its DataCenter Server. Many of us on the outside are now looking in, our noses pressed against the, well, windows.
What do we see? Certainly, there's a greater sense of awareness of the issues related to software reliability on the Microsoft campus, with many of the company's developers being trained to write better, more secure software. It's safe to assume that this all-hands-on-deck approach will be reflected in a positive way in products that are in the development pipeline, to be delivered in the months and years ahead.
At the same time the company is going through this cathartic process, however, there's a defensiveness that could undermine the well-meaning and praiseworthy goal of establishing greater trust in its products among customers. Over the past few weeks, InformationWeek editors have talked to several top Microsoft officials--Gates earlier this month; group VP Jim Allchin and senior VP Paul Flessner in March--and they not only defend the quality of Microsoft's software but even blamed others for some of their problems.
Here's Flessner on Gates' trustworthy computing memo: "I was glad to see Bill capture what we've been doing for a long time. We're doing a huge amount of work, I think ground-breaking work, in software quality."
And Allchin: "I wouldn't say that we're any better or any worse than anyone else. I think it's a disservice to point fingers at us. ... I do believe that systems have worked just fine if they've not been under malicious attack. Malicious attacks don't have anything to do with quality, per se."
Gates: "We're more of a testing, a quality software organization than we're a software organization. ... We love to have people compare our quality to other people's quality. We will win in that any day."
Gates blamed many of the problems familiar to customers on a variety of other things: device drivers, the failure of customers to download updates and patches, and opening E-mail attachments when users have been warned against it.
So, there's the rub. Microsoft execs agree that developing clean, bullet-proof software is their top priority but they find it hard to admit, or simply don't believe, that poor programming practices or past mistakes on their part are to blame for the hacks, crashes, and glitches that resulted in a perception of buggy software among customers. But they understand that perception is reality, which explains the urgent effort to get things right.
There's a vast gray area in the concept of trustworthy computing, a fact not lost on Gates. When we asked him to rate Microsoft's software on a scale of 1 to 10, Gates gave Microsoft a 9--but acknowledged from the customer's point of view, it's probably a 1. That's quite a gap to span, regardless of who's to blame.
In a survey by InformationWeek Research on software quality featured in this issue, Microsoft scored last among 16 software companies in customer satisfaction (see story,"Quality First".) That raises the question: Are Microsoft's practices--past and present--really as good as their veteran managers think? If not, trustworthy computing risks becoming an oxymoron.
IT's Reputation: What the Data SaysInformationWeek's IT Perception Survey seeks to quantify how IT thinks it's doing versus how the business really views IT's performance in delivering services - and, more important, powering innovation. Our results suggest IT leaders should worry less about whether they're getting enough resources and more about the relationships they have with business unit peers.
What The Business Really Thinks Of IT: 3 Hard TruthsThey say perception is reality. If so, many in-house IT departments have reason to worry. InformationWeek's IT Perception Survey seeks to quantify how IT thinks it's doing versus how the business views IT's performance in delivering services - and, more important, powering innovation. The news isn't great.
InformationWeek Must Reads Oct. 21, 2014InformationWeek's new Must Reads is a compendium of our best recent coverage of digital strategy. Learn why you should learn to embrace DevOps, how to avoid roadblocks for digital projects, what the five steps to API management are, and more.