IT employees hold the keys to the treasure rooms, so their honesty or dishonesty is obviously of great interest to us and our organizations. From a societal perspective, one could even argue that in the era of voting machines and complex stock market transactions, IT is the one ring that rules them all.
It's not just about keeping IT personnel from looting our riches or rigging our elections, of course. Rampant, dishonest behavior can lead to a corporate brain drain, as an organization's best and brightest, sick of being part of the Enron Nation, say "to hell with it, I'm not playing anymore."
For all of these reasons, most of us try to support an atmosphere of honesty. We check references, reject resumes with obvious falsehoods, and implement processes that support honest behavior. Turns out, we're probably doing it wrong.
[ Will Microsoft's management practices--more than its competitors or products--be the cause of its decline? See Microsoft A Victim Of Its Own Success. ]
I just finished Duke professor Dan Ariely's latest book, The (Honest) Truth About Dishonesty," and was wide-eyed at a couple of passages. For example, when people deal with tokens of value instead of actual value (such as poker chips instead of dollars), they cheat more. Ariely discusses the rampant overstating of billable hours in most professional services organizations. A personal example involves the "near money" of frequent flier miles and other loyalty programs, all of which are tracked by the systems IT supports.
For years I've assumed that putting folks into teams would cut down on cheating. But Ariely's experiment-based research identifies something called "altruistic cheating," the cheating someone does because he's working on a team and wants others on the team to get a bonus or promotion or avoid punishment.
Ariely urges readers to leave behind the outmoded "simple model of rational crime" (SMORC), which assumes that when there's a favorable risk-benefit ratio to dishonest acts, such acts will be committed, and when there isn't a favorable risk-benefit, they won't be committed. Ariely buries that theory through the experiments he details in the book.
Dishonesty is contagious. If it's well known in your organization that individuals are stealing a competitor's secrets, or if you're at one of those professional services organizations that regularly overstates billable hours, don't expect that the rest of employee behavior will be a model of honesty. The Talmud compares keeping bad company with hanging around in a tannery--the stink will stay with you. (But quoting the Talmud or your mom or scoutmaster at a meeting probably won't be as effective as pointing to serious behavioral science, and for that we can thank Ariely.)
When small dishonest acts accumulate, individuals start feeling what Ariely calls the "what the hell" factor. That is, once you've committed a bunch of dishonest acts, you sort of go with it, and they get bigger.
I had an email conversation with Ariely regarding Scott Thompson, the former Yahoo CEO who claimed (falsely) that he had a degree from Stanford. I wondered what would happen if I didn't correct the folks who refer to me in email as "Dr. Feldman." Might I one day be listed as such on a conference brochure and then not correct the organizers because I didn't want to cause them grief? Might someone then ask me where I had gotten my PhD? Would I make something up and watch it snowball after a colleague overheard the conversation?
If someone told me that I would act this dishonestly, I would argue against that notion vigorously, but wouldn't we all? Trouble is, I could see it happening to someone else, which--to be (honestly) honest--probably means it could happen to me.
Ariely agrees that "small slips that get larger over time are the main problem," and that this may well have been the case with Thompson. My point: It could happen to any of us if we don't watch the small stuff.