The ability to spot an anomaly in hundreds of lines of code isn't all that different from Neanderthal man picking out the weakest wooly mammoth -- and don't let anyone tell you different.
A blog post titled "The tech utopia nobody wants: why the world nerds are creating will be awful" has caused a mild social media stir. The author complained about everything from Soylent to Google Glass and how technology is destroying the basic human experience.
It's a popular meme. It's also dead wrong. These technologies might be new, but the skills they depend on are as old as the instincts our hunter/gatherer ancestors used to avoid becoming dinner for a hungry saber-toothed tiger.
That's not exactly obvious. In fact, the other day a friend and I tried to develop an elevator pitch to explain to non-IT folks what exactly we do all day. I'm a cloud and security architect, he's a sysadmin manager and practitioner and a developer using now-arcane languages. We started to throw ideas against the wall.
"Today, during an all-day whiteboarding session, we defeated the shadow IT contingent by developing an architecture that keeps our multi-vendor cloud strategy fully within the support of the core NOC, while continuing to support our DevOps initiative."
Yeah. We lost grandma.
"I am on the phone all day."
Never want to say that.
"I work with very large computers for a very large company, and I talk with other people about how to make them work most effectively."
We segued into how what we do in 2014 is so challenging that we've evolved and adapted and developed entirely new abilities unique to our era. Surely anthropologists are impressed that we've developed the ability to use our thumbs as primary rather than helper digits, spot one danger signal among thousands of log messages scrolling at high speed across a screen, or orchestrate our fingers to type 100 or even 200 words a minute on a keyboard. Right?
Not really. Yes, these tasks are unique to our era. But we're simply adapting innate, age-old human capabilities to accomplish them using the tools of today.
Say you're a systems administrator and your job is to find an error message in an Apache log. You dump the file and watch it scroll across your screen, waiting for the error message to jump out. How is that different from being on a hunt, lurking still in the forest, waiting for prey to catch your eye? You discern irregularity. There it is, you see it, you kill it. Same basic skill, different application.
Our eyes perceive text patterns now, where they used to spot animals. Our hands build iPhones and plug in RAM/EEPROM/modules where they used to pound in dowels and nails, and before that, chip stones into arrowheads. Eliminating shadow IT is like recognizing early that someone has the plague and drowning him before he takes down the whole tribe. Instinct.
So what does this mean to today's IT pros? Not that you can club the next person who falls for a phishing email. But you can understand that we humans are controlling technology in the way that it needs to be controlled. It might not be immediately intuitive, and yes, it means that you're not some super-evolved wonder of nature.
Sometimes technology develops in a way that lets us build on skills of the past, sometimes it doesn't. We have to adapt our brains, fingers, and eyes. But that doesn't negate our humanity.
We will continue to use our hands to type/touch/push/pull/thumb nubby/or triple-finger-swipe. We will use our eyes to pick out flaws or see that our screen layout is a few pixels off. Maybe a wearable or robot will help us, maybe we need to teach our bodies how to be ergonomically situated so that we can continue to have functioning bodies.
So next time you're in an elevator with the CIO and you're asked what you do for the company, lift your head proudly and exclaim: "I'm a hunter!"
In its ninth year, Interop New York (Sept. 29 to Oct. 3) is the premier event for the Northeast IT market. Strongly represented vertical industries include financial services, government, and education. Join more than 5,000 attendees to learn about IT leadership, cloud, collaboration, infrastructure, mobility, risk management and security, and SDN, as well as explore 125 exhibitors' offerings. Register with Discount Code MPIWK to save $200 off Total Access & Conference Passes.
Scott has spent the last 15 years in the banking, education, and payment sectors perfecting the art of sys-admining, cloud-ifying, and keeping mission-critical systems from falling to pieces. He speaks at conferences about cloud security, the software-defined data center, and ... View Full Bio
IT's Reputation: What the Data SaysInformationWeek's IT Perception Survey seeks to quantify how IT thinks it's doing versus how the business really views IT's performance in delivering services - and, more important, powering innovation. Our results suggest IT leaders should worry less about whether they're getting enough resources and more about the relationships they have with business unit peers.
What The Business Really Thinks Of IT: 3 Hard TruthsThey say perception is reality. If so, many in-house IT departments have reason to worry. InformationWeek's IT Perception Survey seeks to quantify how IT thinks it's doing versus how the business views IT's performance in delivering services - and, more important, powering innovation. The news isn't great.
InformationWeek Must Reads Oct. 21, 2014InformationWeek's new Must Reads is a compendium of our best recent coverage of digital strategy. Learn why you should learn to embrace DevOps, how to avoid roadblocks for digital projects, what the five steps to API management are, and more.