In the midst of an era when adoption of big data analytics is expected to generate more than 4 million jobs worldwide between 2013 and 2016, an IT professional might think that dropping the name "Hadoop" is a ticket to a salary bump, a promotion, or a new job with a new company.
Of course, they would be wrong. Employers increasingly are looking for just the right Hadoop-related skillsets, according to Nixon Patel, VP in charge of the Emerging Technology Competency Unit for staffing firm Collabera.
The Hadoop field is going through a transition on multiple fronts, and IT pros need to know where the technology is heading and which skills will be in demand over the years to come, according to Patel. For example, enterprises are shifting from the original batch-oriented Hadoop implementations to more of a real-time model. That opens the door for use of Apache Spark, designed to run 10 to 100 times faster than MapReduce, which has been in use for several years. So, advantage to Hadoop pros with Spark experience. Even those who are experienced with MapReduce want to be familiar with the newest version, Yarn, said Patel.
Besides having experience with emerging Hadoop tools, the staff-level IT pro who is in the best position to advance has five to seven years of IT experience. They should display analytical abilities and, Patel added, a "zeal and passion for learning new things and to want to commit some time.
[What is it worth to move ahead in the big data field? Read $60,000 Online Degree: A Lesson In Digital Business.]
"Having the name [Hadoop] on their resume is not going to be getting them far. They need to have hands-on experience in terms of how the architecture is going to work," said Patel, who heads a Collabera "train and deploy" program that trains IT pros for several months in emerging technologies.
Another approach for mid-career IT pros who want to advance with their current employers is to identify which Hadoop distribution their company wants to use. Whether they opt for Hortonworks, Cloudera, MapR, or another distribution, IT pros can position themselves as Hadoop pros.
However, the real bonus in doing so, according to data scientist Dan Gutierrez of Amulet Analytics, is that the price is just right for training on one distribution, free.
IT professionals and their employers are accustomed to paying anywhere from $150 to more than $700 for training and certification in areas such as Windows administration and networking. "Hadoop companies know that's a model they can't follow. So they are giving away a lot of resources. They do have for-fee training, but they also have a lot of resources for free," said Gutierrez in a phone interview.
He noted that companies providing Hadoop distributions need a core of specialists at customer sites who understand the architectures and languages of their particular software. In addition to providing free training, it's not unusual for the software companies to punctuate training and even appearances at local tech meet-ups by telling IT pros that they themselves are hiring, said Gutierrez, who teaches Hadoop courses through Coursera. So, the free training exposes the software companies to newly qualified job candidates.
He said experienced IT professionals can guide their employers into the world of big data by taking training courses and then setting up pilot, proof-of-concept applications on a public cloud infrastructure. With that skunkworks-style approach, there is no in-house infrastructure investment.
This isn't an overnight exercise, however, as Gutierrez said that IT pros who want to get into Hadoop and big data analytics should allow a year for training and building out a pilot project for their current company.
In addition to having real-world Hadoop experience, Collabera's Patel suggests that an IT pro who wants to advance their career should have experience with some of the core tools that go into Hadoop development and administration. He cited Python, Java, and R as examples, along with an interest in machine learning. One advantage that an experienced IT pro does have when moving into Hadoop, he said, is that the basic Hadoop commands have their roots in Unix and Linux, which many IT pros already know. That experience with legacy systems can help to position them for growth with today's emerging technology.
InformationWeek's June Must Reads is a compendium of our best recent coverage of big data. Find out one CIO's take on what's driving big data, key points on platform considerations, why a recent White House report on the topic has earned praise and skepticism, and much more.Jim Connolly is a versatile and experienced technology journalist who has reported on IT trends for more than two decades. As editorial director of InformationWeek and Network Computing, he oversees the day-to-day planning and editing on the site. Most recently he was editor ... View Full Bio