While some employers insist on searching for mythical perfect candidates, mere mortals can win Hadoop jobs. Search wisely and demonstrate the right blend of skills.
In a Shel Silverstein kids' song the unicorns ignored the warnings of rain and missed the boat as Noah's Ark sailed away.
Today, despite a couple of years of warnings from experts, employers are wasting time searching for their own unicorns: all-knowing data scientists who can run big data initiatives.
These unicorns might lack the traditional forehead appendage but they are expected to bring to the table great technical chops, including not only Hadoop development and architectural expertise, but analytical skills, problem solving ability, business/industry experience, leadership talent, and great communication skills. (It probably doesn't hurt if they also can cook and sing.)
"This happens whenever there is a new tech paradigm, employers always want that unicorn type of person. At that time people are still exploring and they eventually realize they need more than one person to fill those roles," said Nixon Patel, vice president with staffing company Collabera, in a recent phone interview. "However, after a couple of years they might be able to create that unicorn themselves, with one person learning much from all of the team members."
In the Hadoop sector, IT pros might not need deep experience in a half dozen highly diverse skill areas but the job postings floating around on the web make it clear that from staff-level developer to director-level data scientist, employers want Hadoop pros with a range of experiences. Not all of those skills will be readily thought of as big data related, but job candidates might be smart to include a decent mix at the top of their resumes.
Take some of the postings on the job site Dice.com, for example. A listing for a systems administrator, a classic IT title, carries the additional heading of "Hadoop administrator." It calls for five years of Hadoop management work, strong Linux and systems management experience, knowledge of multiple big data oriented tools, and storage experience.
Another listing for a "Hadoop application developer technician" -- carrying a salary in the $120,000 range -- calls for analytical and problem solving skills, two years of experience with Hadoop and other big data tools, work with Linux and other open source software systems, ETL experience, and sound knowledge of databases.
A listing for a "Hadoop senior architect" calls for at least five years of customer-facing experience as a consultant, two years of experience deploying large-scale Hadoop environments, data transformation experience with Apache PIG, and knowledge of MapReduce, all of which fall into the Hadoop realm. Yet, the job also requires more traditional IT experience with Linux or Unix, multiple enterprise security solutions, understanding of networking, and a "solid" background in database administration and design.
Often, job seekers have to look beyond job titles to find entry to the roles they want. Searches for "Hadoop" on Dice.com and Robert Half Technology's website revealed that about one third of the results returned didn't mention Hadoop in the job title. Hadoop fell under titles such as Backend Software Engineer, Vice President of Data Science, Linux Engineer, BI Consultant, Data Warehousing ETL Engineer, and Application Architect.
In a recent interview, Dice president Shravan Goli advised job seekers in the Hadoop space to highlight their skills beyond just Hadoop, such as experience with Java and NoSQL. "If you are looking to get a higher paying job in the tech field and have any of these big data skills, you can train yourself to be a Hadoop developer and become more valuable," he said. He noted that because of the scarcity of Hadoop qualified candidates, companies are becoming more creative and flexible in who they hire.
The Hadoop concept itself calls for systems management and clustering skills, an understanding of data management, and problem-solving capabilities, along with other traditional IT skills. When a job candidate layers on the Hadoop-specific skillset, it might not add up to a unicorn, but at least a horse of a different color.
Most IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift. Get the 6 Tools To Protect Big Data report today (registration required).
Jim Connolly is a versatile and experienced technology journalist who has reported on IT trends for more than two decades. He has written about enterprise computing, the PC revolution, client/server, the evolution of the Internet, networking, IT management, and the ongoing ... View Full Bio
6 Tools to Protect Big DataMost IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift.
Big Data Brings Big Security ProblemsWhy should big data be more difficult to secure? In a word, variety. But the business won’t wait to use it to predict customer behavior, find correlations across disparate data sources, predict fraud or financial risk, and more.