Hadoop's Second Generation Offers More To Enterprises
The first Hadoop tools weren't easy to deploy or manage. But the second-wave tools deliver great advances in usability.
5 Big Wishes For Big Data Deployments
(click image for larger view and for slideshow)
Hadoop is one of the single most disruptive recent innovations in enterprise IT. The promise is to turn the ever-growing tide of data into profit. Even just in my own industry, telecommunications and media, Hadoop allows a range of analytic uses in areas as diverse as network planning, customer support, security operations, fraud detection and targeted advertising.
Yet realizing this potential has been challenging for many mainstream enterprises. Many started experimenting with some of the 13 functional modules that make up Apache Hadoop, a set of technologies that required large teams and several years for the early wave of Hadoop adopters such as eBay, Facebook and Yahoo to master.
The first wave of Hadoop technology, the 1.x generation, was not easy to deploy nor easy to manage. The many moving parts that make up a Hadoop cluster were difficult to configure for new users. Seemingly minor details – patch versioning, for instance -- mattered a lot. As a result, services failed more often than expected, and many problems only showed up under severe load. Skills were and still are in short supply, although there is no shortage of good training available from leading vendors such as Hortonworks and Cloudera.
Fortunately, the second generation of Hadoop, which Hortonworks calls HDP 2.0 and which was announced at Hadoop Summit 2013, fills in many of the gaps.
Manageability is a key expectation, particularly for the more business-critical use cases that service providers experience. Hadoop has made great advances here with Ambari, an intuitive Web user interface that makes it much easier to provision, manage and monitor Hadoop clusters. Ambari allows the automation of initial installation, rolling upgrades without service disruption, high availability and disaster recovery, all critical to efficient IT operations.
Moreover, the independent software vendor ecosystem that supports Hadoop distributions is broadening and deepening. This is important for two reasons. In our experience, much of a buying decision boils down to how Hadoop fits with existing technology assets; in most cases, that means traditional business intelligence and data warehouse vendors. This also alleviates concerns over the skills shortage.
6 Tools to Protect Big DataMost IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift.
Big Data Brings Big Security ProblemsWhy should big data be more difficult to secure? In a word, variety. But the business won’t wait to use it to predict customer behavior, find correlations across disparate data sources, predict fraud or financial risk, and more.
Top IT Trends to Watch in Financial ServicesIT pros at banks, investment houses, insurance companies, and other financial services organizations are focused on a range of issues, from peer-to-peer lending to cybersecurity to performance, agility, and compliance. It all matters.
Join us for a roundup of the top stories on InformationWeek.com for the week of October 9, 2016. We'll be talking with the InformationWeek.com editors and correspondents who brought you the top stories of the week to get the "story behind the story."