LexisNexis Takes On Hadoop With Open Source HPCC

LexisNexis Risk Solutions' High Performance Computing Cluster will be offered as open source code pitting it against Hadoop and other emerging data handling systems.

Charles Babcock, Editor at Large, Cloud

June 17, 2011

4 Min Read
InformationWeek logo in a gray background | InformationWeek

Slideshow: Yahoo's Hadoop Implementation

Slideshow: Yahoo's Hadoop Implementation


Slideshow: Yahoo's Hadoop Implementation (click for larger image and for full slideshow)

LexisNexis Risk Solutions, the system used by law enforcement, insurance companies and the federal government to insure people are who they say they are, is making its big data handling engine available to compete with Hadoop and other emerging data handling systems.

Big data frequently refers to the masses of data collected at websites and other customer interaction locations that is too voluminous and loosely structured to be dealt through relational database systems.

LexisNexis has spun-out its core, data management system, what it calls its High Performance Computing Cluster, in order to make it available as both open source code and a commercially licensed product. Its development will be overseen by a new HPCC Systems unit that will remain part of LexisNexis.

The HPCC data management engine can handle 5,000 complex queries per second while running on a server cluster of only 100 nodes, according to Armando Escalante, head of the new unit, in an interview. The best-known Hadoop clusters run into the thousands of nodes. Yahoo, a major Hadoop developer and user, runs it on clusters with up to 4,000 servers.

At the same time Escalante, who still serves as senior VP and CTO of LexisNexis Risk Solutions, said LexisNexis invented a language, ECL, which uses one-tenth the number of lines of code needed to build a query program compared to the Java programs that exploit Hadoop. ECL "is a very high level language that tells the HPCC cluster what is to be done, not how to do it ... Our solution has a significant programmer productivity boost," he said.

Nevertheless, the contrast with Hadoop, an open source project of the Apache Foundation, is something of an apples-to-oranges comparison. Hadoop is a file distribution system that can map a query to the cluster nodes that are closest to the data needed. It works on one query at a time, or many queries pushed sequentially through its architecture in a batch-like process. A high level, Hadoop-specific language, Pig, is being developed to speed up the process; likewise, Facebook is developing Hive for the same purpose. With HPCC, on the other hand, there are two related clusters, Thor and Roxie, working together to reduce many queries swiftly down to the relevant answers. Hundreds of concurrent queries can be answered in near real time or milliseconds, not multiple seconds or minutes, said Escalante. The Thor cluster is a brute force, extract, transform, and load system for handling data, and works in background at indexing it. Roxie is a parallel, distributed query engine that operates alongside Thor on the cluster, "tailored to do data delivery with low latency," he said.

Both clusters are contained within Escalante's projection of 100 servers needed.

"We believe that HPCC Systems will take big data computing to the next level," said James Peck, CEO of LexisNexis Risk Solutions, in the announcement of the HPCC project on Wednesday. "We've been doing this quietly for years for our customers with great success." LexisNexis has used its core engine for ten years.

If Peck and Escalante are correct, HPCC Systems will offer to further develop its core system through an open source project. HPCC is potentially a powerful big data engine that requires many fewer commodity servers to do its job than the number previously assumed needed.

Even so, Hadoop has many devotees and will still be used in many big data settings, especially for truly gargantuan tasks such as capturing and sorting all the data gained in a complete crawl of the Internet. Its own high-level languages are being developed at a rapid pace and may bring its capabilities closer in line with those of HPCC.

Escalante said LexisNexis was making its core engine open source code because "we don't want to run a proprietary system. We want to build a community around the system and get the new innovation that people within the community will produce for it." HPCC is written in C++ and will require C++ skilled open source programmers to advance its state of the art.

To underscore its new commitment to open source code, LexisNexis, which runs HPCC on Linux servers, joined the Linux Foundation on Friday.

In the new, all-digital Dark Reading supplement: What industry can teach government about IT innovation and efficiency. Also in this issue: Federal agencies have to shift from annual IT security assessments to continuous monitoring of their risks. Download it now. (Free registration required.)

About the Author

Charles Babcock

Editor at Large, Cloud

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week. He is a graduate of Syracuse University where he obtained a bachelor's degree in journalism. He joined the publication in 2003.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights