Teradata's Big Data Appliance: Analysis For Everyone
With the latest release of its Teradata Appliance For Hadoop, the vendor keeps its focus on pre-loaded racks with commodity hardware that are pre-configured with a Hadoop solution and its attendant management applications.
10 Great Websites For Learning Programming
10 Great Websites For Learning Programming (Click image for larger view and slideshow.)
Teradata wants big data to be as easy to use as a toaster.
Well, that analogy may be over-reaching a bit, but the company announced the 5.0 release of its Teradata Appliance for Hadoop, supporting Cloudera Enterprise 5.4 or Hortonworks HDP 2.3.
"We are seeing a growing adoption of Hadoop and a growing adoption of utilization of appliances," said Chris Twogood, vice president for product marketing at Teradata. "Users want something easier to use and easier to integrate."
Teradata will be pitching its Hadoop appliance in three configurations. First is a Performance package where the appliance has more CPUs and memory, but less storage. Its purpose is to run streaming applications such as Spark, Storm, and SQL-Hadoop engines Presto and Impala.
The Capacity configuration packs more storage but fewer CPUs and less memory than the Performance model. Users will pick the Capacity model to store and fiddle with greater quantities of infrequently used data.
The third model, called Balance, is optimal for extracting, transforming, and loading jobs. Its configuration falls between the Performance and Capacity models.
Teradata's Hadoop appliance starts at $75,000 for 100 TB of capacity. Configuration and cost goes up from there.
[ Are you getting the most from your data? Read 6 Ways To Get More From Data Science, IT. ]
Following up the hardware release will be a rolling support release. Teradata's QueryGrid 14.10 will go on the new boxes this quarter, with the 15.0 release expected in the next quarter as an upgrade. Various support applications for the appliance will get 30-day, 60-day, and 90-day releases, some for purchase and others as standard upgrades, Twogood explained.
Big data was once the province of "expert users," basically data scientists who applied Ph.D.-level math and statistics skills to craft algorithms that could find the smallest needles of insight in the biggest haystacks of terabytes. As a result, many companies in recent years have developed products that make big data easier to use, moving the threshold of utility downward from expert users to the non-expert but technologically adept.
To this end, Teradata maintains its focus on "appliances," basically pre-loaded racks with commodity hardware but pre-configured with a Hadoop solution and its attendant management applications. "We do all the pre-integration," Twogood said, while the customer still has the capability to configure the appliance for Hadoop to suit usage as needed.
"Half the appliances we ship go to people who don't know what they are doing," Twogood said. If these users had to configure the Hadoop appliance, they would be learning from their mistakes until they got it right. The other half is made up of expert users who want to lighten their workload by simply going straight to work using a pre-configured device.
"Hadoop is becoming more like a database in governance and management," added Clarke Patterson, senior director for product marketing at Cloudera. Hadoop is still "rough around the edges," and it needs to get to the point where its stability, reliability, and usage policies are supported and no different from current, established databases, Patterson explained. "Then it blends into the architecture seamlessly."
That threshold may not be reached for few years, added Twogood.
Making big data tools turnkey is a major theme for Teradata. Earlier this year, the company introduced the Aster App Center, a collection of templates for analytic applications designed to run specifically on Teradata's Aster database management system.
About the Author
You May Also Like