Splunk slashes cost of big data analysis platform by 33% while adding 100% uptime guarantee. Customer MindTouch says it's no gimmick.
7 Cloud Service Startups To Watch
(Click image for larger view and slideshow.)
Big data analysis vendor Splunk pushed the appeal of its Splunk Cloud service on Tuesday by announcing a 33% price cut, a 100% uptime guarantee, free developer sandboxes, and higher-scale service plans for capturing and analyzing up to 5 terabytes per day.
Most of Splunk's 7,400-plus customers use its IT-operations-oriented big data analysis platform on-premises. Splunk Cloud was introduced last year as a way to speed and simplify access to the software on Amazon Web Services (AWS).
Splunk's price cut is not shock given that AWS itself has cut its storage and compute prices more than 40 times over the last six years. Amazon's largest-ever price cut was announced in March, one day after Google announced a 32% price cut on its Compute Engine services. Amazon responded with a 51% average price reduction on its Simple Storage Service and a 38% price reduction on M3 general-purpose server instances.
Splunk credited AWS price cuts as well as its own "improved operational efficiencies" for the 33% Splunk Cloud price cut and free new cloud sandbox instances for testing and development.
Splunk said the Splunk Cloud 100% uptime guarantee is backed by financial penalties if the service is disrupted. According to Splunk Cloud customer MindTouch, it's an important assurance to offer customers.
"It's not a difficult problem, across multiple Amazon availability zones, to set up cloud infrastructure to guarantee uptime, so I'm confident they'll be able to hit 100% and I don’t think it's just a BS marketing gimmick," said MindTouch CEO Aaron Fulkerson in a phone interview with InformationWeek.
MindTouch captures and hosts help-desk and support knowledgebase content for customers including Intuit, Intel, Salesforce.com, and SAP SuccessFactors, and it uses Splunk Cloud to monitor and report on its services, which are also hosted on AWS. MindTouch gives its customers up to a 99.9% uptime guarantee, depending on the customer's licensing level.
MindTouch selected Splunk Cloud in 2013 after a two-month bakeoff against competitors including Sumo Logic, ElasticSearch, and New Relic. MindTouch chose Splunk Cloud for what Fulkerson described as its "screamingly fast" performance and its "intuitive" query language. Before Tuesday's announcement, Splunk Cloud didn't offer any uptime guarantees, and Fulkerson said he was glad to see his "only objection" to the service removed.
The new service plans introduced on Tuesday extend Splunk Cloud to capture and analyze up to 5 terabytes of data per day. The limit was previously 1 terabyte of new data per day. Plans start at 5, 10, 20, 50, or 100 gigabytes per day, with numerous pricing levels extending up to the 5-terabyte threshold.
GE and Airbnb take very different approaches to measuring cloud computing's value. Their strategies, plus our exclusive survey, show that calculating cloud ROI is far from an exact science. Get the new Cloud ROI issue of InformationWeek.
Doug Henschen is Executive Editor of InformationWeek, where he covers the intersection of enterprise applications with information management, business intelligence, big data and analytics. He previously served as editor in chief of Intelligent Enterprise, editor in chief of ... View Full Bio
6 Tools to Protect Big DataMost IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift.
Big Data Brings Big Security ProblemsWhy should big data be more difficult to secure? In a word, variety. But the business won’t wait to use it to predict customer behavior, find correlations across disparate data sources, predict fraud or financial risk, and more.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?