Tibco Spotfire, already popular for finding the proverbial needle in the haystack, now works smarter with the biggest haystacks.
Big Data Talent War: 10 Analytics Job Trends
(click image for larger view and for slideshow)
Tibco Spotfire handles data visualization, analytic dashboards and applications, and forward-looking predictive analysis. Spotfire 5.0, a significant upgrade announced on Tuesday, is designed to do all of the above at big-data scale.
Tibco Spotfire is already a popular product for finding the proverbial needles in the haystack, but customers are telling Tibco that the haystacks are getting bigger. Oil and gas operations, for example, are gathering more information from well sensors. Manufacturers are grabbing more production data off of shop floors. And marketers are gathering more customer analysis and segmentation information through click streams and social networks. As the data stacks up, the nature of the analysis changes, according to Steve Farr, a Tibco Spotfire senior product marketing manager.
"It's tempting to think you can identify the important trends in a small subset of the data, but big data analytics is about considering all of the data so you also see the exceptions and outliers," Farr told InformationWeek. "It's in the outliers that you find fraud, risk, and the things that are growing wrong." You can also find latent patterns in that bigger picture that reveal opportunities.
Spotfire competes with the likes of Tableau Software on data visualization and QlikTech on delivering analytic dashboards and applications. All three vendors are growing quickly on the appeal of fast and intuitive in-memory analysis. To deliver that capability at a bigger scale, a rewrite of the in-memory engine in Spotfire 5.0 takes better advantage of high-capacity, multi-core servers, according to Farr.
"We're seeing very little degradation of performance as we add rows of data because we've rewritten in a way that throws all the power of the hardware at the data analysis," Farr said.
But in-memory capacity takes you only so far. When it comes to truly high-volume data analysis up into the tens or hundreds of terabytes or more, in-database analysis is the emerging analytical approach. It lets you apply analytics within the powerful database platforms in which large data sets reside. Users benefit because they don't waste time extracting and moving data, handling analyses on under-powered analytic servers, and then returning result sets back to the database platform. It all happens inside the database.
Tibco is a latecomer to this approach, as SAS, SPSS, Alpine, and other analytics vendors have already ventured into in-database analytics. But Spotfire customers don't want to be left out, according to Farr, so the 5.0 release supports in-database analysis in conjunction with Oracle, Microsoft SQL Server, and Teradata data warehouse platforms. Support for in-database analysis within EMC Greenplum, IBM Netezza, and SAP Hana is also being discussed, but they didn't make the first cut.
The customers most likely to take advantage of in-database processing are those with the largest data volumes. Procter & Gamble, for example, uses Spotfire to serve up data visualizations to some 60,000 employees. It currently uses SAS for most predictive analyses, but the 5.0 release appears to have opened the door to use of Spotfire's predictive capabilities. "We are excited by the prospect of Spotfire 5.0 being able to efficiently analyze and visualize extreme data volumes by executing analytics directly within our database architecture," said Alan Falkingham, P&G's director, business intelligence, in a statement from Tibco. P&G's database platform is Oracle Exadata.
Tibco was an early supporter of analytics based on the popular R programming language with a release back in 2010. The 5.0 release improves on that support with a new Tibco Enterprise Runtime for R, which embeds the open-source R runtime engine into the Spotfire statistical server. This lets customers expose R to a much larger audience of users--beyond the Ph.D crowd--within a simple, object-oriented programming environment.
"Customers have been concerned that when you have a lot of users on an R-based application it becomes unstable," Farr said. "By running R within Spotfire and giving users access using our Web player, you can deploy R-based apps to thousands of users."
In one final enhancement aimed at scaling up Spotfire, the 5.0 release supports more users with the aid of a more robust Web player. Where Spotfire 4.5 scaled up to thousands of Web users, the 5.0 upgrade scales to serve tens of thousands of users on the Web, according to Farr. What's more, with a new option to add Tibco Silver Fabric private-cloud management software, administrators running Spotfire deployments at large companies can bring scale up and scale down Spotfire servers and manage analytic workloads with the speed and flexibility of the private-cloud deployment model.
Spotfire 5.0 is slated for general release in November.
In-memory analytics offers subsecond response times and hundreds of thousands of transactions per second. Now falling costs put it in reach of more enterprises. Also in the Analytics Speed Demon special issue of InformationWeek: Louisiana State University hopes to align business and IT more closely through a master's program focused on analytics. (Free registration required.)
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?