It's still costly and complicated, but recent software vendor partnerships aim to make it less so.

Mary Hayes Weier, Contributor

December 7, 2007

4 Min Read

Bon-Ton stores recently mined 10 million customer records from its clothing stores nationwide and pulled a sample set of 100 million transactions. From that, it analyzed 200 separate factors, including what types of products customers bought, the associations between products in their shopping carts, and how many discounted products they purchased. The reason for all this data crunching: to create a model for direct-mail campaigns that could better predict which customers are likely to shop at a Bon-Ton store in the next 30 days.

For retailers like Bon-Ton, predictive analytics software is increasingly viewed as an edge. "It's a long and difficult process, and therein lies the competitive advantage," say Mike Hayes, Bon-Ton's senior VP of marketing. "A lot of companies don't want to hire the skills to do that or take the time to do it." Bon-Ton saw "marked improvement" in sales from such campaigns, using the predictive model based on software purchased from SPSS earlier this year.

Advanced analytics--of which predictive analytics is a subset--has long been a separate, pricier tier from the business intelligence reporting and query tools provided by the likes of Business Objects, Cognos, and Hyperion (which were or are being acquired by SAP, IBM, and Oracle, respectively). Last year, the advanced analytics market grew 11.3% to $1.24 billion, according to IDC analyst Dan Vesset, who predicts it will grow 10% a year through 2011.

What's changing, however, is that companies specializing in analytics, led by SAS Institute and SPSS, are making it somewhat easier to run programs like predictive analytics and to access the results. While the price and complexity will likely keep such tactics the province of big companies with large customer rolls, a number of recent deals show progress on making the technology more accessible.

chart: Strength in Software

Business Objects, for example, finalized a deal last week with SPSS to offer its predictive analytics to Business Objects XI customers. The intention is to make it easier for companies to present dashboards, graphics, and reports--Business Objects' strong suit--that deliver SPSS-driven predictions about such things as which customers are most likely to stop doing business with a company in the next six months. IBM and SPSS struck a similar deal in October, as did Teradata and SAS.

This week, SAP is announcing a deal to embed Visual Numerics' numerical library into the data processing engine in SAP's NetWeaver platform. The numerical library contains thousands of algorithms intended to enable predictive analysis from SAP applications and those from partners supporting NetWeaver. Potential uses, says SAP, include shopping-bag analysis similar to what Bon-Ton is doing, analyzing keyword searches to forecast popular future words for better online ad buying, and assessing IT systems to predict bottlenecks.

MORE THAN BUSINESS INTELLIGENCE

Consumer goods, financial services, health care, retail, and telecommunications are the industries that are most embracing predictive analytics. What the industries share in common is having to understand the behavior of millions of customers who've become smarter and more fickle thanks to their ability to compare trends--whether for prices or fashions or mortgages--so quickly over the Web. Conventional BI tools offer some predictive analytics. The difference, Hayes says, is the ability to build customized models.

Costs and talent remain a barrier. Even with new vendor partnerships, there's a lot of work to cleanse and format data for such analysis, and the talent is on the top of IT pay scales. In InformationWeek's 2007 Salary Survey, data mining pros ranked as the second-highest-paid staff job, averaging $93,000 in total pay, and data mining managers earned $125,000 in total pay.

As CIO of Catalina Marketing, Eric Williams runs one of the world's largest data repositories. One 45-terabyte database holds information on 70% of transactions made in a given day at every U.S. grocery store--850 billion rows of data. But Williams insists companies can make the analytics process easier. He's practically automated the process of predictive analytics using Netezza's data warehouse appliance, SAS's Enterprise Miner, and a research tool developed in-house that directs people to surveys about their buying habits; in reward, they get store discounts. Catalina analyzes customer data for 35,000 retail stores, plus consumer goods companies such as Procter & Gamble and Unilever.

Predictive analytics also can help find "crazy things you'd never anticipate," Williams says. Catalina built a model to determine why previously loyal customers stopped shopping at a retailer, and it found a connection to milk purchases. For a certain segment of regular milk buyers, if they stopped buying milk for several weeks during store visits, Catalina found it likely that within six to eight months they'd stop coming to the store at all. Catalina concluded those customers started buying milk at a competitor and soon switched all their kitchen staples there. By identifying other customers with similar buying habits, the retailer could study what it needed to do to keep those customers.

It can be a complicated process, but the results are what today's shoppers increasingly expect. As Williams says, "It's the new 'me' generation. Don't tell me about dog food if I don't have a dog."

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights