In the high-stakes game of marketing a new drug, understanding what's working and what's not can make the difference between success and expensive failure.
For top-20 pharmaceutical company Astellas, a new product launch last year presented the marketing analytics team with a data-velocity problem: Data from the field was arriving much faster than before.
"We started getting weekly patient data, instead of monthly," Chad Dau, associate director of marketing analytics at Astellas told InformationWeek by phone. The accelerated pace meant the team couldn't properly process and report on the data before the next batch arrived.
Adding complexity, Astellas and other drug companies increasingly are incorporating new data sets into their analyses, including unstructured text, audio, and video.
[ Read how automation has drawn in cloud enthusiasts: Private Cloud Adoptions On A Roll. ]
Dau turned to NewVantage, a Boston-based data and analytics consultancy, to pilot a cloud-based Hadoop application, leveraging its expertise in big data and optimized Hadoop configurations.
"We went from a week to three hours," said Dau, adding that Astellas plans to expand its cloud Hadoop approach next year.
"Their in-house solutions took a long time to run, even though they had a data warehouse and analysis tools," NewVantage managing partner Paul Barth told InformationWeek. For instance, in one case, analyzing how well patients take their medication -- called "patient persistence" in medical circles -- the NewVantage configuration was able to process the data 100 times faster.
"Their older style took a week or so to complete," Barth said. "We did it in under an hour."
Nor is that kind of performance boost atypical, Barth said.
"In our experience with a dozen of these set-ups, that speed-up is pretty typical," he said, noting that in some cases simply translating the data processing logic from mainframe SQL to Hadoop results in that kind of speed increase.
Paralleling these performance gains are the speed of these deployments, Barth said. "Unlike data warehouses and SQL that took a year or two to figure out if they worked, these [solutions] tell you if they're valuable in six weeks," he said.
But Barth doesn't think this inevitably drives every customer to real-time analytics.
"Some business processes are naturally slowly changing," he said. For these businesses, it wouldn't make sense to change go-to-market strategies on a daily basis. In other scenarios, such as fraud detection, time is of the essence, of course.
A Gartner survey found 64% were investing or planning to invest in big data technology in 2013. In the June survey of 720 of its worldwide clients, Gartner found 55% are currently addressing enhanced customer experience using big data, while 49% are using big data to address process efficiency.
On the other hand, Barth said there does seem to be a trend toward capturing the raw data and keeping it in a place where ad-hoc analysis is possible, such as a stock trading application that can access a repository with 10 years of transaction history in it.
"This allows you to see trends, patterns you hadn't seen before," he said.
Ellis Booker, based in Evanston, Ill., covers big data, data analytics, and education technology for InformationWeek. A familiar name in the computer trade press, he has held senior editorial posts at top IT publications including InternetWeek, Web Week, and IDG's Computerworld.
You can use distributed databases without putting your company's crown jewels at risk. Here's how. Also in the Data Scatter issue of InformationWeek: A wild-card team member with a different skill set can help provide an outside perspective that might turn big data into business innovation. (Free registration required.)