"We're on a journey right now to transform our focus, organizationally, from data to insight, to foresight to action," said Anthony Scriffignano, senior VP of worldwide data and insight. His title was newly created as part of a new organization based on global data insight and analytics.
"If you note the name of the organization, it's not a 'data' organization," he said. "It's a data, insight and analytics organization. That's a huge shift for us."
Scriffignano speaks articulately about this strategic shift, and observes what it means for D&B customers. "A lot of our customers look at us as a place to get data, a report or a score," he said. But the endgame is to take D&B data, merge it with customer data, and so figure out what's going to happen, and help customers take action. "All our initiatives are all around that," he said.
InformationWeek spoke to Scriffignano in April via phone.
Name and title: Anthony Scriffignano, SVP, Worldwide Data & Insight
Tenure at current job: 12 years, current title about 2 years.
Career accomplishment of which I'm most proud: [When I joined D&B], I was very adamant that data was multilingual and multinational, and is often transformed quite a bit. So we've introduced multilingual identity resolution.
Decision I wish I could do over: Wish I'd pursued my MBA sooner.
My most important career influencer: It would have to be 9/11. Something happens that disrupts the whole world and reminds you to think about the bigger world.
Current top initiatives: An Asian language semantic disambiguation system, which addresses the fundamental problem of identity resolution of a business entity, regardless of language or writing system. We're also very focused on the analytic components of data and delivering multi-dimensional insight using our data and the customers'.
Most disruptive force in my industry: Cloud-based capabilities are extremely disruptive. In an era in which we can conceive of the storage and application space as virtually limitless, what new possibilities might emerge as the costs are driven down? The closest analogy I can think of is when virtual memory made it possible to develop applications that no longer needed to "fit" in the memory space available. That evolution gave rise to large memory models that enabled problems to be solved that couldn't even have been articulated in the past. If storage and applications are virtual, computing can converge on a virtually unrestricted way of problem solving.
Biggest misconception about big data: That we're going to solve [big data] problems with technologies to curate more data. It's what I call the "bigger hard drives" phenomenon. Yes, of course we need those things. [But] fundamentally what we need is the ability to step back and say, "Let me assume I had all of the data I could possibly conceive of, and I had it right here, at my keyboard, now what am I going to do?"
The reasons big data projects go wrong: There's not enough focus on problem formulation and opportunity formulation. We tend to jump right to, "How are we going to store it? What databases are we going to use? What cloud solution are we going to use? Who are the providers for services, middleware, consulting?" The word "data" comes up a lot. Look, you wouldn't go to the supermarket and buy all the food. You actually plan the meal. Yet we behave that way with data.
A promising technology: Semantic disambiguation, the ability to figure out what words mean. Sentiment analysis, as an example. There's a lot of focus now on finding the nuanced meaning [of words], tokenizing that meaning, and putting scores on it so it can be operated on heuristics that try to approach the behavior of people when they read.
Companies want more than they're getting today from big data analytics. But small and big vendors are working to solve the key problems. Also in the new, all-digital Analytics Wish List issue of InformationWeek: Jay Parikh, the Facebook's infrastructure VP, discusses the company's big data plans. (Free registration required.)