Q&A: Ambuj Goyal on IBM's Path to Prediction and Optimization
Touting new industry- and domain-specific alternatives, Big Blue's Information Management GM says predictive and statistical modeling are being used as "a hammer" for many business problems for which they are not the right tool.
Ambuj Goyal |
Advanced analytics, particularly predictive and statistical modeling, have gained a reputation as a sort of gold standard for prediction and a sure route to competitive advantage. Challenging this thinking, Ambuj Goyal, general manager of IBM's Information Management division, asserts that predictive and statistical modeling are being used as a hammer for many business problems that should not be treated like nails.
Critics might charge that that's sour grapes given that Cognos, IBM's core business intelligence platform, has not been competing on analytics. Nonetheless, Goyal outlines an emerging IBM vertical-industry and domain-specific approach to applying "the right techniques" for prediction and optimization. The results are delivered in Cognos-supported dashboards and key performance indicators (KPIs), and Goyal says it starts with IBM's "trusted information layer."
Intelligent Enterprise was offered an exclusive interview to discuss your belief that "every company already has all the information it needs to become more predictive, forward-thinking and insightful in its business decisions." So how do we start?
Well, prediction is a tough game. Lots of people talk about the future, but they're almost always wrong. The soundness of a prediction is based on how much you can trust the information that you're getting. More than 50 percent of the people who do analytics do not believe that they have the latest or the most trusted information available.
Whose statistic is that?
There are surveys in the industry that have found that 50 percent of managers and people who make decisions based on information do not believe that they can trust the information that they're getting. So IBM's strategy has been to create a trusted information layer for companies, so that when words like "expense" or "revenue" or "risk" or "organization" or "customer" or "policy" come up, those business terms are consistent across the enterprise and the information populating those terms is trusted information.
Our InfoSphere platform is associated with creating trusted information, with our Information Server and Master Data Management [MDM] capabilities for example. But we have recently launched Business Glossary and Data Governance and Management capabilities that help build trust at the business terms level rather than being about rules and columns at a table level. Customers are using our Business Glossary and the Data Governance tools so they can say "when we say 'expense,' it means the same thing to the business people and the technical people. Everyone is using the same language and the same definitions across the organization."
The Business Glossary sounds like part of MDM. How is it different?
In the past when we offered Information Server or MDM, it was about rows and tables and columns, but these technical terms were not associated with business terms, so reuse did not happen. We have added the Business Glossary capability in all our product lines, from Cognos to MDM to Information Server and everything else. We can now associate every term inside the Information Server or MDM or Cognos or even our data warehouse with the Business Glossary. This sets up a fundamental difference between our data warehouse or our Information Server or MDM Server versus any other vendor offerings, in that you can now address things from a business perspective rather than a technical perspective.
MDM, data governance and data stewardship initiatives are always about getting agreement on terminology between business and IT, aren't they? What are you doing differently than other vendors here?
If you follow what we do in data modeling — our insurance data model or our retail banking data model, for example — those are the business terms. We have mapped those business data models to the MDM servers, the Information Server, the data warehouse and so on... Business Glossary is how we take our data warehousing and Information Server and MDM offerings that much closer to the end goal of defining and creating trusted information about, say, an insurance company or about expense management at a financial services company or about risk within any company. It's how we map those business data models to the infrastructure models like the MDM Server, Information Server and data warehouse.
Once you have these populated data models and you run them by Cognos, you can get predictive analytics. Now you have trusted information associated with business terminology rather than just a data warehouse or an MDM server. If it's a bank, for instance, and you're getting trusted information associated with risk, you can feel comfortable that everybody is talking the same language because it's in the business terms that the bank uses. In November, Cognos released a banking financial risk model, and it includes the Business Glossary and the data models, so we've connected the business and the technology much better than we used to do.
You made a jump there from having trusted information to getting predictive risk analytics out of Cognos. Where does the prediction come in?
Our industry is confused [in thinking] that predictive and statistical models are the only way to predict the future. For example, when you do payment collection, whether it is for a loan origination company or a credit card company or even tax payment collection, the best analytic models associated with doing a better job at that are called Markov decision processes. It has nothing to do with statistical models. The best way to model a computer chip production line and the future yield of chip production is called constrained logic programming. The best way to do things like real-time decisioning for cross-selling or up-selling through the call center is through a policy engine and a rules engine.
So IBM has created a dashboard environment around Cognos, and we are adding predictive models like data mining, which we introduced in Cognos 8.4, constrained logic programming, Markov decision processes, simple statistical models, rules-engines based on technology from our iLog acquisition, policy engines and various other techniques. We are putting all of those into a Cognos API. So based on the specific business problem, you can run the appropriate model.
I take it, then, that these are industry-specific, prebuilt models for particular vertical industries?
There are about 15 to 20 analytical techniques for dealing with industry-specific or problem-domain-specific challenges. If you're doing workforce analytics, for example, we have an offering by Cognos and we use the right technique. Cognos also has a Financial Risk Solution. Underneath these offerings we have dashboards, KPIs and we use the right analytical techniques.
[Author's note: A post-interview request for specific examples yielded the following list:]
Banking Risk Performance - Credit Risk
Customer Care & Insight
Crime Information Warehouse
Clinical Trial Management
Life Sciences Regulatory Compliance
Enterprise Health Analytics
Retail Business Intelligence Solution
Aggregate Spend Management
Automotive Quality Insight
Performance Analytics for JDE and Oracle
Banking Fraud and Abuse
Risk Management Cockpit
Risk Adjusted Profitability
Workforce Resource Management
You're clearly challenging the likes of SAS and SPSS, which are known for tools and models that are applied in many different scenarios and industries, but these vendors also have industry- and domain-specific applications and long-standing relationships with companies in lots of vertical industries. Are you suggesting their technologies are being misused?
Predictive modeling has become a hammer for these vendors, and in many cases it's overkill. For example, we were recently talking to a client that has created a 100-terabyte model, using software from one of those vendors, to do prediction. They're using an amazing amount of computing capacity because the only method that they had available was a statistical model coming from one of these vendors. But the problem could have been solved much more simply with a policy engine, which could have been created for less than $1 million rather than requiring tens of millions of dollars of expense.
So I'd say these vendors have been using what they have as a hammer regardless of the business problem at hand. As we've taken this problem-specific and domain-specific approach, we find that different techniques are the right ones to solve a business problem. In many cases, people don't need to spend the kind of money they are currently spending to be able to solve their problem.
Are those vendors very good in the data mining capability of predictive models? Yes they are very good in the statistical domain of predictive models. Are they very good at constrained logic programming or Markov decision processes or rules engines? No, they are not.
In what areas would you give "the hammer" its due, and what would you say to all those professionals out there who are applying various analytic techniques using tools from the likes of SAS and SPSS?
You can apply those tools in any domain, and in many cases they do a very good job. But it can also be overkill... A combination of techniques can sometimes solve the problem better and less expensively.
Are there companies out there that understand all the techniques and combinations of techniques you're talking about, or is that insight that can only be obtained from IBM Global Services Consultants?
Many people and organizations are starting to embrace our approach. In one of the showcases that we ran at our Information On Demand conference, there were about nine global systems integrators highlighted who are leveraging our Information On Demand stack to solve specific problems in specific domains. And a huge number of our partners are also deploying these techniques.
Many view the real bottleneck in spreading the use of advanced analytic techniques as being the scarcity and expense of experienced analytics professionals. Did that factor in IBM's prepackaged, industry- and domain-specific approach?
You're right that there's not enough talent out there. But companies also don't always care [about having that level of expertise]... They just want to solve a business problem, and they may feel that they don't need to have deep technical experts who know how to use the hammer.
In each of the industries we have addressed, and really in every industry, there are only three things that people are trying to do to leverage information. One is getting better operational efficiency. Second, they are trying to do a better job at compliance. And third, they are trying to do a better job of taking care of customers and partners. In each of these three areas, we have identified top issues, so in automotive, for instance, there are three top issues: sales operations and planning, partner management and trade-up management. Those are the three areas where the industry is struggling, so we have captured those into a set of solutions with KPIs, dashboards and a set of underlying analytic techniques. People can deploy it without the huge amounts of expertise associated with creating a 100-terabyte model for doing prediction....
From our point of view, this approach is bridging the gap between business need and technology, and we can help companies take a simpler, less expensive approach.
About the Author
You May Also Like