"I'm glad to see people here from the business side as well as IT because it tends to be a challenge explaining to the business why they should invest in enterprise data," said Sweeney. "The business-side mentality is often 'get me the information I need to get my job done, but why do I need to spend a little extra - or a lot extra - to help somebody else get their job done.' What they don't always understand is that the interrelationships between various business units often drive the need for enterprise information management."
Sweeney offered an example in which a person within one business unit might classify GE as a bank customer in the industrial manufacturing business. "They're perfectly correct, except that GE Capital is one of the largest financial institutions in the world," he explained. "If that's not reflected in the information supporting the people making decisions about our target markets, they might conclude that 'we do 90 percent of our business with industrial companies, so why should we focus on the financial services industry?' You have to have a streamlined way for people who are maintaining the data to see the impact across the enterprise."
Compliance requirements are compounding information management challenges, said Sweeney. A decade ago, regulators imposed much simpler, cut-and-dried requirements, such as setting aside 8 percent of loan portfolios to offset risk, but Basel II has required more rigorous processes to minimize operational risk and much more detailed analyses to uncover unobserved market risk and better understand credit risk. The financial institutions have responded with much more complex formulas and predictive models to forecast risk more accurately and satisfy regulators without needlessly tying up capital. "Over a six year period, we reduced our regulatory capital requirements by about $6 billion, which translates into about $2 billion in extra revenue recurring every year," Sweeney explained. "To do that, we had to go from examining about 1,000 market risk variables and 200,000 market risk movements up to about 30,000 market risk variables and 24 million market risk movements and a huge co-variance matrix."
But despite all that sophistication, regulators know that analytics are only as good as the data, so financial institutions are required to prove that their models work. At Citigroup, that's done with a separate "profit attribution analysis" system that regularly compares the predictive models for value at risk against actual profits and losses.
"So we have one system that looks forward and one that looks back to prove that our predictions make sense," said Sweeney. "Both systems rely on different people, processes and technologies, so Citigroup eventually has to merge all that information in an enterprise data store to ensure the $6 billion in capital relief."
Few banks are in a position to trim $6 billion from capital reserves, but Sweeney offered a bit of advice that can be applied to any business that bases its decisions on data that has to be aggregated, integrated, cleansed and normalized. "It's not about the data warehouse at the end," he said. "The warehouse has a very high risk of being an expensive collection of useless information if nobody is taking care of the quality up front."Bill Sweeney, the Managing Director, Global Risk, Compliance and Technology at Citigroup, sees to it that the banking giant meets myriad regulatory requirements as well as rigorous rules around assigning reserves against market, credit and operational risks. On all these fronts, the biggest challenge, he says, is enterprise information management - getting a handle on the disparate data stores that are growing like weeds within just about every organization.