Enterprise Risk Management: Illuminate the Unknown

Taking risk is how businesses grow; managing risk is how they sustain that growth -- especially under pressure from regulators. Here's how to assemble a risk management architecture that anticipates dangers ahead, translates data into useful decision-support information and ensures compliance. Explore how operations can benefit from analytics proven for credit and market appraisal.

InformationWeek Staff, Contributor

November 16, 2005

9 Min Read

All Together Now: An ERM Framework

The next step is to map risk management components into an enterprise architecture framework. In this section, I will talk about some of enterprise architecture's technology characteristics and requirements as they pertain to risk management and analysis.

Transactional systems and services dominate most enterprises. They record business activity and thus require frequent additions and updates. A good example would be on a securities trading floor, where you'd find some kind of fixed-income derivative trading system used not only for booking trades but also for pricing potential trades, an operation that requires significant computation.

Referring back to our COSO components, transactional services are mostly used in risk assessment and control activities. The most effective way to mitigate risk is to measure and limit potential risk exposures —that is, before transactions reach the enterprise. Some risk assessment techniques, especially those for complex risks, may involve significant amount of computation. In recent years, Value at Risk (VaR) has become popular. A statistical measure that indicates the maximum loss that can be experienced for a given level of confidence, VaR often requires highly sophisticated techniques such as Monte Carlo simulation.

If we consider the trading floor example again, risk management authorities would have to monitor and limit both the credit and market risks that an individual trader could take. Such an assessment would have to happen with blinding speed based on intensive computations to measure and control the potential trade risk.

The large number of users and volume of simultaneous activity involved in this scenario puts heavy demands on system scalability and ease of maintenance. Thin-client, application-server architectures have become popular. The compute-intensive nature of this sort of risk assessment and control is a reason why financial services firms are at the leading edge of grid computing and other paradigms that promise increasing power at an affordable price. Risk assessment automation is also opening the door to extending the systems for customer self-service.

Decision-support environments support the assessment, response, information, communication and monitoring components of risk management. Analyzing risk exposure likely involves querying a lot of data in an unpredictable manner; risks must be analyzed both individually and as a group (to study portfolio effects, for example). Although response-time demands are less onerous than with transaction systems, risk modeling also involves significant computation.

Analysts are sophisticated in their use of data, often examining data access patterns across heterogeneous sources. This usually calls for a data warehouse supported by BI tools with functionality for canned "burst" reporting, ad hoc querying and OLAP analysis. Data mining and neural network-based optimizers frequently add to the load carried by decision-support systems.

Corporate stakeholder services meet the requirements of the board, senior management, operations, internal audit groups, regulatory authorities and investors. These users typically need risk management information ranging from tactical control and monitoring reporting to dashboard-based strategic reporting. They access systems in unpredictable ways: drilling down from high-level summaries to relevant granular detail, for example. Of paramount importance is the data's quality and currency.

To be responsive, systems serving corporate stakeholders must be built rapidly and reconfigured easily, often by users and without the aid of technologists. SOA, layered over decision-support and selected transactional data stores, is gaining favor to increase agility. Another important technology is the operational data store, which can feed near real-time information via message-oriented middleware (MOM).

Executive Summary

Managing risk, always an essential business activity for companies in the insurance and financial services industries, is breaking out of its niche. With regulatory compliance touching multiple data sources and cross-functional business processes, risk management now requires an enterprise approach. More operations and lines of business face risk management scrutiny. And the kind of data, information and analysis generated by risk management has potential for strategic business objectives, such as improving customer profitability.

Companies are blind to security, fraud and competitive threats if they can't use intelligent systems to bring the big picture of enterprise risk management (ERM) into focus. Relevant technology for risk management includes business intelligence, regulatory compliance tools, data warehousing and integration, message-oriented middleware, and business process management. Risk management will shake up existing distinctions between transactional and decision-support systems as organizations try to deal with problems close to the source and apply intelligence ahead of potentially damaging events and transactions.

The first step toward ERM is to map architectural components to the types of risk that threaten your organization. Then, you can look at the timeliness and quality of data and information sources. An architectural approach will help you eliminate redundant efforts and match risk management with dynamic business needs.

One large institution (preferring to remain unnamed) has layered a flexible reporting architecture over a high-performance data warehouse to give senior executives unparalleled access to risk data. Immediately after a large credit event (for example, something on the scale of Enron's bankruptcy), the chief risk officer could discover the bank's overall exposure to that customer within a few clicks on a Web page. At most banks, it would take weeks to get this information.

BPM automation is essential given the quickening pace of business and the need for proactive risk management. In financial services, BPM automation plays a role in program trading, dynamic hedging engines, credit monitoring and limit trigger applications. A dynamic hedging engine, for example, needs up-to-date information about portfolio positions, which it will combine with market prices of traded instruments to perform risk calculations and execute trades — and then automatically rebalance the customer's portfolio. Organizations look to MOM to feed the hedging engines with timely data for CPU-intensive calculations.

Data repositories are broadly divided today into transactional and decision-support database systems. The former type support operational systems and are "deep but narrow" in data modeling terms. For example, a transactional database may have all the detailed information necessary for trade pricing, trading and settlement for a particular product or line of business. Data models are optimized for insert and update activity using the vocabulary of the business line.

Decision-support databases (usually data warehouses) tend to be "wide and shallow": that is, the information is standardized to make it comparable across business lines. These systems usually contain a lot of history. The risk management user base for decision-support systems has tended to be small, numbering in the tens or hundreds, with cyclical spikes in activity at the end of the month or quarter. However, growing interest in risk management is raising the intensity of performance requirements for these systems.

Data moves between transactional and decision-support systems typically via batch snapshots of business-line-specific aspects of the data. Often with the help of ETL tools, the snapshot extracts are transformed into standardized views. Data warehousing professionals know only too well the myriad data consistency and quality problems this process usually uncovers. However, transformation is the key because risk assessment, measurement and monitoring processes rely on timely, complete and high-quality information.

Drowning in Data, Dying for Information

Information is the set of explicit and unambiguous facts about the enterprise at any given point in time. Data, on the other hand, is merely the raw material of information. Data arises out of facts recorded throughout the enterprise, mostly by transaction systems. The process of transcribing facts into systems is corruptible: facts might be recorded inaccurately or spuriously — inadvertently or with malicious intent. Risk management architecture must prevent, or at least be able to unwind, data corruption to deliver quality information.

Banks struggling with Basel II compliance have discovered gaps in how they collect required transaction data, but the most pressing problem is translating that data into useful decision-support information. Inconsistent data semantics and poor data quality are the stumbling blocks.

Thus, organizations are formulating data governance frameworks that encompass a mix of technology, processes and policies covering all aspects of data management, including metadata consistency and data quality improvement. BI tools, ETL engines and metadata repositories are among the relevant technology solutions pointed at these problems. Data governance frameworks must define the links between transactional and decision-support systems; they must show IT how to detect quality problems and address them at the source rather than through data cleansing procedures.

Illuminate the Unknown

Spurred especially by regulatory compliance, risk management has come a long way. Financial services organizations, for example, have become reasonably adept at managing market and credit risk. Operational risk is the new frontier. As organizations learn the precise drivers of operational risk, the ERM architecture and its attendant analytics will extend into more aspects of how a company operates.

Risk integration — the attribution of losses by risk type and the study of the interplay between them — is another growing area of interest. If a market portfolio suffers a drop in a company's bond prices due to a reduction in the issuer credit rating, for example, should the company look at the event as a market loss, credit loss or both? Answering such questions requires sophisticated analytic engines and increased integration between risk management systems.

Finally, risk management is increasingly an important source of knowledge for competitive advantage, a subject covered in depth in the next article in this cover package. Recent interest in capital measures (due partly to Basel II and similar initiatives) has fueled interest in using industry risk measures to determine risk-based pricing. Information on capital, best calculated and allocated in enterprise data repositories, must be fed to transactional systems where products are priced. This "reverse flow" of information from decision-support to transactional systems has huge implications for how both kinds of systems will be developed and deployed. Decision-support databases must step up to OLTP-level performance, availability and security requirements.

To succeed in risk management and discover new business opportunities with risk information, we must shine a brighter light on the ERM business function and its supporting architecture. As interest in risk analysis grows, distinctions between data repositories will blur. Organizations must formulate an architectural view so they can handle not only unknown risks to the business but also unknown requirements for information transformation.

R. Dilip Krishna, a principal with ProRisk Group, has more than 15 years experience in technology and business consulting. He has worked on risk management and Basel II implementations at several large North american banks and was chief architect of the Basel II program at CIBC. His current focus is on enterprise risk warehouses. Write to him at [email protected].

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights