Hurricanes: Risk Analytics in a World of Uncertainty
The already-devastating hurricane season highlights the need for predictive analytical models that account for exceptional events.
Record-shattering tropical storm Harvey proved once again Mother Nature is an extraordinary force that cannot be easily predicted. Risk modeling, catastrophe and response simulation models are indeed getting tested this month with Harvey and now Irma. As actual data on the storm damage flows in, companies like EigenRisk updated risk modeling forecasts to real-time alerts of loss assessments.
Modeling extreme events
RMS, a global risk modeling and analytics firm, posted preliminary estimates of potential economic losses caused by wind, storm surge, and catastrophic inland flooding from Hurricane Harvey to be as high as $70-90 billion, although later estimates ranged still higher. Most of these losses will be uninsured due to inland flooding and limited private flood insurance options. Over seven million properties valued at $1.5 trillion dollars are located in flooded areas. This is many times larger than devastating Hurricane Katrina.
Unprecedented levels of rainfall, 51.9 inches of rain in several days, caused widespread flooding. Harvey has already broken all United States records for tropical storm-driven extreme rainfall – far exceeding Allison in 2001, along with Claudette in 1979, and Amelia in 1978, not only in volume but also regional extent. Scientists say traditional analytical models of hurricane strength don’t accurately predict destruction. A proposed new scale called the Cyclone Damage Potential Index better projected Harvey’s damage outcome.
Image: Shutterstock/sasa kadrijevic
Don’t remove outliers
Experienced analytics professionals know the best analytic techniques are still imperfect. Even though historical data is used in baselines, extreme situations may not be present in the data being modeled. Analysts designing models might underestimate worst and best case variable values. If you are not including outlier events, your risk models may not be reliable. Per Douglas Hubbard, author of How to Measure Anything: Finding the Value of Intangibles in Business, the frequency of rare catastrophic events is much higher than most models assume.
Data-driven decision makers are constantly faced with uncertainty, ambiguity, variability and even extremes. In the case of Harvey flooding scenarios, probabilistic Monte Carlo and geospatial risk modeling simulation techniques did outperform predictions from old rainfall-runoff models or Design Event Approaches that ignored probability. Modern probability risk models based on the principles of event variables being random in nature versus fixed values have progressed illustrations of potential variability. From what we witnessed, it is evident more investment in risk modeling and early warning systems is needed.
Using risk models in your business
All analytics professionals should learn about risk modeling to apply it to their own domains. By using techniques like Monte Carlo Simulation, you not only get insight into what could happen in the future. You also get invaluable insight into what actions can be taken to mitigate risks.
Monte Carlo Simulation modeling utilizes value ranges for non-deterministic or unknown variables with probability theory to create thousands of “what-if” scenarios. It is considered a stochastic analysis technique that is often used in stock market analysis, financial services, software development, military, healthcare, utilities, transportation, pricing and research. Simulation helps in situations where many decision model factors have inherent uncertainty such as weather conditions, supplier costs, unknown market demand and competitor pricing.
Computer automated simulation models substitute analyst provided ranges of random variable values according to expected statistical probability distributions. It is the variable probability distributions and evaluation across all possible combinations of these variables that allow for realistic evaluation of scenario uncertainty.
One of the most actionable aspects of Monte Carlo simulation is Sensitivity Analysis. Sensitivity Analysis provides a ranked shortlist of variables that have a statistically significant impact on outcomes. The knowledge gained from Sensitivity Analysis aids decision makers in managing influencing variables versus wasting time or money on irrelevant areas.
If you do analyze data for a living, Monte Carlo simulation should be in your repertoire of analytical skills. Advances in GPU-powered computing, augmented analytics, automated feature engineering, deep learning and other areas should also enhance catastrophe risk models that we rely on to ultimately save more lives.
Jen Underwood, founder of Impact Analytix, LLC, is a recognized analytics industry expert. She has a unique blend of product management, design and over 20 years of "hands-on" development of data warehouses, reporting, visualization and advanced analytics solutions. In ... View Full Bio
How Enterprises Are Attacking the IT Security EnterpriseTo learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
Digital Transformation Myths & TruthsTransformation is on every IT organization's to-do list, but effectively transforming IT means a major shift in technology as well as business models and culture. In this IT Trend Report, we examine some of the misconceptions of digital transformation and look at steps you can take to succeed technically and culturally.