The already-devastating hurricane season highlights the need for predictive analytical models that account for exceptional events.

Jen Underwood, Impact Analytix

September 7, 2017

3 Min Read
Image: Shutterstock/sasa kadrijevic

Record-shattering tropical storm Harvey proved once again Mother Nature is an extraordinary force that cannot be easily predicted. Risk modeling, catastrophe and response simulation models are indeed getting tested this month with Harvey and now Irma. As actual data on the storm damage flows in, companies like EigenRisk updated risk modeling forecasts to real-time alerts of loss assessments.

{image 2}

Modeling extreme events

RMS, a global risk modeling and analytics firm, posted preliminary estimates of potential economic losses caused by wind, storm surge, and catastrophic inland flooding from Hurricane Harvey to be as high as $70-90 billion, although later estimates ranged still higher. Most of these losses will be uninsured due to inland flooding and limited private flood insurance options. Over seven million properties valued at $1.5 trillion dollars are located in flooded areas. This is many times larger than devastating Hurricane Katrina.

Unprecedented levels of rainfall, 51.9 inches of rain in several days, caused widespread flooding. Harvey has already broken all United States records for tropical storm-driven extreme rainfall – far exceeding Allison in 2001, along with Claudette in 1979, and Amelia in 1978, not only in volume but also regional extent. Scientists say traditional analytical models of hurricane strength don’t accurately predict destruction. A proposed new scale called the Cyclone Damage Potential Index better projected Harvey’s damage outcome.

Don’t remove outliers

Experienced analytics professionals know the best analytic techniques are still imperfect. Even though historical data is used in baselines, extreme situations may not be present in the data being modeled. Analysts designing models might underestimate worst and best case variable values. If you are not including outlier events, your risk models may not be reliable. Per Douglas Hubbard, author of How to Measure Anything: Finding the Value of Intangibles in Business, the frequency of rare catastrophic events is much higher than most models assume.

Data-driven decision makers are constantly faced with uncertainty, ambiguity, variability and even extremes. In the case of Harvey flooding scenarios, probabilistic Monte Carlo and geospatial risk modeling simulation techniques did outperform predictions from old rainfall-runoff models or   Design Event Approaches that ignored probability. Modern probability risk models based on the principles of event variables being random in nature versus fixed values have progressed illustrations of potential variability. From what we witnessed, it is evident more investment in risk modeling and early warning systems is needed.

Using risk models in your business

All analytics professionals should learn about risk modeling to apply it to their own domains. By using techniques like Monte Carlo Simulation, you not only get insight into what could happen in the future. You also get invaluable insight into what actions can be taken to mitigate risks.

Monte Carlo Simulation modeling utilizes value ranges for non-deterministic or unknown variables with probability theory to create thousands of “what-if” scenarios. It is considered a stochastic analysis technique that is often used in stock market analysis, financial services, software development, military, healthcare, utilities, transportation, pricing and research. Simulation helps in situations where many decision model factors have inherent uncertainty such as weather conditions, supplier costs, unknown market demand and competitor pricing.

Computer automated simulation models substitute analyst provided ranges of random variable values according to expected statistical probability distributions. It is the variable probability distributions and evaluation across all possible combinations of these variables that allow for realistic evaluation of scenario uncertainty.

One of the most actionable aspects of Monte Carlo simulation is Sensitivity Analysis. Sensitivity Analysis provides a ranked shortlist of variables that have a statistically significant impact on outcomes. The knowledge gained from Sensitivity Analysis aids decision makers in managing influencing variables versus wasting time or money on irrelevant areas.

If you do analyze data for a living, Monte Carlo simulation should be in your repertoire of analytical skills. Advances in GPU-powered computing, augmented analytics, automated feature engineering, deep learning and other areas should also enhance catastrophe risk models that we rely on to ultimately save more lives.

About the Author(s)

Jen Underwood

Impact Analytix

Jen Underwood, founder of Impact Analytix, LLC, is a recognized analytics industry expert. She has a unique blend of product management, design and over 20 years of "hands-on" development of data warehouses, reporting, visualization and advanced analytics solutions. In addition to keeping a constant pulse on industry trends, she enjoys digging into oceans of data. Jen is honored to be an IBM Analytics Insider, SAS contributor, former Tableau Zen Master, and active analytics community member.

In the past, Jen has held worldwide product management roles at Microsoft and served as a technical lead for system implementation firms. She has launched new analytics products and turned around failed projects. Today she provides industry thought leadership, advisory, strategy, and market research.

Jen has a Bachelor of Business Administration - Marketing, Cum Laude from the University of Wisconsin, Milwaukee and a post-graduate certificate in Computer Science - Data Mining from the University of California, San Diego.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights