We're sorry but your browser is not supported by Marsh.com

For the best experience, please upgrade to a supported browser:


Risk in Context

How Better CAT Modeling Data Enables Smarter Risk Capital Decisions

Posted by Michael Ducey October 29, 2015

Three years ago this week, Superstorm Sandy hit the East Coast pushing ashore the massive storm surge that caused most of the damage. Because the storm surge was larger than predicted, modeling firms have since updated their catastrophe (CAT) models.

However, such post-storm adjustments don’t remove the human factor. And models are only as good as the data entered into them. Despite significant investments over the past decade, developing high-quality modeling data remains a challenge.

Unfortunately, getting data right — especially across a large property portfolio — requires administrative resources and technical skills that many companies don’t have. For example, many still submit inaccurate and incomplete data about their properties’ primary and secondary characteristics, resulting in inaccurate projections of catastrophe losses.

Those projections are the foundation of insurers’ calculations to price property insurance and of insureds’ decisions about program limits, retentions, and other terms and conditions. Which means that poor data can lead to valuable capital being misused, for example, by overpaying for insurance. Or it could lead to being underinsured and overexposed, potentially leading to extensive, unforeseen recovery costs after a loss.  

But there’s good news. Several tools and resources are available to property owners to improve data quality and the accuracy of modeling results. For example, you can improve data quality through:

  • A natural hazard data management program. Review your portfolio to identify changes in the portfolio and related characteristics since your last catastrophe modeling exercise. You should also maintain and update natural hazard data in the same way that you maintain and update property values and construction, occupancy, protection, and exposure (COPE) data.
  • A pre-modeling engineering assessment. This can help you identify deficiencies and gaps in primary data and the secondary attributes and formulate a plan to gather high-priority data needed to help quantify risk.
  • An initial model assessment. A broker’s data verification tools can help you identify any items that may be inaccurate or incomplete.
  • An engineering review. Site construction drawings, discussion with internal and external real estate resources, and loss prevention data are all good starting points where natural hazard engineering specialists can begin to validate existing property data and fill in many gaps.
  • Site visits. Engineering experts can perform detailed construction characteristic evaluations at high-impact or highly exposed facilities to validate and fill in remaining gaps in your data.

Your risk advisor should have access to construction and engineering experts who can help you validate and update your primary and secondary modifier data, leading to less uncertainty in catastrophe models. That usually will translate into lower estimates of your catastrophic loss, premium savings for your organization, and improved recovery potential.

For more on this topic, read A Decade of Advances in Catastrophe Modeling and Risk Financing.

Related to:  Marsh Risk Consulting

Michael  Ducey

Mike Ducey is Marsh Risk Consulting’s Chicago office’s Property Risk Consulting Practice leader. Mike manages a group that specializes in property-related consulting.