Why P&C insurers need a better way to operationalize hazard data

by Monique Nelson on August 13, 2019

Data Frequency

Our industry is facing two major problems related to hazard data:

  1. There are more hazard and event data providers producing higher resolution footprints for a larger number of catastrophic events than ever before.
  2. This new quantity and quality of data is difficult (and, in some cases, impossible) for insurers to process fast enough to deploy timely response to their insureds.
If these problems sound all too familiar, you’re not alone. At SpatialKey, working with our clients has highlighted a consistent struggle that many insurers are facing: there is a gap between the wealth of data available and a carrier’s ability to quickly process, contextualize, and derive insight from it. Carriers who try to go-it-alone by relying on in-house data teams, may find that they’re spending more time operationalizing data than deriving value from it, particularly during time-sensitive events.

Catastrophe data has evolved tremendously with our data partners, such as KatRisk, JBA, and Impact Forecasting, becoming more agile and producing outlooks, not only during and after events, but well ahead of them. We’re seeing a push among our data partners to be first to market with their forecasts as a means to establish competitive advantage. And, while this data race has the benefit of generating more information (and views of risk) around a given event, it also creates a whole lot of data for you, as a carrier, MGA, or broker, to keep up with and consume.

Three key considerations that arise while operationalizing data during time-sensitive events include:

  1. Continuous file updates make it difficult to keep up with and make sense of data
  2. Processing sophisticated data requires a new level of machine power, and without it, you may struggle to extract insights from your data
  3. Overworking key players on your data and/or GIS team leads to back logs, delays, and inefficiencies

1) Continuous file updates throughout the life of an event

File updates can bring you steps closer to understanding the actual risk to your portfolio and potential financial impact when an event is approaching and/or happening. At the same time, they can make it exceedingly difficult for in-house data teams and GIS experts to keep pace and understand what has changed in a given model. Data providers, like KatRisk for example, are continuously refining their forecasts (see below) as more information becomes available during events, such as last year’s hurricanes Michael and Florence.  

KatRisk Florence inland flood

Using SpatialKey’s slider comparison tool, you can see KatRisk’s initial inland flood model for Hurricane Florence on the left, compared with the final footprint on the right. This prolonged flooding event lead to multiple updates from KatRisk, enabling insurers to gain a solid understanding of potential flood extents throughout the eventand well in advance of other industry data sources.

To give the issue of frequency more perspective, over the course of Hurricane Florence,  SpatialKey received five different file updates from just one data provider. That means for the data partners that we integrate with during an event like Hurricane Florence, we load upwards of 30 different datasets into SpatialKey! If you’re bringing this type of data processing in-house, it’s both time-consuming and tedious; and regrettably, in the end, you may end up with limited actionable information because you can’t effectively keep up with and make sense of all the data.

A solution that supports a data ecosystem and interoperability, like SpatialKey, creates efficiencies and eases the burden of operationalizing data, especially during back-to-back events like we’ve seen the last two hurricane seasons.

2) Hazard data sophistication

Beyond just keeping up with the sheer volume of data during the course of catastrophe events, being able to process high-resolution models and footprints is now a requirement. While SpatialKey has been architected to process large, sophisticated files, many legacy insurance platforms cannot consume the quality and resolution requirements that today’s data providers are churning out.

High-resolution files are massive and a challenge to work with, especially if your systems were not designed for the size and complexity of these files. If you’re attempting to work with them in-house, even for a small-scale singular event, it requires a lot of machine power. The most sophisticated organizations will struggle to onboard files that are 5-, 10- or 30- meter resolution, such as the KatRisk example above. And, doing so could make the model prohibitive, meaning you’ll have spent time and money on data that you won’t be able to use.

3) Dependency on in-house GIS specialists

The job of 24/7 data puts an enormous strain on data teams, especially during seasons where back-to-back events are common. For example, during hurricanes Michael and Florence, our SpatialKey data team processed and made available more than 50 different datasets over the course of four weeks. This is an intense effort with all hands on deck. Insurers who lack the expertise and resources to consume and work with the sheer volume and complexity of data that is being put out by multiple data providers during an event may find the effort down-right grueling—or even impossible.

Additionally, an influx of data can often mean overworking a key player on your data and/or GIS team, leading to back logs and delays in making the data consumable for business users who are under pressure to report to stakeholders and understand financial impact—while pinpointing impacted accounts.

Fortunately, with a solution like SpatialKey, the role of a data team can be easily outsourced so your insurance professionals can go about analyzing, managing, and mitigating risk.

It’s time to automate how you operationalize data   

As catastrophe events grow in frequency and severity, it’s time to explore how you can easily integrate technology that will automate the process of operationalizing data. SpatialKey enables expert data from disparate sources and puts it into usable formats that insurers can instantly derive insight from and deploy throughout their organizations.

Imagine how much time and effort could be diverted toward extracting insight from data and reaching out to your insureds rather than processing it during time-critical events. There’s an opportunity cost to the productivity that your team members could be putting elsewhere. With a solution like Spatialkey, data is readily consumable for your risk professionals, so they can focus on what matters most: understanding the financial impact, mitigating losses, and making customers happy through proactive outreach.

Check back next week for Part 3 of this series where we’ll quantify the actual time and inefficiencies involved in a typical manual event response workflow. Subscribe below to ensure you don't miss it! 

Subscribe

For a complete overview of how to make your event response operations run like clockwork, get the guide.

How can SpatialKey help you spend less time processing data and more time on driving better decisions? Reach out to me directly: monique.nelson@spatialkey.com

Monique Nelson has an extensive background serving the insurance industry with 11 years in various business development roles at both SpatialKey and CoreLogic. Monique has a passion for the industry and holds a masters degree in actuarial science from Boston University. She currently runs SpatialKey's partnership program as Director of Data Product Management.  

Topics: Event response, hazard data, catastrophe risk management

Popular Posts

Visit our resources page for quick access to our P&C guides and webinars.