Bridging the gap: Why data is a problem...and an opportunity

by Bret Stone on March 16, 2018

the data gap in insuranceAt industry events I generally come away invigorated by my conversations with clients and prospects. This year, however, the energy has been a bit different. That energy has had more urgency and emotion behind it. It’s clear the unprecedented events of 2017 took a toll on people, and there’s a compelling need to do something about it (especially with the 2018 hurricane season just around the corner). 

Individuals and teams alike worked tirelessly during the catastrophes of 2017; and while the events have passed, the emotional fatigue is still there. I can empathize. While insurers worked diligently to serve insureds during back-to-back events, at SpatialKey, we worked around-the-clock to serve up timely, expert data to our insurance clients. The job of 24/7 data put an enormous strain on our own employees—and we have a dedicated data team! Insurers, who lack the expertise or resources to consume and work with the sheer volume and complexity of data that was being put out by multiple data providers, may have found it grueling. That exhaustion still lingers in the faces of the people I speak with at industry events.

And, what’s bubbling to the surface now is the underlying problem:

There’s a gap between the wealth of data now available and insurers’ ability to quickly process, contextualize, and derive insight from that data.

Not just an event response problem.

While this transforming data into insight problem was illuminated by 2017’s catastrophic events, this is not just an event response problem. This is not an underwriting problem. This is not a new problem! Events like those of 2017 touch the entire insurance community—insurers and solutions providers alike. And together we need to solve the problem.

What I've heard time and again is that everyone is generally frustrated by a lack of process and an easy way to consume the frequent and sophisticated data that expert providers are putting out during events like Harvey, Irma, Maria, the Mexico City earthquake, and the California wildfires. Insurance professionals are expected to use legacy or complex GIS tools to extract and consume expert data from providers. It doesn’t make sense.

There’s an opportunity cost to the productivity employees could be putting elsewhere.

Nobody has the time to teach themselves a complicated GIS solution to look at data when they’re working to deploy help to their customers in the wake of catastrophe.

No underwriter has the time to get up to speed on a GIS solution that takes years to learn when they’re trying to win business quickly.

It’s like giving your star quarterback a basketball and expecting him to win the Super Bowl with it. He’s talented, he can throw that ball, but he’ll never throw a winning pass with a basketball. It’s clunky, it’s cumbersome, and it just doesn’t fly as fast. In the same way, folks across claims, exposure management, and underwriting can’t quickly consume and understand data with legacy or complex tools that weren't created for their specific use cases.

With all the data comes challenge, and a call for ways to interpret information more efficiently.

We’d all like to think 2017 was an anomaly. That we won’t have a replay of such extreme events, however, it may only be a precursor of what’s to come. Even so, the insurance industry is poised to handle events like these better than ever before because there’s now a wealth of expert data and models. That’s a good thing, and it energizes me! Data quality and modeling is becoming better all the time—more accuracy, better science, higher resolution—as we can attest to working with providers like NOAA, USGS, KatRisk, JBA, RedZone, Swiss Re, Impact Forecasting, and HazardHub. But, with all this data choice comes challenge. And a call for ways to interpret information more efficiently. We know it’s possible because we see our insurance clients succeeding every day when it comes to accessing, analyzing, and interpreting data within SpatialKey. While late 2017 was exhausting and overwhelming, it was also inspiring. I’m inspired to see so much data come to life in platforms like ours at SpatialKey, and energized to see how empowering it is for the people using it.

InsurTech fills the data gap

Insurers, don’t try to solve this problem alone.

The solution is collaboration: partnering with experts who have technology purpose-built to consume data quickly and produce intelligence that insurers can readily act on. And, I’m not advocating collaboration because I’m at the helm of a company who fills this data gap. I have seen a lot of pain in the faces of my insurance friends, and there’s quite honestly just a simple way to solve this.

Processing information is a basic need that has become incredibly complex and time consuming for insurers.

This can be easily outsourced, so insurance professionals can go about analyzing, managing, and mitigating risk. Insurers have an opportunity right now (before this year's hurricane and wildfire seasons) to empower underwriters with the intelligence they need to keep losses on the scale of 2017 from happening again—and to empower them to understand data without complex GIS solutions. 

Start now. Your shareholders will thank you later.


We're here to help with your data needs. Contact us

Topics: property underwriting, underwriting software, hazard data

Popular Posts

Visit our resources page for quick access to our P&C guides and webinars.