Expertise can mean many things. Our last post from our CTO Brandon Purcell, How collaborating on technology changes the buy vs. build debate once and for all, touched on this important, but often overlooked part of this equation. Brandon pointed out that to obtain a solution that truly helps you understand your data and identify the insights you need to make decisions, you have to collaborate with experts. I wholeheartedly agree. But, I would add that you have to use the right kind of expert—with the right skill set—to successfully build and deploy a data analytics solution that will be readily adopted by your team.
What do I mean by that statement? Let me elaborate. Just because I understand the metrics used to describe hurricanes doesn’t mean I have the knowledge or the ability to collect wind speed data and construct a footprint. As a result, it makes absolute sense for me to rely on others who are far better equipped to supply this critical data to me. And yet there are insurers out there who instead of leveraging expert data, will attempt to reinvent the wheel or go without.
I suppose it makes sense in some regard. Who else knows what insurers need when it comes to data better than the insurers themselves? After all, insurance is a complex business with each company developing a unique workflow that suits the geography, lines of business, and culture of their operations. (I should know, I’ve been helping to develop products specifically for insurers for the past two decades.)
But with all due respect, I have to say that while you absolutely know insurance, you don’t have the same expertise in technology, especially for managing the increasing amounts of data you need to make decisions every day. Portfolio data, claims data, hazard data—the volume and breadth of data is growing exponentially. To keep up, you have to leverage the experts in data technology.
Let’s consider managing hazard data for a moment. First, you have to set up and manage the licensing agreements to access data from the provider. But once you have the agreement in place, license restrictions may limit how many people on your team can access the data, so you may not be able to easily scale to meet changing volumes or demands. Then, once you have access, you need to make it available to the right people in your organization. With large amounts of hazard or other kinds of data, this is no trivial matter.
Simple choices about how to warehouse the data and in what format to store it can affect the utility your company can get out of it. If the data is hard to use, you may fail to achieve the benefits you were seeking in the first place. In the end, you will have a variety of costs on the front-end just to have access, as well as maintenance costs to manage and update the data. And that’s before you even begin interpreting the data for underwriting decisions or comparing it to your own portfolio data. Not good.
Think about it this way. If you’re a property and casualty writer, and a client asks you for a workers compensation insurance policy, what do you do? You refer them to an underwriter who specializes in workers comp. Why? Because insuring employees for work-related injuries comes with a host of different concerns than insuring a building. Even inside your own area of expertise, you wouldn’t send an underwriter to do a claims adjuster’s job or have a customer service representative analyze your portfolio, would you?
This kind of expertise also applies to the data you need to drive critical decisions every day. Providers specialize in different types of data just like you specialize in different lines of insurance. JBA Risk Management, for example, is known for hazard mapping and catastrophe modeling for flood. Location, Inc., on the other hand, focuses on providing granular data for crime risk. By specializing, each provider can offer more valuable insights within their chosen area of expertise to the benefit of the insurers who partner with them.
As insurance experts, you might think your internal IT department has the design expertise to provide an optimal user experience when making standalone tools. But with what could amount to terabytes of hazard data, which is the complex kind of data we’re really talking about, your solution has to be able to handle geospatial lookups, as well as apply scoring matrices for one or more hazard layers. Connecting those dots isn't something an IT person who doesn’t specialize in Geographic Information Systems (GIS) or design can do.
On the other hand, choosing to work with the experts who specialize in making data easy to digest ensures you get the high-quality, durable solutions—backed by a dedicated Research and Development team—you need to stay ahead of your competitors. In addition, software from these kinds of experts can provide a better user experience while meeting the specific needs of each department and the broader needs of your entire enterprise.
More importantly, partnering with software experts that can provide an out-of-the-box solution tailored for the insurance workflow means you don’t have to wait to begin making better, more informed decisions. A top-tier insurer recently discovered this for themselves. Because they decided to purchase an insurance analytics solution, they discovered they could have avoided a $1 million claim. If they had decided to try and build something in-house to understand this important data, it likely would have taken far too long, and even worse, may never have been finished in order to uncover the information. So just like that, their decision to collaborate with an outside expert helped them develop a competitive advantage. Not to mention, the software virtually paid for itself.
The truth is that staying on top of all of this is complicated. After all, there’s never been more data with which to inform your decisions. As the number of specialized data providers continues to grow, your internal IT team can’t easily stay on top of bringing together all of the data sources, managing provider license agreements, and performing routine maintenance to keep everything working smoothly. At the same time, managing internal claims and portfolio data is challenging as well. To compete successfully, you need to be able to quickly visualize and understand what all of this information is really telling you so you can act on it...you need insight at your fingertips.
At SpatialKey, we collaborate with data providers to ensure our clients don’t ever have to be bothered with that kind of administrative overhead and costly infrastructure and maintenance. (Just sayin’.) Plus, we’ve already done the due diligence to ensure we’re working with high-quality data providers. That means accessing data is a snap for SpatialKey customers who aspire to write better risk, respond quickly to catastrophes, and build resilient portfolios.
From a purely business perspective, executives and shareholders expect your underwriters, exposure managers, and claims managers to inform their approach and make educated decisions with relevant data. To set your teams up for success—and enable them to focus on the specialty that drives your bottom line—you need them to spend less time finding and managing data and more time interpreting and acting on it. The added bonus? When multiple departments use the same data sources for their decision making, you gain consistent understanding and efficiency across your entire organization.
Today, every insurer understands the importance of being a data-driven enterprise. But to truly achieve this goal, you have to depend on the right partners, ones who specialize in simplifying how you access, interpret, and analyze data, whether it’s your own or from a third party. Being able to quickly access and interpret expert data sources is what will set you apart, and keep your business competitive today and for years to come.