Whether you buy or build, getting ahead depends on leveraging the right experts

Posted on October 19, 2016 by Derek Blum



Expertise can mean many things. Our last post from our CTO Brandon Purcell, How collaborating on technology changes the buy vs. build debate once and for all, touched on this important, but often overlooked part of this equation. Brandon pointed out that to obtain a solution that truly helps you understand your data and identify the insights you need to make decisions, you have to collaborate with experts. I wholeheartedly agree. But, I would add that you have to use the right kind of expert—with the right skill set—to successfully build and deploy a data analytics solution that will be readily adopted by your team.

What do I mean by that statement? Let me elaborate. Just because I understand the metrics used to describe hurricanes doesn’t mean I have the knowledge or the ability to collect wind speed data and construct a footprint. As a result, it makes absolute sense for me to rely on others who are far better equipped to supply this critical data to me. And yet there are insurers out there who instead of leveraging expert data, will attempt to reinvent the wheel or go without.

I suppose it makes sense in some regard. Who else knows what insurers need when it comes to data better than the insurers themselves? After all, insurance is a complex business with each company developing a unique workflow that suits the geography, lines of business, and culture of their operations. (I should know, I’ve been helping to develop products specifically for insurers for the past two decades.)

But with all due respect, I have to say that while you absolutely know insurance, you don’t have the same expertise in technology, especially for managing the increasing amounts of data you need to make decisions every day. Portfolio data, claims data, hazard data—the volume and breadth of data is growing exponentially. To keep up, you have to leverage the experts in data technology.

Let’s consider managing hazard data for a moment. First, you have to set up and manage the licensing agreements to access data from the provider. But once you have the agreement in place, license restrictions may limit how many people on your team can access the data, so you may not be able to easily scale to meet changing volumes or demands. Then, once you have access, you need to make it available to the right people in your organization. With large amounts of hazard or other kinds of data, this is no trivial matter.

Simple choices about how to warehouse the data and in what format to store it can affect the utility your company can get out of it. If the data is hard to use, you may fail to achieve the benefits you were seeking in the first place. In the end, you will have a variety of costs on the front-end just to have access, as well as maintenance costs to manage and update the data. And that’s before you even begin interpreting the data for underwriting decisions or comparing it to your own portfolio data. Not good.

Think about it this way. If you’re a property and casualty writer, and a client asks you for a workers compensation insurance policy, what do you do? You refer them to an underwriter who specializes in workers comp. Why? Because insuring employees for work-related injuries comes with a host of different concerns than insuring a building. Even inside your own area of expertise, you wouldn’t send an underwriter to do a claims adjuster’s job or have a customer service representative analyze your portfolio, would you?

This kind of expertise also applies to the data you need to drive critical decisions every day. Providers specialize in different types of data just like you specialize in different lines of insurance. JBA Risk Management, for example, is known for hazard mapping and catastrophe modeling for flood. Location, Inc., on the other hand, focuses on providing granular data for crime risk. By specializing, each provider can offer more valuable insights within their chosen area of expertise to the benefit of the insurers who partner with them.

As insurance experts, you might think your internal IT department has the design expertise to provide an optimal user experience when making standalone tools. But with what could amount to terabytes of hazard data, which is the complex kind of data we’re really talking about, your solution has to be able to handle geospatial lookups, as well as apply scoring matrices for one or more hazard layers. Connecting those dots isn't something an IT person who doesn’t specialize in Geographic Information Systems (GIS) or design can do.

On the other hand, choosing to work with the experts who specialize in making data easy to digest ensures you get the high-quality, durable solutions—backed by a dedicated Research and Development team—you need to stay ahead of your competitors. In addition, software from these kinds of experts can provide a better user experience while meeting the specific needs of each department and the broader needs of your entire enterprise. managing insurance data

More importantly, partnering with software experts that can provide an out-of-the-box solution tailored for the insurance workflow means you don’t have to wait to begin making better, more informed decisions. A top-tier insurer recently discovered this for themselves. Because they decided to purchase an insurance analytics solution, they discovered they could have avoided a $1 million claim. If they had decided to try and build something in-house to understand this important data, it likely would have taken far too long, and even worse, may never have been finished in order to uncover the information. So just like that, their decision to collaborate with an outside expert helped them develop a competitive advantage. Not to mention, the software virtually paid for itself.

The truth is that staying on top of all of this is complicated. After all, there’s never been more data with which to inform your decisions. As the number of specialized data providers continues to grow, your internal IT team can’t easily stay on top of bringing together all of the data sources, managing provider license agreements, and performing routine maintenance to keep everything working smoothly. At the same time, managing internal claims and portfolio data is challenging as well. To compete successfully, you need to be able to quickly visualize and understand what all of this information is really telling you so you can act on it...you need insight at your fingertips.

At SpatialKey, we collaborate with data providers to ensure our clients don’t ever have to be bothered with that kind of administrative overhead and costly infrastructure and maintenance. (Just sayin’.) Plus, we’ve already done the due diligence to ensure we’re working with high-quality data providers. That means accessing data is a snap for SpatialKey customers who aspire to write better risk, respond quickly to catastrophes, and build resilient portfolios.

From a purely business perspective, executives and shareholders expect your underwriters, exposure managers, and claims managers to inform their approach and make educated decisions with relevant data. To set your teams up for success—and enable them to focus on the specialty that drives your bottom line—you need them to spend less time finding and managing data and more time interpreting and acting on it. The added bonus? When multiple departments use the same data sources for their decision making, you gain consistent understanding and efficiency across your entire organization.

Today, every insurer understands the importance of being a data-driven enterprise. But to truly achieve this goal, you have to depend on the right partners, ones who specialize in simplifying how you access, interpret, and analyze data, whether it’s your own or from a third party. Being able to quickly access and interpret expert data sources is what will set you apart, and keep your business competitive today and for years to come.

Posted in Insurance, Technology, Collaboration | Leave a reply

Hurricane Matthew by the numbers: What we know so far

Posted on October 12, 2016 by Heather Munro




Blue skies may be back, but the impact from Hurricane Matthew is still being felt. As your claims team works around the clock managing your event response, information on the extent of the damage is critical to your ability to respond as quickly as possible.

SpatialKey seamlessly integrates the latest event footprints from third-party data providers like KatRisk, Impact Forecasting, and NOAA so you have access to the up-to-date data you need to take action. You’ll be glad to know that footprints for wind, surge and inland flood for Hurricane Matthew from KatRisk are now integrated within SpatialKey and available for use.

Below, we’ve rounded up the latest statistics on the initial losses from Matthew, so you can continue to evaluate the impact of the storm on your book of business.

Risk modelling firm CoreLogic estimates insured losses from Matthew will likely range between $4 and $6 billion. CoreLogic’s initial figure includes losses from wind and storm surge, but not additional flooding, business interruption or building contents. The firm also places the storm’s losses higher than hurricanes Floyd (1999) and David (1979), but well below hurricanes Katrina (2005) and Sandy (2012).

On top of losses from wind and storm surge, early estimates indicate that the evacuations may result in $10 to $15 billion in losses from economic disruption, according to Chuck Watson, a disaster modeler with Enki Research.

Hurricane Matthew was notable for ending a nine-year streak without an Atlantic Basin Category 5 hurricane. The powerful storm, which weakened as it moved from Haiti to the U.S., forced three million people to evacuate. By the time it was over, it had caused coastal erosion, wind damage, and freshwater flooding across five states, including Florida, Georgia, North Carolina, South Carolina, and Virginia. 


 In North Carolina, rising flood waters threatened more than 1,500 people in need of rescuing. Photo credit: Associated Press

In the Caribbean, Matthew reached Category 5 status with peak gusts up to 166 mph. Haiti was the hardest hit, with more than 1,000 people killed. Now, a resulting cholera outbreak threatens to cause further devastation in an area with history of earthquakes and hurricanes and a low insurance penetration rate.

At the height of the storm, about 2.2 million people—1 million in Florida alone— lost power.


 Images from NASA showing power outages during Hurricane Matthew. Photo credit: NASA

Peak wind gusts in the U.S. during Matthew ranged from 69 to 107 mph. At press time, the latest U.S. death toll is 33, but that number could go up as flooding continues over the next few days.

Outside of power loss and severe winds, storm surge was another significant concern. Fernandina Beach, Florida experienced a storm surge of 9.88 feet above normal, and Ft. Pulaski, Georgia encountered record tide levels and a storm surge just under eight feet.

While it’s too early to tally Matthew’s total impact in the U.S., insured losses will undoubtedly increase due to flooding, especially in North Carolina where flooded rivers that wash through farms and coal ash sites may spread toxins through miles of waterways.

The American Red Cross is already spearheading recovery efforts for the many people affected by the powerful event. To support this effort or learn more about how you can help, click here.

In the meantime, as your claims adjusters manage the growing number of claims, remember SpatialKey can help. To find out more or to access the latest Hurricane Matthew data and analyze it against your own portfolio in SpatialKey, contact us today.

Posted in Hurricane, Insurance, Flood, Event response | Leave a reply

How collaborating on technology changes the buy vs. build debate once and for all

Posted on October 11, 2016 by Brandon Purcell



Our last post by CEO Tom Link, It’s time to evolve how you collaborate in a data-driven world, got me thinking about the importance of partnerships. Who you collaborate with to meet your business objectives matters, especially when it comes to technology. Having the right UX, GIS, and R&D resources on your side can mean the difference between amazing usability and software that slows you down.

Prospects often tell me they’re frustrated. Developing software for an industry as complex—and as data driven—as insurance isn’t easy. I’ve been developing software for 18 years. I can tell you that the decision to “buy or build” a solution isn’t as black and white as it sounds.

More than half of IT projects fail, according to CIO Magazine. Lack of product design resources and failing to align outcomes to meet business goals are two reasons why. Solutions never get deployed, don’t meet the business’ requirements, or are so difficult to use that they never get adopted by the users they are supposed to help. No wonder you’re frustrated. I can’t help but think how much time and money have been wasted.

Chances are, you already have some sort of homegrown data analytics system. And you’re considering whether to extend that to add new features or to retire the solution entirely and buy a new one off the shelf. I get why insurers might decide to create their own in-house solution. After all, compared to buying something new, it can seem like there are lower up-front costs. Not to mention that third-party providers haven’t always delivered solutions that keep pace with the demands of your business. Plus, having control over the end product makes it seem more customizable.

But building software is more complicated (and riskier) than it sounds. Especially for something like risk selection analytics for underwriting. These solutions need to support complex hazard models from different kinds of content providers. I've heard from many prospects that building something like that is just “too risky.” That’s because it’s difficult to maintain and scale as new intelligence becomes available in the market.

When it comes to building software, I wish senior management understood a fundamental truth I’ve learned over the years: The true costs go up significantly when you try to do it all on your own.

In-house development costs are still costs

Spending your budget in-house can often feel like you are saving money. After all, you have the hardware, the programmer’s time is already paid for, and you have a trusted team of folks ready to get the job done. Not so fast. If your company is using this type of thinking, you are (or will be) in for a surprise.

Development is much more expensive than you might think. The real dollar amount you need to consider includes: the cost of your development team, the time taken from your business operations during all phases of development, and the opportunity cost of the work not done on another IT project. Of course, this only applies if you have the expertise within your in-house team to build and add the desired features to your existing solution.

I can’t tell you how many times I’ve seen internal projects end up costing way over the estimate. Or worse, the in-house IT team doesn’t have the expertise to deliver the final functionality. The truth is that developing the solution in-house is never a one-and-done cost.

Once it’s built, you still have to budget for maintenance

Maintaining an application and keeping it running is an expensive proposition. Software has bugs and requires a team to address them as they arise. Once a solution is built, the team typically moves on to another project. I’ve even seen cases where the solution’s primary developer leaves the company. With no one dedicated to keeping the solution running smoothly, your users are often stuck having to figure out a workaround and can lose valuable time, which as you know, is another cost to your business.

At the same time, once your users get the solution in their hands, you can bet they’ll ask for new features and capabilities in future releases. Evolving an in-house solution from version to version also requires a dedicated team—and if your internal resources are on to the next IT project, your users are left with a version 1.0 with limited functionality. If they’re using your solution at all.

An annual enterprise software license can cost upwards of $100,000 a year and yes, over ten years, that’s $1 million. But unlike building in-house, included in that price is maintenance, updates, support, and most importantly, a collaborative partner who understands that you want software to solve your headaches, not cause them. You also get the added benefit of having someone outside your company focused solely on meeting your deliverables on time and on budget.

The right partnership delivers returns on your investment

Collaborating with the right partner gives you a competitive edge. (Stay tuned for more on this—our next blog post will cover collaborating with the right experts and partners to get ahead.) When you can get a more advanced solution faster and for less money, you have to ask yourself: Why would you build one on your own? I like to tell clients that building your own solution is a lot like building your own car from scratch, costly and time consuming. Ultimately, it’s faster, easier, and less expensive to buy a car with the features you like than to try to design and build one yourself.

Just as you might choose a car with a back-up camera or built-in Bluetooth interface, you have options today that allow you to adapt and integrate your existing systems across your unique enterprise. Hybrid solutions excel by leveraging the best of your in-house solution and integrating to third party solutions through APIs. In short, you can have the flexibility you want, while lowering long-term costs. You can also be up and running a heck of a lot faster.

Ideally, integrating your core underwriting system with your advanced risk selection solution gives your business the power to act on information quickly. Let’s say you have an in-house underwriting solution that handles your complex rules and workflow. And, you need to modernize and select better risks using risk models for flood, hail, and tornado. Building a system to integrate this third-party data, perform spatial lookups, and maintain the risk data will require a robust and flexible infrastructure and sophisticated geospatial analytics.

That’s, in part, because the amount of data available—from government sources, third-party providers, and various risk models—is growing at an exponential rate.  Managing, manipulating, and sharing this information efficiently across your organization so it’s easy to understand and act on will take advanced systems that can handle the load. While this sounds simple enough to build, it's pretty complex. But by leveraging an outside solution—one that gives you access to data in a workflow tailored to your needs—it’s possible to have the best of both worlds. Really.

Working with software experts, you can also create economies of scale. At SpatialKey, we develop purpose-built solutions that bridge the gap between underwriting, exposure management, and claims, so our clients can easily make decisions from the same baseline of understanding. When you tap into a third-party solution designed specifically for insurance, you and your internal IT team no longer have to reinvent the wheel. And unlike purchasing a car, your solution will just keep getting better and better over time, thanks to seamless updates by your provider.

So the next time you debate the merits of building or buying a solution, consider all of the shades of gray that will enable you to truly meet your business objective. Who you choose to collaborate with has to be part of your discussion. Partnering with software experts, instead of building in-house, is a clear choice that will keep the wheels of your business turning and more than pay for itself in the long run.
Posted in Insurance, Technology, Collaboration | Leave a reply

Hurricane Matthew: 3 best practices your claims team can use this weekend

Posted on October 7, 2016 by Heather Munro



Photo credit: Mark Wilson via Getty Images.

Now that Hurricane Matthew is hovering about 35 miles off Florida’s east coast, it’s time for your claims team to begin estimating potential exposure and jump into action. The category 3 hurricane is expected to pack a punch this weekend with a powerful combination of storm surge flooding, rainfall flooding, and destructive winds impacting areas from northeast Florida to the southern part of North Carolina. Jacksonville, Savannah, and Charleston are expected to be hard hit. While it’s too early to say what will happen, the level of risk is clearly elevated and insurers are on high alert.

At the same time, tropical storm Nicole picked up speed near Bermuda yesterday afternoon, setting a new record. This is the latest time of year that two storms in the North Atlantic Ocean have had winds over 105 mph simultaneously, according to NASA’s Goddard Space Flight Center.

As Hurricane Matthew evolves, you can access the current storm path in SpatialKey and adjust your response strategy as needed. Now’s the time to begin gearing up for the surge in claims, informing senior management, stakeholders, and investors about the possible financial impact, and most importantly, staying in touch with and providing relief to your customers.

While you are mobilizing your claims team to handle the region’s most powerful storm in nearly ten years, we’ve put together three best practices to help you proactively prepare for the incoming claims from Hurricane Matthew.

1. Understand your exposure and estimate potential claims costs. Now that it’s clear Matthew will cause flooding and wind damage in Florida, Georgia, and the Carolinas, it’s critical to get a sense of the storm’s magnitude and its potential impact to your portfolio. As the storm unfolds, be sure to monitor, analyze, and re-analyze the changing impact—SpatialKey makes it easy to understand portfolio impact throughout the storm. Being able to filter by construction type, location, or line of business with a tool like SpatialKey can also help you understand the drivers of loss and how they contribute to storm-related claims in the aggregate.

Screen_Shot_2016-10-07_at_3.27.13_PM-1.pngQuickly determine your exposure and estimate potential claims costs from Hurricane Matthew within SpatialKey.

2. Communicate the potential financial impact to senior management, stakeholders, and investors.
If your book of business is heavily concentrated in Jacksonville, for example, an area that hasn’t been under a hurricane warning for 17 years, your chief underwriting officer will most likely want to see a summary of the impact. With a click of a button, quickly run reports to visualize the Total Insured Value or Exposed Limits in the affected area and share them with your team using the SpatialKey Hurricane Forecasts event response app.

3. Quickly get in touch with affected customers. While many of your customers may have already evacuated, you can still contact those further inland with tips for preparing their properties to weather the storm. By understanding the path and potential impact of the storm, you can proactively reach out to your customers to demonstrate that you're invested in their well-being, making them more likely to renew when the time comes.

While Matthew is your main concern this weekend, there are nearly two months left in the 2016 Atlantic hurricane season. SpatialKey Hurricane Forecasts can help you manage your response for Matthew, Nicole, and any other hurricanes on the horizon. To find out more about how SpatialKey can help, contact us today.

Posted in Hurricane | Leave a reply

Category 3 Hurricane Matthew barrels toward Southeastern US: How insurers can act now with confidence

Posted on October 5, 2016 by Heather Munro



Screen_Shot_2016-10-05_at_3.33.15_PM.pngEasily track Hurricane Matthew’s storm path in SpatialKey.

As the most powerful Atlantic storm since 2007, Hurricane Matthew is the first Category 3 hurricane to make landfall in Haiti in 52 years. Forecasters are predicting Matthew will continue to head north towards Florida, Georgia, South Carolina, and North Carolina starting Thursday evening. Florida and South Carolina have already begun evacuations, as the states prepare for the possibility of significant wind and storm surge damage.

As you know, meteorologists have some sense of what might happen, but weather patterns can change quickly. The hurricane could strengthen or weaken, head out to sea or reverse course and veer inland. Forecasters identify the most likely track of the storm as it evolves, but the uncertainty increases as they project further into the future. The cone of uncertainty, at it’s typically called, captures the fact that we may see changes in speed, size, severity, and direction.

As the storm unfolds, analyzing Hurricane Matthew’s storm track against your own portfolio data will help you quickly prepare for the event.

  1. Access Hurricane Matthew’s current storm path to outline your plan of action. Once you have the storm path, which is provided by NOAA and seamlessly integrated within SpatialKey, you can track the storm as it’s happening and efficiently manage your event response.
  2. Understand your exposure concentration to not only to determine the potential claims costs, but to effectively mobilize your claims team to respond to insureds. To quantify how much is likely to be exposed to hurricane-force winds, you can filter and segment that data based on characteristics, such as wind speed, construction type, or line of business.
  3. Evaluate the path of the storm and run what-if scenarios to understand the potential impact to your portfolio. As the storm shifts, accessing the latest data will help your team respond to senior management and support business operations as quickly as possible.

It’s important to understand that about one-third of the time, the most intense winds and storm surge impacts associated with the storm will veer outside of the cone of uncertainty, according to the National Hurricane Center. All the more reason to stay on alert—and track the storm— as Hurricane Matthew unfolds. Using  SpatialKey’s Hurricane Forecast app can help you turn the cone of uncertainty into decisive action. 

To find out more about accessing the latest Hurricane Matthew data and analyzing it against your own portfolio in SpatialKey, contact us today.

Posted in Hurricane, Event response | Leave a reply

Summer 2016 catastrophe wrap-up: Understand your exposure and prepare for future events

Posted on September 30, 2016 by Heather Munro



Photo credit: shutterstock.com

With hail, hurricane, and wildfire seasons in full swing, summer is always a busy time for claims adjusters. This past summer was no exception, with several major events happening across the globe.

While California battled multiple, simultaneous wildfires, Hurricane Earl caused mudslides in Eastern Mexico. A tropical storm dropped 30 inches of rain in parts of Louisiana, which experienced its worst flooding since Hurricane Katrina in 2005. The picturesque Italian town of Amatrice was completely destroyed by a magnitude-6.0 earthquake. Most notably, Canada experienced one of the insurance industry’s costliest wildfire events when the Fort McMurray wildfires spread rapidly and caused $2.5 billion in insured losses.

From June to August, 10 different weather disasters each caused more than $1 billion in losses in the U.S., according to CBS News. In addition, Swiss Re reports that global insured losses for the first half of 2016 reached $31 billion, which is 51% larger than the same period from last year.

With so much activity, you were undoubtedly busy determining your exposure to these events and sending adjusters into the field to assess damage and pay claims as quickly as possible. Now that summer is officially over, it’s a good time to look back at these events to examine your losses sustained, improve your claims response efforts, and identify potential loss drivers to better mitigate future claims costs.

Assessing the impact, thinking ahead

No matter if it’s a flood, hurricane, or wildfire, your first priority during an event is to respond quickly to your insureds. Visualizing which of your policyholders are affected within flood footprints, hurricane paths, and wildfire extents is the starting point to getting the information you need to take action.

During the active summer event season, SpatialKey clients used our event response apps, which integrate data from a variety of third-party providers, including KatRisk, JBA Risk Management, Impact Forecasting, Willis Re, NOAA, GeoMAC, and USGS, to manage their claims response efforts quickly and efficiently.

Not only did clients use our apps to understand events as they were happening, they learned valuable insights about their portfolio performance which they can now use to prepare for future events. With SpatialKey’s geospatial insurance analytics, you can compare your actual claims against forecasts, understand the drivers of loss by looking at hazard and location characteristics, and determine if changes are necessary to your underwriting guidelines.

In addition, you can perform a “what-if” analysis to see how a past event would impact your portfolio if it were to strike today. Sharing a snapshot of what you discover with your exposure management team will give them the data they need to determine appropriate actions to maintain or build a resilient portfolio.

Summer’s end may mean the end of hail season, but as you know, weather patterns are far from predictable. Wildfires continue to burn in California. In addition, hurricane season, which will run through November, could still bring plenty of wind and flood damage. SpatialKey’s Hurricane Forecasts and Past Hurricane Scenarios are always available and up-to-date to  help you prepare for the next tropical storm or hurricane.

Now’s the time to look back at what happened this summer, so you can proactively adjust your underwriting strategies and be prepared to take action during the next big event. To learn more about how SpatialKey can help you better understand and respond to events to reduce your claims costs, contact us today.
Posted in Wildfire, Hurricane, Event Analysis, Flood, Hail, Event response | Leave a reply

Could your portfolio weather another Kyrill?

Posted on September 29, 2016 by Derek Blum



Photo credit: scienceblogs.com

With the tenth anniversary of Windstorm Kyrill coming up this January, I can still recall the buzz in the insurance market when it first hit. Insurers were still reeling from a brutal 2005 U.S. hurricane season that brought Katrina, Dennis, Rita, Wilma, and several other large events. So the news of Kyrill came at a time when event response was emerging as both a proactive and reactive discipline, and insurers felt more prepared to respond. The tools they used weren’t quite as sophisticated as what exist today, but they expected to have access to quality data and to be able to provide a rapid supply of information and services to policyholders, management, and investors.

While it wasn’t the largest windstorm ever to impact Europe, it was the most significant storm, in terms of financial impact, since Lothar and Martin in 1999. In addition to 47 deaths, the powerful storm was responsible for more than €4 billion in insured losses (in 2016 euros), according to PERILS.

Nearly ten years later, Kyrill still serves as an important reminder to examine your windstorm exposure carefully, minimize your catastrophe risk wherever possible, and take steps to improve how quickly you can respond to claims. As the Vice President of Product Marketing at SpatialKey, I’m excited to share that our European Windstorm app can help you accomplish these goals in a way that just wasn’t possible back in 2007. The app, which we developed with Willis Re, is releasing just in time for European windstorm season.

Today’s tech can simplify how you respond to risk

An average of six windstorms threaten Europe every year. Not only do you have to be prepared for storm activity that may impact multiple countries throughout the season, you just never know which storm could turn out to be the next Kyrill...or worse.

“When looking over a longer time series, such an event is not rare,” notes Tim Edwards, Head of Europe Catastrophe Analytics, Willis Re. “We estimate that an event the size of Kyrill will hit the European insurance market every five to seven years.”

That’s why it’s critical to evaluate how your portfolio would be impacted—and how quickly your claims team will be able to respond—if a storm of equal or greater severity as Kyrill happened today.


With SpatialKey’s new app, understand the windstorm as it occurs, plus view multiple layers of portfolio and claims data to understand the event and effectively deploy your claims response.

Since Kyrill, advancements in geospatial analytics have made it possible to see exactly where your insureds are located in relation to windstorm forecast data or an event footprint. Today, you can visualize information like this easily on a map and know exactly how to prioritize the deployment of your claims response. If you think that’s cool, these days, you can also understand your exposure and potential claims costs based on insured values or limits by overlaying an event footprint on your portfolio data. That was not an easy feat back in 2007.

Just as important, you can now more easily track a windstorm as it’s happening. SpatialKey and Willis Re partnered with EuroTempest to bring you the windstorm forecast and footprint data you need to estimate potential damage, understand where to deploy claims personnel, and get an early sense of likely losses. As I’m sure you know, your customers’ expectations for timely response have only increased in the past ten years, so advancements like these are critical to maintaining your customer retention.

Since Kyrill, technology isn’t the only thing that’s changed. Your portfolio has likely evolved due to gradual market changes or even through acquisition. Greater population and building density, along with increased coastal erosion may also drive more risk. Evaluating your current portfolio against an actual windstorm footprint offers valuable information about your potential claims costs from a similar event. These ways of analyzing data can offer valuable insights for managing the health of your portfolio against future windstorms.

With European windstorm season just around the corner, now is the time to make sure you have access to the data you need to understand your exposure, respond to claims faster, and proactively handle whatever blows your way from November to March.

SpatialKey European Windstorm can help you manage risk and prepare for incoming claims with expert data and advanced analytics. To learn more about how SpatialKey can help you, contact us today.

Posted in windstorm | Leave a reply

It’s time to evolve how you collaborate in a data-driven world

Posted on September 26, 2016 by Tom Link



Photo credit: shutterstock.com

When I first discovered how well the insurance industry collaborates, I was blown away. As an outsider, learning how agents, brokers, carriers, and reinsurers all work together to identify, write, and share risk was both overwhelming and fascinating. I was impressed at how multiple insurers participate in sharing a single risk and how each does so leveraging its unique strategy and specialty. An industry that initially seemed boring to me became artful and fascinating, even noble. I continually reflect on how this level of collaboration and risk sharing makes tremendously ambitious projects—from skyscrapers to city centers— possible. And yet, surprisingly, this industry that is so fundamentally built upon collaboration seems to lag far behind its peers when it comes to technology innovation and collaboration.

Before starting SpatialKey, I co-founded Universal Mind, a digital agency dedicated to helping businesses figure out how to use technology to differentiate themselves and improve customer experience. SpatialKey is a natural extension of that simple idea, except we focus specifically on helping insurers. As I’ve worked to build companies I’m proud of, I’ve learned some invaluable lessons I think can help everyone participating in the insurance space.

To me, the industry’s call to be more data-driven is a call to collaborate on solutions that simplify the process of interpreting data. After all, consider the caveman. Probably not where you thought I was going with that, right? Bear with me.

Neanderthals died out for one simple reason: they failed to learn how to collaborate with groups outside of their tribe. Homo erectus, on the other hand, thrived because of their ability to trade their specialized skills with outside groups, according to Matt Ridley, author of The Evolution of Everything.

Today, it’s really not all that different. Our economy is built on this simple exchange of skills, and I’m betting that’s hardly surprising to anyone. But I see business leaders thinking they can go it alone when it comes to technology in insurance. In talking with clients, I hear a lot of stories about data being difficult to interpret, contextualize, or provide in a timely fashion to decision makers. I hear about internal teams building systems that stray from their areas of expertise, and a cry for a way to get work done faster (not to mention cheaper). I can’t help thinking we’re ignoring some of the lessons of our ancestors. This is why I deeply believe in the power of focus, specialization, and exchange. The only way to survive—and let’s face it, evolve—is to hone our own skills and have the good sense to collaborate with other experts along the way.

I invite you to consider that outsourcing to experts is something you already do. You may use third-party administrators to manage claims adjustments during an event or underwriters and managing agents who specialize in writing certain regions and classes. So it makes sense to do this with the data you use to drive your business forward, right?

Data, to put it simply, has no value unless you understand it. Over the past several years, the explosion of available data sources you can access to make decisions has been amazing. But the creators of that data are frustrated because few people can deploy that data effectively (or sometimes, know it even exists). And you are challenged by having to manually manage disparate data sources and provide relevant data meaningfully and effectively to people making decisions. SpatialKey was designed to help both parties—making more data available and useful so it can be interpreted and acted upon. I’m proud of how we’ve made that happen—and I appreciate that it could have never have happened if we hadn’t collaborated with our clients.

That’s why I’m passionate about building an ecosystem where we’re an effective contributor and where we can be smart about how we collaborate with others. One of the ways we do that at SpatialKey is by partnering with data experts. We don’t make flood footprints or wildfire extents, so we understand the value of working with experts like JBA Risk Management, KatRisk, NOAA, USGS, and Location, Inc. (to name only a few), who do incredible work with data. Exchanging our skill of visualizing data with their talent of creating it helps us move technology forward for insurers. Without this exchange, there’s no way we’d be able to give our users the ability to visualize and interpret the data they need to make the decisions that positively impact their bottom line.

The truth is this: maintaining the status quo will hinder our long-term success. I knew this when SpatialKey was in its early days and was determined to make sure we were always thinking creatively to help our users stay ahead of their competition (while maintaining our own edge in the software space). We’ve always known that our specialization is building innovative software with a keen focus on purpose-built analytics and great design. What we didn’t always know is that it would be for insurers. One of our first clients, Willis Re, is a veteran in the insurance space. Willis Re are experts in placing reinsurance, and needed a way to effectively develop and convey their strategy and recommendations to their clients clearly, interactively, and fluidly. Rather than build their own internal solutions, as was the traditional route for other reinsurance brokers, Willis decided to collaborate with us. They appreciated that what we brought to the table could help them differentiate and create a better and more impactful solution for their clients.

“We are at the forefront of risk and reinsurance, but we recognize where others can provide exceptional capabilities for us—especially when it comes to software,” says Vaughn Jensen, Executive Vice President, Willis Re. “Investing in our partnership with SpatialKey has given us a real competitive advantage because we can stay focused on being experts on what matters most to our clients and business.”

As it turned out, we needed Willis Re  to help us see how beneficial our solution could be for geospatial insurance analytics. This exchange is one of many that pushed us to actively pursue developing our software specifically for insurers. It was through this collaboration that I saw the benefit of offering, not to just Willis, but to the insurance industry, one of the key things they’re always trying to provide their own clients: peace of mind. What excites me the most about collaborating in this way is the remarkable ripple effect it has on product advancement. The more we collaborate with our clients, the better solutions we can offer the entire industry.

It’s no secret that many companies in the insurance industry are working to catch up when it comes to technology. Collaboration is the clearest way for you to get where you need to be. Together, we’re building something that is advancing technology in insurance. The more collaboration that happens, the more we’ll see companies grow, innovate with technology, and maximize profits.

The cool part is that insurance is one of the most prevalent and necessary things in our world today, and it’s only going to keep evolving. I know there are endless opportunities for exchange, specialization, and advancement—new players in the industry, new ideas, new approaches. With so many ways to move forward, if you embrace collaboration as a way to innovate, you will not just survive, you’ll thrive.
Posted in Insurance, Technology, Collaboration | Leave a reply

SpatialKey and Impact Forecasting are giving insurers greater access to global peril data

Posted on September 20, 2016 by Sarah Stadler


SpatialKey to include hazard data for more countries

SpatialKey now offers its insurance clients access to global peril data from Impact Forecasting, a catastrophe model development center within Aon Benfield. Designed to highlight potential risks for underwriters and exposure managers, SpatialKey geospatial analytic solutions now seamlessly integrate with expert content from Impact Forecasting. Insurers can now access the same, expert hazard data across both underwriting and exposure management disciplines, enabling them to make more confident business decisions.

“We’re excited to offer our clients access to a greater breadth of global hazard and risk data,” said Bret Stone, COO of SpatialKey. “Insurers rely on expert content to write and manage their exposure. Working with Impact Forecasting means our clients can easily access the information they need to inform their risk management strategies across the globe.”

In 2015, insured catastrophe losses totaled roughly $27 billion, according to the Insurance Information Institute. Risk experts, like Impact Forecasting, and geospatial insurance analytics providers, like SpatialKey, play an increasingly strategic role in helping insurers improve their bottom line. Both companies help insurers perform more accurate underwriting and risk assessment—two key factors in keeping insurers profitable.

Impact Forecasting provides data for eight kinds of perils in more than 60 countries, including emerging markets. With that data integrated into SpatialKey, insurers can identify new market opportunities and gain insights where their competitors lack understanding.

“We’re delighted to make our risk and hazard data models available to SpatialKey clients for the first time,” commented Adam Podlaha, CEO of Impact Forecasting. “Now, underserved markets can easily access, visualize, and analyze the data they need to better manage risk and write more profitable business.”

To learn more, please contact us.

Posted in Press Releases, Insurance, Underwriting | Leave a reply

Hazard data from Willis Re View of Risk now available to Willis Re clients via SpatialKey

Posted on September 16, 2016 by Heather Munro


Willis Re, the reinsurance division of Willis Towers Watson (NASDAQ: WLTW), the global advisory, broking and solutions company, today announces the availability of proprietary Willis Re hazard data to its clients via partnership with SpatialKey.

The Willis Re View of Risk assists Willis Re clients in developing their own view of risk, through a combination of enhancing and evaluating existing models, and developing new models where there are none. Constantly advancing Willis Re’s in-house modelling solutions by combining insurance industry experience with expert knowledge, Willis Re Catastrophe Analytics teams help quantify the financial impact of natural and man-made catastrophes on their clients’ portfolios across the world. 

Data developed by the team specifically focuses on gaps in the market for global perils where no reliable risk quantification tools exist, or where Willis Re can help clients enhance limited coverage or improve less detailed information.

Willis Re’s partnership with SpatialKey enables Willis Re clients to access this proprietary Willis Re information alongside data that clients may license from other industry sources, via a powerful and user-friendly geospatial platform. 

The territories and perils covered are broad in scope, ranging from volcanic risk in Italy, to tsunamis in Japan, to flood and storm event assessments in Europe and flood in Australia and New Zealand.


Karl Jones, Managing Director, Willis Re said: “It’s important for us to help clients understand their risk, especially for perils and regions that get less attention from the commercial vendors.  The expertise of our analysts around the globe, combined with the resources of the Willis Research Network and the power of the SpatialKey platform, puts the latest science at our clients’ fingertips.”

Tom Link, CEO of SpatialKey, said:  “We’re pleased to broaden the range of information that Willis Re clients can access through our platform.  SpatialKey’s vision is to empower insurers with all the information they need, enabling them to make optimal use of their risk capital to enhance profitability and resiliency.”

Posted in Press Releases, Insurance, Flood | Leave a reply