Punxsutawney Phil predicts it's time to reevaluate your underwriting approach

Posted on February 4, 2017 by Jen Smoldt

Reply

Groundhog-day-washington-post.jpg

So, what does Groundhog Day have to do with underwriting?

On February 2nd, Punxsutawney Phil saw his shadow. And, you're probably thinking, "So what does this have to do with my underwriting?" Just bear with the logic for a moment….Do you know how often Phil’s forecasts have been right? If he were right say even 70-80 percent of the time, would you see animal predictions as rooted in science? A sound piece of data perhaps? 

Let’s consider how rodent prognostication works: If Phil sees his shadow, then expect six more weeks of winter; however, no shadow means an early spring. The ritual dates back to 1887 when Phil first shared his meteorological insights and was then quickly devoured as part of the Gobbler’s Knob celebration (ouch, for Phil). Back in the day, there was a belief that Phil’s forecasting was connected to a larger animal consciousness.

The question we’re all asking is: How reliable is Phil, really? We decided to find out.

THE DATA 
The Washington Post actually did do the math. They calculated the average daily temperatures during the six weeks after Groundhog Day for the past 30 years and then compared the temperatures in the years when Phil saw his shadow to those in the years that he did not.

Drum roll please…

It turns out that Phil was right more often than not, but only in some cities. The results basically come down to chance because temperatures do not vary uniformly across the country—so Phil is bound to be right and wrong somewhere.

I know what you’re thinking...“This makes sense, but give me a number already!” Stormfax Almanac data suggests Phil’s accuracy is about 39 percent.

There’s only one prognosticating groundhog to make a very broad prediction for the entire country, though. Basically, every region of the country would need its own Phil for any type of accuracy according to The Post’s findings. For example, Punxsutawney Phil and his ancestors have been right only 39 percent, whereas Staten Island Chuck, has an 80 percent accuracy rate.

What’s this mean for your underwriting practices? Paying attention to more than one data source, the accuracy of data, and how it informs predictability can lead you to the right information for more informed decision making. Data and analytics are absolutely critical for accurate risk assessment—or let’s call it “underwriting prognostication” in honor of Groundhog Day.

PREDICTABILITY
Remember when Bill Murray woke up to a screeching alarm only to face the same day over and over again in the movie Groundhog Day?

Are you stuck in that same cycle when it comes to underwriting—approaching it the same way as you’ve always been? It’s likely that your practices are “good enough,” so why make a change? Here’s why: Eventually, the lack of moving forward—into a new day—will hinder performance. You will lose a competitive edge by being stuck in the same mode and missing out on key advancements (namely, InsurTech) to move your business forward. Many insurers realize this, that’s why three in four insurance companies—74 percent—believe that some part of their business is at risk of disruption.

ACCURACY 
In the movie, Bill Murray, who plays a weatherman, is able to accurately predict what will happen next. He can do this because he’s already lived the day before—he’s been there and is able to manipulate every situation to his benefit.

How great would it be if this were true for insurers? Imagine having the power to know exactly what’s going to happen when new business comes in? Being able to know for certain which catastrophes were going to hit (and when) would seriously increase success. But the fact is, risk is the reason insurance exists. People need insurance because they can’t predict the future. Every day in the insurance business is a new day—a new risk. And, anything that can help you more accurately predict risk is good as gold.

There’s a lot of information out there that can help with more accurate risk assessment, the big challenge is bringing it all together to empower more insightful decisions. Which leads us to how the combination of data, analytics, and technology, InsurTech in a nutshell, can enrich your insights for better “underwriting prognostication” (i.e. decisioning) and profitability:

“Insurers that embrace predictive modeling complexity by focusing on data enrichment, advanced analytics and technology can achieve a significant return on their investment,” said Klayton Southwood, director, P&C practice, Willis Towers Watson. “Carriers that catapult beyond their competition do so, in part, by leveraging superior data organization and analysis. For those insurers aspiring to unlock the potential of big data, they must be strategic, persistent and consistent.”

MORAL OF THE STORY
It’s essential to constantly evaluate your current underwriting rituals and determine if they’re still relevant for today or stuck in Groundhog Day. And the good news is, insurers don’t have to be stuck because the new day is offering incredible new solutions.

Like Punxsutawney Phil, and our friend Bill (Murray), you can’t be “holed up” in good enough. You need a broader view to accurately assess and gain insights. It’s time to leverage InsurTech strategies like geospatial insurance analytics which, in the case of SpatialKey, can provide access to a spectrum of content providers along with consistency and collaboration across departments–not to mention real-time insights that bring your data to life and help you make quality decisions. Because, let’s face it, data is useless if we can only see a shadow of it.

Connect with us to find out more about SpatialKey’s underwriting solutions. Or, see SpatialKey in action at the upcoming RAA conference:

HEADING TO ORLANDO FOR RAA NEXT WEEK? COME SEE US! - 4:45 EST, February 14

Join Jonathan Ward, AVP Risk Services, RLI, and Bret Stone, President, SpatialKey, on February 14 at 4:45 EST. They'll discuss today's underwriting challenges and demonstrate how geospatial insurance analytics has helped RLI harness the power of data to accelerate decisioning and much more. Don't miss it! 

 

Posted in Insurance, Underwriting, Analytics | Leave a reply

Whether you buy or build, getting ahead depends on leveraging the right experts

Posted on October 19, 2016 by Derek Blum

Reply

Collab_Blog_Post_3.jpg

Expertise can mean many things. Our last post from our CTO Brandon Purcell, How collaborating on technology changes the buy vs. build debate once and for all, touched on this important, but often overlooked part of this equation. Brandon pointed out that to obtain a solution that truly helps you understand your data and identify the insights you need to make decisions, you have to collaborate with experts. I wholeheartedly agree. But, I would add that you have to use the right kind of expert—with the right skill set—to successfully build and deploy a data analytics solution that will be readily adopted by your team.

What do I mean by that statement? Let me elaborate. Just because I understand the metrics used to describe hurricanes doesn’t mean I have the knowledge or the ability to collect wind speed data and construct a footprint. As a result, it makes absolute sense for me to rely on others who are far better equipped to supply this critical data to me. And yet there are insurers out there who instead of leveraging expert data, will attempt to reinvent the wheel or go without.

I suppose it makes sense in some regard. Who else knows what insurers need when it comes to data better than the insurers themselves? After all, insurance is a complex business with each company developing a unique workflow that suits the geography, lines of business, and culture of their operations. (I should know, I’ve been helping to develop products specifically for insurers for the past two decades.)

But with all due respect, I have to say that while you absolutely know insurance, you don’t have the same expertise in technology, especially for managing the increasing amounts of data you need to make decisions every day. Portfolio data, claims data, hazard data—the volume and breadth of data is growing exponentially. To keep up, you have to leverage the experts in data technology.

Let’s consider managing hazard data for a moment. First, you have to set up and manage the licensing agreements to access data from the provider. But once you have the agreement in place, license restrictions may limit how many people on your team can access the data, so you may not be able to easily scale to meet changing volumes or demands. Then, once you have access, you need to make it available to the right people in your organization. With large amounts of hazard or other kinds of data, this is no trivial matter.

Simple choices about how to warehouse the data and in what format to store it can affect the utility your company can get out of it. If the data is hard to use, you may fail to achieve the benefits you were seeking in the first place. In the end, you will have a variety of costs on the front-end just to have access, as well as maintenance costs to manage and update the data. And that’s before you even begin interpreting the data for underwriting decisions or comparing it to your own portfolio data. Not good.

Think about it this way. If you’re a property and casualty writer, and a client asks you for a workers compensation insurance policy, what do you do? You refer them to an underwriter who specializes in workers comp. Why? Because insuring employees for work-related injuries comes with a host of different concerns than insuring a building. Even inside your own area of expertise, you wouldn’t send an underwriter to do a claims adjuster’s job or have a customer service representative analyze your portfolio, would you?

This kind of expertise also applies to the data you need to drive critical decisions every day. Providers specialize in different types of data just like you specialize in different lines of insurance. JBA Risk Management, for example, is known for hazard mapping and catastrophe modeling for flood. Location, Inc., on the other hand, focuses on providing granular data for crime risk. By specializing, each provider can offer more valuable insights within their chosen area of expertise to the benefit of the insurers who partner with them.

As insurance experts, you might think your internal IT department has the design expertise to provide an optimal user experience when making standalone tools. But with what could amount to terabytes of hazard data, which is the complex kind of data we’re really talking about, your solution has to be able to handle geospatial lookups, as well as apply scoring matrices for one or more hazard layers. Connecting those dots isn't something an IT person who doesn’t specialize in Geographic Information Systems (GIS) or design can do.

On the other hand, choosing to work with the experts who specialize in making data easy to digest ensures you get the high-quality, durable solutions—backed by a dedicated Research and Development team—you need to stay ahead of your competitors. In addition, software from these kinds of experts can provide a better user experience while meeting the specific needs of each department and the broader needs of your entire enterprise. 

More importantly, partnering with software experts that can provide an out-of-the-box solution tailored for the insurance workflow means you don’t have to wait to begin making better, more informed decisions. A top-tier insurer recently discovered this for themselves. Because they decided to purchase an insurance analytics solution, they discovered they could have avoided a $1 million claim. If they had decided to try and build something in-house to understand this important data, it likely would have taken far too long, and even worse, may never have been finished in order to uncover the information. So just like that, their decision to collaborate with an outside expert helped them develop a competitive advantage. Not to mention, the software virtually paid for itself.

The truth is that staying on top of all of this is complicated. After all, there’s never been more data with which to inform your decisions. As the number of specialized data providers continues to grow, your internal IT team can’t easily stay on top of bringing together all of the data sources, managing provider license agreements, and performing routine maintenance to keep everything working smoothly. At the same time, managing internal claims and portfolio data is challenging as well. To compete successfully, you need to be able to quickly visualize and understand what all of this information is really telling you so you can act on it...you need insight at your fingertips.

At SpatialKey, we collaborate with data providers to ensure our clients don’t ever have to be bothered with that kind of administrative overhead and costly infrastructure and maintenance. (Just sayin’.) Plus, we’ve already done the due diligence to ensure we’re working with high-quality data providers. That means accessing data is a snap for SpatialKey customers who aspire to write better risk, respond quickly to catastrophes, and build resilient portfolios.

From a purely business perspective, executives and shareholders expect your underwriters, exposure managers, and claims managers to inform their approach and make educated decisions with relevant data. To set your teams up for success—and enable them to focus on the specialty that drives your bottom line—you need them to spend less time finding and managing data and more time interpreting and acting on it. The added bonus? When multiple departments use the same data sources for their decision making, you gain consistent understanding and efficiency across your entire organization.

Today, every insurer understands the importance of being a data-driven enterprise. But to truly achieve this goal, you have to depend on the right partners, ones who specialize in simplifying how you access, interpret, and analyze data, whether it’s your own or from a third party. Being able to quickly access and interpret expert data sources is what will set you apart, and keep your business competitive today and for years to come.

Posted in Insurance, Technology, Collaboration | Leave a reply

Hurricane Matthew by the numbers: What we know so far

Posted on October 12, 2016 by Heather Munro

Reply

Hurricane_Matthew_Stats-02_Final_Final.jpg

 

Blue skies may be back, but the impact from Hurricane Matthew is still being felt. As your claims team works around the clock managing your event response, information on the extent of the damage is critical to your ability to respond as quickly as possible.

SpatialKey seamlessly integrates the latest event footprints from third-party data providers like KatRisk, Impact Forecasting, and NOAA so you have access to the up-to-date data you need to take action. You’ll be glad to know that footprints for wind, surge and inland flood for Hurricane Matthew from KatRisk are now integrated within SpatialKey and available for use.

Below, we’ve rounded up the latest statistics on the initial losses from Matthew, so you can continue to evaluate the impact of the storm on your book of business.

Risk modelling firm CoreLogic estimates insured losses from Matthew will likely range between $4 and $6 billion. CoreLogic’s initial figure includes losses from wind and storm surge, but not additional flooding, business interruption or building contents. The firm also places the storm’s losses higher than hurricanes Floyd (1999) and David (1979), but well below hurricanes Katrina (2005) and Sandy (2012).

On top of losses from wind and storm surge, early estimates indicate that the evacuations may result in $10 to $15 billion in losses from economic disruption, according to Chuck Watson, a disaster modeler with Enki Research.

Hurricane Matthew was notable for ending a nine-year streak without an Atlantic Basin Category 5 hurricane. The powerful storm, which weakened as it moved from Haiti to the U.S., forced three million people to evacuate. By the time it was over, it had caused coastal erosion, wind damage, and freshwater flooding across five states, including Florida, Georgia, North Carolina, South Carolina, and Virginia. 

WireAP_e4459f5e1a0b41db9e3936bb129c3682_16x9_1600.jpg

 In North Carolina, rising flood waters threatened more than 1,500 people in need of rescuing. Photo credit: Associated Press

In the Caribbean, Matthew reached Category 5 status with peak gusts up to 166 mph. Haiti was the hardest hit, with more than 1,000 people killed. Now, a resulting cholera outbreak threatens to cause further devastation in an area with history of earthquakes and hurricanes and a low insurance penetration rate.

At the height of the storm, about 2.2 million people—1 million in Florida alone— lost power.

matthew_vir_2016282_20161009104349045_6294798_ver1.0_640_360-1.jpg

 Images from NASA showing power outages during Hurricane Matthew. Photo credit: NASA

Peak wind gusts in the U.S. during Matthew ranged from 69 to 107 mph. At press time, the latest U.S. death toll is 33, but that number could go up as flooding continues over the next few days.

Outside of power loss and severe winds, storm surge was another significant concern. Fernandina Beach, Florida experienced a storm surge of 9.88 feet above normal, and Ft. Pulaski, Georgia encountered record tide levels and a storm surge just under eight feet.

While it’s too early to tally Matthew’s total impact in the U.S., insured losses will undoubtedly increase due to flooding, especially in North Carolina where flooded rivers that wash through farms and coal ash sites may spread toxins through miles of waterways.

The American Red Cross is already spearheading recovery efforts for the many people affected by the powerful event. To support this effort or learn more about how you can help, click here.

In the meantime, as your claims adjusters manage the growing number of claims, remember SpatialKey can help. To find out more or to access the latest Hurricane Matthew data and analyze it against your own portfolio in SpatialKey, contact us today.

Posted in Hurricane, Insurance, Flood, Event response | Leave a reply

How collaborating on technology changes the buy vs. build debate once and for all

Posted on October 11, 2016 by Brandon Purcell

Reply

Collab_blog_2_image.jpg

Our last post by CEO Tom Link, It’s time to evolve how you collaborate in a data-driven world, got me thinking about the importance of partnerships. Who you collaborate with to meet your business objectives matters, especially when it comes to technology. Having the right UX, GIS, and R&D resources on your side can mean the difference between amazing usability and software that slows you down.

Prospects often tell me they’re frustrated. Developing software for an industry as complex—and as data driven—as insurance isn’t easy. I’ve been developing software for 18 years. I can tell you that the decision to “buy or build” a solution isn’t as black and white as it sounds.

More than half of IT projects fail, according to CIO Magazine. Lack of product design resources and failing to align outcomes to meet business goals are two reasons why. Solutions never get deployed, don’t meet the business’ requirements, or are so difficult to use that they never get adopted by the users they are supposed to help. No wonder you’re frustrated. I can’t help but think how much time and money have been wasted.

Chances are, you already have some sort of homegrown data analytics system. And you’re considering whether to extend that to add new features or to retire the solution entirely and buy a new one off the shelf. I get why insurers might decide to create their own in-house solution. After all, compared to buying something new, it can seem like there are lower up-front costs. Not to mention that third-party providers haven’t always delivered solutions that keep pace with the demands of your business. Plus, having control over the end product makes it seem more customizable.

But building software is more complicated (and riskier) than it sounds. Especially for something like risk selection analytics for underwriting. These solutions need to support complex hazard models from different kinds of content providers. I've heard from many prospects that building something like that is just “too risky.” That’s because it’s difficult to maintain and scale as new intelligence becomes available in the market.

When it comes to building software, I wish senior management understood a fundamental truth I’ve learned over the years: The true costs go up significantly when you try to do it all on your own.

In-house development costs are still costs

Spending your budget in-house can often feel like you are saving money. After all, you have the hardware, the programmer’s time is already paid for, and you have a trusted team of folks ready to get the job done. Not so fast. If your company is using this type of thinking, you are (or will be) in for a surprise.

Development is much more expensive than you might think. The real dollar amount you need to consider includes: the cost of your development team, the time taken from your business operations during all phases of development, and the opportunity cost of the work not done on another IT project. Of course, this only applies if you have the expertise within your in-house team to build and add the desired features to your existing solution.

I can’t tell you how many times I’ve seen internal projects end up costing way over the estimate. Or worse, the in-house IT team doesn’t have the expertise to deliver the final functionality. The truth is that developing the solution in-house is never a one-and-done cost.

Once it’s built, you still have to budget for maintenance

Maintaining an application and keeping it running is an expensive proposition. Software has bugs and requires a team to address them as they arise. Once a solution is built, the team typically moves on to another project. I’ve even seen cases where the solution’s primary developer leaves the company. With no one dedicated to keeping the solution running smoothly, your users are often stuck having to figure out a workaround and can lose valuable time, which as you know, is another cost to your business.

At the same time, once your users get the solution in their hands, you can bet they’ll ask for new features and capabilities in future releases. Evolving an in-house solution from version to version also requires a dedicated team—and if your internal resources are on to the next IT project, your users are left with a version 1.0 with limited functionality. If they’re using your solution at all.

An annual enterprise software license can cost upwards of $100,000 a year and yes, over ten years, that’s $1 million. But unlike building in-house, included in that price is maintenance, updates, support, and most importantly, a collaborative partner who understands that you want software to solve your headaches, not cause them. You also get the added benefit of having someone outside your company focused solely on meeting your deliverables on time and on budget.

The right partnership delivers returns on your investment

Collaborating with the right partner gives you a competitive edge. (Stay tuned for more on this—our next blog post will cover collaborating with the right experts and partners to get ahead.) When you can get a more advanced solution faster and for less money, you have to ask yourself: Why would you build one on your own? I like to tell clients that building your own solution is a lot like building your own car from scratch, costly and time consuming. Ultimately, it’s faster, easier, and less expensive to buy a car with the features you like than to try to design and build one yourself.

Just as you might choose a car with a back-up camera or built-in Bluetooth interface, you have options today that allow you to adapt and integrate your existing systems across your unique enterprise. Hybrid solutions excel by leveraging the best of your in-house solution and integrating to third party solutions through APIs. In short, you can have the flexibility you want, while lowering long-term costs. You can also be up and running a heck of a lot faster.

Ideally, integrating your core underwriting system with your advanced risk selection solution gives your business the power to act on information quickly. Let’s say you have an in-house underwriting solution that handles your complex rules and workflow. And, you need to modernize and select better risks using risk models for flood, hail, and tornado. Building a system to integrate this third-party data, perform spatial lookups, and maintain the risk data will require a robust and flexible infrastructure and sophisticated geospatial analytics.

That’s, in part, because the amount of data available—from government sources, third-party providers, and various risk models—is growing at an exponential rate.  Managing, manipulating, and sharing this information efficiently across your organization so it’s easy to understand and act on will take advanced systems that can handle the load. While this sounds simple enough to build, it's pretty complex. But by leveraging an outside solution—one that gives you access to data in a workflow tailored to your needs—it’s possible to have the best of both worlds. Really.

Working with software experts, you can also create economies of scale. At SpatialKey, we develop purpose-built solutions that bridge the gap between underwriting, exposure management, and claims, so our clients can easily make decisions from the same baseline of understanding. When you tap into a third-party solution designed specifically for insurance, you and your internal IT team no longer have to reinvent the wheel. And unlike purchasing a car, your solution will just keep getting better and better over time, thanks to seamless updates by your provider.

So the next time you debate the merits of building or buying a solution, consider all of the shades of gray that will enable you to truly meet your business objective. Who you choose to collaborate with has to be part of your discussion. Partnering with software experts, instead of building in-house, is a clear choice that will keep the wheels of your business turning and more than pay for itself in the long run.
Posted in Insurance, Technology, Collaboration | Leave a reply

It’s time to evolve how you collaborate in a data-driven world

Posted on September 26, 2016 by Tom Link

Reply

shutterstock_324245873.jpg

Photo credit: shutterstock.com

When I first discovered how well the insurance industry collaborates, I was blown away. As an outsider, learning how agents, brokers, carriers, and reinsurers all work together to identify, write, and share risk was both overwhelming and fascinating. I was impressed at how multiple insurers participate in sharing a single risk and how each does so leveraging its unique strategy and specialty. An industry that initially seemed boring to me became artful and fascinating, even noble. I continually reflect on how this level of collaboration and risk sharing makes tremendously ambitious projects—from skyscrapers to city centers— possible. And yet, surprisingly, this industry that is so fundamentally built upon collaboration seems to lag far behind its peers when it comes to technology innovation and collaboration.

Before starting SpatialKey, I co-founded Universal Mind, a digital agency dedicated to helping businesses figure out how to use technology to differentiate themselves and improve customer experience. SpatialKey is a natural extension of that simple idea, except we focus specifically on helping insurers. As I’ve worked to build companies I’m proud of, I’ve learned some invaluable lessons I think can help everyone participating in the insurance space.

To me, the industry’s call to be more data-driven is a call to collaborate on solutions that simplify the process of interpreting data. After all, consider the caveman. Probably not where you thought I was going with that, right? Bear with me.

Neanderthals died out for one simple reason: they failed to learn how to collaborate with groups outside of their tribe. Homo erectus, on the other hand, thrived because of their ability to trade their specialized skills with outside groups, according to Matt Ridley, author of The Evolution of Everything.

Today, it’s really not all that different. Our economy is built on this simple exchange of skills, and I’m betting that’s hardly surprising to anyone. But I see business leaders thinking they can go it alone when it comes to technology in insurance. In talking with clients, I hear a lot of stories about data being difficult to interpret, contextualize, or provide in a timely fashion to decision makers. I hear about internal teams building systems that stray from their areas of expertise, and a cry for a way to get work done faster (not to mention cheaper). I can’t help thinking we’re ignoring some of the lessons of our ancestors. This is why I deeply believe in the power of focus, specialization, and exchange. The only way to survive—and let’s face it, evolve—is to hone our own skills and have the good sense to collaborate with other experts along the way.

I invite you to consider that outsourcing to experts is something you already do. You may use third-party administrators to manage claims adjustments during an event or underwriters and managing agents who specialize in writing certain regions and classes. So it makes sense to do this with the data you use to drive your business forward, right?

Data, to put it simply, has no value unless you understand it. Over the past several years, the explosion of available data sources you can access to make decisions has been amazing. But the creators of that data are frustrated because few people can deploy that data effectively (or sometimes, know it even exists). And you are challenged by having to manually manage disparate data sources and provide relevant data meaningfully and effectively to people making decisions. SpatialKey was designed to help both parties—making more data available and useful so it can be interpreted and acted upon. I’m proud of how we’ve made that happen—and I appreciate that it could have never have happened if we hadn’t collaborated with our clients.

That’s why I’m passionate about building an ecosystem where we’re an effective contributor and where we can be smart about how we collaborate with others. One of the ways we do that at SpatialKey is by partnering with data experts. We don’t make flood footprints or wildfire extents, so we understand the value of working with experts like JBA Risk Management, KatRisk, NOAA, USGS, and Location, Inc. (to name only a few), who do incredible work with data. Exchanging our skill of visualizing data with their talent of creating it helps us move technology forward for insurers. Without this exchange, there’s no way we’d be able to give our users the ability to visualize and interpret the data they need to make the decisions that positively impact their bottom line.

The truth is this: maintaining the status quo will hinder our long-term success. I knew this when SpatialKey was in its early days and was determined to make sure we were always thinking creatively to help our users stay ahead of their competition (while maintaining our own edge in the software space). We’ve always known that our specialization is building innovative software with a keen focus on purpose-built analytics and great design. What we didn’t always know is that it would be for insurers. One of our first clients, Willis Re, is a veteran in the insurance space. Willis Re are experts in placing reinsurance, and needed a way to effectively develop and convey their strategy and recommendations to their clients clearly, interactively, and fluidly. Rather than build their own internal solutions, as was the traditional route for other reinsurance brokers, Willis decided to collaborate with us. They appreciated that what we brought to the table could help them differentiate and create a better and more impactful solution for their clients.

“We are at the forefront of risk and reinsurance, but we recognize where others can provide exceptional capabilities for us—especially when it comes to software,” says Vaughn Jensen, Executive Vice President, Willis Re. “Investing in our partnership with SpatialKey has given us a real competitive advantage because we can stay focused on being experts on what matters most to our clients and business.”

As it turned out, we needed Willis Re  to help us see how beneficial our solution could be for geospatial insurance analytics. This exchange is one of many that pushed us to actively pursue developing our software specifically for insurers. It was through this collaboration that I saw the benefit of offering, not to just Willis, but to the insurance industry, one of the key things they’re always trying to provide their own clients: peace of mind. What excites me the most about collaborating in this way is the remarkable ripple effect it has on product advancement. The more we collaborate with our clients, the better solutions we can offer the entire industry.

It’s no secret that many companies in the insurance industry are working to catch up when it comes to technology. Collaboration is the clearest way for you to get where you need to be. Together, we’re building something that is advancing technology in insurance. The more collaboration that happens, the more we’ll see companies grow, innovate with technology, and maximize profits.

The cool part is that insurance is one of the most prevalent and necessary things in our world today, and it’s only going to keep evolving. I know there are endless opportunities for exchange, specialization, and advancement—new players in the industry, new ideas, new approaches. With so many ways to move forward, if you embrace collaboration as a way to innovate, you will not just survive, you’ll thrive.
Posted in Insurance, Technology, Collaboration | Leave a reply

SpatialKey and Impact Forecasting are giving insurers greater access to global peril data

Posted on September 20, 2016 by Sarah Stadler

Reply

announcements_email.jpg

SpatialKey to include hazard data for more countries

SpatialKey now offers its insurance clients access to global peril data from Impact Forecasting, a catastrophe model development center within Aon Benfield. Designed to highlight potential risks for underwriters and exposure managers, SpatialKey geospatial analytic solutions now seamlessly integrate with expert content from Impact Forecasting. Insurers can now access the same, expert hazard data across both underwriting and exposure management disciplines, enabling them to make more confident business decisions.

“We’re excited to offer our clients access to a greater breadth of global hazard and risk data,” said Bret Stone, COO of SpatialKey. “Insurers rely on expert content to write and manage their exposure. Working with Impact Forecasting means our clients can easily access the information they need to inform their risk management strategies across the globe.”

In 2015, insured catastrophe losses totaled roughly $27 billion, according to the Insurance Information Institute. Risk experts, like Impact Forecasting, and geospatial insurance analytics providers, like SpatialKey, play an increasingly strategic role in helping insurers improve their bottom line. Both companies help insurers perform more accurate underwriting and risk assessment—two key factors in keeping insurers profitable.

Impact Forecasting provides data for eight kinds of perils in more than 60 countries, including emerging markets. With that data integrated into SpatialKey, insurers can identify new market opportunities and gain insights where their competitors lack understanding.

“We’re delighted to make our risk and hazard data models available to SpatialKey clients for the first time,” commented Adam Podlaha, CEO of Impact Forecasting. “Now, underserved markets can easily access, visualize, and analyze the data they need to better manage risk and write more profitable business.”

To learn more, please contact us.

Posted in Press Releases, Insurance, Underwriting | Leave a reply

Hazard data from Willis Re View of Risk now available to Willis Re clients via SpatialKey

Posted on September 16, 2016 by Heather Munro

Reply

Willis Re, the reinsurance division of Willis Towers Watson (NASDAQ: WLTW), the global advisory, broking and solutions company, today announces the availability of proprietary Willis Re hazard data to its clients via partnership with SpatialKey.

The Willis Re View of Risk assists Willis Re clients in developing their own view of risk, through a combination of enhancing and evaluating existing models, and developing new models where there are none. Constantly advancing Willis Re’s in-house modelling solutions by combining insurance industry experience with expert knowledge, Willis Re Catastrophe Analytics teams help quantify the financial impact of natural and man-made catastrophes on their clients’ portfolios across the world. 

Data developed by the team specifically focuses on gaps in the market for global perils where no reliable risk quantification tools exist, or where Willis Re can help clients enhance limited coverage or improve less detailed information.

Willis Re’s partnership with SpatialKey enables Willis Re clients to access this proprietary Willis Re information alongside data that clients may license from other industry sources, via a powerful and user-friendly geospatial platform. 

The territories and perils covered are broad in scope, ranging from volcanic risk in Italy, to tsunamis in Japan, to flood and storm event assessments in Europe and flood in Australia and New Zealand.

Willis_Re__Flood_Brisbaneimage002.png

Karl Jones, Managing Director, Willis Re said: “It’s important for us to help clients understand their risk, especially for perils and regions that get less attention from the commercial vendors.  The expertise of our analysts around the globe, combined with the resources of the Willis Research Network and the power of the SpatialKey platform, puts the latest science at our clients’ fingertips.”

Tom Link, CEO of SpatialKey, said:  “We’re pleased to broaden the range of information that Willis Re clients can access through our platform.  SpatialKey’s vision is to empower insurers with all the information they need, enabling them to make optimal use of their risk capital to enhance profitability and resiliency.”

Posted in Press Releases, Insurance, Flood | Leave a reply

How to be the 007 of underwriting

Posted on June 20, 2016 by Heather Munro

Reply

 

Photo credit: fullhdpix.com Photo credit: fullhdpix.com

James Bond makes it look so easy. Whether he’s sneaking into a super villain’s secret hideout, skiing down a mountain pursued by assassins, or matching wits with a femme fatale, he never breaks a sweat.

If you watch closely, however, there’s a reason he always lands on his feet—and it’s not just because of camera angles and Hollywood tricks. As he secretly scuba dives onto a private island, he makes sure to use all of the relevant information at his disposal. The amount of oxygen in his tank, the number of miles to shore, the best location to make landfall—these are the critical data points he needs to plan his approach.

While the data is critical to his mission, it’s Bond’s intuition that will help him carry it out. Bond’s gut tells him to look back on his way to the island, and he sees the enemy henchmen in time to stop their pursuit.

Like Bond, underwriters rely on both key information and their best judgement to get the job done. Here’s how you can easily use the most relevant data to write the best, most profitable risks, especially when it comes to terror.

Use the most advanced technology

Whether it’s a poisonous-dart-shooting watch or a car armed with lasers, Bond has access to tech that does more than you expect. So do you. Today, sophisticated analytics designed specifically for the insurance workflow can help you improve your terrorism underwriting, diversification, and overall profitability.

We have created a solution built for the Bonds of underwriting. SpatialKey Underwriting enables you to immediately determine the contribution to your portfolio from a single, interactive environment.

Terror 1 Blog_Screen Shot 2016-06-17 at 1.31.16 PM

Just imagine:

  • Visualizing high-interest terrorist targets on a map and the proximity of new and/or existing locations to them.
  • Observing details about potential targets, such as a skyscraper and its surrounding building environment.
  • Displaying a schedule of locations on a map and performing an analysis to identify peak accumulations of exposure.

Having the ability to derive meaningful insights from multiple sources of information—and being able to do so quickly with self-serve analytics—keeps you at the top of your game. Using your intuition about these insights keeps you ahead of your competition.

Know your location inside and out

The beginning of every 007 mission usually begins in M’s office, where Bond receives his confidential orders to travel to dangerous parts of the world. As Britain’s top secret agent, Bond never begins an assignment without understanding the territory he’s about to enter. As a terrorism underwriter, neither should you. Fortunately, you don’t need M to get the relevant information you need.

When it comes to assessing a terrorism risk, location is one of the most important factors to consider. After all, what’s near your insured’s property—iconic landmarks, public transportation stations, and other potential targets—could increase the likelihood of an attack. The more you know about the locations you are insuring, the more accurately you can price the risk.

Data is meaningless, however, if you can’t make sense of it. Using geospatial insurance analytics to visualize what’s in and around the area you’re considering bringing into your portfolio is a key first step to making a confident decision. Just as you use past event data to better understand a potential weather catastrophe risk, using keen visualizations and analytics to evaluate a potential bomb blast equips you with the right information to avoid making a blind decision.

Rely on the experts

Just like Bond gets classified intel from the brains at MI6, you, too, have partners you can lean on for pertinent information. You can gain insight into the location of a potential attack from modeling firms like IHS, RMS, or Verisk as well as data providers who compile lists of probable targets. You can then visualize that data to understand exactly how it impacts a specific location.

Armed with this information, you can quote rates commensurate with the level of risk and differentiate between office buildings in different parts of the world and price their policies accordingly. Replacing doubt with compelling information brings science to the art of terror underwriting, where you can more quickly decide the best path forward for your business.

While none of us, not even James Bond, know where the next terrorist attack will take place, we do know that one will. Underwriters who can suit up and perform more like a quick-thinking secret agent, will be the ones to reap potential profitable business opportunities.

Connect today
Discover how to get the intelligence you need to competitively underwrite risk.

Posted in Insurance, Underwriting | Leave a reply

SpatialKey Hires Product Marketing VP to Expand Insurance Analytics Offerings

Posted on April 12, 2016 by Sarah Stadler

Reply

 

DerekBlum_headshot

We’re delighted to welcome Derek Blum as our new Vice President of Product Marketing.

In this newly developed role, Derek will ensure SpatialKey delivers high-quality, relevant insurance analytics products, expanding existing capabilities and developing new offerings to meet our clients’ changing needs.

Derek’s background made him a natural choice for SpatialKey. For most of his 20-year career, he has helped insurers—and reinsurers—use technology to mitigate, manage, and transfer catastrophe risk.

“Derek’s extensive experience in the definition and development of insurance analytics products makes him the ideal person to help us identify innovative product opportunities,” explains SpatialKey COO Bret Stone. “He’ll be instrumental in further developing our product strategy and reinforcing our presence and commitment within the insurance industry.”

In addition, Derek is committed to innovation and will be drawing on his experience in bringing natural and man-made catastrophe risk and exposure analysis products for multiple insurance lines and segments to market.

“Over the years, it has been fascinating to watch insurers embrace analytics and use those insights to shape their risk management, underwriting, and claims decisions,” explains Derek. “I’m excited to join SpatialKey’s smart, innovative team and to continue identifying ways to make the business of writing risk easier—and more profitable—for insurers.”

Posted in Press Releases, Insurance | Leave a reply

Brussels Attack Highlights Need for Sophisticated Insurance Analytics Tools

Posted on April 4, 2016 by Sarah Stadler

Reply

Photo credit: REUTERS/Charles Plateau. Photo credit: REUTERS/Charles Plateau.

Sadly, the recent bombings in Brussels—just a few months after the attacks in Paris—are a grim reminder that terrorism has become a fact of life in Western cities.

For Property & Casualty insurers, this unfortunate reality means keeping terrorism risk top of mind when underwriting properties. The three coordinated explosions in Brussels—two at Zaventem Airport and one at Maelbeek metro station—underscore the complex nature of terrorism risk.

To help underwriters write terror risk with more confidence, SpatialKey has added concentric ring models that simulate realistic bomb blast scenarios and provide insurers with more accurate risk assessments.

Get a More Accurate View of Terror Risk

From a single, interactive environment, SpatialKey helps you quickly evaluate terror risk and the impact on your portfolio.

With the new concentric rings models, you can evaluate the worst case scenario—or peak accumulation—if an attack, such as a bomb, were to go off near prospective locations. The model simulates damage radiating away from a bomb blast, so you can see complete or partial damage diminishing as buildings get further away from the original blast site.

“We developed the concentric rings model because our clients and prospects recognized the need to consider terrorism risk in their underwriting and portfolio management processes,” explains SpatialKey Product Manager Angie Olivero. “And now they can differentiate the damages within different radii for a richer understanding of terror accumulations.”

SpatialKey Terror UW Concentric Rings

In addition, SpatialKey gives you the option to run your analysis based on specific metrics in your portfolio, on exposed limits, or on a specific area of interest within custom geographic extents.

And with the SpatialKey Accumulations app, you can dive deeper into terrorism risks using the Rings and Target features, which allow you to:

    • Identify and evaluate peak accumulations of exposure within your portfolio
    • Analyze your portfolio against activity, high-interest properties, or historical terror claims
    • Look at terror target datasets (locations identified as having terrorism risk such as skyscrapers, churches, train stations, etc.)

SpatialKey Terror UW Accumulations

And thanks to an advanced understanding of exposure concentrations, including where terror attacks are most likely to occur, you can analyze your current book of business and apply guidelines to select profitable new business.

What’s more, SpatialKey can help you evaluate the potential for claims given a recent or future terrorist attack. By considering attacks at credible terrorist targets, insurers can begin to understand their overall risk and formulate a claims response plan.

To learn more about how SpatialKey can help you feel confident writing terror risks and minimize the impact on your portfolio, contact us today.

Posted in Insurance, Underwriting, risk assessment | Leave a reply