Tag Archives: geocoding

New insurtech product, Milliman PinPoint, selected by North Carolina Rate Bureau to deliver granular flood rating plan

Milliman today announced the launch of Milliman PinPoint, an innovative insurtech software solution that enables insurers to cost-effectively evaluate, price, and market residential property and flood products through location-level geospatial information that is customizable for each unique user.

Insurers are looking for ways to take advantage of geographic information systems (GIS) technology and apply it to better understand their current and future customers. PinPoint offers a way for insurers to zero in on a property’s risk, and can be especially valuable for companies thinking of entering new markets such as private flood or adopting more granular rating strategies in their existing markets.

For example, Milliman PinPoint has already been implemented by the North Carolina Rate Bureau (NCRB), which is using the insurtech solution for members who wish to offer admitted private flood insurance in the state.

“PinPoint provides a rating solution for insurance companies that can be readily adopted without spending IT resources and significant up-front costs,” says Andy Montano, Personal Lines Director at the NCRB. “PinPoint is an important option we offer our member companies, making it easy to implement the recently-approved NCRB flood program.”

Using a simple application program interface (API), PinPoint delivers data and insights to customer systems at the point of decision. It provides a level of granularity not frequently seen in insurance products, including distance calculations (such as distance to coast), elevation statistics (such as elevation relative to surrounding areas), and market data (such as Census information or competitor premiums). The API also provides company-specific rating algorithms, delivering premium calculations and customizable rating territories across all 50 states. This can be especially valuable for insurers and managing general agents (MGAs) looking to quickly and efficiently launch new products in emerging markets, such as private flood in the U.S. Since PinPoint is built and implemented by Milliman’s property insurance experts, it is tailor made to fit specific business objectives, giving clients quick time to market in addition to the full customization they need.  

For more information, click here.

Flood warning: Working to provide better coverage

Flood is one of the most devastating catastrophic perils, in which a single event can create tens of billions of dollars of loss. It is also one of the least insured perils, affecting people in every part of the United States. Advanced risk models now provide granularity, assessing flood risk at local levels. Such technological development presents insurers the opportunity to offer affordable, risk-based coverage within a private insurance market. Milliman colleagues Nancy Watkins, Matt Chamberlain, Andrei Stoica, and Garrett Bradford offer perspective in this video.

To learn more about Milliman’s flood expertise, click here.

Reading list: Florida’s private flood insurance market

Advances in catastrophe models and new state insurance regulations have opened the door for an affordable, risk-based private insurance market in Florida. This reading list highlights articles focusing on various issues and implications related to the market. The articles feature Milliman consultants Nancy Watkins and Matt Chamberlain, whose knowledge and experience is helping insurers to understand and price flood risk more precisely.

Forbes: “The private flood insurance market is stirring after more than 50 years of dormancy
The reemergence of private flood insurance has piqued the interest of carriers seeking to enter the market. Some catastrophe (CAT) modeling companies are creating flood models to help insurers price policies. Here’s an excerpt:

Nancy Watkins, a principal consulting actuary for Milliman, likened the current level of interest from insurers to enter the private flood insurance market to popcorn.

“We are at that stage where you can hear the space between pops. You can hear one kernel at a time,” she said. “What I think is going to happen is, in one to two years, there’s going to be a lot more going on.”

Bradenton Herald: “Important for homeowners to compare flood insurance options
Florida homeowners must consider the issues related to the National Flood Insurance Program (NFIP) and private flood policies. Private insurers can use predictive modeling technology to determine a home’s distinct flood risk.

Tampa Bay Times: “Remember the flood insurance scare of 2013? It’s creeping back into Tampa Bay and Florida
Real estate and insurance experts comment on the possible effects that high flood insurance rates may have on homeowners. Insurers express interest in the granular modeling of flood-prone territories.

Tampa Bay Business Journal: “Why some Tampa Bay property insurers are offering flood coverage and others are not” (subscription required)
Insurers need to weight the risks and rewards associated with the underwriting of flood insurance. A few carriers have already decided to participate in Florida’s private flood insurance market.

Geographic information systems can help insurers price flood risk

Insurers have been cautious about reentering the homeowners flood insurance market, which is due to high risks related to floods. In his Best Review’s article “High water mark,” Milliman’s Matt Chamberlain discusses the reasons behind the industry’s trepidation. He also provides perspective on how geographic information systems (GIS) can help insurers develop granular rating plans. Here is an excerpt:

There are several reasons why flood has been considered an uninsurable risk. First, flood is a localized peril; a distance of a few hundred feet, or less, can make a large difference in risk. This produces an information asymmetry, because the insured has a clear understanding of the local topography, while the insurer does not. The insured knows how far the house is from water, and whether it is on the top of a hill or if it is in a depression.

Insurers, on the other hand, typically use large rating territories for homeowners insurance, in some cases larger than a county. If these territories were to be used for flood insurance, it would create the potential for adverse selection. Insureds that were at highest risk of a flood would be most likely to want the coverage, and if insurance companies do not have a means of distinguishing higher-risk from lower-risk policies, anti-selection would result….

Geographic Information Systems, when coupled with the new flood catastrophe models to provide a very granular rating plan, may help insurance companies overcome these risks. Territories can be based on “hydrological units,” or watersheds, so that areas that water is not likely to flow between are not grouped together. Within a territory, appropriate rating factors are distance-to-coast (relating to storm surge risk), distance-to-river/stream (relating to river flood risk), and elevation (because all else being equal, there is lower flood risk at higher elevations).

Using all of these rating factors produces a rating plan that is able to distinguish different levels of risk even among points that are near each other. This produces true risk-based pricing that is likely to be sustainable in the long run. The top map at right shows this approach and compares it to the traditional method of rating flood insurance used by the NFIP, shown at bottom.

The video below presents an example of how GIS can improve pricing strategy.

Hurricane Sandy reading list

Reuters is just out with preliminary analysis from Eqecat indicating there is ample coverage in the insurance industry to cover losses associated with Hurricane Sandy. This analysis offers an estimate of $5 billion to $10 billion, which represents “a preliminary forecast that could change after the storm makes landfall.”

In the same article, Morgan Stanley provides this observation:

“With $500 billion-plus of capital … we expect the (property and casualty) industry is once again well prepared to pay all Frankenstorm insured losses,” Morgan Stanley analyst Gregory Locraft said in a report on Monday, using the nickname for the Sandy-nor’easter combo.

Rather than predicting what’s about to happen, we’ll point you to some reading on events in the rearview mirror. You can learn a lot about insurance just through how the industry has evolved following natural catastrophes.

  • David Sanders frames up the challenge of insuring natural catastrophes in his 2006 paper, “The Price of Civilization.” Sanders builds off of something Voltaire said after witnessing the wreckage following the 1755 Lisbon earthquake: “Is this the price mankind must pay for civilization?”  Sanders tries to answer this question by looking at how we pay the price that natural catastrophes extract and examining who bears the brunt of that expense. Here’s a helpful excerpt:

To assess how dangerous an insurance risk is, it is often convenient to apply the Pareto parameter. This rule—commonly known as the 80/20 rule—states that 20% of the claims in a particular portfolio are responsible for more than 80% of the total portfolio claim amount. With the Pareto parameter as a baseline, we can assess a portfolio’s vulnerability. If a single event can spell financial ruin, there may be a problem.

Hurricane data in the Caribbean indicates that insurers can make profit for a number of years, and then find themselves hit by a “one-in-1,000-year” hurricane, which swallows up 95% of the sum insured in one go. For example, when Hugo hit the U.S. Virgin Islands, the total cost of the loss for residential property insurers was equal to 1,000 years’ worth of premiums.

The regulators of the insurance industry generally target a one-in-100-year to one-in-200-year insolvency level. They do not cater to the one-in-1,000-year event. Typical solvency levels for major developed insurance markets that cover catastrophes are on the order of three to six times the cost of a once-in-a-century event. However, Katrina-type losses are not one-in-100-year events. Recent history indicates that they are more like one-in-five-year events, which means every five years the insurance industry can expect a $50 billion loss [most recently Katrina4].

There is a finite amount of reinsurance capacity, with billions available worldwide. A company might find a reinsurer willing to insure a $100 million dollar loss, but can they find one willing to cover single-year losses that exceed that threshold? It is difficult to adequately spread around risk and fill a reinsurance dance card when aggregate losses reach ten digits, which is why “securitizing” insurance risks in the capital markets has become an attractive option. Cat bonds were born in the early 1990s following Hurricane Andrew–a seminal event because it revealed the limits of reinsurance. Since then, companies, insurers, and reinsurers have used cat bonds to provide another layer of insurance, often protecting against an insurer’s unlikely third or fourth hundred-million-dollar loss–the one that finally exhausts insurance or reinsurance capacity.

  • Companies looking to triage the flood of claims that will come their way in the next two to five days might look to text mining as a way of comprehending Sandy’s claims. Phil Borba’s article on text mining shows how insurers can analyze claim notes to better screen and triage their claims.
  • Matt Chamberlain’s recent article examines how something called geocoding can lead to more precise pricing for homeowners policies in hurricane-prone parts of Florida.

Although it may seem like defining the “coastline” is clear-cut, it is actually quite ambiguous when considering a property’s exposure to a hurricane. Does the coastline follow bays, such as Tampa Bay? Does it follow barrier islands? Does it follow rivers and, if so, how far? After a company decides that it should organize its territories based on distance to the coast, that company’s first instinct may be to use an existing coastline. However, such a coastline may not be suitable for the purpose. Off-the-shelf coastlines may follow many small-scale features that do not, in fact, affect hurricane risk.

Maybe we’ll have to publish a follow-up article analyzing the potential for geocoding in Lower Manhattan.

Wherever you are, stay safe and dry. And if you are lucky enough to maintain electricity and Internet, feel free to post related reading below.