Posts

Mitigating Flood Risk with Location Intelligence

While some weather patterns are predictable, extreme weather events tend to run on their own calendar. Unfortunately, severe weather events have a tendency to cause the most damage and are occurring on a more frequent basis, resulting in a substantial financial impact on Canadian insurers.   According to Catastrophe Indices and Quantification Inc. (CatIQ), in 2018 insured damage from severe weather events across Canada reached a staggering $1.9 billion.

Canada will no doubt continue to experience catastrophic weather events, but insurance professionals have solutions available that can help mitigate risks and reduce the financial impact of these events.  Location Intelligence is an emerging field that provides substantial benefit to the insurance industry – from policy quote and approval processes through to claims management and customer service.

Advances in cloud computing are enabling insurers to migrate traditional geospatial analytics from an off-line manual process to real-time integrated solutions and automated workflows.  Using location as a common point of reference, users are able to connect policy, property, claims and third-party data together to gain insight into the property and the surrounding area – allowing them to quickly analyze, visualize and assess property risk.

One common use case is the real-time, automated analysis of a proposed policy vs flood zones, earthquake zones and other potential perils.  Getting a high precision view of where a property sits versus identified risks, allows for more accurate assessments. The availability of real-time location data offers two primary benefits.  The first is increased productivity, delivered by automating assessments and consolidating the output on one screen so that agents can make better faster decisions.  The second is proper pricing for the risk scenarios being proposed.  The key to enabling these benefits is to empower the end user with information in real time.

While an agent is assessing the risk of a potential property, location data can answer a multitude of questions: Is this property exposed to a 1/20, 1/100, 1/200, or 1/1500 year flood event? What are the types and magnitude of historical claims in the surrounding area? Is my accumulation risk too high, or am I able to take on any more policies in this area? Are the current policies priced to cover identified risk? The answers to these questions are imperative when an insurer is trying to accurately calculate total risk exposure from perils and flooding, and accurately quote the policy in a timely manner.

While severe weather events and flooding will continue to present risks to Canadian insurers, access to real-time location data, including flood information, can enable insurers to make more informed decisions during the approval process. While the weather will continue to be unpredictable, location intelligence is a powerful tool to protect your bottom line.

To learn more read our White Paper:  Mitigating Risk with Location Intelligence

 

Additional Reading:

Top 3 Ways Location Intelligence Empowers Underwriters

 

Customer Experience Location Intelligence and Insurance

5 Hidden Data Points Insurance Companies Can Use to Improve their Customer Experience

Creating exceptional customer experience is becoming a top priority for many insurance providers as an essential way of differentiating themselves in a fiercely competitive marketplace. If you and your competitor offer similar pricing, but your competitor provides a better customer experience, who do you think will land the customer?

Location data offers actionable information that insurance providers can use to deliver high quality customer experiences at every touch point. Interested in using data to outpace your competition by offering stronger customer experiences? Check out these 5 hidden data points:

1) Accurate Postal Codes  

You want to ensure you’re offering the most accurate quotes possible when customers come to you seeking coverage. If you’re assessing risk based on postal codes, you aren’t working off of the most accurate location information, and this will be reflected in your pricing. Flooding and earthquakes, for example, don’t stop at the postal code boundary. It is more accurate to assess properties based on latitude, longitude, and elevation to truly understand risk.

Using a location intelligence platform like DMTI’s Location Hub, insurers can better identify and rate risks for properties of interest. Actionable information ensures underwriters make data-backed decisions based on the specific locations being reviewed. You get real-time, high precision geocoding delivered with trusted industry standards, and customers get pricing they can trust.

2) True Risk Concentration

You cannot gain a true understanding of a property’s overall risk probability if you examine risks in silos. Location intelligence offers the ability to review total concentration of risk, such as flood, fire, and other natural hazards, and assess likelihood of any of these occurring, as well as how close the property is to emergency services.

Here’s how this improves your customer’s experience. Aside from providing customers with fair, accurate pricing, you also have a comprehensive understanding of their exact insurance needs, including needs they may not realize they have. For example, a customer coming to you for property insurance may not realize they also require coverage for wildfire or flooding. Using location intelligence, you’ve got the data to show customers what they need, making them feel confident in your services and completely protected by the coverage you offer.

3) Reduce Processing

Time Average processing time for quotes and claims is an essential data point every insurance provider must be aware of. Most people are accustomed to immediate gratification, and staying competitive in the insurance industry requires providers to deliver information quickly.

Leveraging the right digital tools is the key to tracking and reducing processing time. Location intelligence easily feeds into platforms that automate the underwriting and claims process according to your company’s rules and guidelines.

4) Customer Information

Location Intelligence platforms enable customer data to be seamlessly integrated, allowing you to autofill information and limit the number of manual inputs required when generating quotes. Automation of personal information reduces costs for you, while providing customers with a better digital experience. Location intelligence also allows you to review customer claims patterns for real time data-based decision making throughout the underwriting process.

5) Complete Customer Portfolios

Instant access to customers’ portfolios provides an overview of data revealing areas where your insurance company could offer additional services. For example, if you’re using a location intelligence platform, you’ll be able to see all properties owned by a customer that may be covered by other insurance providers. With complete property information paired with hazard data, you’ll also be able to see if your customer has adequate coverage to fully protect their assets. This data allows you to provide a great experience for customers because you enter conversations prepared with complete information and options to provide better rates and coverage than your competition.

Click here to learn more.

 

Additional Reading:

 

Flood Data and Location Intelligence

Leveraging Flood Analysis to Mitigate Risk

Condo popularity is on the rise and for those financing the properties, whether it be a reputable financial institution or the bank of mom and Dad, is your investment protected? What happens in the event of a flood and are you asking all of the right questions before investing? For example, are you investing in properties that are a high risk to flood? With help from DMTI, we’re able to take a closer look on a few regions to understand your exposure.

What risk does flood pose in Canada?

Floods are the most frequently occurring natural hazard in Canada. According to the Institute for Catastrophic Loss Reduction (ICLR), the Canadian Disaster Database indicates that 241 flood disasters have occurred in Canada between the years 1900 and 2005, almost five times as many as the next most common disaster (wildfire). Over the past few decades, urban flooding has been a growing problem, resulting in more than $20 billion in flood damage between 2003 and 2012, according to the federal government.

What is the risk of flood peril to condos in Canada?

In order to provide answers to this question, a condominium database for Canada was created by Teranet and DMTI Spatial combined with flood hazard maps highlighting areas that could be impacted by river flood (where the water rises over its banks), surface water (where water will pool due to elevation differences) and storm surge (coastal flooding). Three key markets were focused on as part of this analysis: Toronto, Vancouver and Montreal.

Toronto, ON

The flood risk analysis (using 1/100 year return period) for Toronto revealed that approx. 1.2% of all condo buildings may be impacted by river flood risk and approx. 5.9% of all condo buildings may be impacted by surface water flood risk.

Toronto

Toronto

Figure #1: Toronto, Ontario – Condos falling within the river flood hazard map for the 1/100 year return period.

Vancouver, BC

The flood risk analysis (using the 1/100 year return period) in Vancouver revealed that approx. 7.3% of all condo buildings may be impacted by surface water risk and approx. 3.2% of all condo buildings may be impacted by storm surge flood risk

Figure 2: Vancouver, British Columbia – Condos falling within the surface water hazard map for the 1/100 year return period

Montreal, QC

The flood analysis (using the 1/100 year return period) in Toronto revealed that approx. 15.0% of all condo buildings may be impacted by river flood risk and approx. 11.4% of all condo buildings may be impacted by surface water flood risk.

Figure 3: Montreal, Quebec – Condos falling within the surface water hazard map for the 1/100 year return period

What does this mean to my business?

As per the Insurance Bureau of Canada (IBC) for flood perils, 20% of Canadian households could be qualified as high risk, and about 10% of those would be considered very high risk which equates to about 1.8 million households. Understanding the impact of natural disasters such as catastrophic flooding is a complex issue. Many customers are challenged with identifying and mitigating their total risk and exposure within their existing portfolio. Here are some additional areas for consideration that would benefit from this type of analysis:

  • Risk Mitigation: Enhance real-time mortgage adjudication processes, speed time to decision and reduce manual intervention with enhanced insight into the precise location of the property as it relates to a flood zone.
  • Risk Analysis: Validate capital adequacy requirements and better understand and reduce exposure by being able to assess the total accumulated risk to a portfolio as it relates to proximity within flood plains.
  • Site Planning: Enhance infrastructure and site planning analysis by understanding the potential risk of flood before deployment.

The analysis conducted by DMTI Spatial using its platform Location Hub supports real-time flood risk analysis, portfolio accumulation risk analysis and the real-time visualization of the potential exposure to flood zones. This provides key data of importance to better forecast exposure and mitigate risk.

Contact us to learn more

Customer Experience

How Spatial Technology is Transforming Customer Experiences in the Insurance Industry

Using Spatial Technology to Improve the Customer Experience in the Insurance Industry…While Increasing Profits

Customer experience in the insurance industry is a hot topic, and it’s easy to see why. A strong customer experience is the key to growth and profitability, requiring many companies to focus on differentiating their customer experience to stay ahead of the competition. However, despite the obvious benefits of cultivating happy customers, many insurance providers struggle with how to deliver a great customer experience and increase profits.

You must find ways to maintain the integrity of your business alongside ways to serve customers better – because if you don’t someone else will. In a 2016 letter to his shareholders, Amazon CEO Jeff Bezos explained that being customer-centric is about staying ahead of customers and developing new ways to keep them happy. Bezos says that if you shift your focus from trying to create great customer experiences, then you’re already on your way down.

Insurance providers are concerned that providing better customer experiences could conflict with the established rules and processes they’ve developed for risk selection and pricing. This certainly doesn’t have to be the case. The solution is using a spatial technology platform that supports and strengthens existing workflows while helping develop meaningful digital experiences for customers.

Digital Customer Experience in the Insurance Industry

In today’s tech-enabled world, your customer’s digital experience plays a huge role in how they perceive your company. Your website is often a customer’s first touch-point with your company and will be the first time they assess if you can meet their needs. Studies show you’ve got less than 1 minute to convince website visitors whether or not they should try your product or service. In insurance, that translates to showing customers they can expect fair pricing and an efficient turnaround time.

That’s where your spatial technology platform comes in.

Use Spatial Technology to Deliver What Customers Want

People expect online experiences to be simple and intuitive and they are accustomed to instantly accessing the information they want. In the insurance industry, providing a great customer experience means being able to quickly deliver information or a plan for coverage. Leveraging a spatial technology platform enables automation and self-service, making the process of generating insurance quotes and pricing simple, fast and efficient.

More and more insurance companies are experiencing the transformative effects of automation, including improved accuracy and efficiency. Data gathered by customer inputs can be automatically delivered into your underwriting process, and remain consistent with the risk selection and pricing models developed by your company’s actuarial/risk group. A well-defined process can easily interface with a spatial technology platform to automate inputs for calculation resulting in instant assessment and pricing.

Spatial technology platforms deliver location-based insights, including tools like geocoding, digital mapping, data analytics and visual dashboards, helping insurers use location intelligence throughout the policy lifecycle to deliver on client needs without sacrificing profitability. Using automated workflows, your spatial technology platform will geocode addresses in real time providing high precision coordinates, and automatically cross reference this data with the appropriate risk factors within your risk assessment models and pricing engine.

In addition to improving profitability by streamlining customer’s access to plans and policies, spatial data paired with customer information can also be used to assess additional plans that consumers may not realize they could benefit from, therefore offering ample opportunity for your company to cross-sell or upsell additional plans and features.

Using Spatial Intelligence to Improve Risk Assessment and Pricing

A strong spatial technology platform provides the location-based insights and analytics necessary to support underwriting, exposure management and claims. This data allows you to improve auditability with defined automated rules and quickly aggregate and visualize location information for more effective and accurate analysis when determining risk.

Real time location intelligence helps insurers quickly respond to policy applications, catastrophic events, and claims with greater accuracy. The ability to apply geographic coordinates (geocoding) to the property of interest offers insurers the ability to associate complex data sets with those coordinates, enabling deeper insights on concentration risk and exposure to hazards. This includes earthquakes, flooding, windstorms and more. These insights can better enable no-touch, low-touch adjudication processes, or provide better insights through visualization for exception handling or portfolio risk analysis.

Finally, the data on a strong spatial technology platform is continually updated to reflect the most current, precise location data, improving risk assessment to ensure accurate pricing.

DMTI Helps Insurance Companies Improve the Customer Experience

When it comes to developing a great customer experience in the insurance industry, it’s important to keep things simple. Companies now have the tools to gather data on what customers want and need to keep them happy, loyal to your brand, and refer you to other potential customers. Using spatial technology, you can manage and monitor risk exposure in real time, with complete location data on one platform.

DMTI is the gold standard for GIS and location-based data in Canada, offering insurers scalable on-demand tools that support real time workflows. DMTI Spatial’s Location Hub® & UAID® is the only solution of its kind in Canada, and supports you company’s risk assessment process by delivering data visualization tools, and data delivery infrastructure to provide high-precision location accuracy.

Click here to find out how DMTI will help your insurance company experience greater growth and profitability.

From Data Analytics to the Cloud

Top Finance Execs Discuss Industry Hot Topics

In an increasingly digitized world, data carries undeniable clout. At once the source of insight into optimizing business operations, data analytics also supports the development of products and services that were unimagined before as new sources of social and sensor information – and the technology to manage it – come online.

But if the vision for data potential is coming into sharper focus, for many organizations, the actual integration of new volumes and variety of data to create business value is less clear. In this situation, the sharing of experiences with data successes and ongoing challenge can serve as a useful tool to galvanize discussion and ultimately more successful deployment of data and analytics solutions. Within a specific industry, this sharing can take on added import as common language and circumstance create quick sympathy – a phenomenon demonstrated in the June installment of DMTI Spatial’s Strategic Insights Sessions, at which key players in the financial sector considered top of mind issues in the use of analytics in banking and insurance.

Data Analytics in Finance

To kick off this month’s pre-game session at the Rogers Centre in Toronto, DMTI invited four panelists to outline their involvement with different analytics projects. These presentations provided a springboard for session attendees to discuss their own knowledge and experience of data practices. Like the topics covered in the panel presentations, conversation in post-panel breakout sessions was animated and broadly based. Several key themes arose in the panel, breakouts and Q&A segments, however, that reveal much about the current state of data analytics adoption.

Finance Industry Discusses Data Analytics

Analytics-based customer experience vs. privacy

In her presentation, Susan Doniz, global chief information officer for provider of marketing and loyalty analytics solutions Aimia, lead with the notion of ‘customer experience’, a buzz topic in the industry that is an increasing preoccupation of customer facing organizations that are on the cutting edge of service delivery.

According to Doniz, loyalty programs based on sophisticated analytics can help businesses understand the consumer, but the key is to use data to do things for people rather than for the business alone. She provided a couple of examples – UK-based grocer Sainsbury, which analyzed what people were buying to discover who might be prone to heart conditions and hence improve targeting for medication and wellness marketing campaigns, and Sephora, which developed “Inform” analytics to remind customers what they had bought in the past to ease the shopping experience – in order to introduce issues around this practice of ‘personalization’.

While personal information can inform service delivery, Doniz argued that consumers today also want transparency around how their data is being used, the ability to turn the data flow off, as well as information on who has accessed their personal data. Largely a government responsibility, the privacy of personal data is also a challenge for CIOs, she argued, who typically do not have a lot of experience in this area. What kinds of limits should be placed on information sharing, what is necessary to collect and what institutions do consumers trust with information like a “digital ID” are all questions that still need to be resolved.

Cloud a panacea for Big Data requirements?

CFO for Kognitive Marketing Atif Ansari made a strong case for creating customer experience through the use of cloud computing, illustrating the benefits of this data delivery approach through description of his work with Bank of America. Ansari asked: “performance measurement takes a long time…. how do you simplify it, [and move data from back office systems] so that it serves the front office who can use it to better serve customers”?

Since bank transactions are typically managed on a quarterly basis, Ansari explained that traditionally it has been necessary to build huge computing systems that were not attuned to this kind of schedule, but could provide capacity when demand was at peak. While this practice is common, it is extremely costly. As a result, the bank moved to cloud, dramatically reducing the costs associated with data housing and management (savings in the millions Ansari added), while also delivering instant access to data for field workers. The financial investment advisor, for example, who is now able to access cloud data to show the customer everything about his individual portfolio in real time in the field.

Ansari acknowledged security and data residency concerns around cloud technology that linger in this sector, but argued that now that service providers can delivery virtual private cloud, located in Canada, there is increasing adoption of cloud within the financial services industry. Ansari’s perspective on cloud was not universal, however, and other session attendees voiced more reservation. For example, Curtis Gergley-Garner, chief risk officer at Canada Guaranty Mortgage Insurance Co., noted that cloud tends to be problematic at his organization since it works with bank data, and the banks in turn require that Canada Guaranty systems be as robust as their own: banks need to become comfortable with cloud data first, before insurers take this step, he insisted.

Similarly, Dion Yungblut, VP at Capital One, observed that while people are moving to cloud, the banks and regulators are not really there; a leader’s job, he said, involves understanding how stay on top of that – how to take advantage of “exponential growth of access to data and computing” and to to create “nimble IT infrastructure” if cloud is not ubiquitous. To clarify, Brad Struthers, director of collateral management and strategic alliances at RBC, pointed to the importance in this debate of separating out cloud computing and cloud storage, which may have different security management requirements. Lack of understanding issues like this, he added, may be preventing banking organizations from tapping into new technologies.

Tech for operational efficiencies

David Bradshaw’s data story was one of growth management. As the VP, client business support at Tangerine bank, explained, the Tangerine group predicted their transactional volume was going to grow by 48 percent with ramp up of operations, and built a “scalability model” based on Excel spreadsheets to figure out how to prevent operating costs from growing at the same rate. This model helped Tangerine to identify the areas that were going to expand most quickly, and which would benefit most from process optimization. Areas such as deposits, where manual management of fuzzy screen cheque photos proved a real bottleneck, as did the mailroom, mortgages and payments.

Through this analytics exercise, the organization was also able to estimate the potential savings that could be gained via the replacement of manual processes with technology, and to plug tools in to achieve significant improvements in focus areas like the mailroom. By streamlining processes and identifying the proper software tools to help increase operational efficiency and speed time to market, Tangerine managed to increase volumes by 45 percent with no increase in operating costs.

Fill in the right data blanks

Curtis Gergley-Garner from Canada Guaranty launched his presentation with the observation that as a relatively new company in the mortgage insurance space with two very large competitors that have a lot more data than they do, the firm has to be very efficient with the data they do have.

From a business perspective, the goal of Canada Guaranty’s data strategy was to improve the efficiency, consistency and speed with which mortgage applications are approved, decrease the cancellation rate (as frustrated customers look elsewhere), improve customer service, ultimately, the overall quality of Canada Guaranty credit. To achieve this, the firm mounted in two specific data projects. One involved the creation of a proprietary scorecard, based on trended credit bureau information over five quarters, which enhanced the company’s ability to predict forward default. On the property side, Canada Guaranty also worked with DMTI Spatial to implement the Location Hub platform, which addresses problems with ambiguous addressing through the Unique Address Identifier (UAID) standard, and by analysing the address quality in a customer data set based on automated error checking.

As Gergley-Garner explained, when individuals input address data, it’s not uncommon for errors to occur; by the same token, it is difficult to achieve an “automated value model” when address data is wrong and easy to “miss a lot of hits” in address search.

The importance of data cleansing, data integrity and completeness of the data set – in location information and other areas – to the implementation of solutions that can support business objectives was echoed by other session participants. Noting regulators’ move from structured data models to real time analytics models, Parag Gupta of the Northbridge Financial Corporation asked “How can you clean up their data?” and “what kind of problem does this cause for the insurance business?” Pointing to the fact that it’s typical for an organization to have lots of data in one area, and shortages in others, Brad Struthers from RBC asked “how is it possible to fill in the gaps?”

If no definitive answers to these data quality issues or to questions that linger around privacy and the use of cloud emerged at the session, by drawing together individuals with similar industry experience who ask the right questions, DMTI hosts are helping to shape an ongoing and healthy dialogue on best practice in the use of financial data and analytics.

Contact us to learn more.