Posts

UAID Territories

4 Ways to Visualize, Build and Manage your Territories for Canada

Territory management is a process that helps organizations by defining geographical territories based on factors such as sales, infrastructure locations and service delivery zones. Companies are managing this type of information today within spreadsheets and require a solution that helps build and visualize territories based on key metrics like revenue goals, market opportunities, custom client information, government areas, proximity to salespeople’s base location, etc.

Customer Experience Location Intelligence and Insurance

5 Hidden Data Points Insurance Companies Can Use to Improve their Customer Experience

Creating exceptional customer experience is becoming a top priority for many insurance providers as an essential way of differentiating themselves in a fiercely competitive marketplace. If you and your competitor offer similar pricing, but your competitor provides a better customer experience, who do you think will land the customer?

Location data offers actionable information that insurance providers can use to deliver high quality customer experiences at every touch point. Interested in using data to outpace your competition by offering stronger customer experiences? Check out these 5 hidden data points:

1) Accurate Postal Codes  

You want to ensure you’re offering the most accurate quotes possible when customers come to you seeking coverage. If you’re assessing risk based on postal codes, you aren’t working off of the most accurate location information, and this will be reflected in your pricing. Flooding and earthquakes, for example, don’t stop at the postal code boundary. It is more accurate to assess properties based on latitude, longitude, and elevation to truly understand risk.

Using a location intelligence platform like DMTI’s Location Hub, insurers can better identify and rate risks for properties of interest. Actionable information ensures underwriters make data-backed decisions based on the specific locations being reviewed. You get real-time, high precision geocoding delivered with trusted industry standards, and customers get pricing they can trust.

2) True Risk Concentration

You cannot gain a true understanding of a property’s overall risk probability if you examine risks in silos. Location intelligence offers the ability to review total concentration of risk, such as flood, fire, and other natural hazards, and assess likelihood of any of these occurring, as well as how close the property is to emergency services.

Here’s how this improves your customer’s experience. Aside from providing customers with fair, accurate pricing, you also have a comprehensive understanding of their exact insurance needs, including needs they may not realize they have. For example, a customer coming to you for property insurance may not realize they also require coverage for wildfire or flooding. Using location intelligence, you’ve got the data to show customers what they need, making them feel confident in your services and completely protected by the coverage you offer.

3) Reduce Processing

Time Average processing time for quotes and claims is an essential data point every insurance provider must be aware of. Most people are accustomed to immediate gratification, and staying competitive in the insurance industry requires providers to deliver information quickly.

Leveraging the right digital tools is the key to tracking and reducing processing time. Location intelligence easily feeds into platforms that automate the underwriting and claims process according to your company’s rules and guidelines.

4) Customer Information

Location Intelligence platforms enable customer data to be seamlessly integrated, allowing you to autofill information and limit the number of manual inputs required when generating quotes. Automation of personal information reduces costs for you, while providing customers with a better digital experience. Location intelligence also allows you to review customer claims patterns for real time data-based decision making throughout the underwriting process.

5) Complete Customer Portfolios

Instant access to customers’ portfolios provides an overview of data revealing areas where your insurance company could offer additional services. For example, if you’re using a location intelligence platform, you’ll be able to see all properties owned by a customer that may be covered by other insurance providers. With complete property information paired with hazard data, you’ll also be able to see if your customer has adequate coverage to fully protect their assets. This data allows you to provide a great experience for customers because you enter conversations prepared with complete information and options to provide better rates and coverage than your competition.

Click here to learn more.

 

Additional Reading:

 

Flood Data and Location Intelligence

Leveraging Flood Analysis to Mitigate Risk

Condo popularity is on the rise and for those financing the properties, whether it be a reputable financial institution or the bank of mom and Dad, is your investment protected? What happens in the event of a flood and are you asking all of the right questions before investing? For example, are you investing in properties that are a high risk to flood? With help from DMTI, we’re able to take a closer look on a few regions to understand your exposure.

What risk does flood pose in Canada?

Floods are the most frequently occurring natural hazard in Canada. According to the Institute for Catastrophic Loss Reduction (ICLR), the Canadian Disaster Database indicates that 241 flood disasters have occurred in Canada between the years 1900 and 2005, almost five times as many as the next most common disaster (wildfire). Over the past few decades, urban flooding has been a growing problem, resulting in more than $20 billion in flood damage between 2003 and 2012, according to the federal government.

What is the risk of flood peril to condos in Canada?

In order to provide answers to this question, a condominium database for Canada was created by Teranet and DMTI Spatial combined with flood hazard maps highlighting areas that could be impacted by river flood (where the water rises over its banks), surface water (where water will pool due to elevation differences) and storm surge (coastal flooding). Three key markets were focused on as part of this analysis: Toronto, Vancouver and Montreal.

Toronto, ON

The flood risk analysis (using 1/100 year return period) for Toronto revealed that approx. 1.2% of all condo buildings may be impacted by river flood risk and approx. 5.9% of all condo buildings may be impacted by surface water flood risk.

Toronto

Toronto

Figure #1: Toronto, Ontario – Condos falling within the river flood hazard map for the 1/100 year return period.

Vancouver, BC

The flood risk analysis (using the 1/100 year return period) in Vancouver revealed that approx. 7.3% of all condo buildings may be impacted by surface water risk and approx. 3.2% of all condo buildings may be impacted by storm surge flood risk

Figure 2: Vancouver, British Columbia – Condos falling within the surface water hazard map for the 1/100 year return period

Montreal, QC

The flood analysis (using the 1/100 year return period) in Toronto revealed that approx. 15.0% of all condo buildings may be impacted by river flood risk and approx. 11.4% of all condo buildings may be impacted by surface water flood risk.

Figure 3: Montreal, Quebec – Condos falling within the surface water hazard map for the 1/100 year return period

What does this mean to my business?

As per the Insurance Bureau of Canada (IBC) for flood perils, 20% of Canadian households could be qualified as high risk, and about 10% of those would be considered very high risk which equates to about 1.8 million households. Understanding the impact of natural disasters such as catastrophic flooding is a complex issue. Many customers are challenged with identifying and mitigating their total risk and exposure within their existing portfolio. Here are some additional areas for consideration that would benefit from this type of analysis:

  • Risk Mitigation: Enhance real-time mortgage adjudication processes, speed time to decision and reduce manual intervention with enhanced insight into the precise location of the property as it relates to a flood zone.
  • Risk Analysis: Validate capital adequacy requirements and better understand and reduce exposure by being able to assess the total accumulated risk to a portfolio as it relates to proximity within flood plains.
  • Site Planning: Enhance infrastructure and site planning analysis by understanding the potential risk of flood before deployment.

The analysis conducted by DMTI Spatial using its platform Location Hub supports real-time flood risk analysis, portfolio accumulation risk analysis and the real-time visualization of the potential exposure to flood zones. This provides key data of importance to better forecast exposure and mitigate risk.

Contact us to learn more

Underwriters DMTI Spatial CanMap Canadian GIS Data

Top 3 Ways Location Intelligence Empowers Underwriters

Today, modern location intelligence solutions are transforming the underwriting process. They give front line underwriters the ability to quickly and intuitively understand the exposures associated with one address – or an entire portfolio. Geospatial analysis has evolved from a back-office, ‘after the-fact’ function to a leading role in real-time underwriting decisions. Below are the top 3 ways Location Intelligence empowers Underwriting.

1) Individual Risk Assessment and Pricing 

True location intelligence solutions allows underwriters to more accurately assess risk at the individual property level, resulting in higher quality underwriting, more profitable business and cost savings through reduced claims.

Proper risk assessment starts at the point of sale. In the case of personal and commercial properties, that involves the validation and cleansing of addresses. Location intelligence solutions allow insurers to quickly validate the accuracy of new addresses or addresses currently on file for existing policyholders.

2) Risk Accumulation and Portfolio Management 

Accumulation or concentration of risk is an ongoing concern for insurance companies. An accumulation of risk occurs when a portfolio of business contains a concentration of risks that might give rise to exceptionally large losses from a single event. Such an accumulation might occur by location (property insurance) or occupation (employers’ liability insurance), for example.

Insurance underwriters require real-time visibility into their policy accumulations, perils risk data and claims history across their book of business. With an aggregated view of risk (perils and accumulations), underwriters can better manage their overall exposure across their entire portfolio.

Location intelligence solutions offer an accumulation and perils management tool to analyze various risk levels against individual addresses, street levels and postal codes to produce hazard ratings. This means underwriters can make decisions using up to-date data, such as flood or earthquake risk, contaminated land or proximity to potential risk sites, such as gas or propane storage facilities

3)  Segmentation 

A powerful result of location intelligence technology for underwriters is segmentation. Instead of looking at risks on a “blanket” basis, they have new tools to slice individual exposures by specific rooftop locations. In the example of a flood-prone region, insurers can identify which specific properties are at risk, instead of relying on broad postal code or FSA boundaries.

By assessing risk at the property level, insurers have the ability to underwrite business they may have previously declined, based on an inaccurate assessment of the risk location. With a better understanding of geographical risk (peril and accumulations) insurers can be more aggressive with rates in low-risk areas. Underwriters can identify under-exposed areas and target those areas with marketing efforts and competitive premium. Using a location intelligence capability, the insurer can hone in on a given region’s hot and cold spots

Conclusion 

Location-based technology has become an invaluable strategic tool to many leading insurers. Precise mapping and geocoding allow underwriters to quickly access information, recognize patterns and drill down into data for detailed analysis and sound decision-making. The real danger for those carriers not adopting location intelligence is adverse selection – the writing of poor risks without accurate or detailed information.

Understanding where a property is in relation to risk elements is a key decision affecting an insurance carrier’s profitability. Using location intelligence is a simple solution to a complex, real-world problem for the insurance industry. Insurers depend on geographic and demographic information to assess underwriting risk, develop appropriate pricing models, match coverage, expand markets, serve existing customers and develop new or niche business

To learn more download our Property & Casualty Insurance: Getting Risk Right White Paper

From Data Analytics to the Cloud

Top Finance Execs Discuss Industry Hot Topics

In an increasingly digitized world, data carries undeniable clout. At once the source of insight into optimizing business operations, data analytics also supports the development of products and services that were unimagined before as new sources of social and sensor information – and the technology to manage it – come online.

But if the vision for data potential is coming into sharper focus, for many organizations, the actual integration of new volumes and variety of data to create business value is less clear. In this situation, the sharing of experiences with data successes and ongoing challenge can serve as a useful tool to galvanize discussion and ultimately more successful deployment of data and analytics solutions. Within a specific industry, this sharing can take on added import as common language and circumstance create quick sympathy – a phenomenon demonstrated in the June installment of DMTI Spatial’s Strategic Insights Sessions, at which key players in the financial sector considered top of mind issues in the use of analytics in banking and insurance.

Data Analytics in Finance

To kick off this month’s pre-game session at the Rogers Centre in Toronto, DMTI invited four panelists to outline their involvement with different analytics projects. These presentations provided a springboard for session attendees to discuss their own knowledge and experience of data practices. Like the topics covered in the panel presentations, conversation in post-panel breakout sessions was animated and broadly based. Several key themes arose in the panel, breakouts and Q&A segments, however, that reveal much about the current state of data analytics adoption.

Finance Industry Discusses Data Analytics

Analytics-based customer experience vs. privacy

In her presentation, Susan Doniz, global chief information officer for provider of marketing and loyalty analytics solutions Aimia, lead with the notion of ‘customer experience’, a buzz topic in the industry that is an increasing preoccupation of customer facing organizations that are on the cutting edge of service delivery.

According to Doniz, loyalty programs based on sophisticated analytics can help businesses understand the consumer, but the key is to use data to do things for people rather than for the business alone. She provided a couple of examples – UK-based grocer Sainsbury, which analyzed what people were buying to discover who might be prone to heart conditions and hence improve targeting for medication and wellness marketing campaigns, and Sephora, which developed “Inform” analytics to remind customers what they had bought in the past to ease the shopping experience – in order to introduce issues around this practice of ‘personalization’.

While personal information can inform service delivery, Doniz argued that consumers today also want transparency around how their data is being used, the ability to turn the data flow off, as well as information on who has accessed their personal data. Largely a government responsibility, the privacy of personal data is also a challenge for CIOs, she argued, who typically do not have a lot of experience in this area. What kinds of limits should be placed on information sharing, what is necessary to collect and what institutions do consumers trust with information like a “digital ID” are all questions that still need to be resolved.

Cloud a panacea for Big Data requirements?

CFO for Kognitive Marketing Atif Ansari made a strong case for creating customer experience through the use of cloud computing, illustrating the benefits of this data delivery approach through description of his work with Bank of America. Ansari asked: “performance measurement takes a long time…. how do you simplify it, [and move data from back office systems] so that it serves the front office who can use it to better serve customers”?

Since bank transactions are typically managed on a quarterly basis, Ansari explained that traditionally it has been necessary to build huge computing systems that were not attuned to this kind of schedule, but could provide capacity when demand was at peak. While this practice is common, it is extremely costly. As a result, the bank moved to cloud, dramatically reducing the costs associated with data housing and management (savings in the millions Ansari added), while also delivering instant access to data for field workers. The financial investment advisor, for example, who is now able to access cloud data to show the customer everything about his individual portfolio in real time in the field.

Ansari acknowledged security and data residency concerns around cloud technology that linger in this sector, but argued that now that service providers can delivery virtual private cloud, located in Canada, there is increasing adoption of cloud within the financial services industry. Ansari’s perspective on cloud was not universal, however, and other session attendees voiced more reservation. For example, Curtis Gergley-Garner, chief risk officer at Canada Guaranty Mortgage Insurance Co., noted that cloud tends to be problematic at his organization since it works with bank data, and the banks in turn require that Canada Guaranty systems be as robust as their own: banks need to become comfortable with cloud data first, before insurers take this step, he insisted.

Similarly, Dion Yungblut, VP at Capital One, observed that while people are moving to cloud, the banks and regulators are not really there; a leader’s job, he said, involves understanding how stay on top of that – how to take advantage of “exponential growth of access to data and computing” and to to create “nimble IT infrastructure” if cloud is not ubiquitous. To clarify, Brad Struthers, director of collateral management and strategic alliances at RBC, pointed to the importance in this debate of separating out cloud computing and cloud storage, which may have different security management requirements. Lack of understanding issues like this, he added, may be preventing banking organizations from tapping into new technologies.

Tech for operational efficiencies

David Bradshaw’s data story was one of growth management. As the VP, client business support at Tangerine bank, explained, the Tangerine group predicted their transactional volume was going to grow by 48 percent with ramp up of operations, and built a “scalability model” based on Excel spreadsheets to figure out how to prevent operating costs from growing at the same rate. This model helped Tangerine to identify the areas that were going to expand most quickly, and which would benefit most from process optimization. Areas such as deposits, where manual management of fuzzy screen cheque photos proved a real bottleneck, as did the mailroom, mortgages and payments.

Through this analytics exercise, the organization was also able to estimate the potential savings that could be gained via the replacement of manual processes with technology, and to plug tools in to achieve significant improvements in focus areas like the mailroom. By streamlining processes and identifying the proper software tools to help increase operational efficiency and speed time to market, Tangerine managed to increase volumes by 45 percent with no increase in operating costs.

Fill in the right data blanks

Curtis Gergley-Garner from Canada Guaranty launched his presentation with the observation that as a relatively new company in the mortgage insurance space with two very large competitors that have a lot more data than they do, the firm has to be very efficient with the data they do have.

From a business perspective, the goal of Canada Guaranty’s data strategy was to improve the efficiency, consistency and speed with which mortgage applications are approved, decrease the cancellation rate (as frustrated customers look elsewhere), improve customer service, ultimately, the overall quality of Canada Guaranty credit. To achieve this, the firm mounted in two specific data projects. One involved the creation of a proprietary scorecard, based on trended credit bureau information over five quarters, which enhanced the company’s ability to predict forward default. On the property side, Canada Guaranty also worked with DMTI Spatial to implement the Location Hub platform, which addresses problems with ambiguous addressing through the Unique Address Identifier (UAID) standard, and by analysing the address quality in a customer data set based on automated error checking.

As Gergley-Garner explained, when individuals input address data, it’s not uncommon for errors to occur; by the same token, it is difficult to achieve an “automated value model” when address data is wrong and easy to “miss a lot of hits” in address search.

The importance of data cleansing, data integrity and completeness of the data set – in location information and other areas – to the implementation of solutions that can support business objectives was echoed by other session participants. Noting regulators’ move from structured data models to real time analytics models, Parag Gupta of the Northbridge Financial Corporation asked “How can you clean up their data?” and “what kind of problem does this cause for the insurance business?” Pointing to the fact that it’s typical for an organization to have lots of data in one area, and shortages in others, Brad Struthers from RBC asked “how is it possible to fill in the gaps?”

If no definitive answers to these data quality issues or to questions that linger around privacy and the use of cloud emerged at the session, by drawing together individuals with similar industry experience who ask the right questions, DMTI hosts are helping to shape an ongoing and healthy dialogue on best practice in the use of financial data and analytics.

Contact us to learn more.

Calgary Flood

Calgary Flood – 2 years later. Where are we now?

It has been 2 years since the 2013 Calgary floods that occurred in Southern and Central Alberta.  What’s changed?

Overland Flood Insurance Availability

In 2014, Canada’s Economic Action Plan noted that “Canada is the only G-8 country without residential flood insurance coverage, leaving many Canadian homeowners with inadequate protection against losses from overland flood events.”

In 2015, Canadian insurance providers began offering overland water protection for residential property owner across Canada.

Aviva Canada was the first to introduce this change to the market followed shortly after by The Co-operators with Alberta being the immediate focus and other subsequent provinces to be rolled out over time.

 A better understanding of flood risk

According to the Canadian Underwriter:

Flooding is the most common type of natural disaster in Canada and the flood in southern Alberta in 2013 was the most costly storm in Canadian history. “In general,” overland flooding is not currently covered on home insurance policies, the Insurance Bureau of Canada said recently on its website.

A number of vendors have begun to offer hazard maps that help companies determine

The first vendor to market was JBA Risk Management and in May 2015 Aon Benfield began offering this type of data to Canadian insurance companies.

A better understanding of portfolio risk

As the usage of flood hazard maps increases, they will also seek detailed property location information to ensure that they understand where current and new customers are located in reference to these boundaries.

Insurance companies typically utilize three (3) different boundaries from the first three digits of the postal code to the address when analyzing flood risk:

Boundaries Number of unique records in Canada 2015
Forward Sortation Areas (FSA) 1.6K
Postal Codes (FSA LDU) 857K
Addresses 15M

Address level accuracy should be considered when mapping (geocoding) your portfolio against flood hazard boundaries versus the use of postal codes to better understand risk.  Using postal codes without understanding how many individual properties are associated to it in relation an event boundary may lead to the stigmatization of that entire postal code even though only a few addresses may be impacted.

Below is an example for the municipality of Black Diamond, Alberta which has one postal code (T0L0H0) and over 1,000 addresses associated with it. The blue boundary represents the flood boundary from the Alberta 2013 floods.

data visualization tools

The map below depicts the same area where the blue boundary represents the flood boundary from the Alberta 2013 floods:

data visualization tools

Data maintenance is essential to ensuring high-precision accuracy.

Alberta municipalities Number of new addresses added since 2014
Calgary 31,897
High River 281

Address level precision should be utilized when comparing flood information to policies and performing other forms of analyses such as concentration analysis and proximity to other perils such as risk (e.g., underground tanks).

Click the links below to learn more about disaster risk management and disaster visualization tools:

To learn more about how your book of business may be impacted by overland flood in Canada, please contact us.

 

 

 

Flood risk

Are Insurance Companies Measuring Flood Risk Accurately?

On Feb 19 2015, Aviva Canada announced the availability of overland water endorsement. This meant homeowner coverage in Ontario and Alberta would start in May, and roll out to additional provinces throughout 2015.

In Canada, we now have national flood hazard maps and the ability to easily map each of our policies.

“What gets measured, gets managed” – Peter Drucker

Data Visualization Tools for Flood Risk

Patrick Lundy (CEO) of Zurich Canada said:

“Having the right tools, maps and predictive models is key to charging an accurate price for the risk, and capacity in certain areas may become harder to come by. Updated flood zone maps for Canada are of the utmost importance in being able to respond accurately to the increased flooding activity.” (Canadian Underwriter, 2013)

Today, insurance organizations have the ability to:

  • Identify and assess significant exposures in their portfolio
  • Identify new business without growing their 1/n year flood loss
  • Determine where they should not write new business
  • Identify flood risk which may require a more detailed assessment

“Opportunities multiply as they are seized” – Sun Tzu

Insurance Bureau of Canada (IBC) identified that overland flooding is a risk, but this is for a small percentage of the population. This refers to those who live in floodplains or flood prone areas close to rivers or lakes.

Leveraging this knowledge may lead to the creation of a new niche product offering for overland flood.

“Once we know something, we find it hard to imagine what it was like not to know it” – Chip & Dan Heath, Authors of Made to Stick, Switch

The Real Flood Risk

Van Bakel of Crawford recalled discussions that insurance companies shouldn’t worry about catastrophic events, and that everything was accounted for internally.

Fast forward about six weeks. Two of the most populated areas of Canada would never flood within two weeks of each other, would they?”

Overland flood hazard maps and precise mapping (or geocoding) technology allows insurance companies to:

  • Understand the risk to your book of business
  • Identify which markets may have flood risk
  • Create new pricing models based on this risk
  • Generate new product revenue for the business

Click here to learn more about how your book of business may be impacted by overland flood in Canada, or contact us at info@dmtispatial.com.

 

Risk management for earthquakes

The Importance of Managing Earthquake Risk

Do your risk management processes consider the risk of Earthquake?

October 16th marked the 7th annual ShakeOut where over 24 million participants worldwide will practice how to drop, cover and hold on at 10:16 a.m. during Great ShakeOut Earthquake Drills.

“ShakeOut BC Day” started in 2011 and this year over 660,000 participants in British Columbia will participate in drills.  The Charlevoix region in Quebec started participating in 2013 and the entire province has joined in for 2014 with over 80,000 participants registered.

Canadian Regions at Risk for Earthquake

Most people would initially think that British Columbia is most at risk when thinking about the risk of earthquakes in Canada.  However, parts of Quebec and Eastern Ontario are also at risk for earthquakes.  At a recent earthquake response seminar held in Toronto by the Catastrophe Response Unit (CRU), Dr. Kristy Tiampo, professor of geophysical modeling methods at Western University’s department of earth sciences in London, Ont. who also works with the Institute for Catastrophic Loss Reduction (ICLR) stated that “Montreal and Ottawa are both at significant risk of ground shaking” and noted that both cities have seen earthquakes that have measured around 6 on the Richter scale.

In an October 2013 report commissioned by the Insurance Bureau of Canada titled “Study of Impact and the Insurance and Economic Cost of a Major Earthquake in British Columbia and Ontario/Québec” two hypothetical earthquakes were modeled by AIR Worldwide.  One off the west coast of British Columbia measuring 9.0 on the Richter scale and one northeast of Quebec City measuring 7.1.  These two hypothetical scenarios would result in a combined estimated total insured losses of over $30 billion.

It is imperative for insurance companies to have a complete and accurate picture of the location of the property that they are insuring in context to the risks that surround that property.  This will allow them to rate the policy correctly and also to determine whether or not they want to assume the risk.  Understanding where the property is in relation to an earthquake zone is very important.  But not only is it important to know if the property itself is at risk, but also knowing where that property is in relation to other items that could be impacted by an earthquake.  For example, what if the property was close to a natural gas pipeline or propane processing facility?  Knowing about these potential risks in isolation is important to the underwriting and rating decision. But, what about when you also factor in earthquake?  An earthquake of a small magnitude may not be enough to cause much damage the property.  But what if it was enough to cause a gas leak, that then lead to a fire and an explosion?  Having this level of information could mean a big difference.

Disaster Risk Management for Insurance Companies

Another factor to consider for insurance companies is the accumulation of risk.  While the risk for the single property may be acceptable, knowing where all your existing policyholders are at the time you underwrite a mortgage and their relation to risks such as earthquake zones will be critical in determining whether you are willing to assume this additional risk or if your exposure is too high.  If there are two major events in a given year, would your exposure be too high and you wouldn’t be able to pay out on all the claims?

In Canada, various forms of location such as postal code boundaries, municipalities and Catastrophe Risk Evaluating and Standardizing Target Accumulations (CRESTA) zones (for earthquakes) are used to determine the accumulation of risk.

As per the ICLR, Canadian reinsurers, insurers and regulators use Catastrophe Risk Evaluating and Standardizing Target Accumulations (CRESTA) zones as the minimum standard for the capture of data and first level of calculation of probable maximum loss (PML).  PML evaluations can influence underwriting decisions, and the amount of reinsurance allowed on a risk can be predicated on the PML valuation.

The original CRESTA zones were established in 1981 and introduced in Canada in 1986.  They have been recently re-worked globally and have been re-launched to the market for 2012/2013.

All businesses can use this information to help define their contingency plans in event of an earthquake. Which of my existing store or branch locations might be impacted?  Where are my employees situated?  How would I deploy resources to help my customers most efficiently?  Where would I situate them?Insurance underwriting and exposure analysis is only one area where this information can be used.  Other examples include:

  • Public Safety departments within governments can use this to build contingency plans for their citizens, determine where they would locate remote relief sites, sites for temporary housing or medical facilities.
  • Telecommunication companies could use this to gain a better understanding of the risks associated with building out infrastructure in various parts of the country

Click here to see how DMTI’s disaster risk management tools help insurance companies effectively plan for every possibility.

Canadian Flood Data

Are ‘inadequate’ flood-hazard maps impacting your business?

The flood-hazard maps currently available for Calgary are inadequate, according to a recent Calgary Herald article.  Hundreds of properties outside of the designated 1/100 year flood plain were impacted during the Alberta flood event of 2013.  Canada experienced the costliest and third-costliest disasters in Canadian history within two weeks of each other in 2013.  This past summer has also seen its share of flooding. The Insurance Bureau of Canada (IBC) estimated insured damage from summer flooding in the Prairies at $60 million.

The Institute for Catastrophic Loss Reduction (ICLR) advocates for the implementation of the recommendations made from the Alberta provincial reportProvincial Flood Mitigation Report: Consultation and Recommendation.”  Flood risk maps are needed to identify urban flood risk areas.

Effective Disaster Risk Management for Insurance Companies

Disaster risk management is crucial. The following for flood-hazard maps should be considered, in addition to the recommendations above:

  • Use multiple flood-hazard return periods (1/20, 1/100, 1/200 and 1/1,500 year) for analysis
  • Ensure flood-hazard data is regularly assessed and maintained annually
  • Integrate flood-hazard data and location services into your on-line underwriting applications to get real-time access to proximity to flood and water features
  • Access address level services that provide risk related information
  • Assess your accumulated risk based on location of properties within the 1/20, 1/100, 1/200 and 1/1,500 year flood zones.

What about flood maps for Canada?

A national file of flood-hazard maps for Canada will allow you to understand the risk associated with one address or your entire portfolio.

For insurers, this data allows you to get the answers to imperative questions such as:

  • Exposures to your portfolio
  • Where to write new business without growing your 1/n year flood loss
  • Accurately rate policies based on the potential risk of flood
  • Where not to write new business
  • Identifying areas that may have a potential flood risk that may require further assessment

Learn more about how to take advantage of vital flood information that can be used in real-time transactional risk analysis. Click here to learn more about how DMTI provides Insurance Providers with accurate information.

Developing Location Intelligence

Transforming Location Intelligence into Profit

Over the next few weeks, this blog series will provide an overview of some of the basic uses of location intelligence (LI) at an enterprise level, its capacity to optimize business processes, and its hierarchy of benefits that impact positively on profitability and competitiveness.

Here’s what readers can expect to learn:

Location Intelligence: Definition and Context

Some progressive organizations are starting to recognize the value of location as an organizing principle. They see how it is embedded in corporate information, and can be applied to current business problems. Through the use of location intelligence technology, these organizations are finding ways to leverage a latent asset.

As a result, telecommunications companies are improving the serviceability of products across their customer base to increase profitability, insurance companies are better understanding risk and pricing to contain costs, utilities are more accurately meeting compliance requirements, and civil authorities are improving threat detection and emergency management capabilities. These are just some of the applications at an enterprise level.

The strategic use of location intelligence is being propelled by several key business drivers, including the need to increase revenue while simultaneously contain or reduce costs. These strategic imperatives form two sides of the profitability equation, which location intelligence is well-suited to solve.

Location intelligence describes the capacity of an entity or organization to use the principles of location to organize, reason, plan and problem solve. It is not defined by the mere presence of location-enabled technology, but moreover by the degree information is enriched by the perspective of location and the successful integration of this information into a process of decision making.

Specifically, location intelligence is the capability to organize and understand complex phenomena through the use of geographic relationships inherent in all information. Applied in a business context, the outcomes are meaningful, actionable and can provide a sustainable competitive advantage. Building location intelligence successfully requires business specific domain knowledge, formal frameworks, and a relentless focus on desired business outcomes. It’s about transforming business processes and creating opportunities.

Dimensions of Location Intelligence

Location intelligence applications are generally industry specific. However, within that framework, uses can be sub-sorted into three sub-categories:

Enterprise decision support: enterprise applications, often vertically focused, that illuminate optimal business strategy. For example, a telecom company consolidating newly acquired customers can identify common customers and determine how to offer services to achieve the greatest value. An insurance company can link geography dependent risk elements such as proximity to a flood zone or density of coverage in specific neighborhoods to better contain costs, and mitigate or more accurately price for risk.

Customer service: applications that facilitate customer service and self-service to improve the overall customer experience.For example, a government agency can more efficiently measure service levels or plan for the distribution of services that are in many cases dependent on variables that change over space, such as household income or number of children. Governments may also be able to better protect constituents by applying location intelligence to existing workflows so as to enhance fraud detection or threat detection capabilities.

Consumer applications: enterprise applications that build loyalty among customers and influence purchasing behaviors. For example, retailers can execute store-specific promotions with more accuracy, and profile and target their markets, resulting in the identification of higher value customers. Or retailers may use location intelligence to augment loyalty program services via internet channels, as in neighbourhood smart store offerings.

Table 1.0 – Use scenarios for location intelligence by industry

Uses of Location Intelligence
Communications & Media Insurance and Finance Government Services
Marketing

  • micro-marketing
  • assessing penetration levels
  • identifying competitive threats
Portfolio Analysis

  • predictive analytics
  • pricing and loss reserving
  • assessing policy saturation
Address Management

  • address validation
  • data cleansing
  • data maintenance
Customer Service

  • pre-sales qualification
  • dispatch efficiencies
  • multi-product eligibility
Marketing Services

  • property-level campaigns
  • neighbourhood context
  • repeat marketing tracking
Information Integration

  • improved data integrity
  • a “one client view”
  • reducing cascading error
Operations

  • customer serviceability
  • cost avoidance
  • network planning
Sustainable Compliance

  • risk assessment
  • improved monitoring
  • compliance auditing & reporting
Entity Authentication

  • fraud detection
  • risk profiling and scoring
  • advanced analytics

Up Next: Challenges, Drivers and the Need for Location Intelligence