Showing posts with label Insurance Bureau of Canada. Show all posts
Showing posts with label Insurance Bureau of Canada. Show all posts

Are Six 100 Year Storms Across the GTA Rare Over a 14 Year Period When Considering Probabilities of Observing Extremes at over 150 Rain Gauges?

Roll a 100-sided die once. That is what looking
for a 100 Year storm at a single rain gauge
in a single year is like.
A motion at the City of Toronto notes the following regarding extreme rainfall in the GTA: "According to the Insurance Bureau of Canada, the Greater Toronto Area has had six “100 Year Storms” since 2005". See Mike Layton motion here: https://www.toronto.ca/legdocs/mmis/2019/mm/bgrd/backgroundfile-131063.pdf

CBC has reported on this: link

While we are all concerned about flooding, the question on large storm frequency is "So What?". Or more specifically, from a statistical, mathematical, logical point of view, is more than five 100 Year storms over a 14 year period (2005 to 2018) rare and unexpected, or does this have a high probability of occurring? As we know the Insurance Bureau of Canada does not always rely on proper statistics to support statements on extreme weather, confusing theoretical shifts in probabilities of extreme events with real data (see IBC Telling the Weather Story where IBC ignores Environment and Climate Change Canada's Engineering Climate Datasets).

Let's do some math to see if over five 100 Year storms is rare or not.

First, consider that a 100 Year storm has a probability of occurring of 1/100 = 1 percent per year.

***

Second, count up the number of rain gauges that have been proliferating across the GTA to support inflow in infiltration studies for wastewater studies and to support operational needs. Here are some counts with various sources:

i) City of Toronto (https://www.toronto.ca/city-government/data-research-maps/open-data/open-data-catalogue/water/#09dee024-b840-174f-7270-29c1a1773d14) - 46 rain gauges

ii) Region of York (https://www.york.ca/wps/wcm/connect/yorkpublic/b22ae2f3-5140-48f2-869e-a803d2552893/2017+Inflow+and+Infiltration+Reduction+Strategy+Annual+Report.pdf?MOD=AJPERES) - 71 rain gauges

iii) Peel Region (https://www.peelregion.ca/council/agendas/pdf/ipac-20110811/4b.pdf) - 6 rain gauges (correction July 25, 2019 - Peel has 28 rain gauges ... probabilities in this blog post will go up a bit)

iv) Halton Region (https://www.peelregion.ca/budget/2018/pdf/conservation-halton.pdf) - 14 rain gauges

v) Toronto and Region Conservation Authority (http://199.103.56.152/xcreports/Precipitation/precipitationOverview.aspx) - 14 rain gauges

Total number of gauges = 151. A good first estimate - certainly there are more. (correction July 25, 2019 - as Peel has 28 rain gauges the total is 173 stations)

***

Third, assuming each rain gauge observes rainfall events independently year to year, what is the chance of getting at least one 100 Year event at a single gauge in 14 years?

Probability = 1 - (1-1/100)^14 = 13.1% chance of a 100 Year storm storm at a single gauge. That seems pretty big.

The number of 'trials' or samples equivalent to 14 rolls of a 100-sided die, meaning 14 independent observations or 'samples' from the statistical population of events.

It is reasonable to assume that a single rain gauge can record a 100 Year event but not surrounding gauges? Yes indeed. The August 2018 storm in Toronto only exceeded 100 Year rainfall totals at one gauge. So it is reasonable for smaller, spatially isolated rainfall events that do occur.

***

Fourth, assuming all rain gauges observe rain independently what is the chance of getting more than one 100 Year events across all 151 gauge in 14 years?

The number of trials/samples/observations = 151  x 14 = 2114

Probability = 1 - (1-1/100)^(2114) = over 99.9% chance of at least one 100 year storm at 151 independent gauges. That is almost a certainty.

(Additional comment: we know that storms exceeding 100 Year volumes can cover large areas such that observations are adjacent gauges are not completely independent, especially if they are spatially very close - so this fourth scenario is considered an upper bound on sensitivity analysis considering gauge independence - below, another bound is evaluated assuming less independence).

What about more than five 100 Year storms over 14 years? We have to then consider combinations of events (we do not care which of the 2144 samples has the events) and approach this by subtracting the probability of 1, 2, 3, and 4 events. This summarizes the approach (thanks so much FP!):



The probability of 5 or more 100 Year events is again over 99.9% (see cell F22), showing that when there are many, many trials, the probability of a multiple rare event is very high.

***

Fifth, assuming large storms cluster across several gauges and they do not operate independently from each other for extreme events, and that say they observe 100 Year storms in groups of 5, what is the chance of getting one 100 Year event across 151/5 = 30.2 rain gauge clusters in 14 years?

The number of trials/samples/observations = (151 x 14) / 5 = 2114 / 5 = 422.8

Probability = 1 - (1-1/100)^(422.8) = over 98.5% chance of at least one 100 year storm at 30 independent gauge clusters.  Near certainty. Not rare at all!

Let's consider over five 100 Year storms again. A keen reader has shown that the probability is 41.6% for this, as shown in cell L22 in the spreadsheet image above. Again, pretty high chance of getting 5 or more events when gauges do not observe extremes independently, but rather in clusters.

For more on this analysis, and the probability of 5 or more occurrences in 423 observations the probabilities considered in deriving the probability are as follows:
  • 4 occurrences in 423 observations (P = 0.195038119)
  • 3 occurrences in 423 observations (P = 0.183893083)
  • 2 occurrences in 423 observations (P = 0.1297298)
  • 1 occurrences in 423 observations (P = 0.060868484)
  • 0 occurrences in 423 observations (P = 0.014245815)
  • Sum = 0.583775302
So P[ X ≥ 5; 423] = 1 - 0.583775302 = 0.416224698, or 41.6% noted above. This is the common approach for deriving the probability of a scenario, i.e., by subtracting the probability of the event not occurring from 1.0 (the probability of all events). In this case the sum of the probability of zero to 4 observations occurring is the probability of the scenario of interest (5 occurrences or more) not occurring. If you are interested in testing other scenarios and assumptions for size of rain gauge clusters, use this helpful web site (also used to check the calculations in the spreadsheet shared above): https://stattrek.com/online-calculator/binomial.aspx. Below are checks of the probability analysis:

Probability of 5 or more 100 Year Storms at Independent Rain Gauges (151 gauges x 14 years = 2114 'trials')
Probability of 5 or more 100 Year Storms at Clusters of Rain Gauges With Dependent  Observations (30.2 gauge clusters x 14 years = 422.8, say 423, 'trials')

There are more rain gauges in Durham Region and other Conservation Authorities in the GTA which means there may be more than 30 clusters to observe extreme weather in, meaning an even higher probability of observing extreme events.

So about 423 rolls of a 100-sided die may result in more than five occurrences of a single number with a relatively high probability. If the clusters are bigger, the probability is a bit less, but as we have seen, sometimes only one gauge 'sees' the 100 Year extreme rain. If gauges observe events in clusters of 10, which is an extreme end of the range as we have examples of storms affecting only one gauge (August 2018 in Toronto), there is still a probability for 5 events of over 5% (see below):

Probability of 5 or more 100 Year Storms at Large Clusters of Rain Gauges With Dependent  Observations (15.1 gauge clusters x 14 years = 211.4, say 211, 'trials')
Past flood events in Toronto reveal that between 1 and 12 rain gauges observe 100 Year rainfall depth, as shown in this Toronto Water presentation: https://www.slideshare.net/glennmcgillivray/iclr-friday-forum-reducing-flood-risk-in-toronto-february-2016
It shows:

  • May 12, 2000 - 1 rain gauge over 100 Year (see slide 9)
  • August 19, 2005 - 12 rain gauges over 100 Year (see slide 11)
  • July 8, 2013 - 6 rain gauges over 100 Year (see slide 19)
The August 7, 2018 flood in Toronto was due to only one Toronto rain gauge in the Open Data dataset exceeding 100 Year volumes. Therefore, assuming a cluster size of 5 dependent rain gauges within independent clusters that observe extreme events seems quite reasonable.


Conclusion - is it not rare to get more than five 100 Year rainfall observations at over 151 GTA gauges, over 14 years. The chances range from near certainty (over 99.9%) for independent events at each rain gauge to relatively high probability (over 40%) if gauges are independent clusters of 5 or more.

***

So what else does that tell us? There is a tendency to exercise an 'availability bias' in the words of Daniel Kahneman, and ignore statistics when making quick observations about extreme events. A description of this and other "Thinking Fast" heuristic biases surrounding flooding and extreme weather is in this paper.

Most media reports seldom "do math" and echo sources without question many times - that was the finding of the CBC Ombudsman on this topic of more frequent or severe extreme rainfall recently - see Ombudsman ruling.

Its one thing for a reporter to echo IBC statements on extreme weather for a news story, but Toronto should be careful in taking on a court case with limited data - it would be great to see any IBC statistics or analysis (unlike in the Telling the Weather Story communications). Toronto should also be aware that its flood problems are due mainly to its own design standards in the original size municipalities dating back before the 1980's. Spatial analysis shows that is where the risks are and where the flood reports are being made to the City of Toronto - see slide 36 in this review of flood risk factors which clearly do not include more extreme weather - partially separated systems have the highest risk and Toronto has allowed development to occur without mitigating risks in the past (hence the famous Scarborough Golf court case decision against municipalities for gaps in their stormwater management practices (Scarborough Golf Country Club Ltd v City of Scarborough et al)). Same thing on other GTA cities - see slide 7 in this presentation to the National Research Council's national workshop on urban flooding February 2018 for flood vulnerabilities in the City of Markham - see where Mississauga flood calls occur in this previous post (more than half of flood calls are in pre-1980 areas designed with limited resiliency for extreme weather).

So there has always been flooding:


And the most extreme rainfall intensities in Toronto over short durations happened in the 1960's:


And now extreme rainfall statistics from Environment and Climate Change Canada show decreasing short duration intensities since 1990 in and around Toronto:


.. as shown in a previous post. These 5 minute 100 Year intensities have dropped between 4.0 % and 8.1% between 1990 and 2016-2017 depending on the location.

Such decreases in short duration intensities are happening across southern Ontario as well, based on the newest Engineering Climate Datasets as shown here. Toronto should be careful in preparing for a legal challenge and any claims on flood causes.

As noted in my recent Financial Post OpEd, making a big deal about irrelevant risk facts distracts us from addressing the root cause of flood problems. The City of Toronto should try to not get distracted. And Councilor Mike Layton is probably in the running for a Milli Vanilli "Blame it on the Rain" award this year :)

***

Terence Corcoran covers this all very well in today's column, referencing analysis on this blog.

Note: probabilities for 5 or more events corrected/updated April 1, 2019. Thanks to keen readers for helping define the probabilities of combination events and for the nostalgic references to University of Toronto's Professor Emeritus Dr. Barry Adams' CIV340 course notes that outline the analysis approach.

***

What are the probabilities considering the updated number of stations (i.e., more in Peel), meaning a total of 173 stations? That is, 2422 trials if stations are independent and 484 trials if stations are clustered in clusters of 5.

For 5 or more 100-year storms in 14 years, the probability is 99.9% - 53.2% for independent and clustered gauges, respectively.

For 6 or more storms the probability is 99.9% - 35.6% for independent and clustered gauges, respectively.

Catastrophic Losses in Canada - Have Flood Damages Increased Significantly Or Have Changing Data Sources Affected Trends?

Disaster Losses Are Up
Catastrophic loss trends have been reported regularly in Canada, often in relation to flood damages. These have often linked to climate change effects as well as other factors that may include aging infrastructure (not a significant factor in our mind), or urbanization and intensification (the true overriding factor in many urban centres). This post looks at how trends have changed in relation to changes in data sources.

GDP Adjusted Losses are Down
A blog post by the Institute for Catastrophic Loss Reduction (ICLR) discusses loss trend reporting by the Insurance Bureau of Canada. ICLR discusses but dismisses the calls for adjusting losses for growth, which is commonplace in Munich RE NatCatSERVICE analysis and reporting, and which is promoted by may others (this includes my paper in the Journal of Water Management Modelling which evaluated losses adjusted for net written premiums, and Roger Pielke Jr.'s work, such as reported here in Five Thirty Eight - see charts to the right - that also calls for evaluating trends considering GDP growth).

The ICLR notes "Normalizing disaster loss data to include such factors as growth in population, economic activity and building stock is not a simple undertaking. Further, there are many problems with using simple measures like GDP or insurance premium growth as a normalizer. For these and other reasons, I don’t want to go ‘there’ at this point ...".

So ICLR is content to us the following chart that does not include GDP adjustments:

Catastrophic Losses Flood Damages Canada
Losses in Canada Unadjusted for GDP Growth - 1983-2007 Data per IBC Survey, 2008- Data per CatIQ.

The ICLR notes a change in the data source for the above graph: "Bureau data begins at 1983. From that year to 2007, IBC uses data it collected itself through various company surveys conducted immediately after significant natural disaster events. It also uses various data from Property Claim Services (PCS), Swiss Re, Munich Re and Deloitte. After 2007, the Bureau only uses data from Catastrophe Indices and Quantification Inc. (CatIQ)."

How does the change in data affect reported losses? We can look at how the increase in losses has been reported, for example by the ICLR in 2016:

Catastrophic Loss Trends in Canada. Effects of change in data source on reported losses pre 2008.
Below the ICLR chart, the timing of the change in data is shown. This indicates that the change in reported annual losses from $400M average up to 2008 to $1B average after corresponds to the change in data source in 2008.

More recently the Intact Centre on Climate Adaptation (ICCA) has reported trends in losses on TVO's The Agenda as shown in the chart below:

Intact Centre on Climate Adaptation cites changes in insurable claims on TVO (chart shown), with ICLR's noted change in data sources added below (IBC data up to 2007 and CatIQ data from 2008 onward).

Again, the change in data is added below the ICCA chart. The lower losses of $200-500M up to 2008 and higher losses typically over $1B from 2009 onward correspond to this change in data source.

Adjusting for data sources or for GDP does not really change priorities for flood risk and catastrophic loss reduction. Better characterization of the GDP-adjusted trend can give us insight into the effectiveness of past mitigation efforts, more-resilient design standards that are common in modern practice. Without such GDP adjustment, one would think that everything is built as disaster-prone as it was in the past. Also, understanding the cause of the trend in losses will help focus adaptation or mitigation efforts in the proper place - if increases are explained by GDP growth as opposed to changes in extreme weather (shown to not be a factor) efforts will be placed on adaptation infrastructure built to old, less-resilient design standards as opposed to mitigation (e.g., GHG reduction).

A more wordy comment has been added to the ICLR blog post.

***

A paper Trend Analysis of Normalized Insured Damage from Natural Disasters, published in:
Climatic Change, 113 (2), 2012, pp. 215-237, by Fabian Barthel and Eric Neumayer, Department of Geography and Environment and The Grantham Research Institute on Climate Change and the Environment, London School of Economics and Political Science explores "Normalized" / GDP adjusted damages, exploring trends for different types of events.

As noted in their abstract:

"As the world becomes wealthier over time, inflation-adjusted insured damages from natural disasters go up as well. This article analyzes whether there is still a significant upward trend once insured natural disaster loss has been normalized. By scaling up loss from past disasters, normalization adjusts for the fact that a hazard event of equal strength will typically cause more damage nowadays than in past years because of wealth accumulation over time. A trend analysis of normalized insured damage from natural disasters is not only of interest to the insurance industry, but can potentially be useful for attempts at detecting whether there has been an increase in the frequency and/or intensity of natural hazards, whether caused by natural climate variability or anthropogenic climate change."

The following charts from the paper show an increase in deflated (non-normalized) damage losses over time, and virtually no change in normalized losses.

Global deflated insured losses from natural disasters
Global normalised insured losses from all disasters
Similarly, the following charts illustrate normalized trends for convective storm events (4165 disasters) showing a decrease, all storms including winter and other storms but excluding tropical cyclones (4369 disasters) showing a decrease, and for tropical cyclones (874 disasters) showing an increase.

Global normalized insured losses from convective events
Global normalized insured losses from all storm events except tropical cyclones


Global normalized insured losses from tropical cyclones


***

The Government of Canada has reported that the majority of loss increases have been due to growth (more exposed people, assets and wealth), and that climate change 'may' be having an effect - this contrast many media and insurance industry comments. The true driver of increased losses was reiterated in the just-released Canada in a Changing Climate: National Issues Report (see post: https://www.cityfloodmap.com/2021/06/national-issues-report-identifies.html). Canadian loses have been normalized for growth and show a moderate increase over time - the report notes that earlier data may be incomplete, which would affect the normalized trend as well (more complete older data could decrease the trend).