Why Confounding Variables Matter – On that UVic study attributing the 2017 Extreme Fire Season to Climate Change

One of the downsides of my investigation of evidence-based environmental decision-making being a hobby, is my real life often gets in the way. This means I am not always able to comment on every interesting paper when it comes out. One such example is the paper that came out in January from the University of Victoria titled Attribution of the Influence of Human-Induced Climate Change on an Extreme Fire Season. The paper has been a topic of intense conversation but very little critique. It is repeatedly cited by activists who have not read it, but feel the conclusions:

that the event’s high fire weather/behavior metrics were made 2–4 times more likely, and that anthropogenic climate change increased the area burned by a factor of 7–11.

help their political narrative. I keep expecting to read a serious challenge of its results because it has a really obvious flaw that essentially eliminates its usefulness in quantifying anything; but I haven’t seen one to date. I am surprised because once you see how it treats confounding variables it is impossible to take its quantification seriously. In the rest of this blog post I will provide an explanation for this statement/

Since it is the basis of this discussion lets explain the concept of a “confounding variable” in research design. The simplest description I’ve found online is this

A confounding variable is an “extra” variable that you didn’t account for. They can ruin an experiment and give you useless results. They can suggest there is correlation when in fact there isn’t. They can even introduce bias. That’s why it’s important to know what one is, and how to avoid getting them into your experiment in the first place.

As a practical example, imagine you were comparing death rates from car accidents between 1964 and the present and your hypothesis was that these deaths were attributable to better, modern engines. Confounding variables might include the fact that modern cars have air-bags, better seat-belts and more survivable designs; all features that were not available in 1960’s automobiles. If you did not find a way to correct for the presence/absence of seat-belts, air bags and design considerations then any person reading the study would instantly recognize that the results of the study were invalid.

So how is this relevant to the UVic forest fire study? Well let’s look at what it compares:

As the CanRCM4 ensemble includes natural and anthropogenic forcings, we use the decade 2011–2020 to represent the current climate and an earlier decade and 1961–1970 to represent an alternative climate with reduced influence of human emissions

So much has changed between 1961 and 2011 that I expected to find a lot of work to deal with all the potential confounding variables. Imagine my surprise when I came across this text deep in the report:

The result is dependent on the regression model being realistic [my emphasis throughout]. Our regression model assumes that nonclimatic variability in the natural log of area burned is stationary in time and does not account for the possible influence of human factors such as changes in forest management or human ignition sources. Humans have long had a direct influence on fire activity (Bowman et al., 2011), and trends in some regions have been strongly impacted by human intervention (Fréjaville & Curt, 2017; Parisien et al., 2016; Turco et al., 2014). Syphard et al. (2017) demonstrated that climate influence on fire activity becomes less important with a strong human presence. We also do not consider directly the impacts of repeated suppression over time, which could result in larger fires, nor do we consider the pine beetle infestation that has affected BC

Stop and re-read that section again. Their hypothesis is that climate change is the driving factor but they didn’t correct their work for any of the critical confounding variables. They simply ignored some of the most of the most important considerations when discussing forest fire size/numbers/intensity. Let’s look at them one at a time.

Pine Beetles

Obviously the first issue to consider is the Pine beetle infestation. As described by Natural Resources Canada:

Over 18 million hectares of forest were impacted to some degree [by the pine beetles], resulting in a loss of approximately 723 million cubic metres (53%) of the merchantable pine volume by 2012. The epidemic peaked in 2005: total cumulative losses from the outbreak are projected to be 752 million cubic metres (58%) of the merchantable pine volume by 2017,

The pine beetles killed massive swathes of our forests and turned them into dead wood just ready to burn. How can a study comparing fire influence without accounting for the pine beetles? Certainly they cite an American study to justify their decision, but numerous Canadian studies indicate that beetle-killed stands have “higher fire spreading potential” among other considerations. Now if the authors had only missed the pine beetle issue it might have been a minor issue but they also missed changes in forest management.

Forest Management

It is well-understood that BC’s forest management has raised the fire risk in BC. BC has systematically been suppressing broad-leaf trees like aspen and birch that provide natural fire protection so as to make room for more commercially valuable conifer species like pine and Douglas fir. Those species are critical to large stands of trees: they’re less prone to burning, create shade on the forest floor, reduce temperatures and promote more humidity. Current forest management has changed the nature of our forests, this is not a hypothesis it has been a stated policy of our forest management regime for decades. How can a study ignore this consideration? Not only have we changed the forest make-up we have completely modified the fire regime via fire suppression.

Fire Suppression:

After the Slave Lake fire in 2011 the Alberta Government sought advice on the fire situation. The result was the Flat Top Complex Wildfire Review Committee Report which made a number of recommendations and concluded:

Before major wildfire suppression programs, boreal forests historically burned on an average cycle ranging from 50 to 200 years as a result of lightning and human-caused wildfires. Wildfire suppression has significantly reduced the area burned in Alberta’s boreal forests. However, due to reduced wildfire activity, forests of Alberta are aging, which ultimately changes ecosystems and is beginning to increase the risk of large and potentially costly catastrophic wildfires.

Essentially the report acknowledged that fire suppression efforts are making wildfires bigger and more dangerous. While the report was written for Alberta the conclusions are entirely transferable to BC. Humans have interfered with the natural regime of fire in order to protect forests for commercial use and we have now created a situation where bigger, and more numerous, fires are a certainty. But that is not all because of the way we have allowed encroachment into interface zones and provided added access to our forests.

Human Encroachment and Access

Another feature the paper missed is human access issues. For those of us who lived through the 1970’s, one thing I can assure you is that access to the back-country has changed significantly since then. In the 1970’s the resource road network did not exist. Huge portions of the province were essentially inaccessible except via air or on foot. This protected the forests from humans and their tendency to light things on fire or drop sparks from their engines. These days we can get to the back-country much more easily which gives more opportunity for fires. Consider this comment from UBC professor Lori Daniels:

The easiest piece of the puzzle is population. There are simply more of us, in more pockets of the province, which inevitably increases the chance of man-made fires. Varying estimates suggest anywhere between 30 to 50 per cent of the current fires are caused by people.

This result is consistent with the BC Wildfire Service which says that 40% of fires are caused by people. The greater access to the back country has resulted in more area under risk from human impacts.

Conclusion

To conclude, let’s look at the confounding variables that were not considered in this study:

  • Pine Beetles
  • Forest management
  • Fire suppression and
  • Human encroachment and development

and yet this paper says it can provide accurate quantification of the increases in forest fire activity between the 1970’s and the 2010’s due to climate change?

Like our hypothetical study that ignored seat-belts, air bags and vehicle design, the confounding variables have to have had an effect on the two signature numbers “2–4 times more likely” and “increased the area burned by a factor of 7–11“. Absent controls for confounding variables any quantification of the effect of climate change alone cannot be taken seriously. Certainly, it is entirely likely that climate change will eventually increase the likelihood of fire and even increase the area burned, but those 2-4 times and factor of 7-11 numbers are simply not credible given what we know about the disclosed confounding variables.

This entry was posted in Canadian Politics, Climate Change, Risk, Uncategorized. Bookmark the permalink.

17 Responses to Why Confounding Variables Matter – On that UVic study attributing the 2017 Extreme Fire Season to Climate Change

  1. prcyyc2 says:

    When “common sense “ is mentioned in the world today it should be labeled “an uncommon occurrence “. The articles written by “a chemist in langley” are filled to the brim with facts, thoughtful dialogue, well reasoned and researched discussion and COMMON SENSE. Bravo and please keep it up. With some luck perhaps it will be contagious!

    Like

  2. Chris Morris says:

    Another well written article Blair but unfortunately, I think you are preaching to just the converted. It is very unlikely that any of the authors will see, or even address your criticisms, and the paper will be cited by IPCC or the like as proof of the deleterious effects of global warming.

    Like

    • Gerry says:

      Like you say,if it’s only ready by logical people, I suppose it’s our duty to enter the echo chambers of the delusional and spread the knowledge as respectful as possible.

      Like

  3. Bradley J Stokes-Bennett says:

    Keep writing Blair, you help take away fear from reasonable people and get them thinking about real solutions and strategy that has a chance of working.

    Like

  4. They cherry picked BC forestry but if you look at all of Canada here:
    https://cwfis.cfs.nrcan.gc.ca/ha/nfdb
    ……You find their extreme ratios are nonsense.

    Like

  5. mdander says:

    Your discussion of confounding variables is not a “really obvious flaw”.

    As clearly discussed on page 7 of the report, pine beetle kill / forest management / human ignition sources / fire suppression, are identified limitations to the study. The authors are not suggesting that these variables are extraneous, but they are making the case that, despite the fact that they are not able to quantify these effects, the very hot and dry conditions in the summer of 2017 made both the increased number and severity of the forest fires that year much more likely.

    A credible scientific critique of this study does not consist of putting in italics and bold letters what the study already clearly identifies. It requires quantified estimates of the impacts of the omitted variables.

    Climate change attribution studies are being done and they are increasing the body of evidence that the effects of anthropogenic climate change are happening now. As an advocate for evidence-based environmental decision making, you must understand that these numbers, as they get better, are needed to better understand what the economic impact of continuing to burn fossil fuels are vs. the economic impact of aggressively attenuating our production.

    When someone asks, “Should we build TMX?”, that comparison of numbers is what you need to hold up to support your case. Instead you attempt to discredit the science. When that science is not your specific scientific specialty, you are engaged in ideology, not scientific critique.

    As an environmental chemist, you have the credentials to read this paper and explain the limitations to non-scientists, but calling it flawed is overstepping your expertise. The same is true with your critiquing of whale habitat papers. You are also not qualified to have your own opinion about climate sensitivity — that’s not how science works. Your argument about the “Insects are Vanishing Paper” is much better — i.e. it is an explanation of the misuse of the science by the following news cycle, rather than a direct attack on the science.

    This study does not deserve your critique, it deserves follow-on studies that attempt to quantify the effects of the supposedly confounding variables.

    If you still don’t get what I am saying, take the sentence from your conclusion: “Certainly, it is entirely likely that climate change will eventually increase the likelihood of fire and even increase the area burned…”. You, a scientist who has done no published, quantified research in this field, directed this patronizing sentence at the authors of the study — living, breathing experts in climate change attribution.

    That is arrogance, not science.

    Like

    • Blair says:

      You argument makes my case entirely. They don’t say they can’t quantify it, the entire point of my critique is they present a detailed quantification. Perhaps you should read my post again because you appear to have completely missed the point.

      As for my expertise, this is exactly what I am trained to do. This type of analysis is how I cut my teeth.

      Like

      • mdander says:

        As usual, your article contains good research and valid analysis, but it doesn’t matter how well you have cut your teeth if you are chewing on the wrong thing.

        The study quantifies the effects of climate change on the 2017 BC forest fires, and they clearly identify the limitations of their results. Based on your arguments, you are correct to instruct anyone interpreting their results regarding the limitations. You could even criticize the study authors for not referencing the limitations in the abstract of their paper.

        However, your argument is that the science is flawed because they don’t have the means to quantify the effect of all variables that may have a confounding effect. In fact, the authors have provided useful analysis, quantified the results and they have appropriately qualified their results with their understanding of the limitations.

        As with pretty much all the scientific commentary available these days, it is not the science that is flawed, but how it is communicated.

        You may think that your job is to cut your teeth by finding ways to criticize scientific studies that don’t align with your ideological stance. With respect, I think that clear, unbiased science communication would be a better use of your considerable skills.

        Like

      • Blair says:

        When the identified limitations of the study all involve features that control the amount of area burned and the likelihood of larger or more intense fires (you know the features being quantified in the report) then any conclusion that involves quantification become moot. The authors could say risk was increased but their quantification of the increase is suspect due to the confounding variables.

        On your other point, apparently you are not familiar with the term “cut my teeth on” so let’s be absolutely clear. My graduate studies and training involved the formal deconstruction of peer-reviewed work to identify flaws in data analysis and presentation. I was specifically trained to do this type of analysis in a formal setting and while I did it primarily with environmental chemistry and toxicological work during my graduate studies, this is literally what I was trained to do.

        Like

      • mdander says:

        Your statement ending in “then any conclusion that involves quantification become moot” is incorrect. This study uses established mathematical means to analyse available data to produce quantified results. It also clearly identifies the limiting variables we are discussing.

        Your mistake is clear: Your statement that the quantification is moot is equivalent to saying that the confounding effect of the limiting variables is obviously large enough to invalidate the study. That is a qualitative statement. As such, it is sufficient to raise a red flag to policy makers regarding the use of this result as factual. It is insufficient to identify a flaw in a quantitative study.

        In a study such as this, you can only work with the data, tools and resources that you have at your disposal. As a consequence, there will always be limitations and, as long as the authors are up front about them, there is no flaw.

        As for how you cut your teeth, thank you for being clear. I now understand your use of the phrase as a simple appeal to authority.

        For what it is worth, I don’t believe that the quantified results of this study represent “the truth” any more than you do. I do however believe that the effects of anthropogenic climate change are being experienced now and I applaud anyone working to improve our ability to quantify how much. For my own appeal to authority, I will leave you with a link to another study (https://iopscience.iop.org/article/10.1088/1748-9326/aafc1b). Maybe you can find some qualitative reasons to dismiss it as well.

        Like

      • Blair says:

        Of course if they are unable to quantify the variables they deliberately exclude then their quantification becomes invalid. If the uncertainty bounds in the confounding variables exceeds the observed signal then the quantification is invalid. This is basic theory and I am uncertain why you don’t understand it. Simply put Dan, you appear to not understand how this type of analysis is conducted and what it entails.

        Like

      • mdander says:

        I couldn’t make sense of your first sentence, so let’s look at your second: “If the uncertainty bounds in the confounding variables exceeds the observed signal then the quantification is invalid.”

        Correct! We agree and maybe I pass your test of understanding basic theory (or was that just a gentle ad hominem for good measure?).

        However to state that the quantification is invalid, the uncertainty bounds would have to be quantitative, not qualitative as they are in your post.

        Again, to be super clear: Having qualitative uncertainty bounds (i.e. your informed opinion) means that it is appropriate to communicate the science with a red flag about the limitations. Having quantitative uncertainty bounds (absent in this case) mean that you can say definitively whether or not the limitations introduce excessive confounding and hence invalidate the study.

        Like

      • Blair says:

        and once again you demonstrate that you don’t get it. The fact they can’t quantify their confounding variables…and that we know they are significant invalidates any quantification. The noise could (and likely does) completely overwhelm the signal.

        Like

      • mdander says:

        You were very earnest in this post to provide a teaching moment to your readers regarding the nature of confounding variables. You keep saying I “don’t get it”. Maybe you should provide a teaching moment about qualitative vs. quantitative statements.

        I am sure that most of your readers think that the “2-4 times” and “factor of 7-11” of the study are like guesses that are largely affected by the bias of the author. In fact, in such a study, those numbers are the product of a very strict mathematical process that starts with a very specific set of assumptions (the limitations we are discussing being some).

        Near real-time climate change attribution studies are new. Sure, the numbers will not be right on the money right away, but it is a rapidly advancing field that will only get more accurate. Scientific evidence and policy are both iterative. The evidence available to our policy makers is getting more accurate all the time and this is a good thing. As an advocate for evidence based policy making, you should be excitedly applauding the efforts of the authors of this study (while clearly reporting the current limitations of their work).

        Like

      • Blair says:

        The problem is that the “very strict mathematical process” begins with an assumption that outside variables are not an issue. Once it is confirmed that outside variables are a significant issue the quantification goes out the window.

        Like

      • mdander says:

        No. The authors did not begin with “an assumption that outside variables are not an issue”.

        From the study: “While significant uncertainties remain in the attribution of area burned relating to the influence of nonclimatic factors, as discussed above, our finding of a significant anthropogenic contribution to the risk of extreme fire weather, based on multiple metrics and event definitions, increases the robustness of our overall result.”

        The authors have qualitatively justified that their identified limitations do not undermine the robustness of the result, just as you have qualitatively justified why they do.

        You disagree with them and the fact remains that this is their field of expertise, not yours. Arguably, both you and they have an interest (beyond simple scientific curiosity) in being proved correct.

        I don’t. Sitting here on the fence, it is clear that the degree to which the results are mitigated by potentially confounding variables can only be accurately ascertained by adjusting the models and adding that data in. That is clearly not something the authors can do in a timely manner — yet.

        Try believing for a moment that these scientists aren’t trying to pull the wool over our eyes and that their primary motivations aren’t money and prestige. There is no conspiracy. I, like you, believe strongly in evidence-based policy making (though I don’t blog about it, and respect the effort that you put into it) and the best evidence that climate change attribution scientists can provide is getting better, and that’s a good thing.

        Like

    • Rob Mceachran says:

      Although your appeal to authority is compelling, it does nothing to discredit the critique offered in this article.
      I see how the design of this study results in confounding and other forms of bias; inferring causation in any meaningful way (as in a numeric value) is impossible. So the suggestion that a fire is 2-4 times more likely is much closer to pure speculation than it ever will be to evidentiary-based science.
      Just my own unqualified opinion, which I realize I am not entitled to hold, but there it is anyway.

      Liked by 1 person

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.