Why political demands we radically speed up decarbonization represent wishful thinking

This blog post started as a potential Twitter thread that got out of hand. It grew out of recent demands by major political organizations that Canada increase its pace of decarbonization. First it was the Canadian pact for a Green New Deal which demanded we:

cut Canada’s emissions in half in 11 years while protecting cultural and biological diversity.

Then the Green Party’s Mission Possible, which is looking to establish our new target of:

60 per cent GHG reductions against 2005 levels by 2030; zero emissions by 2050.

Most recently we have the Assembly of First Nations calling on the other levels of government to:

reduce emissions in Canada by 60% below 2010 levels by 2030 and reach net-zero emissions by 2050.

It is like each organization is attempting to claim the moral high ground and trying to outbid their rivals to prove their environmental plan is the Greenest.

The problem with these demands is they betray a lack of understanding where greenhouse gas emissions come from and what it will take to achieve our decarbonization goals. It is unclear whether this lack of understanding is a political ploy or reflects a true misunderstanding of the scope of the problem we face. In either case, it appears necessary to explain what we face in achieving our decarbonization goals. In doing so I hope to explain why the unrealistic goals of these organizations reflect an unhelpful form of wishful thinking.

The first thing to understand about decarbonization is it is not just about giving up on high-carbon energy sources but replacing them with lower or zero-carbon energy sources. We can’t simply give up on producing food, we have to decarbonize the food production system. We can’t simply give up on transporting food to communities. We have to decarbonize the means by which food reaches communities. We cannot simply give up on heating our homes in winter. We must switch from higher-carbon heating (coal, fuel oil natural gas) to lower-carbon heating like electricity or heat pumps.

Switching over modes of energy generation, and consumption, means replacing existing infrastructure with different infrastructure. In some cases, it means replacing existing technologies with still undeveloped technologies or technologies not currently available in the mass market.

These replacement technologies, and this new infrastructure, won’t simply materialize overnight. They need to be designed, tested and built. Each step in that process consumes time and resources. Moreover, since these technologies often depend on similar supply chains, accelerating the development of one may limit our ability to develop another. As an example, there is not enough lithium available to create all the batteries needed for a complete transition to electric vehicles and for battery back-ups for electric homes.

Also recognize that Canada does not operate in a vacuum. Other jurisdictions are also seeking to reduce their carbon footprints and so are also laying claim to limited resources to achieve their goals. Every electric automobile built in North America, and sold in the United States, is one less North American electric automobile available for purchase in Canada.

It is also important to understand that supply chains are limited. As has been demonstrated in the last decade, trains that are moving one commodity are not available to move another. This is why we have had massive backlogs in grain transportation for the last 10 years.

Going back to our initial challenge. Building infrastructure takes time. Right now, Metro Vancouver is planning for an upgrade to the transit system. Given the limitations of our planning processes they anticipate the newest major transit infrastructure won’t be completed for over a decade.

Yet here we have political groups demanding that we completely upend our national energy system within a decade.

Understand, to achieve a 50% reduction in GHG emissions means replacing all that energy with some other form of energy, likely electricity.

Before you can replace that energy with electricity you must build facilities to generate that electricity. That means building thousands of individual solar, wind, tidal, wave or hydro units and each one of those units involves planning and financing. You can’t just say I am going to build a wind facility and then do it the next day. You must identify appropriate sites; you must get the appropriate permits; you must carry out environmental assessments and adjust your plan to reflect the results of the assessments; you must secure financing; you must undertake First Nations consultation and you must incorporate the results of that consultation in your project.

Each “must” step above takes time and that list is just the steps before you start construction.

Now let’s look at the scope of the problem. As I described previously, a 50% reduction in our greenhouse gas emissions would require we

  • Essentially eliminate the personal vehicle
  • Eliminate our oil sands and natural gas industries
  • Retrofit every household in Canada that uses natural gas for heat and/or hot water
  • Eliminate all our fossil fuel electricity capacity
  • Build the electrical capacity to provide the power for all those EVs, hot water heaters and heaters and
  • Build an entire electricity transmission system to move all that power around.

Moreover, we need to do this while

  • Dealing with the massive recession that comes from destroying our oil & gas industry
  • Paying for a massive upgrade to our public transportation infrastructure to deal with the fact we virtually eliminated personal automobiles
  • Paying for massive retrofits for virtually every household in the country that uses natural gas or fuel oil for heating and hot water
  • Paying for a massive increase in renewable electricity capacity to deal with the sudden jump in demands and the loss of fossil fuel electricity infrastructure
  • Paying for the massively upgraded transmission capacity to move all that new renewable electricity from where it was generated to where it is needed.

To understand the complexity, let’s briefly look at one single step: upgrading our electrical grid.

From a planning perspective, building an upgraded grid would involve identifying a route. That route needs to be surveyed which takes time. An environmental assessment would need to be carried out on the new route to identify the potential ecological effects of the project. Since it is a massive project that assessment would have to include seasonal information. Once an initial route has been identified, consultation will have to be undertaken with any affected communities and First Nations. These consultations must be carried out in the spirit of understanding and will likely require re-routing portions of the project. Any re-routing would require subsequent environmental studies. Given all this pre-planning, for a single linear development we are already 2+ years into the process and haven’t put up a single meter of line.

When it comes to construction, we must consider seasonality. You can’t cut trees during the nesting season and you can’t build river crossings during the fisheries runs. Work will also have to slow down or stop during the heart of winter. This adds more time. Ultimately, to achieve our goal we need to build a backbone of high-power transmission lines which will then connect to a series of laterals and we haven’t even started the process on these laterals. This is not the work of a decade; this is the work of multiple decades.

Moreover, that is just the transmission lines, we haven’t even started on all the solar facilities, wind farms, tidal and wave plants.

Do these appear to be a series of steps that are even vaguely possible to complete before 2030? We are talking about completely remaking our economy on the fly. All the while respecting the needs and desires of legitimate interests including our natural environment, our First Nations partners and our global neighbours. This in a country where a motivated local government can’t get a transit line built in under a decade. Don’t even get me started on the costs. If you imagine medical wait times are long today, imagine what they will be after we completely ignore any investment in our medical system for a decade so we can dedicate ourselves to the hopeless task of getting that energy system built.

To conclude, my understanding is that these groups often see their goals as aspirational rather than literal, at least that is what I hope is true. Admittedly, the Green Party claims their plan is “possible” which is why they named it “Mission Possible“. Looking at the steps involved, however; there is simply no way any reasonable group of policy experts could honestly believe we could achieve these goals in a decade and anyone who claims otherwise is either lying to you or is ignorant and neither of those choices looks good on a political party. But even if we imagine these demands are merely aspirational, I don’t see the point. What point is there in demanding the impossible? All it does is cause the hesitant to plant their feet more strongly while feeding red meat to opponents.

If we are going to achieve our climate goals it will be through incremental change. Set tough goals and then work like the dickens to meet those goals. Certainly, we need to set long-term goals and clarify our aspirations but demanding the sun, the moon and the stars is not how you get things accomplished. As for the people saying “it is a climate emergency we have to get this done” my response is: How? We live in a world of linear time and finite resources, simply demanding the impossible contributes nothing.

Posted in Canadian Politics, Climate Change, Climate Change Politics, Leap Manifesto, Uncategorized | 8 Comments

On the Channel 4 News video – Uncovered: Canada’s Dirty Oil Secret – An embarrassing hit piece full of errors and falsehoods

Recently, I was directed to a report prepared for viewing on Facebook called Uncovered: Canada’s Dirty Oil Secret by Channel 4 News which is reportedly a news program. After watching the report I wasn’t entirely sure what to say. My first response after watching it was to write:

It is factually wrong and uses strategic interviews with recognized opponents who say things that are demonstrably untrue and then presents their words uncritically on the screen

While my statement is true, it doesn’t make for much of a blog post so I suppose I will have to provide a bit more detail. The following is my attempt to highlight some of the incredibly bad journalism in this execrable video.

The report is only 10 minutes 20 seconds long, but it packs a lot of misinformation, errors and falsehoods into that short span. Let’s start with the introductory statement:

Canada, as much of the world sees it. A progressive country. A land of unspoiled wilderness. But in Alberta’s devastated oil sands an image of a very different Canada

It takes only four sentences to get what can gently be called misinformation. The Athabasca oil sands lie beneath 142,200 km² of land. Disturbed oil sands area encompass less than 1% of the oil sands area. It is very hard to understand how surface impacts that occupy less than 1% of an area can fairly be called “Alberta’s devastated oil sands“.

Fourteen seconds into the video and we have another falsehood.

Canada is warming twice as fast as the rest of the world, driven by intensive oil and gas extraction

While it is true that Canada is warming twice as fast as the rest of the world; that extra warming is not driven “by intensive oil and gas extraction“. As explained by the global climate models, warming will occur first in the extreme north and extreme south of the planet. Canada occupies more of the extreme north than any country other than Russia and that is why we are heating faster than countries near the equator.

Now I have to be clear here, if I tried to conduct a second-by second debunking of this video I would be here all week, so I will stick to the highlights hereafter. Let’s go to 34 seconds into the piece:

[the pipeline] will carve through sacred indigenous lands….two different cultures dollar versus the wind, the water…raise carbon emissions while risking devastating oil spills…there will be no cleanup whatsoever.

This section is a load of hokum followed by a direct mistruth.

The vast majority of the pipeline will go along an existing right-of-way that has undergone intense archaeological study. Pretending that the lands under the majority of the pipeline are “sacred” is a false narrative. Arguing that First Nations have a different culture ignores that First Nations are bidding to own the pipeline. As for the mistruth, the claim that there would be no clean-up of spills is calumny that the editors must know is categorically false. That they chose to present it in their video immediately indicts their motives.

I know the argument the editors will make: we are only presenting what these people have to say. The problem is that presenting information that you know, or should reasonably know, is false and doing so under the guise of providing an “opinion” simply doesn’t cut it for a reputable news organization.

We are less than a minute in to this report and we have encountered enough falsehoods that I need a break. Unfortunately, they don’t give us a break it just continues. At minute 1:

I’ve come to ground zero of Canada’s environmental devastation. Fort McMurray, Alberta. It’s hard to imagine that the ravaged land below was once carpeted with forest

[Inigo Gilmour] Canada’s tar sands, Canada’s most shameful environmental secret. Below me is what’s been called the largest and most destructive industrial project in human history.

Let’s start with the visuals. The camera work is careful to provide incredibly narrow shots, likely because wide shots would have shown that the impacts while significant, are not widespread. You can actually see unaffected forests in the top of the shot in numerous shots.

Now I want to note a curious feature of this report (featured in the title). The video presents a bizarre narrative that the oil sands are simultaneously a single, massive industrial project and yet they are also some massive secret. This “secret” theme repeats throughout the piece. Channel 4 apparently believes the oil sands are Canada’s Manhattan Project that we keep hidden from the world. If only the North Korean nuclear program was kept as well hidden as the oil sands.

As for the claims by Mr. Gilmour. Once again they choose to use quotations because what he says is far from the truth.

The oil sands aren’t one big project but a number of projects separated from each other by massive swathes of forest. They are not one of the largest and most destructive projects in human history. They don’t come close. They pale in comparison to the Soviet destruction of the Aral Sea and if we are talking ecological destruction, I would argue the City of London and its boroughs have less biodiversity and natural habitat than the city of Fort McMurray and its environs.

This is a funny thing about the ecological hypocrites from urban European cities. They look at Canada which has protected massive areas while leaving others unaffected by development and complain we aren’t doing enough. What percentage of England has been set aside as permanent ecological reserves? How much of England been dedicated to parks? What does England’s natural biodiversity look like?

Let’s jump now to 1:35.

The new Trans Mountain Pipeline will span over a 1000 through western Canada. Its export serving the Asian market. It will treble oil production from 300,000 to 890,000 barrels per day.

Apparently our narrator did not get the memo about there being no Asian market for Alberta crude. As for the next line, it is simply wrong. The pipeline will not increase production to 890,0000 per day it will provide transport for existing production. Any reasonable editorial fact-checking should have caught this error.

The next couple minutes involve the narrator interviewing Chief Alan Adam who lives downriver of Fort McKay. He claims that the oil sands have affected the river including claims that the oil sands developments have added heavy metals to the river. This is simply not the case. The Athabasca River and its tributaries run directly through the oil sands. Natural hydrocarbon seeps have been contributing impacts to this river system for tens of thousands of years.

As described in the academic literature, studies of the rivers and lakes upstream of the oil sands often find higher concentrations of mercury, trace metals, methylmercury, napthanic acids and other dissolved organics upriver of the oil sands. Put simply, the independent, academic research indicates that the river impacts are mostly natural in origin with only minor anthropogenic contributions.

As for the “state of the health of the community”. Alberta has brought in teams of professional epidemiologists to assess the conditions in the Athabasca region. They did a comprehensive analysis and eventually established that the rates of various illnesses were consistent with what would be expected in these communities. No cancer clusters existed. The final report was completed in 2014 (presented in full at the Alberta Health Cancer page). This report debunked the claims of activists (as described in the CBC follow-up report:“Higher cancer rates not found in oil sands community, study shows”).

Now comes a realization. I am over 1100 words into this post and only one quarter of the way through the video. Having demonstrated that virtually everything being presented is done through an anti-oil sands (of Tar sands in their case) lens, I will now stick to debunking the apparently deliberate misinformation and outright errors for the rest of the video.

At 4:30 the narrator claims the pipeline will produce “8.8 metric tonnes of carbon dioxide” which the narrator claims is equivalent to 2.2 million additional cars on the road every year.

At this point it is eminently clear that no one involved in this project has a clue what they are talking about, because even a well-educated child would recognize the error in this sentence. They are pulling from the City of Vancouver presentation on the Trans Mountain and the correct number was 8.8 million tonnes. Apparently the gang that couldn’t shoot straight doesn’t understand the topic well enough to recognize that MT means million tonnes rather than metric tonnes. This error actually caused me to laugh because it so clearly displays the ignorance of every member of this team that they would get such a basic piece of information so completely wrong. Remember, this report supposedly went through a fact-check and they still got a number off by a MILLION.

Of course what is an anti-pipeline video without Kanahus Manuel who makes her appearance around 6:20 into the video. She makes a number of false statements including saying that in case of spills “there will be no clean-up whatsoever” that “this is a bitumen pipeline” that bitumen from the pipeline “will actually sink“, that “there’s no exact cleaning methods” and that there were pipeline spills in their territory “where clean-up never happened“.

Every one of these statement is untrue. I’m not going to sugar-coat this because Ms. Manuel cannot possibly be this ignorant this long into her struggle. She can’t possibly not know the truth. She appears to be lying and the producers of the video have failed in ensuring that the report presents a true picture of the situation.

What is the truth? Every spill on the Trans Mountain has been carefully documented and has involved a clean-up. The details of every spill are preserved by the National Energy Board. The pipeline will carry diluted bitumen which is chemically different from bitumen. Diluted bitumen floats on water and the science is clear there are numerous means of cleaning a dilbit spill. It is hard to imagine that the editors of the video could allow this many errors in a row by accident.

At 7:45 we get to the Stoney First Nation and the narrator repeats a claim that the Stoney were not consulted on the pipeline, except that is not true.

At 9 minutes is another incredible claim. That the pipeline will result in 250 oil tankers a month in the Burrard Inlet. Try as I might, I can’t figure out where they got that number from. It appears to have been created from thin air. The project is expected to generate about 400 tankers a year [okay I seem to remember the actual number from the NEB was 408 but I could be off by a couple]. Once again it is hard to attribute this bad number to ignorance, but given the serious errors made prior to this it is entirely possible that the video producers are so completely uninformed that they chose not to look at any of the documentation on the project and just repeated a number presented by others. In any case this is the sort of thing a competent fact-checker should have caught and fixed. It doesn’t just make Channel 4 look bad, it makes it look incompetent.

The next line is Reuben George suggesting there is 87% chance of a spill with a million people getting sick within hours. These numbers are simply nowhere near the truth. According to studies by experts the increase in major spill risk is negligible and the suggestion that a million people will get sick is the result of Mr. George’s consultant making an embarrassing mistake confusing a pseudo-surrogate with an actual compound in oil. Mr. George then repeats long debunked economic arguments and ends with a threat.

I finally, and thankfully, come to the end of this painfully bad propaganda piece. After watching this video I can only express disgust and dismay. It is full of misinformation, half-truths, easily identifiable errors and what some would call outright lies. That so much bad information got through the “fact-checking” and so many falsehoods made it through the editing process erases doubt I had about the motives of the producers. Channel 4 needs to pull this video and fix it. The errors are an embarrassment to any organization that claims to present the news.

Author’s Note:

In an earlier version of this post I mistakenly indicated that Channel 4 was associated with the BBC. That was incorrect. This post has been revised to address this error. My apologies to the BBC for associating them with this drek.

Posted in Climate Change, Uncategorized | 5 Comments

The Green Party’s “Mission Possible” a cool name for a policy proposal that is not ready for prime time.

On May 16th Elizabeth May unveiled the Green Party’s Mission Possible, their 20-step “Green Climate Action Plan“. While I have to admit “Mission Possible” is a very cool name, the plan repeats what we saw with their Canadian “Green New Deal“. It is simply not ready for prime time. When you start looking at the details it becomes clear that the Green Party’s needs to assemble a policy team that understands energy issues, infrastructure development and logistics because this plan demonstrates a woeful lack of specific expertise on these topics and more. As this is only a blog post I won’t address all 20-steps here. Instead, I will address a handful of the steps on topics with which I am familiar.

To begin let’s start with the biggest challenge: modernizing the grid. This one is particularly important because many of their subsequent steps rely on easy access to copious amounts of low-carbon electricity.

9 – And modernize the grid

By 2030, rebuild and revamp the east-west electricity grid to ensure that renewable energy can be transmitted from one province to another.

While a necessary goal, if we are going to achieve our long-term climate ambitions, Canada’s vast and challenging geography has limited our ability to create a nationally integrated power grid. Even the most optimistic view has a new grid costing $25 billion and taking a couple decades to build. A more realistic appraisal puts the cost of a national backbone of 735 kV transmission lines at around $104 billion and taking 20 years to complete.

While a reasonable observer would note that $104 billion, while expensive, is doable; my biggest concern with “Mission Possible” is the time it allocates to achieve these goals. Put simply, building infrastructure takes time. Even war-time mobilizations can’t eliminate Canadian winters or the breeding/nesting seasons so unless we decide to ignore every environmental law on the books we will be limited to clearing and building over limited portions of the year.

One other thing is for certain. Building this grid will require a massive, permanent transfer of land rights. As we know from watching the pipeline debates, linear developments affect every community they go through and I can’t see affected First Nations voluntarily giving up rights to tens of thousands of hectares of land without consultation. Given recent history, I can’t see major work starting until years after the projects are proposed and, as demonstrated by the Trans Mountain pipeline expansion project, federal ownership of the project does not mean it will get a free ride through the courts. I have no doubt that any costs to build the transmission system will need to be supplemented with large sums to compensate individuals and First Nations affected by the grid upgrades.

As for the pace of the work? Well consultations take time and the court has made it abundantly clear you can’t rush consultations.

After the national backbone has been built we still will need to work on all the feeder lines that will have to go to every city, town and hamlet. Building transmission lines in Canada can be intensely expensive. Consider that the Northwest Transmission Line project in BC cost over $2 million a kilometer to build. As for the costs? If your single main line is $104 billion what will be the costs of 10’s of thousands of kms of feeder lines. Even taking into account the existing infrastructure we are talking stupendous sums to complete this task. It is simply not possible that we could achieve this goal by the year 2030.

This leads to an obvious problem, if the electricity isn’t there then where are all the electric vehicles going to get their electricity?

10- Plug in to EVs

By 2030 ensure all new cars are electric. By 2040, replace all internal combustion engine vehicles with electric vehicles, working with car makers to develop EVs that can replace working vehicles for Canadians in rural areas. Build a cross-country electric vehicle charging system so that drivers can cruise from St. John’s, NL to Prince Rupert, B.C. – with seamless ease.

Many others have written about the challenges of decreasing the number of internal combustion engine (ICE) vehicles on our roads so I won’t repeat their criticisms here. Instead, let’s consider the load forecasts. As I previously calculated to simply replace the gasoline burned in BC (and accounting for increased efficiency of electric vehicles over ICE vehicles) would require approximately 15,800 GWh of electricity (or about 3 Site C dam Equivalents [5,100 GWh]). Want to replace all those diesel vehicles? That is about the energy equivalent to 11,400 GWh (2.2 Site C Dams). These are not trivial numbers and they represent British Columbia’s demand only. In combination with other steps (discussed below) the increase in electricity demand will require us to essentially double our national electricity generating capacity. Renewable are great, but the scale of this problem seems not to have been noticed by the policy folks at the Green Party. We are talking about absolutely massive increases in our electricity generation system with all the associated costs and time limitations built into those upgrades.

12- Complete a national building retrofit

Create millions of new, well-paying jobs in the trades by retrofitting every building in Canada – residential, commercial, and institutional – to be carbon neutral by 2030.

I live in a relatively efficient 25-year old house. Like most of my neighbours, my heat and hot water are natural gas and my house was not built to passive housing standards. To re-fit my house to become “carbon neutral” would require removing and replacing the heating and hot water systems (and a lot of insulation upgrades which I won’t go into in this post) and I am not alone.

According to the National Energy Board, in British Columbia 58% of households rely on natural gas for heating, in Ontario it is 67% and in Alberta it is 79%. In order to achieve step 12 we would need to retrofit all those houses by 2030. Consider that according to StatsCan Ontario had 5,169,175 households in 2016. 67% of that number represents around 3,500,000 houses needing retrofitting or about 350,000/yr by 2030. That is essentially 1,000/day every day of the year between now and 2030, so it will certainly create jobs.

I would also note that any requirement to retrofit will require some sort of compensation to home-owners required to expend thousands of dollars to replace perfectly functional hot water heaters and furnaces. Sure one might argue that the government could simply refuse to provide compensation, but a program that alienates 67% of households in Ontario would never pass political muster. No sane government would try it and so the only way it happens is if the government pours billions of dollars into the program.

Since this blog likes to consider energy, let’s also consider what this means for load forecasts. According to our natural gas supplier the natural gas for household use represents about 64 Petajoules (PJ) of energy in British Columbia alone. Put another way 64 PJ is equivalent to about 17,750 GWhr or more than 3 Site C Dams worth of additional power in BC alone. These numbers are starting to add up pretty fast now aren’t they?…and we haven’t even considered the commercial or institutional retrofits.

13- Turn off the tap to oil imports

End all imports of foreign oil. As fossil fuel use declines, use only Canadian fossil fuels and allow investment in upgraders to turn Canadian solid bitumen into gas, diesel, propane and other products for the Canadian market, providing jobs in Alberta. By 2050, shift all Canadian bitumen from fuel to feedstock for the petrochemical industry.

A lot of people agree that it is desirable that Canada be self-sufficient in oil. While a positive idea, it ignores the geographic/infrastructure realities of Canada. In 2018, Canada produced about 4.6 million barrels per day (MMb/d) of crude oil. The problem is that western Canada produced about 95% of that oil and the vast majority of the consumption takes place in Eastern Canada. In 2017, the Hibernia, oil field generated about 220,800 barrels/day (b/d). The Irving refinery in Saint John, meanwhile consumes 320,000 b/d all on its own. Put simply all of the Newfoundland and Labrador oil production is insufficient to supply that one refinery in the Maritimes, it doesn’t come close to replacing the oil imported to supply the Maritimes, Quebec and Ontario.

The Green Party has spent the better part of a the last two decades blocking the mass movement of oil from western Canada to Eastern Canada. We simply cannot get the oil from where it is produced to where it is used absent a massive investment in infrastructure (like say an Energy East pipeline). Absent that investment we cannot end oil imports in Eastern Canada.

As for the idea of developing a 4 MMb/d petrochemical industry in Alberta? That is simply magical thinking. Due to their volatile nature petrochemicals are generally produced close to where they are consumed so good luck finding foreign investors willing to cover the costs. This would be another multi-billion dollar government investment in fossil fuel infrastructure….you know the type of subsidy the Green Party loudly supports every day. Since this blog is getting long, I won’t delve further into that topic.

14 – Switch to bio-diesel

Promote the development of local, small scale bio-diesel production, primarily relying on used vegetable fat from restaurants. Mandate the switch to bio-diesel for agricultural, fishing and forestry equipment.

This represents another case of the Green’s identifying a technology that sounds good on paper but has significant concerns when you look more deeply. It ignores the challenges of scale. Specifically, how many restaurants do the Green’s think exist in Saskatchewan to replace all the agricultural diesel?

Additionally, switching over to pure bio-diesel poses significant challenges for modern engines. Bio-diesel produces less energy per liter and has significant issues with filter plugging and engine compatibility when it represents more than about 20% of the blend. It is another case of the Green Party saying something that sounds clever until you take a close look under the hood.


I think I can stop here. I have looked at only 5 of the 20-steps and shown each one to be impossible/impracticable in the time-frame provided. I haven’t even mentioned that step 7 – “Ban Fracking” would make it impossible to develop geothermal energy resources or that the “ban fracking” statement is inconsistent with most recent science (the Scientific Review of Hydraulic Fracturing) on the topic.

Rather let’s just recognize that from my brief review it is clear that the Green Party either lacks the internal expertise to create reasonable policy or it has chosen to ignore that internal expertise when producing its policy proposals. I say this because I am not providing particularly earth-shattering insight here. The information I have noted is understood by literally hundreds, if not thousands of informed analysts across the country and any one of them could provide a detailed analysis of the flaws in this proposal to build on what I have presented here.

If the Green Party wants to be taken seriously in October, it has to start imagining that its signature polices are going to be looked at more closely than they were in the past. This “Mission Possible” document makes it clear they are not yet ready for such scrutiny.

Posted in Climate Change, Climate Change Politics, Renewable Energy, Site C, Uncategorized | 13 Comments

Why Confounding Variables Matter – On that UVic study attributing the 2017 Extreme Fire Season to Climate Change

One of the downsides of my investigation of evidence-based environmental decision-making being a hobby, is my real life often gets in the way. This means I am not always able to comment on every interesting paper when it comes out. One such example is the paper that came out in January from the University of Victoria titled Attribution of the Influence of Human-Induced Climate Change on an Extreme Fire Season. The paper has been a topic of intense conversation but very little critique. It is repeatedly cited by activists who have not read it, but feel the conclusions:

that the event’s high fire weather/behavior metrics were made 2–4 times more likely, and that anthropogenic climate change increased the area burned by a factor of 7–11.

help their political narrative. I keep expecting to read a serious challenge of its results because it has a really obvious flaw that essentially eliminates its usefulness in quantifying anything; but I haven’t seen one to date. I am surprised because once you see how it treats confounding variables it is impossible to take its quantification seriously. In the rest of this blog post I will provide an explanation for this statement/

Since it is the basis of this discussion lets explain the concept of a “confounding variable” in research design. The simplest description I’ve found online is this

A confounding variable is an “extra” variable that you didn’t account for. They can ruin an experiment and give you useless results. They can suggest there is correlation when in fact there isn’t. They can even introduce bias. That’s why it’s important to know what one is, and how to avoid getting them into your experiment in the first place.

As a practical example, imagine you were comparing death rates from car accidents between 1964 and the present and your hypothesis was that these deaths were attributable to better, modern engines. Confounding variables might include the fact that modern cars have air-bags, better seat-belts and more survivable designs; all features that were not available in 1960’s automobiles. If you did not find a way to correct for the presence/absence of seat-belts, air bags and design considerations then any person reading the study would instantly recognize that the results of the study were invalid.

So how is this relevant to the UVic forest fire study? Well let’s look at what it compares:

As the CanRCM4 ensemble includes natural and anthropogenic forcings, we use the decade 2011–2020 to represent the current climate and an earlier decade and 1961–1970 to represent an alternative climate with reduced influence of human emissions

So much has changed between 1961 and 2011 that I expected to find a lot of work to deal with all the potential confounding variables. Imagine my surprise when I came across this text deep in the report:

The result is dependent on the regression model being realistic [my emphasis throughout]. Our regression model assumes that nonclimatic variability in the natural log of area burned is stationary in time and does not account for the possible influence of human factors such as changes in forest management or human ignition sources. Humans have long had a direct influence on fire activity (Bowman et al., 2011), and trends in some regions have been strongly impacted by human intervention (Fréjaville & Curt, 2017; Parisien et al., 2016; Turco et al., 2014). Syphard et al. (2017) demonstrated that climate influence on fire activity becomes less important with a strong human presence. We also do not consider directly the impacts of repeated suppression over time, which could result in larger fires, nor do we consider the pine beetle infestation that has affected BC

Stop and re-read that section again. Their hypothesis is that climate change is the driving factor but they didn’t correct their work for any of the critical confounding variables. They simply ignored some of the most of the most important considerations when discussing forest fire size/numbers/intensity. Let’s look at them one at a time.

Pine Beetles

Obviously the first issue to consider is the Pine beetle infestation. As described by Natural Resources Canada:

Over 18 million hectares of forest were impacted to some degree [by the pine beetles], resulting in a loss of approximately 723 million cubic metres (53%) of the merchantable pine volume by 2012. The epidemic peaked in 2005: total cumulative losses from the outbreak are projected to be 752 million cubic metres (58%) of the merchantable pine volume by 2017,

The pine beetles killed massive swathes of our forests and turned them into dead wood just ready to burn. How can a study comparing fire influence without accounting for the pine beetles? Certainly they cite an American study to justify their decision, but numerous Canadian studies indicate that beetle-killed stands have “higher fire spreading potential” among other considerations. Now if the authors had only missed the pine beetle issue it might have been a minor issue but they also missed changes in forest management.

Forest Management

It is well-understood that BC’s forest management has raised the fire risk in BC. BC has systematically been suppressing broad-leaf trees like aspen and birch that provide natural fire protection so as to make room for more commercially valuable conifer species like pine and Douglas fir. Those species are critical to large stands of trees: they’re less prone to burning, create shade on the forest floor, reduce temperatures and promote more humidity. Current forest management has changed the nature of our forests, this is not a hypothesis it has been a stated policy of our forest management regime for decades. How can a study ignore this consideration? Not only have we changed the forest make-up we have completely modified the fire regime via fire suppression.

Fire Suppression:

After the Slave Lake fire in 2011 the Alberta Government sought advice on the fire situation. The result was the Flat Top Complex Wildfire Review Committee Report which made a number of recommendations and concluded:

Before major wildfire suppression programs, boreal forests historically burned on an average cycle ranging from 50 to 200 years as a result of lightning and human-caused wildfires. Wildfire suppression has significantly reduced the area burned in Alberta’s boreal forests. However, due to reduced wildfire activity, forests of Alberta are aging, which ultimately changes ecosystems and is beginning to increase the risk of large and potentially costly catastrophic wildfires.

Essentially the report acknowledged that fire suppression efforts are making wildfires bigger and more dangerous. While the report was written for Alberta the conclusions are entirely transferable to BC. Humans have interfered with the natural regime of fire in order to protect forests for commercial use and we have now created a situation where bigger, and more numerous, fires are a certainty. But that is not all because of the way we have allowed encroachment into interface zones and provided added access to our forests.

Human Encroachment and Access

Another feature the paper missed is human access issues. For those of us who lived through the 1970’s, one thing I can assure you is that access to the back-country has changed significantly since then. In the 1970’s the resource road network did not exist. Huge portions of the province were essentially inaccessible except via air or on foot. This protected the forests from humans and their tendency to light things on fire or drop sparks from their engines. These days we can get to the back-country much more easily which gives more opportunity for fires. Consider this comment from UBC professor Lori Daniels:

The easiest piece of the puzzle is population. There are simply more of us, in more pockets of the province, which inevitably increases the chance of man-made fires. Varying estimates suggest anywhere between 30 to 50 per cent of the current fires are caused by people.

This result is consistent with the BC Wildfire Service which says that 40% of fires are caused by people. The greater access to the back country has resulted in more area under risk from human impacts.


To conclude, let’s look at the confounding variables that were not considered in this study:

  • Pine Beetles
  • Forest management
  • Fire suppression and
  • Human encroachment and development

and yet this paper says it can provide accurate quantification of the increases in forest fire activity between the 1970’s and the 2010’s due to climate change?

Like our hypothetical study that ignored seat-belts, air bags and vehicle design, the confounding variables have to have had an effect on the two signature numbers “2–4 times more likely” and “increased the area burned by a factor of 7–11“. Absent controls for confounding variables any quantification of the effect of climate change alone cannot be taken seriously. Certainly, it is entirely likely that climate change will eventually increase the likelihood of fire and even increase the area burned, but those 2-4 times and factor of 7-11 numbers are simply not credible given what we know about the disclosed confounding variables.

Posted in Canadian Politics, Climate Change, Risk, Uncategorized | 17 Comments

The New Gas Boom – A Bust for anyone interested in an informed discussion about Canadian LNG

Anyone who follows news about the Canadian Liquid Natural Gas (LNG) industry (and many who haven’t) will have heard about the new report prepared by the folks at Global Energy Monitor (GEM) called The New Gas Boom: Tracking Global LNG Infrastructure (The New Gas Boom). The New Gas Boom is the latest effort by GEM to generate earned media in its fight against new fossil fuel infrastructure. In the last week I have seen and heard lead author Ted Nace all over local and national radio and television. What is most troubling is that this report (if you can call it that) consists mostly of tables and figures with little-to-no supporting information. What is most problematic is that what information they do present about Canadian LNG developments appears incorrect…and consequently the conclusions the authors have proclaimed across our media landscape are also likely equally incorrect. The rest of this blog will expand on this topic.

Where are they getting their Project Numbers from?

According to their website GEM documents fossil fuel infrastructure developments. Given that mandate one would expect the one thing they would try to do well is keep track of fossil fuels infrastructure developments. If you expected that, you would be disappointed. Between the New Gas Boom report, their fossil fuel tracker application and their associated data tables [look for the tab in the application] they can’t seem to keep their numbers straight. The numbers in the tables of The New Gas Boom report don’t appear to reflect those in the application; and facilities identified in the data tables differ from those that appear when you open the mapping program. More problematically, their numbers, don’t correlate with other, authenticated resources available online.

In The New Gas Boom the authors report that Canada currently has 281.6 million tonnes per annum (MTA) of LNG Export Terminals in “pre-construction”. Now let’s excuse the fact that the term “pre-construction” is never defined and imagine it means projects that someone intends to construct.

I spent several hours trying to figure out what they claimed was in store on the topic of Canadian “LNG Export Terminals” and simply could not reconstruct their 281.6 MTA number. What is more important is that Natural Resources Canada (NRC) keeps a detailed list of Canadian LNG export projects and the NRC list differs significantly with the list from the GEM website.

According to NRC there are 216 MTA of projects on the books, not the 281.6 MTA reported as under “pre-construction” in the New Gas Boom report. Moreover, that 216 MTA of projects simply represent projects that have obtained export licenses, which represents the first step in a long process to construction.

Going to the Sourcewatch page for BL LNG Terminals (Sourcewatch serves as the feeder for the GEM mapping program) they list 21 LNG terminal projects on the BC coast but the NRC identifies only 13. That is a major discrepancy. The next task would be to compare the list from Sourcewatch to the National Energy Board list of LNG export licence applications [excel file] and there one discovers more discrepancies.

Ultimately the thing to understand about the list of BC LNG projects is that many are mutually exclusive in that they rely on the same pipeline capacity and others have no associated pipeline capacity. Put simply, most of these projects have no funding, no gas supply and no chance of being completed. Anyone aware of the BC LNG situation would know that currently there are only a handful of projects close to being considered viable, likely in the 30-40 MTA range rather than the hundreds described by GEM.

The question one might ask is: why does it matter that GEM’s numbers are off by so much? Well because the entire point of The New Gas Boom report is that there is too much capacity under construction; that the massive amount of construction to come will overload the global system; and this will result in financial and ecological ruin. Their headline statements include:

  • $1.3 trillion being invested in global gs expansion.
  • The scale of the LNG expansion under development is as large or greater than the expansion of coal-fired power plants,
  • If built, LNG terminals in pre-construction and construction would increase current global export capacity threefold

The problem, as we have discovered, is that since their numbers are flawed, so are their dire warnings. Their narrative is all wrong. BC is not going to build hundreds of MTA of export capacity and BC is not going to flood the global market with LNG. Rather, BC appears set to produce about 40 MTA of export capacity in the next decades which will fit right in with demand expectations in Asia. Moreover, now that we have established how far off this report is for BC LNG how can we seriously believe anything it has to say for Australian or American developments?

What’s Up With Their Fugitive Emissions numbers?

Besides claiming financial catastrophe, the authors of The New Gas Boom also want to convince readers that natural gas is worse than coal. They appear to do so by undertaking a careful, and some might say biased, reading of the academic literature which allows them to massively exaggerate fugitive emissions. To understand you have to go to page 13 where they explain their methodology. Under the Section: Updated leakage estimates alter the assessment” we see the following text:

Updated leakage estimates alter the assessment. The 2014 DOE report was based on the assumption that methane leakage was 1.3% for conventional onshore gas and 1.4% for fracked gas. In 2018, a comprehensive reassessment of methane emissions in the U.S. oil and gas supply chain, based on facility-scale measurements and validated with aircraft observations in areas accounting for about 30% of U.S. gas production, concluded that the overall leakage rate for natural gas was 2.3% of gross U.S. gas production, a figure 60% higher than the U.S. Environmental Protection Agency inventory estimate (Alvarez 2018). At the higher leakage rate, the advantage to using coal disappears. Multiple studies estimate the overall leakage rates even higher than the 2.3% Alvarez estimate, due to the fact that the Alvarez study did not include “downstream” leaks in the distribution of gas. Such leaks account for an additional 2.7 ± 0.6%, according to a study of Boston (McKain 2015).

There are two topics to be addressed in this section. The 2.3% number for overall leakage and the 2.7% number for downstream leakage. Both are useless in the Canadian context.

To address the 2.3% overall leakage you could go to my previous blog post. In my previous post I noted that the 2.3% number is derived from a paper by Alvarez et al. in Science where they calculate that 2.3% of US production of LNG was lost in the form of fugitive emissions. They argue this wipes out the emission savings in using LNG for power. The problem with the Alvarez results is that they aren’t applicable to Canadian LNG.

The Alvarez paper relies on top-down surveys (airplane surveys) in selected US gas fields and extrapolates those results to the US (and Canada). The problem with this extrapolation is that geology and regulations matter in the LNG field. BC LNG is from deeper formations; Canadian infrastructure is much newer; much of our BC natural gas is sour; and BC has enhanced regulatory controls compared to the US fields studied by Alvarez.

To explain why this all matters consider our stricter regulatory structures which have essentially eliminated flaring and strongly encourage green completions (which prevent release of gas when the well is being completed) and encourage the use of electricity in on-site equipment (to prevent the use of gas which may then be released). These regulations massively reduce the amount of fugitive emissions in the upstream industry. They do this not only because it makes environmental sense but because much of our gas is sour (read poisonous). The levels of releases considered common in Texas or Pennsylvania could result in mass casualty events in BC.

Moreover, as I noted, the Alvarez paper relies on airplane surveys to measure fugitive emissions. These types of surveys have well-understood issues that I address in detail in this blog post. The most important is temporal variability.

Recent research shows that the time when the planes fly really effects their results. The new research makes the observation that the flights used by researchers (like Alvarez et al.) to measure methane only happen during daytime hours (usually in the middle of the day), during the spring/summer on clear days. This coincides with when maintenance is typically scheduled on natural gas facilities (which requires that they flush their systems). As such, the research concludes that top-down surveys will almost always significantly overestimate total emissions.

For an analogy, it would be like traffic counters only working during rush hour and then extrapolating those rush hour conditions over the entire 24 hour day including the middle of the night.

As I noted in my previous blog post, under a 20-year time period a leakage rate greater than 2.6% is necessary for natural gas to approach an emission level for coal when dealing with low efficiency natural gas compression systems and high-efficiency coal. Most estimates for upstream BC fugitive emissions run between 0.6% and 1.1% including the transportation component. Given our high use of electricity in our LNG streams the result is that BC LNG is much cleaner than coal.

Now lets be generous and imagine that fugitive emissions are equivalent to the the entire 2.3% identified by Alvarez. Thanks to our electification BC LNG would still be much cleaner than coal. To address this shortfall the authors had more work to do. They did this by adding an additional “downstream” fudge factor; and because apparently this group of authors and to do so they went to the McKain paper to get their 2.7% downstream leakage rate.

You might wonder why I highlight their dependence on the McKain paper for downstream leakage. The reason for my disdain is that this study represents a particularly egregious case of cherry-picking.

The McKain paper does indeed document substantial leaks in downstream transportation but there is a HUGE PROVISO. The McKain research addresses the “Urban Region of Boston” which has some of the oldest and leakiest natural gas infrastructure on the planet. It has been the topic of major news stories and has its own page on the Environmental Defense Fund’s website. According to Environmental Defense over half of Boston’s natural gas pipes are over 50 years old and “nearly 45% of the pipes are made from cast iron or other corrosive [sic] and leak-prone materials“.

The authors of The New Gas Boom chose the absolute worst-case scenario in North America and used it as their signature value for their assessment. Does this sound like the typical case that should be used to extrapolate to Canadian natural gas infrastructure?

To make an analogy people might better understand, it would be like assuming that the lead issues in the Flint, Michigan water system are typical of the entire North American water system and that we should calculate lead ingestion levels for all North Americans using the numbers only from the Flint, Michigan water system.

If I didn’t know better I would have guessed that the authors of the report were counting on readers not being aware of the background of the McKain paper and so were counting on being able to slide this information by us all….and given the media reporting (and other reporting I have read) they almost did.


Ultimately The New Gas Boom report appears to have been intended to serve a political, rather than a scientific purpose. The numbers presented in the report appear to be massively inflated to allow for them to make catastrophic predictions and present terrifying numbers ($1.3 Trillion) in order to generate lots of free media hits. Sadly, the report did just that. In our local market I saw, or heard, the lead author Ted Nace on each of the major news and information networks. He was able to spread his apocalyptic conclusions free from any significant criticism or push-back. This happened because he was able to take advantage of the fact that 99.9% of the population would not have the knowledge of the academic literature or Canadian LNG industry to call him out.

Unfortunately, it is only now after he has enjoyed his 15 minutes of fame and generated millions of dollars of free publicity, that the limited number of individuals with the knowledge to challenge his claims have been able to come forward to challenge them. It is a sad indication of the state of discourse in our current media landscape that the numerous pieces, like this one, identifying the significant flaws in the story, will mostly serve as footnotes as the press moves on to the next Donald Trump mis-step. This is another case of an NGO promoting a false narrative and no one being available to knowledgeably push-back.

Posted in LNG, Uncategorized | 2 Comments

Debunking another CCPA anti-LNG article, this time in the Globe and Mail – now with Marc Lee response

I have to admit something. Every time I read an article by the Canadian Centre for Policy Alternatives (CCPA), I hope that it will present an evidence-based analysis consistent with the quality of the individuals who I know work there. Sadly, more often than not I am disappointed. I could probably fill an entire section of my blog with pieces debunking analyses by the CCPA. Thus, it was with trepidation that I approached an “Opinion” piece in The Globe And Mail called LNG’s big lie by my regular foil Economist Marc Lee. In this blog post I will go over some of the more egregious issues I had with this article.

The article starts with three introductory paragraphs.

The federal government is seeking to use a clause in the Paris Agreement on climate change to get emissions credits for exports of liquefied natural gas (LNG) to Asian countries.

This plan is nonsensical for a number of reasons, but at its heart is the big lie that LNG will help to reduce global emissions. No one should take such claims seriously.

The grain of truth upon which this claim is made is simple: at the point of combustion, gas is about half as emissions-intensive as coal to deliver the same amount of energy.

These paragraphs demonstrate that the author disagrees with the Federal government on their interpretation of Article 6 of the Paris Agreement (which provides mechanisms for the trading of emission credits) and then follows it up with a demonstration that he does not understand the climate math underlying the BC LNG industry….but that becomes clear as the article continues.

The next three paragraphs provide a simple guide to the LNG industry. They mostly emphasize how challenging it is to get LNG. Presumably this filler was intended to imply that these efforts cause excessive greenhouse gas emissions. The problem is that Life Cycle Analysis (LCA) is an actual field of study and real LCAs have been done on LNG in both Canada and the US. I can only presume the author is hoping that Globe readers are unaware of the academic literature on this topic.

The next paragraph is where the interesting stuff starts to happen:

Taken together, one-fifth of the gas must be consumed in the liquefaction, transport and regasification processes. These processes all lead to greenhouse gas (GHG) emissions and thus substantially reduce the emissions advantage relative to coal.

This “one-fifth” number is derived from an older CCPA report that I have previously debunked. In that case the author of the CCPA report decided to take the results of a US National Energy Technology Laboratory (NETL) study Life Cycle Greenhouse Gas Perspective on Exporting Liquefied Natural Gas from the United States for an export facility from New Orleans to Shanghai, and applied them for a Canadian LNG project exporting LNG from Prince Rupert to Shanghai (only adjusting for tanker shipping distance). The problem with using the NETL data is that the numbers are simply irrelevant in the BC context.

The NETL study, written in 2014, includes lower efficiency compressors, assumes leaky pipelines, in very hot climates and, as I discuss in my earlier blog post, is simply not relevant to the Canadian experience. The compressors to be used in Canada are more efficient and our regulatory system is stricter. This results in substantial efficiencies which essentially halve the number presented in the Opinion piece. So no, the one-fifth number isn’t close to our current technological state.

On the topic of coal the next paragraph is even more egregious.

Coal, in contrast, may be dirty in terms of emissions, but getting it to market is relatively easy compared with gas. Coal can be dug up, put on rail cars and shipped to its final destination.

Where to start on this one? Around 10% of the entire life cycle emissions of coal (including its eventual combustion) come from digging it up and shipping it to market. That 10% may sound small but when you look at the numbers it represents almost 25% of the emissions generated by the combustion of natural gas. This is not a rounding error and for the CCPA to treat it as such is simply ridiculous.

Moreover, the statement also ignores the methane emissions associated with mining coal. As described in a recent study of the Marcellus Shale play “a significant portion (~70%) of the emitted CH4 [in the region] was found to originate likely from coalbeds.” From an upstream perspective, coal is far worse than natural gas when the fugitive methane emissions are incorporated into the climate math.

Now for the next paragraph

The other emissions problem with gas is that leaks occur at various points along the supply chain from wellhead to final combustion. Recent studies have found that these leaks are much larger than have been reported by industry and governments.

The study the CCPA is talking about is by Alvarez et al. in Science which calculated that 2.3% of US production of LNG was lost in the form of fugitive emissions which they argue wipes out the savings in using LNG for power. The problem with the Alvarez results is that they aren’t applicable to the Canadian context.

The Alvarez paper relies on top-down surveys (airplane surveys) in selected US gas fields and extrapolates these results to the US (and Canada). The problem with this extrapolation is that geology and regulations matter. BC LNG is deeper, with newer infrastructure, more sour gas and different regulatory standards than the US fields studied by Alvarez. All four of these factors matter in this debate As an example, our stricter regulatory structures have essentially eliminated flaring and strongly encourage green completions and encourage the use of electricity in on-site equipment. All of which significantly reduce our fugitive emissions.

Moreover, since much of our gas is sour (read poisonous), the type of release considered common in Texas or Pennsylvania would result in mass casualties in BC.

As I noted, the Alvarez paper relies on airplane surveys to measure fugitive emissions. These types of surveys have well-understood issues that I address in detail in this blog post. The most important is temporal variability.

Recent research shows that the time when the planes fly really effects their results. The paper makes the observation that the flights used by researchers (like Alvarez et al.) to measure methane only happen during daytime hours (usually in the middle of the day), during the spring/summer on on clear days. This coincides with when maintenance on natural gas facilities (which requires that they flush their systems) typically occurs. As such, the research concludes that top-down surveys will almost always significantly overestimate total emissions.

For an analogy, it would be like traffic counters only working during rush hour and then extrapolating those rush hour conditions over the entire 24 hour day including the middle of the night.

Now you would think at this point it wouldn’t get worse, and yet it does. The next paragraph goes:

Even very small leaks of methane can wipe out any remaining advantage for gas relative to coal. Methane is short lived, breaking down in about 12 years into carbon dioxide and water, but while it is in the atmosphere it is 100 times more heat-trapping than carbon dioxide.

What the author is trying to do is introduce the concept of global warming potential (GWP). GWP is important because methane has a shorter atmospheric half-life (it breaks down more quickly) than carbon dioxide but acts more quickly during its short life. Specialists, disagree whether one should consider the 20-year or 100-year potential of methane, since the IPCC has established that GWP can vary from 28 times (100-year) to 84 times (20-year). The EPA uses a number that includes feedbacks to give ranges of 28-36 times for 20-year and 84-86 times for 100-year. Now looking at these numbers the one number you do not see is the “100 times” cited in the opinion piece. I simply can’t figure where that figure comes from, but certainly not the field of climate studies.

Herein lies the challenge in debunking bad opinion pieces. The original article is only 680 words and I am already at twice that number and only halfway through the piece. So I will speed this up.

In this discussion of leaks the author is careful to to avoid using actual numbers. I can only guess this is because he doesn’t want you to know that his argument doesn’t hold water using what we know about fugitive emissions. The International Energy Agency has debunked his argument and even given us a nice graphic for typical natural gas facilities. As it makes clear at typical fugitive emission levels natural gas is better than coal in greenhouse gas intensity.

Remember this graph is for typical American facilities. As we know from our past analyses BC LNG can produce the same product with 80% of the emissions of our competitors. Our LNG is cleaner and greener. Even a typical US facility makes climate sense when leakage is less than 2.6% (from a 20-year perspective). Yet even the worst number provided by Alvarez is 2.3% and the Canadian numbers are estimated to be in the 1% range. The climate math says LNG is a lot cleaner than coal.

Now we are in the home stretch let’s look at the next three paragraphs.

Finally, we need to think about where Canadian gas is being exported. While it’s plausible our natural gas could displace coal use in China, it could also simply contribute to higher overall energy demand, adding to emissions on top of coal. Or LNG could displace renewables in China’s evolving energy mix.

If exports go to Japan or Korea, the two biggest LNG importers, they would most likely displace cleaner energy sources and therefore increase global GHG emissions.

Even to the limited extent that China may be able to reduce its emissions by switching from coal to gas, it is not suddenly going to hand over the emissions credit to Canada. That’s not how emissions accounting works.

These paragraphs appear to consist of wishful thinking by the author. He imagines that Japan is not building new coal capacity at this very moment. The problem is that is not true. Look at their coal plant tracker or the EIA analysis of the country. Japan is building coal facilities because they can’t get enough natural gas and they need back-up for all the renewables they are installing now that the NGO’s have scared them off nuclear.

As for China, well I have a blog post showing how China is building synthetic natural gas plants to convert coal to natural gas, so the suggestion that China doesn’t need natural gas simply doesn’t hold water either.

I think I am going to stop here. Having looked at 10 paragraphs and found significant issues with virtually every one, I have simply run out of gas. The question I have to ask is: where were the editors with this piece? When I was a younger lad an Opinion piece in the local paper was proofed and fact-checked by the paper to ensure its contents were fact-based. Back then it was believed that best “Opinions” were those supported by facts. Editors didn’t let things like “100 times more heat-trapping” get through the editing process.

Reading this piece I am reminded of the Daniel Patrick Moynihan quote: “Everyone is entitled to their own opinions, but they are not entitled to their own facts.” I only wish the Editors of the Globe would ensure that Opinion writers didn’t come with their own facts.

Addendum: the Author responds

While I was on vacation, the author of the Globe piece (Marc Lee) responded in the comments section. I have pulled his comment up to the text so everyone could see it. My original instinct was to provide a detailed reply but instead I will simply provide links and highlights debunking his responses. Below I have his comment indented in italics and my reply thereafter.

I’ve been on vacation but it was fun to see what you made of my article. Sadly, you don’t do a very good job of rebutting my core arguments. You don’t address the central argument that Canada cannot get credit for its LNG exports, and most of what you write is an ad hominem attack on me and the CCPA. Tip: Writing in a condescending tone does not win an argument.

What is particularly funny about this response is that it is clear that Marc doesn’t even understand what an ad hominem attack entails? I don’t attack him or the CCPA, I attack his argument throughout.

As for this argument, the federal government made clear Article 6 of the Paris Agreement provides a means by which Canada could earn credits for our LNG exports. All it requires is that Canada choose to provide an inducement to the receiving country (likely in the form of a rate cut) to earn the credit.

Your main challenge is around the differences in lifecycle emissions between LNG and coal, and the IEA figure you show highlights some of the trade-offs wrt leakage. But if you read the original (https://www.iea.org/newsroom/news/2017/october/commentary-the-environmental-case-for-natural-gas.html) you would see that they don’t consider LNG at all, and a key point of my article was the energy required for liquefaction, which reduces the advantage of LNG relative to coal.

This claim is simply a red herring since Marc specifically states in the piece “This plan is nonsensical for a number of reasons, but at its heart is the big lie that LNG will help to reduce global emissions. No one should take such claims seriously.” As I have shown in my previous blog post the climate math makes it abundantly clear that Canadian LNG can reduce global emissions. Nowhere does Marc provide any actually numbers to support his argument because every legitimate source supports my argument, not his.

Your comments on GWP are highly misleading. If 100-year GWP is 34 and 20-year GWP is 86, then what is a 12-year GWP? That is how long methane stays in the atmosphere before breaking down into carbon dioxide and water. Here’s a reference that backs my statement of 100 times over 12 years: https://www.eeb.cornell.edu/howarth/summaries_CH4.php

This response says more about the author than I could ever do myself. No legitimate organization uses a 12-year GWP. The standard GWP used by the IPCC is the 100-year GWP. Recently some organizations have chosen to use the 20-year GWP but when they do so they preface it by clearly stating they are using the 20-year GWP. To use a 12-year GWP, without prefacing it as a 12-year figure appears to represent an attempt to deliberately deceive an uninformed public. Nothing I have written to date discredits him more than his admission that he deliberately chose to cite a 12-year GWP without declaring that fact out front.

Your comment that US results on methane leaks are not applicable in BC is a misdirection. Part of the problem is that we are taking industry’s word for it and not doing independent measurement. But studies that have find conventional estimates are an under-estimation:



This is another case of Marc choosing the road less traveled and it is less-traveled because the sources he provides are not legitimate and have been utterly de-bunked. I go into the de-bunking of the Atherton paper here and the Suzuki paper here. The Scientific Review of Hydraulic Fracturing (SRHF) singled out the Atherton report because follow-up work by the regulator demonstrated its results were not valid. The fact that after the SRHF utterly discredited the work, Marc still chose to rely on it, speaks volumes.

I also note that in a previous critique of me you cite Kasumu et al as debunking my and David Hughes argument. Hmmm, the actual article is much more nuanced and does not back that claim. They state: “Results show that while the ultimate magnitude of the greenhouse gas emissions associated with natural gas production systems is still unknown, life cycle greenhouse gas emissions depend on country-level infrastructure (specifically, the efficiency of the generation fleet, transmission and distribution losses and LNG ocean transport distances) as well as the assumptions on what is displaced in the domestic electricity generation mix. Exogenous events such as the Fukushima nuclear disaster have unanticipated effects on the emissions displacement results. We highlight national regulations, environmental policies, and multilateral agreements that could play a role in mitigating emissions.”

Marc claims that the Kasumu article is nuanced, and yes it is…the problem is the nuance doesn’t erase the numbers it presents or the numbers presented both in my piece and in numerous supporting documents. When compared to the existing and in-progress facilities in China and India Canadian LNG will reduce global emissions. There is a reason Marc doesn’t provide any real numbers in his piece, because every real number shoots down his argument.

At best, you can argue there is a plausible range of impacts, from LNG slightly better than coal to worse that coal, and those depend on what assumptions one makes about which export markets, what fuels are displaced, methance leakages, and plant performance, including what the actual performance of LNG Canada will be once constructed (as opposed to the claims they make before hand).

Which is basically what I say in the article

This comment is simply not true. I can’t say this enough, for him to make this claim is simply not supported by the literature. The “plausible range” for BC export to Asia goes from BC LNG being almost 2 times cleaner than coal (with electrification of the compression step and China using SNG) to BC LNG being slightly better than coal (using natural gas for every step andd an excessive methane leakage rate compared to the highest-efficiency coal). Every legitimate life cycle analysis supports my position on this. The one exception is the CCPA LCA which is fatally flawed and even then it has to struggle to make LNG look equivalent to coal.

In re-reading Marc’s reply I can only say that it leaves him looking even worse than if he had not replied in the first place. Before his reply you could reasonably be left with the opinion he simply made a few mistakes….after the reply….

Posted in Uncategorized | 4 Comments

Debunking more misinformation about the Trans Mountain Pipeline Expansion project. Some simple facts about bitumen, heavy oil, and Asian Markets.

As someone interested in evidence-based decision-making there are few topics as frustrating to discuss as the Trans Mountain Pipeline Expansion (TMX) project. The reason for this is that the media landscape is so completely full of misinformation and bad information that evidence-based decision making is almost impossible. This weekend I had an extended discussion with an independent podcaster about the project after listening to him in a radio segment with Lynda Steele on CKNW.

The segment had so many errors that I sent out a number of intemperate tweets in his direction. His response was to present a number of media stories that served as the basis for his opinions. The problem was that most of these stories were full of errors. Therein lies the dilemma. There is so much bad information out there that even well-meaning observers are going to get it wrong. Between the bad information he was being fed in media stories, and the deliberate misinformation being spread by opponents of the project, it is virtually impossible for the regular observer (i.e this podcaster) to know what is right and what is wrong. This post will look at a few more of these myths.


As I discussed in my previous post, the TMX has two major components:

  • Line 1 – existing pipeline segments (with pump upgrades) able to transport 350,000 barrels/day (bbl/d) of refined petroleum products and light crude. It has the capability to carry bitumen but at a much reduced volume per day.
  • Line 2 – a new pipeline with a capacity of 540,000 bbl/d. It is intended to transport heavy crude oil.

Line 2 is about moving heavy oils including diluted bitumen and synthetic crude. Line 1 is intended to help mitigate the supply bottleneck that has Vancouver drivers paying such high prices for gasoline and diesel while supplying the light crude needed by the Parkland Refinery in Vancouver and US refineries in the Puget Sound.

Admittedly Line 1 could be used for heavy crude but even a little bit of heavy oil in Line 1 eliminates the benefits of the upgraded Line 1. Meanwhile Alberta recently completed the Sturgeon refinery and now has a glut of diesel. As such, it makes logistical and financial sense to operate the pipeline in the manner consistent with the NEB proposal which means that Line 1 will almost certainly be used for what it was intended: light crude and refined fuels. Now let’s deal with the misinformation.

Diluted Bitumen – what is it?

The name: “diluted bitumen”, when enunciated syllable-by-syllable by Dr. Andrew Weaver, sounds a lot like a chemical warfare agent. The truth is entirely the opposite. Diluted bitumen (dilbit) is pretty boring stuff that consists of a mixture of 20% to 30% diluent and 70% to 80% bitumen.

Bitumen is a type heavy oil. It is characterised by high viscosity, high density (low API gravity), and high concentrations of nitrogen, oxygen, sulphur, and heavy metals.

The diluent is typically a light-hydrocarbon mixture (like naptha) called “condensate”. The condensate has a specific gravity in the 0.6 g/mL to 0.8 g/mL range.

The resultant dilbit has an API of 20-22 (medium crude is API 22.3 – 31.1 so dilbit is almost a medium crude in API) and a sulfur in the 3.7% -3.9%. Dilbit has density/specific gravity that ranges from around 0.92 g/mL to about 0.94 g/mL. Since we know that freshwater has a density of 1 g/mL and that seawater density ranges from 1.025 g/mL to 1.033 g/mL that means that when spilled any dilbit will initially float.

Chemically, dilbit acts and behaves just like any other heavy, sour oil. Maya is the most comparable crude to typical Alberta dilbit (called WCS). Maya is a Mexican heavy crude that ships out of the ports of Cayo Arcas and Salina Cruz on the Gulf Coast. It has an API of 22 and sulfur of 3.5%. Thus WCS and Mata are both low API blends with less than 5% sulfur.

The thing to take from this section is that dilbit is not some strange creation or unusual mixture. From a chemical perspective it is unspectacular. It looks like a heavy crude oil, it reacts in a refinery like a heavy crude oil and when shipped or spilled behaves almost exactly like a heavy crude oil.

How is bitumen extracted.

Bitumen can be extracted using two methods depending on how deep the deposits are below the surface: in-situ production or open pit mining.

As described in Natural resources Canada Crude oil facts open pit mining represents 45% of current production and 20% of oil sands reserves. In 2017, seven mining projects in Alberta produced approximately 1.25 million barrels a day:

In Situ methods represent 55% of current production and 80% of the total resources. There are about 20 in situ projects in Alberta. In in situ extraction the bitumen is treated in a manner that allows it to flow to be collected. Generally, the three methods that can be used to reduce the viscosity of the bitumen are the addition of steamsolvents, or thermal energy. The biggest benefit of in situ extraction is the lack of above-ground impacts. There are no tailings ponds and the sands are all left underground.

As described by NRC Canada water management is a key challenge of the oil sands extraction process. The mining method uses 2.5 barrels of fresh water per barrel of bitumen and the in situ method uses an average of 0.21 barrels of fresh water per barrel of bitumen. Oil sands producers recycle around 80-95% of the water used in established mines and approximately 85-95% for in situ production.

Is Dilbit particularly dirty?

Let’s be clear here, heavy crude oils are not something you use as a comfort food for a toddler or to bathe puppies. That being said, heavy oils are an essential commodity and bitumen is not a particularly dirty form of heavy oil. Recent studies by California’s Environmental Protection Agency, Air Resources Board for their Low Carbon Fuel Standard  made the following findings:

  • There are 13 oil fields in California, plus crude oil blends originating in at least six other countries, that generate a higher level of upstream greenhouse gas emissions than Canadian dilbit blends;
  • Crude oil from Alaska’s North Slope, which makes up about 12 per cent of California’s total crude slate, is actually “dirtier” than the Canadian dilbit known as “Access Western Blend”;
  • The “dirtiest oil in North America” is not produced in Canada, but just outside Los Angeles, where the Placerita oil field generates about twice the level of upstream emissions as Canadian oil sands production; and
  • The title of “world’s dirtiest oil” goes to Brass crude blend from Nigeria, where the uncontrolled release of methane during the oil extraction process generates upstream GHG emissions that are over four times higher than Canadian dilbit.

As for the claim that the oil sands are the most expensive oil, that dubious title likely goes to the Kashagan oil field in Kazakhstan but it certainly doesn’t go to oil sands oil most of which can be produced at very reasonable costs.

Spills – We know what to expect

Contrary to claims by critics, we know a lot about how to handle diluted bitumen spills. During the original NEB hearings a lot of organizations made hay over the lack of specific knowledge about dilbit spills. As a consequence the federal government spent almost $50 million to study the topic. Transportation Canada prepared a summary of the latest research as did Fisheries and Oceans Canada.

Their conclusions were that dilbit behaves almost exactly the same as other heavy crude oils in spills and that the technologies that we currently rely on to address heavy oil spills would work equally well on diluted bitumen. So when Dr. Weaver tells a reporter about an old Royal Society of Canada Report, the correct response should be to point out that a lot of much newer information now exists and that the report is no longer a particularly useful resource on this topic.

Refining Heavy oils

It is true that heavy oils can’t be effectively refined in a lot of refineries. Rather heavy oil needs to be refined in specially designed and built high-conversion refineries.

Heavy crude oil refineries will include very expensive cracking and coking units, designed to break down the long chain hydrocarbons into the smaller hydrocarbons used in gasoline, kerosene and diesel. Unfortunately, the simpler light crude refineries don’t typically have these cracking and coking units. Ironically, this can mean that the light crude refineries can’t handle the heavier components in the light crude oils and so the refineries end up producing more undesirable byproducts (like petroleum coke) per barrel of input.

What this means is that the heavy oil refineries produce more gasoline/diesel/kerosene per barrel of heavy crude oil than the light refineries do per barrel of light crude oil and the heavy refineries produce a lot less waste petroleum coke per barrel as well.

In financial terms, the heavier crudes produce much higher margins per barrel of input than their lighter crude cousins and generate less waste byproduct that have to be disposed.

Because of these factors the owners of heavy oil refineries will pay a premium to get heavy oil to use in their very expensive high-conversion refineries.

Asian Refining Capacity

One of the most bizarre recent naratives is that there is no market for diluted bitumen in Asia and that Asian refineries can’t refine dilbit. This is entirely untrue. As Reuters recently reported:

Many of the region’s refineries are new and are optimized to process heavy and sour crudes.

They were designed this way to take advantage of the historical discount these grades were priced at relative to light, sweet crudes, such as global benchmarks Brent and West Texas Intermediate (WTI), and oil from West African producers such as Nigeria and Angola.

The recent developments in the crude oil market have all but eliminated the discount enjoyed by heavy crudes, and in some cases, physical cargoes of some heavy grades have traded at premiums to light crudes.

So, contrary to what the folks at the Canadian Press or David Anderson have to say Asia has a lot of refineries that can refine heavy oil. Want some numbers? According to GlobalData’s report on Chinese refining capacity:

The country’s total coking capacity, catalytic cracker capacity and the hydrocracking capacity is expected to increase during the outlook period. The total coking capacity is expected to increase from 1,991 mbd [thousand barrels per day] in 2018 to 2,371 mbd in 2023. China’s total catalytic cracker unit capacity is expected to increase from 4,359 mbd in 2018 to 5,532 mbd in 2023. Over the five year period, the hydrocracking unit capacity of the country is set to increase to 2,922 mbd from 1,846 mbd.

Look at those numbers. The Chinese refineries can refine all the bitumen Alberta currently produces and can handle over 8 times what Line 2 of the TMX can send to Westridge Marine Terminal for export. This is why Asian refineries are buying up all the heavy they can get, often at a premium over lighter crudes. Consider that on June 13th (when I last looked it up)

  • Maya (the chemical twin to land-locked Alberta WCS) for export to Far East was selling at $51.16/bbl.
  • WCS (the Canadian heavy oil used to represent Alberta heavy) was $39.19/bbl and
  • West Texas Intermediate $48.96/bbl.

The high sulfur, heavy oil was selling at a premium over the lighter crude and Alberta was losing almost $12/bbl of value because its oil was land-locked. It doesn’t take a PhD in Economics to know that if the market is paying a premium for a product then clearly someone wants that product.

Asian Demand for Heavy oil

The most ridiculous recent story coming from the activist community is that there is no demand for heavy oil in Asia. Why do I say ridiculous? Because according to data supplied to Business in Vancouver by Statistics Canada.

7.5 million barrels of Alberta crude shipped to Asia via Westridge Marine Terminal in 2018, with a total value of $539 million.

China took 6.3 million barrels, at a value of $442 million. Another 648,000 barrels went to South Korea ($51 million) and 508,000 barrels went to Hong Kong ($46 million). A small amount also went to Thailand.

Coincidentally last week Reuters reported:

The tanker New Dream, chartered by commodities trader Mercuria Energy Group, departed on June 16 from Galveston loaded with more than 1 million barrels of heavy Canadian crude, and is headed to Asia, according to vessel tracking data from Refinitiv Eikon and ClipperData.

Another 3 million barrels of Canadian crude are due to be exported from the Gulf Coast by June 30, according to an oil trader familiar with the matter. Their destinations could not immediately be learned.

Ironically, on the same day the activists were claiming that no Asian economies want our heavy oil a tanker from Korea tied up at Westridge Marine Terminal to take on a load of heavy oil to one of their refineries.

Asian refineries are doing everything in their power to get Alberta heavy crude. They are even buying material that has been shipping all the way to the Gulf Coast and then shipping it half-way around the world. The activist narrative is simply false.


So here we have it. Less than a week after I wrote a 2000 word post on Trans Mountain myths I have another 2000 words debunking more myths. It is almost impossible to keep up with the false narratives. What is worse is that many of the newer ones represent journalists repeating misinformation that has been fed to them and that they had been unable/unwilling to confirm via other sources. The problem is that the statements like there being no refining capacity for heavy oil in Asia can be debunked with five minutes of research on Google.

Posted in Pipelines, Trans Mountain, Uncategorized | 22 Comments