The Green Party’s “Mission Possible” a cool name for a policy proposal that is not ready for prime time.

On May 16th Elizabeth May unveiled the Green Party’s Mission Possible, their 20-step “Green Climate Action Plan“. While I have to admit “Mission Possible” is a very cool name, the plan repeats what we saw with their Canadian “Green New Deal“. It is simply not ready for prime time. When you start looking at the details it becomes clear that the Green Party’s needs to assemble a policy team that understands energy issues, infrastructure development and logistics because this plan demonstrates a woeful lack of specific expertise on these topics and more. As this is only a blog post I won’t address all 20-steps here. Instead, I will address a handful of the steps on topics with which I am familiar.

To begin let’s start with the biggest challenge: modernizing the grid. This one is particularly important because many of their subsequent steps rely on easy access to copious amounts of low-carbon electricity.

9 – And modernize the grid

By 2030, rebuild and revamp the east-west electricity grid to ensure that renewable energy can be transmitted from one province to another.

While a necessary goal, if we are going to achieve our long-term climate ambitions, Canada’s vast and challenging geography has limited our ability to create a nationally integrated power grid. Even the most optimistic view has a new grid costing $25 billion and taking a couple decades to build. A more realistic appraisal puts the cost of a national backbone of 735 kV transmission lines at around $104 billion and taking 20 years to complete.

While a reasonable observer would note that $104 billion, while expensive, is doable; my biggest concern with “Mission Possible” is the time it allocates to achieve these goals. Put simply, building infrastructure takes time. Even war-time mobilizations can’t eliminate Canadian winters or the breeding/nesting seasons so unless we decide to ignore every environmental law on the books we will be limited to clearing and building over limited portions of the year.

One other thing is for certain. Building this grid will require a massive, permanent transfer of land rights. As we know from watching the pipeline debates, linear developments affect every community they go through and I can’t see affected First Nations voluntarily giving up rights to tens of thousands of hectares of land without consultation. Given recent history, I can’t see major work starting until years after the projects are proposed and, as demonstrated by the Trans Mountain pipeline expansion project, federal ownership of the project does not mean it will get a free ride through the courts. I have no doubt that any costs to build the transmission system will need to be supplemented with large sums to compensate individuals and First Nations affected by the grid upgrades.

As for the pace of the work? Well consultations take time and the court has made it abundantly clear you can’t rush consultations.

After the national backbone has been built we still will need to work on all the feeder lines that will have to go to every city, town and hamlet. Building transmission lines in Canada can be intensely expensive. Consider that the Northwest Transmission Line project in BC cost over $2 million a kilometer to build. As for the costs? If your single main line is $104 billion what will be the costs of 10’s of thousands of kms of feeder lines. Even taking into account the existing infrastructure we are talking stupendous sums to complete this task. It is simply not possible that we could achieve this goal by the year 2030.

This leads to an obvious problem, if the electricity isn’t there then where are all the electric vehicles going to get their electricity?

10- Plug in to EVs

By 2030 ensure all new cars are electric. By 2040, replace all internal combustion engine vehicles with electric vehicles, working with car makers to develop EVs that can replace working vehicles for Canadians in rural areas. Build a cross-country electric vehicle charging system so that drivers can cruise from St. John’s, NL to Prince Rupert, B.C. – with seamless ease.

Many others have written about the challenges of decreasing the number of internal combustion engine (ICE) vehicles on our roads so I won’t repeat their criticisms here. Instead, let’s consider the load forecasts. As I previously calculated to simply replace the gasoline burned in BC (and accounting for increased efficiency of electric vehicles over ICE vehicles) would require approximately 15,800 GWh of electricity (or about 3 Site C dam Equivalents [5,100 GWh]). Want to replace all those diesel vehicles? That is about the energy equivalent to 11,400 GWh (2.2 Site C Dams). These are not trivial numbers and they represent British Columbia’s demand only. In combination with other steps (discussed below) the increase in electricity demand will require us to essentially double our national electricity generating capacity. Renewable are great, but the scale of this problem seems not to have been noticed by the policy folks at the Green Party. We are talking about absolutely massive increases in our electricity generation system with all the associated costs and time limitations built into those upgrades.

12- Complete a national building retrofit

Create millions of new, well-paying jobs in the trades by retrofitting every building in Canada – residential, commercial, and institutional – to be carbon neutral by 2030.

I live in a relatively efficient 25-year old house. Like most of my neighbours, my heat and hot water are natural gas and my house was not built to passive housing standards. To re-fit my house to become “carbon neutral” would require removing and replacing the heating and hot water systems (and a lot of insulation upgrades which I won’t go into in this post) and I am not alone.

According to the National Energy Board, in British Columbia 58% of households rely on natural gas for heating, in Ontario it is 67% and in Alberta it is 79%. In order to achieve step 12 we would need to retrofit all those houses by 2030. Consider that according to StatsCan Ontario had 5,169,175 households in 2016. 67% of that number represents around 3,500,000 houses needing retrofitting or about 350,000/yr by 2030. That is essentially 1,000/day every day of the year between now and 2030, so it will certainly create jobs.

I would also note that any requirement to retrofit will require some sort of compensation to home-owners required to expend thousands of dollars to replace perfectly functional hot water heaters and furnaces. Sure one might argue that the government could simply refuse to provide compensation, but a program that alienates 67% of households in Ontario would never pass political muster. No sane government would try it and so the only way it happens is if the government pours billions of dollars into the program.

Since this blog likes to consider energy, let’s also consider what this means for load forecasts. According to our natural gas supplier the natural gas for household use represents about 64 Petajoules (PJ) of energy in British Columbia alone. Put another way 64 PJ is equivalent to about 17,750 GWhr or more than 3 Site C Dams worth of additional power in BC alone. These numbers are starting to add up pretty fast now aren’t they?…and we haven’t even considered the commercial or institutional retrofits.

13- Turn off the tap to oil imports

End all imports of foreign oil. As fossil fuel use declines, use only Canadian fossil fuels and allow investment in upgraders to turn Canadian solid bitumen into gas, diesel, propane and other products for the Canadian market, providing jobs in Alberta. By 2050, shift all Canadian bitumen from fuel to feedstock for the petrochemical industry.

A lot of people agree that it is desirable that Canada be self-sufficient in oil. While a positive idea, it ignores the geographic/infrastructure realities of Canada. In 2018, Canada produced about 4.6 million barrels per day (MMb/d) of crude oil. The problem is that western Canada produced about 95% of that oil and the vast majority of the consumption takes place in Eastern Canada. In 2017, the Hibernia, oil field generated about 220,800 barrels/day (b/d). The Irving refinery in Saint John, meanwhile consumes 320,000 b/d all on its own. Put simply all of the Newfoundland and Labrador oil production is insufficient to supply that one refinery in the Maritimes, it doesn’t come close to replacing the oil imported to supply the Maritimes, Quebec and Ontario.

The Green Party has spent the better part of a the last two decades blocking the mass movement of oil from western Canada to Eastern Canada. We simply cannot get the oil from where it is produced to where it is used absent a massive investment in infrastructure (like say an Energy East pipeline). Absent that investment we cannot end oil imports in Eastern Canada.

As for the idea of developing a 4 MMb/d petrochemical industry in Alberta? That is simply magical thinking. Due to their volatile nature petrochemicals are generally produced close to where they are consumed so good luck finding foreign investors willing to cover the costs. This would be another multi-billion dollar government investment in fossil fuel infrastructure….you know the type of subsidy the Green Party loudly supports every day. Since this blog is getting long, I won’t delve further into that topic.

14 – Switch to bio-diesel

Promote the development of local, small scale bio-diesel production, primarily relying on used vegetable fat from restaurants. Mandate the switch to bio-diesel for agricultural, fishing and forestry equipment.

This represents another case of the Green’s identifying a technology that sounds good on paper but has significant concerns when you look more deeply. It ignores the challenges of scale. Specifically, how many restaurants do the Green’s think exist in Saskatchewan to replace all the agricultural diesel?

Additionally, switching over to pure bio-diesel poses significant challenges for modern engines. Bio-diesel produces less energy per liter and has significant issues with filter plugging and engine compatibility when it represents more than about 20% of the blend. It is another case of the Green Party saying something that sounds clever until you take a close look under the hood.


I think I can stop here. I have looked at only 5 of the 20-steps and shown each one to be impossible/impracticable in the time-frame provided. I haven’t even mentioned that step 7 – “Ban Fracking” would make it impossible to develop geothermal energy resources or that the “ban fracking” statement is inconsistent with most recent science (the Scientific Review of Hydraulic Fracturing) on the topic.

Rather let’s just recognize that from my brief review it is clear that the Green Party either lacks the internal expertise to create reasonable policy or it has chosen to ignore that internal expertise when producing its policy proposals. I say this because I am not providing particularly earth-shattering insight here. The information I have noted is understood by literally hundreds, if not thousands of informed analysts across the country and any one of them could provide a detailed analysis of the flaws in this proposal to build on what I have presented here.

If the Green Party wants to be taken seriously in October, it has to start imagining that its signature polices are going to be looked at more closely than they were in the past. This “Mission Possible” document makes it clear they are not yet ready for such scrutiny.

Posted in Climate Change, Climate Change Politics, Renewable Energy, Site C, Uncategorized | 13 Comments

Why Confounding Variables Matter – On that UVic study attributing the 2017 Extreme Fire Season to Climate Change

One of the downsides of my investigation of evidence-based environmental decision-making being a hobby, is my real life often gets in the way. This means I am not always able to comment on every interesting paper when it comes out. One such example is the paper that came out in January from the University of Victoria titled Attribution of the Influence of Human-Induced Climate Change on an Extreme Fire Season. The paper has been a topic of intense conversation but very little critique. It is repeatedly cited by activists who have not read it, but feel the conclusions:

that the event’s high fire weather/behavior metrics were made 2–4 times more likely, and that anthropogenic climate change increased the area burned by a factor of 7–11.

help their political narrative. I keep expecting to read a serious challenge of its results because it has a really obvious flaw that essentially eliminates its usefulness in quantifying anything; but I haven’t seen one to date. I am surprised because once you see how it treats confounding variables it is impossible to take its quantification seriously. In the rest of this blog post I will provide an explanation for this statement/

Since it is the basis of this discussion lets explain the concept of a “confounding variable” in research design. The simplest description I’ve found online is this

A confounding variable is an “extra” variable that you didn’t account for. They can ruin an experiment and give you useless results. They can suggest there is correlation when in fact there isn’t. They can even introduce bias. That’s why it’s important to know what one is, and how to avoid getting them into your experiment in the first place.

As a practical example, imagine you were comparing death rates from car accidents between 1964 and the present and your hypothesis was that these deaths were attributable to better, modern engines. Confounding variables might include the fact that modern cars have air-bags, better seat-belts and more survivable designs; all features that were not available in 1960’s automobiles. If you did not find a way to correct for the presence/absence of seat-belts, air bags and design considerations then any person reading the study would instantly recognize that the results of the study were invalid.

So how is this relevant to the UVic forest fire study? Well let’s look at what it compares:

As the CanRCM4 ensemble includes natural and anthropogenic forcings, we use the decade 2011–2020 to represent the current climate and an earlier decade and 1961–1970 to represent an alternative climate with reduced influence of human emissions

So much has changed between 1961 and 2011 that I expected to find a lot of work to deal with all the potential confounding variables. Imagine my surprise when I came across this text deep in the report:

The result is dependent on the regression model being realistic [my emphasis throughout]. Our regression model assumes that nonclimatic variability in the natural log of area burned is stationary in time and does not account for the possible influence of human factors such as changes in forest management or human ignition sources. Humans have long had a direct influence on fire activity (Bowman et al., 2011), and trends in some regions have been strongly impacted by human intervention (Fréjaville & Curt, 2017; Parisien et al., 2016; Turco et al., 2014). Syphard et al. (2017) demonstrated that climate influence on fire activity becomes less important with a strong human presence. We also do not consider directly the impacts of repeated suppression over time, which could result in larger fires, nor do we consider the pine beetle infestation that has affected BC

Stop and re-read that section again. Their hypothesis is that climate change is the driving factor but they didn’t correct their work for any of the critical confounding variables. They simply ignored some of the most of the most important considerations when discussing forest fire size/numbers/intensity. Let’s look at them one at a time.

Pine Beetles

Obviously the first issue to consider is the Pine beetle infestation. As described by Natural Resources Canada:

Over 18 million hectares of forest were impacted to some degree [by the pine beetles], resulting in a loss of approximately 723 million cubic metres (53%) of the merchantable pine volume by 2012. The epidemic peaked in 2005: total cumulative losses from the outbreak are projected to be 752 million cubic metres (58%) of the merchantable pine volume by 2017,

The pine beetles killed massive swathes of our forests and turned them into dead wood just ready to burn. How can a study comparing fire influence without accounting for the pine beetles? Certainly they cite an American study to justify their decision, but numerous Canadian studies indicate that beetle-killed stands have “higher fire spreading potential” among other considerations. Now if the authors had only missed the pine beetle issue it might have been a minor issue but they also missed changes in forest management.

Forest Management

It is well-understood that BC’s forest management has raised the fire risk in BC. BC has systematically been suppressing broad-leaf trees like aspen and birch that provide natural fire protection so as to make room for more commercially valuable conifer species like pine and Douglas fir. Those species are critical to large stands of trees: they’re less prone to burning, create shade on the forest floor, reduce temperatures and promote more humidity. Current forest management has changed the nature of our forests, this is not a hypothesis it has been a stated policy of our forest management regime for decades. How can a study ignore this consideration? Not only have we changed the forest make-up we have completely modified the fire regime via fire suppression.

Fire Suppression:

After the Slave Lake fire in 2011 the Alberta Government sought advice on the fire situation. The result was the Flat Top Complex Wildfire Review Committee Report which made a number of recommendations and concluded:

Before major wildfire suppression programs, boreal forests historically burned on an average cycle ranging from 50 to 200 years as a result of lightning and human-caused wildfires. Wildfire suppression has significantly reduced the area burned in Alberta’s boreal forests. However, due to reduced wildfire activity, forests of Alberta are aging, which ultimately changes ecosystems and is beginning to increase the risk of large and potentially costly catastrophic wildfires.

Essentially the report acknowledged that fire suppression efforts are making wildfires bigger and more dangerous. While the report was written for Alberta the conclusions are entirely transferable to BC. Humans have interfered with the natural regime of fire in order to protect forests for commercial use and we have now created a situation where bigger, and more numerous, fires are a certainty. But that is not all because of the way we have allowed encroachment into interface zones and provided added access to our forests.

Human Encroachment and Access

Another feature the paper missed is human access issues. For those of us who lived through the 1970’s, one thing I can assure you is that access to the back-country has changed significantly since then. In the 1970’s the resource road network did not exist. Huge portions of the province were essentially inaccessible except via air or on foot. This protected the forests from humans and their tendency to light things on fire or drop sparks from their engines. These days we can get to the back-country much more easily which gives more opportunity for fires. Consider this comment from UBC professor Lori Daniels:

The easiest piece of the puzzle is population. There are simply more of us, in more pockets of the province, which inevitably increases the chance of man-made fires. Varying estimates suggest anywhere between 30 to 50 per cent of the current fires are caused by people.

This result is consistent with the BC Wildfire Service which says that 40% of fires are caused by people. The greater access to the back country has resulted in more area under risk from human impacts.


To conclude, let’s look at the confounding variables that were not considered in this study:

  • Pine Beetles
  • Forest management
  • Fire suppression and
  • Human encroachment and development

and yet this paper says it can provide accurate quantification of the increases in forest fire activity between the 1970’s and the 2010’s due to climate change?

Like our hypothetical study that ignored seat-belts, air bags and vehicle design, the confounding variables have to have had an effect on the two signature numbers “2–4 times more likely” and “increased the area burned by a factor of 7–11“. Absent controls for confounding variables any quantification of the effect of climate change alone cannot be taken seriously. Certainly, it is entirely likely that climate change will eventually increase the likelihood of fire and even increase the area burned, but those 2-4 times and factor of 7-11 numbers are simply not credible given what we know about the disclosed confounding variables.

Posted in Canadian Politics, Climate Change, Risk, Uncategorized | 17 Comments

The New Gas Boom – A Bust for anyone interested in an informed discussion about Canadian LNG

Anyone who follows news about the Canadian Liquid Natural Gas (LNG) industry (and many who haven’t) will have heard about the new report prepared by the folks at Global Energy Monitor (GEM) called The New Gas Boom: Tracking Global LNG Infrastructure (The New Gas Boom). The New Gas Boom is the latest effort by GEM to generate earned media in its fight against new fossil fuel infrastructure. In the last week I have seen and heard lead author Ted Nace all over local and national radio and television. What is most troubling is that this report (if you can call it that) consists mostly of tables and figures with little-to-no supporting information. What is most problematic is that what information they do present about Canadian LNG developments appears incorrect…and consequently the conclusions the authors have proclaimed across our media landscape are also likely equally incorrect. The rest of this blog will expand on this topic.

Where are they getting their Project Numbers from?

According to their website GEM documents fossil fuel infrastructure developments. Given that mandate one would expect the one thing they would try to do well is keep track of fossil fuels infrastructure developments. If you expected that, you would be disappointed. Between the New Gas Boom report, their fossil fuel tracker application and their associated data tables [look for the tab in the application] they can’t seem to keep their numbers straight. The numbers in the tables of The New Gas Boom report don’t appear to reflect those in the application; and facilities identified in the data tables differ from those that appear when you open the mapping program. More problematically, their numbers, don’t correlate with other, authenticated resources available online.

In The New Gas Boom the authors report that Canada currently has 281.6 million tonnes per annum (MTA) of LNG Export Terminals in “pre-construction”. Now let’s excuse the fact that the term “pre-construction” is never defined and imagine it means projects that someone intends to construct.

I spent several hours trying to figure out what they claimed was in store on the topic of Canadian “LNG Export Terminals” and simply could not reconstruct their 281.6 MTA number. What is more important is that Natural Resources Canada (NRC) keeps a detailed list of Canadian LNG export projects and the NRC list differs significantly with the list from the GEM website.

According to NRC there are 216 MTA of projects on the books, not the 281.6 MTA reported as under “pre-construction” in the New Gas Boom report. Moreover, that 216 MTA of projects simply represent projects that have obtained export licenses, which represents the first step in a long process to construction.

Going to the Sourcewatch page for BL LNG Terminals (Sourcewatch serves as the feeder for the GEM mapping program) they list 21 LNG terminal projects on the BC coast but the NRC identifies only 13. That is a major discrepancy. The next task would be to compare the list from Sourcewatch to the National Energy Board list of LNG export licence applications [excel file] and there one discovers more discrepancies.

Ultimately the thing to understand about the list of BC LNG projects is that many are mutually exclusive in that they rely on the same pipeline capacity and others have no associated pipeline capacity. Put simply, most of these projects have no funding, no gas supply and no chance of being completed. Anyone aware of the BC LNG situation would know that currently there are only a handful of projects close to being considered viable, likely in the 30-40 MTA range rather than the hundreds described by GEM.

The question one might ask is: why does it matter that GEM’s numbers are off by so much? Well because the entire point of The New Gas Boom report is that there is too much capacity under construction; that the massive amount of construction to come will overload the global system; and this will result in financial and ecological ruin. Their headline statements include:

  • $1.3 trillion being invested in global gs expansion.
  • The scale of the LNG expansion under development is as large or greater than the expansion of coal-fired power plants,
  • If built, LNG terminals in pre-construction and construction would increase current global export capacity threefold

The problem, as we have discovered, is that since their numbers are flawed, so are their dire warnings. Their narrative is all wrong. BC is not going to build hundreds of MTA of export capacity and BC is not going to flood the global market with LNG. Rather, BC appears set to produce about 40 MTA of export capacity in the next decades which will fit right in with demand expectations in Asia. Moreover, now that we have established how far off this report is for BC LNG how can we seriously believe anything it has to say for Australian or American developments?

What’s Up With Their Fugitive Emissions numbers?

Besides claiming financial catastrophe, the authors of The New Gas Boom also want to convince readers that natural gas is worse than coal. They appear to do so by undertaking a careful, and some might say biased, reading of the academic literature which allows them to massively exaggerate fugitive emissions. To understand you have to go to page 13 where they explain their methodology. Under the Section: Updated leakage estimates alter the assessment” we see the following text:

Updated leakage estimates alter the assessment. The 2014 DOE report was based on the assumption that methane leakage was 1.3% for conventional onshore gas and 1.4% for fracked gas. In 2018, a comprehensive reassessment of methane emissions in the U.S. oil and gas supply chain, based on facility-scale measurements and validated with aircraft observations in areas accounting for about 30% of U.S. gas production, concluded that the overall leakage rate for natural gas was 2.3% of gross U.S. gas production, a figure 60% higher than the U.S. Environmental Protection Agency inventory estimate (Alvarez 2018). At the higher leakage rate, the advantage to using coal disappears. Multiple studies estimate the overall leakage rates even higher than the 2.3% Alvarez estimate, due to the fact that the Alvarez study did not include “downstream” leaks in the distribution of gas. Such leaks account for an additional 2.7 ± 0.6%, according to a study of Boston (McKain 2015).

There are two topics to be addressed in this section. The 2.3% number for overall leakage and the 2.7% number for downstream leakage. Both are useless in the Canadian context.

To address the 2.3% overall leakage you could go to my previous blog post. In my previous post I noted that the 2.3% number is derived from a paper by Alvarez et al. in Science where they calculate that 2.3% of US production of LNG was lost in the form of fugitive emissions. They argue this wipes out the emission savings in using LNG for power. The problem with the Alvarez results is that they aren’t applicable to Canadian LNG.

The Alvarez paper relies on top-down surveys (airplane surveys) in selected US gas fields and extrapolates those results to the US (and Canada). The problem with this extrapolation is that geology and regulations matter in the LNG field. BC LNG is from deeper formations; Canadian infrastructure is much newer; much of our BC natural gas is sour; and BC has enhanced regulatory controls compared to the US fields studied by Alvarez.

To explain why this all matters consider our stricter regulatory structures which have essentially eliminated flaring and strongly encourage green completions (which prevent release of gas when the well is being completed) and encourage the use of electricity in on-site equipment (to prevent the use of gas which may then be released). These regulations massively reduce the amount of fugitive emissions in the upstream industry. They do this not only because it makes environmental sense but because much of our gas is sour (read poisonous). The levels of releases considered common in Texas or Pennsylvania could result in mass casualty events in BC.

Moreover, as I noted, the Alvarez paper relies on airplane surveys to measure fugitive emissions. These types of surveys have well-understood issues that I address in detail in this blog post. The most important is temporal variability.

Recent research shows that the time when the planes fly really effects their results. The new research makes the observation that the flights used by researchers (like Alvarez et al.) to measure methane only happen during daytime hours (usually in the middle of the day), during the spring/summer on clear days. This coincides with when maintenance is typically scheduled on natural gas facilities (which requires that they flush their systems). As such, the research concludes that top-down surveys will almost always significantly overestimate total emissions.

For an analogy, it would be like traffic counters only working during rush hour and then extrapolating those rush hour conditions over the entire 24 hour day including the middle of the night.

As I noted in my previous blog post, under a 20-year time period a leakage rate greater than 2.6% is necessary for natural gas to approach an emission level for coal when dealing with low efficiency natural gas compression systems and high-efficiency coal. Most estimates for upstream BC fugitive emissions run between 0.6% and 1.1% including the transportation component. Given our high use of electricity in our LNG streams the result is that BC LNG is much cleaner than coal.

Now lets be generous and imagine that fugitive emissions are equivalent to the the entire 2.3% identified by Alvarez. Thanks to our electification BC LNG would still be much cleaner than coal. To address this shortfall the authors had more work to do. They did this by adding an additional “downstream” fudge factor; and because apparently this group of authors and to do so they went to the McKain paper to get their 2.7% downstream leakage rate.

You might wonder why I highlight their dependence on the McKain paper for downstream leakage. The reason for my disdain is that this study represents a particularly egregious case of cherry-picking.

The McKain paper does indeed document substantial leaks in downstream transportation but there is a HUGE PROVISO. The McKain research addresses the “Urban Region of Boston” which has some of the oldest and leakiest natural gas infrastructure on the planet. It has been the topic of major news stories and has its own page on the Environmental Defense Fund’s website. According to Environmental Defense over half of Boston’s natural gas pipes are over 50 years old and “nearly 45% of the pipes are made from cast iron or other corrosive [sic] and leak-prone materials“.

The authors of The New Gas Boom chose the absolute worst-case scenario in North America and used it as their signature value for their assessment. Does this sound like the typical case that should be used to extrapolate to Canadian natural gas infrastructure?

To make an analogy people might better understand, it would be like assuming that the lead issues in the Flint, Michigan water system are typical of the entire North American water system and that we should calculate lead ingestion levels for all North Americans using the numbers only from the Flint, Michigan water system.

If I didn’t know better I would have guessed that the authors of the report were counting on readers not being aware of the background of the McKain paper and so were counting on being able to slide this information by us all….and given the media reporting (and other reporting I have read) they almost did.


Ultimately The New Gas Boom report appears to have been intended to serve a political, rather than a scientific purpose. The numbers presented in the report appear to be massively inflated to allow for them to make catastrophic predictions and present terrifying numbers ($1.3 Trillion) in order to generate lots of free media hits. Sadly, the report did just that. In our local market I saw, or heard, the lead author Ted Nace on each of the major news and information networks. He was able to spread his apocalyptic conclusions free from any significant criticism or push-back. This happened because he was able to take advantage of the fact that 99.9% of the population would not have the knowledge of the academic literature or Canadian LNG industry to call him out.

Unfortunately, it is only now after he has enjoyed his 15 minutes of fame and generated millions of dollars of free publicity, that the limited number of individuals with the knowledge to challenge his claims have been able to come forward to challenge them. It is a sad indication of the state of discourse in our current media landscape that the numerous pieces, like this one, identifying the significant flaws in the story, will mostly serve as footnotes as the press moves on to the next Donald Trump mis-step. This is another case of an NGO promoting a false narrative and no one being available to knowledgeably push-back.

Posted in LNG, Uncategorized | 2 Comments

Debunking another CCPA anti-LNG article, this time in the Globe and Mail – now with Marc Lee response

I have to admit something. Every time I read an article by the Canadian Centre for Policy Alternatives (CCPA), I hope that it will present an evidence-based analysis consistent with the quality of the individuals who I know work there. Sadly, more often than not I am disappointed. I could probably fill an entire section of my blog with pieces debunking analyses by the CCPA. Thus, it was with trepidation that I approached an “Opinion” piece in The Globe And Mail called LNG’s big lie by my regular foil Economist Marc Lee. In this blog post I will go over some of the more egregious issues I had with this article.

The article starts with three introductory paragraphs.

The federal government is seeking to use a clause in the Paris Agreement on climate change to get emissions credits for exports of liquefied natural gas (LNG) to Asian countries.

This plan is nonsensical for a number of reasons, but at its heart is the big lie that LNG will help to reduce global emissions. No one should take such claims seriously.

The grain of truth upon which this claim is made is simple: at the point of combustion, gas is about half as emissions-intensive as coal to deliver the same amount of energy.

These paragraphs demonstrate that the author disagrees with the Federal government on their interpretation of Article 6 of the Paris Agreement (which provides mechanisms for the trading of emission credits) and then follows it up with a demonstration that he does not understand the climate math underlying the BC LNG industry….but that becomes clear as the article continues.

The next three paragraphs provide a simple guide to the LNG industry. They mostly emphasize how challenging it is to get LNG. Presumably this filler was intended to imply that these efforts cause excessive greenhouse gas emissions. The problem is that Life Cycle Analysis (LCA) is an actual field of study and real LCAs have been done on LNG in both Canada and the US. I can only presume the author is hoping that Globe readers are unaware of the academic literature on this topic.

The next paragraph is where the interesting stuff starts to happen:

Taken together, one-fifth of the gas must be consumed in the liquefaction, transport and regasification processes. These processes all lead to greenhouse gas (GHG) emissions and thus substantially reduce the emissions advantage relative to coal.

This “one-fifth” number is derived from an older CCPA report that I have previously debunked. In that case the author of the CCPA report decided to take the results of a US National Energy Technology Laboratory (NETL) study Life Cycle Greenhouse Gas Perspective on Exporting Liquefied Natural Gas from the United States for an export facility from New Orleans to Shanghai, and applied them for a Canadian LNG project exporting LNG from Prince Rupert to Shanghai (only adjusting for tanker shipping distance). The problem with using the NETL data is that the numbers are simply irrelevant in the BC context.

The NETL study, written in 2014, includes lower efficiency compressors, assumes leaky pipelines, in very hot climates and, as I discuss in my earlier blog post, is simply not relevant to the Canadian experience. The compressors to be used in Canada are more efficient and our regulatory system is stricter. This results in substantial efficiencies which essentially halve the number presented in the Opinion piece. So no, the one-fifth number isn’t close to our current technological state.

On the topic of coal the next paragraph is even more egregious.

Coal, in contrast, may be dirty in terms of emissions, but getting it to market is relatively easy compared with gas. Coal can be dug up, put on rail cars and shipped to its final destination.

Where to start on this one? Around 10% of the entire life cycle emissions of coal (including its eventual combustion) come from digging it up and shipping it to market. That 10% may sound small but when you look at the numbers it represents almost 25% of the emissions generated by the combustion of natural gas. This is not a rounding error and for the CCPA to treat it as such is simply ridiculous.

Moreover, the statement also ignores the methane emissions associated with mining coal. As described in a recent study of the Marcellus Shale play “a significant portion (~70%) of the emitted CH4 [in the region] was found to originate likely from coalbeds.” From an upstream perspective, coal is far worse than natural gas when the fugitive methane emissions are incorporated into the climate math.

Now for the next paragraph

The other emissions problem with gas is that leaks occur at various points along the supply chain from wellhead to final combustion. Recent studies have found that these leaks are much larger than have been reported by industry and governments.

The study the CCPA is talking about is by Alvarez et al. in Science which calculated that 2.3% of US production of LNG was lost in the form of fugitive emissions which they argue wipes out the savings in using LNG for power. The problem with the Alvarez results is that they aren’t applicable to the Canadian context.

The Alvarez paper relies on top-down surveys (airplane surveys) in selected US gas fields and extrapolates these results to the US (and Canada). The problem with this extrapolation is that geology and regulations matter. BC LNG is deeper, with newer infrastructure, more sour gas and different regulatory standards than the US fields studied by Alvarez. All four of these factors matter in this debate As an example, our stricter regulatory structures have essentially eliminated flaring and strongly encourage green completions and encourage the use of electricity in on-site equipment. All of which significantly reduce our fugitive emissions.

Moreover, since much of our gas is sour (read poisonous), the type of release considered common in Texas or Pennsylvania would result in mass casualties in BC.

As I noted, the Alvarez paper relies on airplane surveys to measure fugitive emissions. These types of surveys have well-understood issues that I address in detail in this blog post. The most important is temporal variability.

Recent research shows that the time when the planes fly really effects their results. The paper makes the observation that the flights used by researchers (like Alvarez et al.) to measure methane only happen during daytime hours (usually in the middle of the day), during the spring/summer on on clear days. This coincides with when maintenance on natural gas facilities (which requires that they flush their systems) typically occurs. As such, the research concludes that top-down surveys will almost always significantly overestimate total emissions.

For an analogy, it would be like traffic counters only working during rush hour and then extrapolating those rush hour conditions over the entire 24 hour day including the middle of the night.

Now you would think at this point it wouldn’t get worse, and yet it does. The next paragraph goes:

Even very small leaks of methane can wipe out any remaining advantage for gas relative to coal. Methane is short lived, breaking down in about 12 years into carbon dioxide and water, but while it is in the atmosphere it is 100 times more heat-trapping than carbon dioxide.

What the author is trying to do is introduce the concept of global warming potential (GWP). GWP is important because methane has a shorter atmospheric half-life (it breaks down more quickly) than carbon dioxide but acts more quickly during its short life. Specialists, disagree whether one should consider the 20-year or 100-year potential of methane, since the IPCC has established that GWP can vary from 28 times (100-year) to 84 times (20-year). The EPA uses a number that includes feedbacks to give ranges of 28-36 times for 20-year and 84-86 times for 100-year. Now looking at these numbers the one number you do not see is the “100 times” cited in the opinion piece. I simply can’t figure where that figure comes from, but certainly not the field of climate studies.

Herein lies the challenge in debunking bad opinion pieces. The original article is only 680 words and I am already at twice that number and only halfway through the piece. So I will speed this up.

In this discussion of leaks the author is careful to to avoid using actual numbers. I can only guess this is because he doesn’t want you to know that his argument doesn’t hold water using what we know about fugitive emissions. The International Energy Agency has debunked his argument and even given us a nice graphic for typical natural gas facilities. As it makes clear at typical fugitive emission levels natural gas is better than coal in greenhouse gas intensity.

Remember this graph is for typical American facilities. As we know from our past analyses BC LNG can produce the same product with 80% of the emissions of our competitors. Our LNG is cleaner and greener. Even a typical US facility makes climate sense when leakage is less than 2.6% (from a 20-year perspective). Yet even the worst number provided by Alvarez is 2.3% and the Canadian numbers are estimated to be in the 1% range. The climate math says LNG is a lot cleaner than coal.

Now we are in the home stretch let’s look at the next three paragraphs.

Finally, we need to think about where Canadian gas is being exported. While it’s plausible our natural gas could displace coal use in China, it could also simply contribute to higher overall energy demand, adding to emissions on top of coal. Or LNG could displace renewables in China’s evolving energy mix.

If exports go to Japan or Korea, the two biggest LNG importers, they would most likely displace cleaner energy sources and therefore increase global GHG emissions.

Even to the limited extent that China may be able to reduce its emissions by switching from coal to gas, it is not suddenly going to hand over the emissions credit to Canada. That’s not how emissions accounting works.

These paragraphs appear to consist of wishful thinking by the author. He imagines that Japan is not building new coal capacity at this very moment. The problem is that is not true. Look at their coal plant tracker or the EIA analysis of the country. Japan is building coal facilities because they can’t get enough natural gas and they need back-up for all the renewables they are installing now that the NGO’s have scared them off nuclear.

As for China, well I have a blog post showing how China is building synthetic natural gas plants to convert coal to natural gas, so the suggestion that China doesn’t need natural gas simply doesn’t hold water either.

I think I am going to stop here. Having looked at 10 paragraphs and found significant issues with virtually every one, I have simply run out of gas. The question I have to ask is: where were the editors with this piece? When I was a younger lad an Opinion piece in the local paper was proofed and fact-checked by the paper to ensure its contents were fact-based. Back then it was believed that best “Opinions” were those supported by facts. Editors didn’t let things like “100 times more heat-trapping” get through the editing process.

Reading this piece I am reminded of the Daniel Patrick Moynihan quote: “Everyone is entitled to their own opinions, but they are not entitled to their own facts.” I only wish the Editors of the Globe would ensure that Opinion writers didn’t come with their own facts.

Addendum: the Author responds

While I was on vacation, the author of the Globe piece (Marc Lee) responded in the comments section. I have pulled his comment up to the text so everyone could see it. My original instinct was to provide a detailed reply but instead I will simply provide links and highlights debunking his responses. Below I have his comment indented in italics and my reply thereafter.

I’ve been on vacation but it was fun to see what you made of my article. Sadly, you don’t do a very good job of rebutting my core arguments. You don’t address the central argument that Canada cannot get credit for its LNG exports, and most of what you write is an ad hominem attack on me and the CCPA. Tip: Writing in a condescending tone does not win an argument.

What is particularly funny about this response is that it is clear that Marc doesn’t even understand what an ad hominem attack entails? I don’t attack him or the CCPA, I attack his argument throughout.

As for this argument, the federal government made clear Article 6 of the Paris Agreement provides a means by which Canada could earn credits for our LNG exports. All it requires is that Canada choose to provide an inducement to the receiving country (likely in the form of a rate cut) to earn the credit.

Your main challenge is around the differences in lifecycle emissions between LNG and coal, and the IEA figure you show highlights some of the trade-offs wrt leakage. But if you read the original ( you would see that they don’t consider LNG at all, and a key point of my article was the energy required for liquefaction, which reduces the advantage of LNG relative to coal.

This claim is simply a red herring since Marc specifically states in the piece “This plan is nonsensical for a number of reasons, but at its heart is the big lie that LNG will help to reduce global emissions. No one should take such claims seriously.” As I have shown in my previous blog post the climate math makes it abundantly clear that Canadian LNG can reduce global emissions. Nowhere does Marc provide any actually numbers to support his argument because every legitimate source supports my argument, not his.

Your comments on GWP are highly misleading. If 100-year GWP is 34 and 20-year GWP is 86, then what is a 12-year GWP? That is how long methane stays in the atmosphere before breaking down into carbon dioxide and water. Here’s a reference that backs my statement of 100 times over 12 years:

This response says more about the author than I could ever do myself. No legitimate organization uses a 12-year GWP. The standard GWP used by the IPCC is the 100-year GWP. Recently some organizations have chosen to use the 20-year GWP but when they do so they preface it by clearly stating they are using the 20-year GWP. To use a 12-year GWP, without prefacing it as a 12-year figure appears to represent an attempt to deliberately deceive an uninformed public. Nothing I have written to date discredits him more than his admission that he deliberately chose to cite a 12-year GWP without declaring that fact out front.

Your comment that US results on methane leaks are not applicable in BC is a misdirection. Part of the problem is that we are taking industry’s word for it and not doing independent measurement. But studies that have find conventional estimates are an under-estimation:

This is another case of Marc choosing the road less traveled and it is less-traveled because the sources he provides are not legitimate and have been utterly de-bunked. I go into the de-bunking of the Atherton paper here and the Suzuki paper here. The Scientific Review of Hydraulic Fracturing (SRHF) singled out the Atherton report because follow-up work by the regulator demonstrated its results were not valid. The fact that after the SRHF utterly discredited the work, Marc still chose to rely on it, speaks volumes.

I also note that in a previous critique of me you cite Kasumu et al as debunking my and David Hughes argument. Hmmm, the actual article is much more nuanced and does not back that claim. They state: “Results show that while the ultimate magnitude of the greenhouse gas emissions associated with natural gas production systems is still unknown, life cycle greenhouse gas emissions depend on country-level infrastructure (specifically, the efficiency of the generation fleet, transmission and distribution losses and LNG ocean transport distances) as well as the assumptions on what is displaced in the domestic electricity generation mix. Exogenous events such as the Fukushima nuclear disaster have unanticipated effects on the emissions displacement results. We highlight national regulations, environmental policies, and multilateral agreements that could play a role in mitigating emissions.”

Marc claims that the Kasumu article is nuanced, and yes it is…the problem is the nuance doesn’t erase the numbers it presents or the numbers presented both in my piece and in numerous supporting documents. When compared to the existing and in-progress facilities in China and India Canadian LNG will reduce global emissions. There is a reason Marc doesn’t provide any real numbers in his piece, because every real number shoots down his argument.

At best, you can argue there is a plausible range of impacts, from LNG slightly better than coal to worse that coal, and those depend on what assumptions one makes about which export markets, what fuels are displaced, methance leakages, and plant performance, including what the actual performance of LNG Canada will be once constructed (as opposed to the claims they make before hand).

Which is basically what I say in the article

This comment is simply not true. I can’t say this enough, for him to make this claim is simply not supported by the literature. The “plausible range” for BC export to Asia goes from BC LNG being almost 2 times cleaner than coal (with electrification of the compression step and China using SNG) to BC LNG being slightly better than coal (using natural gas for every step andd an excessive methane leakage rate compared to the highest-efficiency coal). Every legitimate life cycle analysis supports my position on this. The one exception is the CCPA LCA which is fatally flawed and even then it has to struggle to make LNG look equivalent to coal.

In re-reading Marc’s reply I can only say that it leaves him looking even worse than if he had not replied in the first place. Before his reply you could reasonably be left with the opinion he simply made a few mistakes….after the reply….

Posted in Uncategorized | 3 Comments

Debunking more misinformation about the Trans Mountain Pipeline Expansion project. Some simple facts about bitumen, heavy oil, and Asian Markets.

As someone interested in evidence-based decision-making there are few topics as frustrating to discuss as the Trans Mountain Pipeline Expansion (TMX) project. The reason for this is that the media landscape is so completely full of misinformation and bad information that evidence-based decision making is almost impossible. This weekend I had an extended discussion with an independent podcaster about the project after listening to him in a radio segment with Lynda Steele on CKNW.

The segment had so many errors that I sent out a number of intemperate tweets in his direction. His response was to present a number of media stories that served as the basis for his opinions. The problem was that most of these stories were full of errors. Therein lies the dilemma. There is so much bad information out there that even well-meaning observers are going to get it wrong. Between the bad information he was being fed in media stories, and the deliberate misinformation being spread by opponents of the project, it is virtually impossible for the regular observer (i.e this podcaster) to know what is right and what is wrong. This post will look at a few more of these myths.


As I discussed in my previous post, the TMX has two major components:

  • Line 1 – existing pipeline segments (with pump upgrades) able to transport 350,000 barrels/day (bbl/d) of refined petroleum products and light crude. It has the capability to carry bitumen but at a much reduced volume per day.
  • Line 2 – a new pipeline with a capacity of 540,000 bbl/d. It is intended to transport heavy crude oil.

Line 2 is about moving heavy oils including diluted bitumen and synthetic crude. Line 1 is intended to help mitigate the supply bottleneck that has Vancouver drivers paying such high prices for gasoline and diesel while supplying the light crude needed by the Parkland Refinery in Vancouver and US refineries in the Puget Sound.

Admittedly Line 1 could be used for heavy crude but even a little bit of heavy oil in Line 1 eliminates the benefits of the upgraded Line 1. Meanwhile Alberta recently completed the Sturgeon refinery and now has a glut of diesel. As such, it makes logistical and financial sense to operate the pipeline in the manner consistent with the NEB proposal which means that Line 1 will almost certainly be used for what it was intended: light crude and refined fuels. Now let’s deal with the misinformation.

Diluted Bitumen – what is it?

The name: “diluted bitumen”, when enunciated syllable-by-syllable by Dr. Andrew Weaver, sounds a lot like a chemical warfare agent. The truth is entirely the opposite. Diluted bitumen (dilbit) is pretty boring stuff that consists of a mixture of 20% to 30% diluent and 70% to 80% bitumen.

Bitumen is a type heavy oil. It is characterised by high viscosity, high density (low API gravity), and high concentrations of nitrogen, oxygen, sulphur, and heavy metals.

The diluent is typically a light-hydrocarbon mixture (like naptha) called “condensate”. The condensate has a specific gravity in the 0.6 g/mL to 0.8 g/mL range.

The resultant dilbit has an API of 20-22 (medium crude is API 22.3 – 31.1 so dilbit is almost a medium crude in API) and a sulfur in the 3.7% -3.9%. Dilbit has density/specific gravity that ranges from around 0.92 g/mL to about 0.94 g/mL. Since we know that freshwater has a density of 1 g/mL and that seawater density ranges from 1.025 g/mL to 1.033 g/mL that means that when spilled any dilbit will initially float.

Chemically, dilbit acts and behaves just like any other heavy, sour oil. Maya is the most comparable crude to typical Alberta dilbit (called WCS). Maya is a Mexican heavy crude that ships out of the ports of Cayo Arcas and Salina Cruz on the Gulf Coast. It has an API of 22 and sulfur of 3.5%. Thus WCS and Mata are both low API blends with less than 5% sulfur.

The thing to take from this section is that dilbit is not some strange creation or unusual mixture. From a chemical perspective it is unspectacular. It looks like a heavy crude oil, it reacts in a refinery like a heavy crude oil and when shipped or spilled behaves almost exactly like a heavy crude oil.

How is bitumen extracted.

Bitumen can be extracted using two methods depending on how deep the deposits are below the surface: in-situ production or open pit mining.

As described in Natural resources Canada Crude oil facts open pit mining represents 45% of current production and 20% of oil sands reserves. In 2017, seven mining projects in Alberta produced approximately 1.25 million barrels a day:

In Situ methods represent 55% of current production and 80% of the total resources. There are about 20 in situ projects in Alberta. In in situ extraction the bitumen is treated in a manner that allows it to flow to be collected. Generally, the three methods that can be used to reduce the viscosity of the bitumen are the addition of steamsolvents, or thermal energy. The biggest benefit of in situ extraction is the lack of above-ground impacts. There are no tailings ponds and the sands are all left underground.

As described by NRC Canada water management is a key challenge of the oil sands extraction process. The mining method uses 2.5 barrels of fresh water per barrel of bitumen and the in situ method uses an average of 0.21 barrels of fresh water per barrel of bitumen. Oil sands producers recycle around 80-95% of the water used in established mines and approximately 85-95% for in situ production.

Is Dilbit particularly dirty?

Let’s be clear here, heavy crude oils are not something you use as a comfort food for a toddler or to bathe puppies. That being said, heavy oils are an essential commodity and bitumen is not a particularly dirty form of heavy oil. Recent studies by California’s Environmental Protection Agency, Air Resources Board for their Low Carbon Fuel Standard  made the following findings:

  • There are 13 oil fields in California, plus crude oil blends originating in at least six other countries, that generate a higher level of upstream greenhouse gas emissions than Canadian dilbit blends;
  • Crude oil from Alaska’s North Slope, which makes up about 12 per cent of California’s total crude slate, is actually “dirtier” than the Canadian dilbit known as “Access Western Blend”;
  • The “dirtiest oil in North America” is not produced in Canada, but just outside Los Angeles, where the Placerita oil field generates about twice the level of upstream emissions as Canadian oil sands production; and
  • The title of “world’s dirtiest oil” goes to Brass crude blend from Nigeria, where the uncontrolled release of methane during the oil extraction process generates upstream GHG emissions that are over four times higher than Canadian dilbit.

As for the claim that the oil sands are the most expensive oil, that dubious title likely goes to the Kashagan oil field in Kazakhstan but it certainly doesn’t go to oil sands oil most of which can be produced at very reasonable costs.

Spills – We know what to expect

Contrary to claims by critics, we know a lot about how to handle diluted bitumen spills. During the original NEB hearings a lot of organizations made hay over the lack of specific knowledge about dilbit spills. As a consequence the federal government spent almost $50 million to study the topic. Transportation Canada prepared a summary of the latest research as did Fisheries and Oceans Canada.

Their conclusions were that dilbit behaves almost exactly the same as other heavy crude oils in spills and that the technologies that we currently rely on to address heavy oil spills would work equally well on diluted bitumen. So when Dr. Weaver tells a reporter about an old Royal Society of Canada Report, the correct response should be to point out that a lot of much newer information now exists and that the report is no longer a particularly useful resource on this topic.

Refining Heavy oils

It is true that heavy oils can’t be effectively refined in a lot of refineries. Rather heavy oil needs to be refined in specially designed and built high-conversion refineries.

Heavy crude oil refineries will include very expensive cracking and coking units, designed to break down the long chain hydrocarbons into the smaller hydrocarbons used in gasoline, kerosene and diesel. Unfortunately, the simpler light crude refineries don’t typically have these cracking and coking units. Ironically, this can mean that the light crude refineries can’t handle the heavier components in the light crude oils and so the refineries end up producing more undesirable byproducts (like petroleum coke) per barrel of input.

What this means is that the heavy oil refineries produce more gasoline/diesel/kerosene per barrel of heavy crude oil than the light refineries do per barrel of light crude oil and the heavy refineries produce a lot less waste petroleum coke per barrel as well.

In financial terms, the heavier crudes produce much higher margins per barrel of input than their lighter crude cousins and generate less waste byproduct that have to be disposed.

Because of these factors the owners of heavy oil refineries will pay a premium to get heavy oil to use in their very expensive high-conversion refineries.

Asian Refining Capacity

One of the most bizarre recent naratives is that there is no market for diluted bitumen in Asia and that Asian refineries can’t refine dilbit. This is entirely untrue. As Reuters recently reported:

Many of the region’s refineries are new and are optimized to process heavy and sour crudes.

They were designed this way to take advantage of the historical discount these grades were priced at relative to light, sweet crudes, such as global benchmarks Brent and West Texas Intermediate (WTI), and oil from West African producers such as Nigeria and Angola.

The recent developments in the crude oil market have all but eliminated the discount enjoyed by heavy crudes, and in some cases, physical cargoes of some heavy grades have traded at premiums to light crudes.

So, contrary to what the folks at the Canadian Press or David Anderson have to say Asia has a lot of refineries that can refine heavy oil. Want some numbers? According to GlobalData’s report on Chinese refining capacity:

The country’s total coking capacity, catalytic cracker capacity and the hydrocracking capacity is expected to increase during the outlook period. The total coking capacity is expected to increase from 1,991 mbd [thousand barrels per day] in 2018 to 2,371 mbd in 2023. China’s total catalytic cracker unit capacity is expected to increase from 4,359 mbd in 2018 to 5,532 mbd in 2023. Over the five year period, the hydrocracking unit capacity of the country is set to increase to 2,922 mbd from 1,846 mbd.

Look at those numbers. The Chinese refineries can refine all the bitumen Alberta currently produces and can handle over 8 times what Line 2 of the TMX can send to Westridge Marine Terminal for export. This is why Asian refineries are buying up all the heavy they can get, often at a premium over lighter crudes. Consider that on June 13th (when I last looked it up)

  • Maya (the chemical twin to land-locked Alberta WCS) for export to Far East was selling at $51.16/bbl.
  • WCS (the Canadian heavy oil used to represent Alberta heavy) was $39.19/bbl and
  • West Texas Intermediate $48.96/bbl.

The high sulfur, heavy oil was selling at a premium over the lighter crude and Alberta was losing almost $12/bbl of value because its oil was land-locked. It doesn’t take a PhD in Economics to know that if the market is paying a premium for a product then clearly someone wants that product.

Asian Demand for Heavy oil

The most ridiculous recent story coming from the activist community is that there is no demand for heavy oil in Asia. Why do I say ridiculous? Because according to data supplied to Business in Vancouver by Statistics Canada.

7.5 million barrels of Alberta crude shipped to Asia via Westridge Marine Terminal in 2018, with a total value of $539 million.

China took 6.3 million barrels, at a value of $442 million. Another 648,000 barrels went to South Korea ($51 million) and 508,000 barrels went to Hong Kong ($46 million). A small amount also went to Thailand.

Coincidentally last week Reuters reported:

The tanker New Dream, chartered by commodities trader Mercuria Energy Group, departed on June 16 from Galveston loaded with more than 1 million barrels of heavy Canadian crude, and is headed to Asia, according to vessel tracking data from Refinitiv Eikon and ClipperData.

Another 3 million barrels of Canadian crude are due to be exported from the Gulf Coast by June 30, according to an oil trader familiar with the matter. Their destinations could not immediately be learned.

Ironically, on the same day the activists were claiming that no Asian economies want our heavy oil a tanker from Korea tied up at Westridge Marine Terminal to take on a load of heavy oil to one of their refineries.

Asian refineries are doing everything in their power to get Alberta heavy crude. They are even buying material that has been shipping all the way to the Gulf Coast and then shipping it half-way around the world. The activist narrative is simply false.


So here we have it. Less than a week after I wrote a 2000 word post on Trans Mountain myths I have another 2000 words debunking more myths. It is almost impossible to keep up with the false narratives. What is worse is that many of the newer ones represent journalists repeating misinformation that has been fed to them and that they had been unable/unwilling to confirm via other sources. The problem is that the statements like there being no refining capacity for heavy oil in Asia can be debunked with five minutes of research on Google.

Posted in Pipelines, Trans Mountain, Uncategorized | 21 Comments

Another look at the Trans Mountain Pipeline Expansion Project from the lens of a pragmatic environmentalist

This week the Federal cabinet will decide whether to proceed with the Trans Mountain Pipeline Expansion Project (TMX). In response, opponents of the TMX have been out in force making ridiculous claims about the project. This has caused me to summarize some older post explaining why a pragmatic environmentalist supports the project. What follows are a series of short-takes addressing the major arguments used by activists fighting the pipeline and a conclusion explaining why I support the project.

More capacity for light crude and refined fuels to the west coast

A common myth about the TMX is that it is all about moving bitumen. That is not true. The TMX has two major components:

  • Line 1 – existing pipeline segments (with pump upgrades) able to transport 350,000 barrels/day (bbl/d) of refined petroleum products and light crude. It has the capability to carry bitumen but at a much reduced volume per day.
  • Line 2 – a new pipeline with a capacity of 540,000 bbl/d. It is intended to transport heavy crude oil.

So while Line 2 is about bitumen, Line 1 is intended to help mitigate the supply bottleneck that has Vancouver drivers paying such high prices for gasoline and diesel while supplying the light crude needed by the Parkland Refinery in Vancouver and US refineries in the Puget Sound.

Admittedly Line 1 could be used for heavy crude but even a little bit of heavy oil in Line 1 eliminates the benefits of the upgraded Line 1. Meanwhile Alberta recently completed the Sturgeon refinery and now has a glut of diesel. As such, it makes logistical and financial sense to operate the pipeline in the manner consistent with the NEB proposal which means that Line 1 will almost certainly be used for what it was intended: light crude and refined fuels.

Asian Markets

This weekend Mr. Anderson repeated a common claim: that there is no market for Alberta heavy crude in Asia. That claim is bogus. As pointed out in BiV: more than $1 billion worth of Alberta crude was exported by oil tanker via Vancouver in 2018, with China accounting for about one-third of the sales. Remember, this is from constrained pipeline that often didn’t get a full marine allotment to Westridge.

As for Mr. Anderson’s argument that the world is awash in light oil and that heavy oil is somehow inferior to light oil. That is simply false. Heavy oil and light oil are different liquids with different chemical properties and different markets. This argument is akin to claiming that diesel is inferior to gasoline. If you own a diesel truck you need diesel not gasoline. As for refining capacity, Asia has lots of refining capacity for heavy oil and not enough supply, as does California.

According to GlobalData’s report on Chinese refining capacity:

The country’s total coking capacity, catalytic cracker capacity and the hydrocracking capacity is expected to increase during the outlook period. The total coking capacity is expected to increase from 1,991 mbd [thousand barrels per day] in 2018 to 2,371 mbd in 2023. China’s total catalytic cracker unit capacity is expected to increase from 4,359 mbd in 2018 to 5,532 mbd in 2023. Over the five year period, the hydrocracking unit capacity of the country is set to increase to 2,922 mbd from 1,846 mbd.

This is why Asian refineries are buying up all the heavy they can get, often at a premium over lighter crudes. Consider that on June 13th (when I started this blog post)

  • Maya (the chemical twin to land-locked Alberta WCS) for export to Far East was selling at $51.16/bbl.
  • WCS (the Canadian heavy oil used to represent Alberta heavy) was $39.19/bbl and
  • West Texas Intermediate $48.96/bbl.

Look at those numbers. The high sulfur, heavy oil was selling at a premium over the lighter crude and Alberta was losing almost $12/bbl of value because its oil was land-locked. It doesn’t take a PhD in Economics to know that if the market is paying a premium for a product then clearly someone wants that product.

As for that $12/bbl less per barrel. That means less royalty money for the Alberta government, less income for Alberta and less money going towards paying for Canadian social programs.

The world market

This brings me to a topic I’ve been told I shouldn’t touch….our competitors. The reality is that you can’t have a legitimate discussion about the topic of oil without considering the ethics underlying our oil supply.

Some commentators say we should get out of the oil business and cede the field to the despots, the tyrants and the murderers. I disagree. I see a need to supply the Canadian market with Canadian oil, produced by Canadian workers who pay into the Canadian tax system and thus underwrite the costs of Canadian civil services, the Canadian way of life and the Canadian move away from fossil fuels. Do you know who else is making this same argument? Green party Leader Elizabeth May.

Our primary competitors in the heavy oil market (besides Mexico) are Venezuela, Russia and Iran. These are not liberal democracies where individuals are free to marry the people they want; protest their governments without fear of persecution; or are safe from arbitrary arrest and punishment. These are dictatorships propped up by oil money. Their oil is extracted with little or no concern for environmental protection and the profits are used to fund oppression. This is a painful truth that has to be faced by any activist who says we shouldn’t be producing oil in Canada.

Tankers in the Salish Sea

As described in a previous post, if the TMX doesn’t get completed, the refineries in the Puget Sound will still need over 645,000 bbl/d of crude oil. Currently Cherry Point refinery alone sees 500+ tankers a year and Toresco (a committed shipper on the TMX) has said they want to add 120 tankers a year to their Andeavor facility to make up for an absence of supply. Meanwhile Westridge will still be sending out a few tankers a month. So in the end we will still see 700+ tankers a year coming in and out of the Salish Sea with 620+ of them running the narrower and much more dangerous Rosario Strait.

Spill risks

I have written in detail about the relative risks associated with the project to the Salish Sea. Any cold-eyed analysis of the relative risks shows that the TMX reduces our regional risks of oil spills. Blocking the TMX will increase the likelihood of a disastrous rail spill that could spell the end of a major fishery or result in the deaths of dozens of innocents. It will put more tankers going through narrower waters with less support from escort tugs. That is a formula for increased risk.

Spill response

The BC west coast has been chronically under-served for spill response. One of the big gets for BC in the TMX project was a toll on the new fuel transportation to pay for improved spill response. However, if there is no expansion that toll will not be paid and that money disappears. The result is a loss of spill response capability.  Right now we are looking at losing $150 million and several spill response bases. Since the funds for the spill response was coming from the private sector there is no obvious way to replace those funds. When a spill occurs the equipment will not be there to address it. So damage will be greater.

The threat to the Southern Resident Killer Whales

I wrote about this in a previous post. If you look at the entire Salish Sea, and not simply the Canadian side of the border, then you realize that the TMX will likely decrease the risks to the southern resident killer whales (SRKWs) not increase those risks. If the TMX fails, foreign-flagged ships with lower safety standards will be coming in to the same waters, running through narrower straits while not following the slower speeds recommended by DFO to reduce ship noise. It will be more dangerous and louder for the SRKWs. Meanwhile, more oil-by-rail along the Columbia Gorge puts the whales’ winter feeding grounds at risk. All it takes is one spill in the Columbia River to destroy the SRKWs’ winter feeding grounds.

Oil-by-rail volumes

As reported by Global News: the Paris-based IEA forecasts that Canadian crude-by-rail exports will grow from 150,000 b/d a day in late 2017 to 390,000 bbl/day in 2019. In October 2018, rail exports hit a record high at 327,229 bbl/day — a 138.5% volume increase over 2017.

On the American side of the border just three (Tacoma, Anacortes, or Ferndale) of the region’s six refineries moved over 156,800 bbl/d by rail in 2017 and every indicator is that the volume will be increasing absent TMX. These trains are carrying explosive Bakken crude through some of the most densely populated parts of the Pacific Northwest and along the sides of some of the West Coast’s most important salmon rivers.

Rail spill risk

We all know that risk of incident is 4.5 times higher for transportation via rail over pipeline and more of the rail route is along the river sides than is the pipeline. Many activists complain about the sourcing of the 4.5 times stat so let’s go to Citylab and the Sightline Institute, both of  which warn about the increase in risk of oil spills associated with this increase in oil volumes by rail. There will be more oil-by-rail spills and because our rail lines run along river sides we will have far more risk to salmon habitat and the SRKWs.


I am a pragmatic environmentalist. My area of professional expertise is the investigation and remediation of former industrial and commercial sites with a specialty in the assessment of petroleum hydrocarbon contamination and its effects on human and ecological health. Working in this field I have come to understand that all industrial activities have environmental consequences.

We live in a society that, like it or not, remains dependent on oil (petroleum hydrocarbons) and petroleum hydrocarbon-based products. Our food is produced on farms that need heavy equipment to operate. That food is shipped around the world by air, water and rail, all of which rely on petroleum hydrocarbons to operate. Contrary to claims from activists, this reality is not going to change anytime soon. While many alternative means of transportation are in developmental pipelines, none are in a position to significantly change our industrial or commercial dependence on liquid fuels, even as we move to electric vehicles for personal use.

In 2015 world leaders passed the Paris Agreement. As part of the process Canada agreed to drop our greenhouse gas emissions to 30 per cent below 2005 levels by 2030. Irrespective of what many activists may claim, Canada did not commit to an economic suicide pact, nor did we agree to abandon all fossil fuels.

Canada certainly did not commit to achieving a fossil fuel-free status in less than two decades. I have read many recent articles written by activists about a Canadian Green New Deal who repeat ridiculous claims like the idea that we can cut our emissions by 50% by 2030. As I have demonstrated at this blog, the claim that we could achieve this goal in the next 11 years does not even rise to the level of laughable. It is simply magical thinking. If we undertake herculean efforts and dedicate a historically unprecedented per cent of our national gross domestic product to the task we have a reasonable chance of weaning ourselves off fossil fuels in 30 years. What this means is that Canada has, and will have, an ongoing need for fossil fuels for the foreseeable future.

To fund that transition needs a healthy Canadian economy. I want the funds generated by Canadian oil to help fund our Canadian transition away from fossil fuels. The first step in that process is getting that oil to market in the safest, least environmentally harmful manner and that means via pipeline. Most importantly, blocking the pipeline is not going to reduce our dependence on fossil fuels, rather it will simply redirect the crude to less safe means of transport while simultaneously reducing our economic ability to fight climate change. One might say we will end up with the worst of both worlds, a greater risk to the environment and less financial ability to finance the fight against climate change.

Author’s note: conflict of interest declaration

Contrary to claims by my detractors, I have no connection, financial or otherwise, to the Trans Mountain pipeline project. My employers holds no contracts (nor anticipates any contracts) with the project nor do I have any financial stake in the project. I am interested in this project because I am an environmentalist who has spent over 30 years becoming an expert in this field and worry that the narrative on this topic is being driven by activists who appear to not really understand the topic and so keep making impossible claims/demands.

Posted in Canadian Politics, Pipelines, Trans Mountain | 5 Comments

On the Energy Innumeracy of the supporters of Canada’s Green New Deal

In the last week a group of Canadian activists have decided to mimic their American cousins by trying to advance a Canadian pact for a Green New Deal (GND Can hereafter). This is not the American Green New Deal you might have heard about, it is entirely Canadian project, and like the Leap Manifesto (its political cousin) the GND Can represents a sort of aspirational thinking best relegated to fairy tales and not worthy of consideration in serious climate change discussions.

I’m sure a lot of people reading this post will think that I am being a bit unkind by describing the GND Can as a grand delusion, but as I intend to show in this blog post, it is clear that the people who created this project are innumerate when it comes to energy policy. The demands being forwarded are so ridiculous that it is unclear how any informed individual/organization could sign on to this deal.

Once again I have presented some pretty strong words, so let’s start with the obvious question: what are the core demands for the GND Can? From their Q&A we get this:

The Pact for a Green New Deal rests on two fundamental principles:

a. It must meet the demands of Indigenous knowledge and science and cut Canada’s emissions in half in 11 years while protecting cultural and biological diversity.

b. It must leave no one behind and create a better present and future for all of us. That means ensuring that solutions are universal and far reaching.

Now I can hear you all asking: isn’t eleven years to cut Canadian emissions in half a bit ambitious? Well the GND Can web site answers that question as well. Here is their response:

We must remember we are living a climate crisis. If we would have started acting decades ago we wouldn’t need such a wide-spread and rapid transition. But because of government delays and fossil fuel funded disinformation campaigns our window has become very short and our timelines are no longer negotiable. We have a 303 MT gap* to make up and we need to get started. We are talking about survival now. We either choose to act and avoid catastrophe or we don’t. We’d prefer the former.

**The IPCC 1.5 degree report uses a 2010 baseline. Canadian GHG emissions were 694 MT in 2010, so to meet the science Canada’s emissions would need to fall to 347 MT. The most recent data is on 2017 emissions (716 MT), so that’s a ~369 MT reduction from 2017 levels. Canada’s current climate plan gets us to 616 MT (excluding LULUCF), so there is still a 303 MT gap to get to 369 MT:

As a numerate individual interested in energy policy I find this argument somewhat less than compelling. Crisis or not, demanding the impossible will get you nowhere. I say this not because I don’t want to see it happen, I say it because I want us to fight climate change and when we spend time arguing for the impossible we are not arguing about accomplishing the possible.

To understand the extent of the problem we have to remember that evidence-based environmental decision-making relies on using real data in the process. So let’s look at Canadian greenhouse gas (GHG) emissions to see what I mean (here is the Env Can summary doc and the document the GND Can web site cites in their footnote).

Let’s start with the big picture (all figures from Env Can). According to the government of Canada, in 2017 Canada emitted 715 Megatonnes (MT) of carbon dioxide equivalents. Sectorally this can be broken down to:

  • Oil and gas – 195 MT
  • Transportation – 174 MT
  • Buildings – 85 MT
  • Electricity – 74 MT
  • Heavy industry – 73 MT
  • Agriculture -72 MT
  • Waste and others (light industry etc) – 42 MT

To achieve the GND Can demand of a 50% decrease would bring us from 715 MT/year to 357.5 MT/yr. This means we have to find 357.5 MT to reduce from our yearly emissions. Moreover, if the demands of the GND Can are to be met we need to achieve this goal in 11 years. So let’s start chopping to see what it will take.

Let’s first look at the oil and gas sector since it represents the low-hanging fruit.

From the supporting documentation we get these yearly numbers:

  • Natural gas – 49.5 MT
  • Conventional oil – 31.3 MT
  • Oil Sands Mining and extraction – 16.4 MT
  • Oil sands in situ – 41.7 MT
  • Oil sands upgrading -22.4 MT
  • Other 33.2 MT

Let’s assume we will need conventional oil if we want to keep our planes, trains and buses operating. That leaves us about 160 MT which gets us almost half-way (44%) there.

Cutting the oil and gas sector brings about some pretty serious problems from a public policy perspective. We are talking about a significant percentage of Canada’s economy, hundreds of thousands of jobs and most importantly the tax dollars that will be needed to pay for this massive program. By killing the industry we are starting from behind the eight ball with a massive Canadian recession and an even bigger hole in the national budget.

The killing of the oil and gas industry doesn’t just destroy our economy it also leaves an even bigger hole in Canada’s energy picture. Natural gas is used in almost every sector of our economy but is particularly important for housing. Natural gas is used over much of the country to heat our homes and businesses. According to the Canadian Association of Petroleum Producers

Over six million Canadians use natural gas to light, heat and cool their houses, heat their water, and cook their food. Natural gas is increasingly used in energy-efficient furnaces and appliances such as dryers. 

Natural gas currently provides 13 per cent of Canada’s electricity generation, and because it can be delivered quickly and affordably it is an excellent partner for intermittent renewable power sources such as wind and solar.

According to the NEB, in British Columbia 58% of households rely on natural gas for heating, in Ontario it is 67% and in Alberta it is 79%.

To go without natural gas means we will need to upgrade EVERY SINGLE home or business that uses natural gas for heating or hot water. All in 11 years. Think about that number 58% of British Columbian households, 67% of Ontarian households and 79% of Albertan households would need to be upgraded in 11 years. That is millions and millions of households.

Having addressed the biggest source, the next biggest source is transportation.

Transportation emitted 174 MT in 2017 broken into

  • Passenger Cars – 34.6 MT
  • Passenger light trucks – 50.5 MT
  • Passenger aviation, bus, rail and motorcycle – 8.6 MT
  • Freight trucks – 59.9 MT
  • Freight aviation, rail and marine- 11.9 MT
  • Others (recreational, commercial and residential) 8.9MT

Now looking at these numbers the first thing to understand is that technologically there are no currently available alternatives for the two freight options. You can’t expect us to go without food or water so those we are stuck with using liquid fuels for passenger aviation, bus, rail and freight trucks. The only problem is they only represent 41% of the pie. To achieve a 50% cut we need to eliminate 85% of emissions from the other four categories. If we assume that mass transit should be protected (bus, rail and domestic airlines represent about 7 MT) that leaves a stately 8.2 MT for all remaining passenger vehicles (or over 90% reductions). Reducing your personal vehicle use by 90% shouldn’t be a problem in 11 years should it?

A lot of people would likely be able to switch to EVs but then we have an increase in electricity, challenges with our power grid and the problem of simply getting all those EVs built and sold.

A minor consideration not previously mentioned. Most municipal electrical systems are not easily upgradeable and all those EVs, electrical heaters and electrical hot water heaters will overload the local grids.

Having completely upending our transportation system still only earns us 87 MT (roughly 24%). We are far from where we need to be.

Now for the electricity sector.

Since the aim is to get off coal and natural gas, this seems like an easy one. We can simply eliminate all 74 MT. This helps a lot (20%) but leaves us in a pickle because every step to date has assumed we will have more electricity. Since building new electricity infrastructure is both expensive and time-consuming we will just have to go without for the time-being. After all hospitals don’t really need ventilators.

We also need to massively upgrade our transmission system to move all that new power to where it is needed. As I’ve written previously, even the most optimistic view has a new grid costing $25 billion and taking a couple decades to build. A more realistic appraisal puts the cost of a national backbone of 735 kV transmission lines at around $104 billion and taking 20 years to complete. I’m not sure how we will do that in 11 years but at this point it is becoming clear how awesome this task really is.

We are now reasonably close. We have cut 321 MT and need only 36.5 MT to reach our goals. Considering that a lot of the buildings (85 MT) involve burning natural gas for heat and hot water that should easily get us below the magic number to below 357.5 MT. So let’s consider what this entire project will cost us:

By 2030 we need to

  • Essentially eliminate the personal vehicle
  • Eliminate our oil sands and natural gas industries
  • Retrofit every household in Canada that uses natural gas for heat and/or hot water
  • Eliminate all our fossil fuel electricity capacity
  • Build the electrical capacity to provide the power for all those EVs, hot water heaters and heaters and
  • Build an entire electricity transmission system to move all that power around.

Moreover we need to do this in 11 years while

  • Dealing with the massive recession that comes from destroying our oil & gas industry
  • Paying for a massive upgrade to our public transportation infrastructure to deal with the fact we virtually eliminated personal automobiles
  • Paying for massive retrofits for virtually every household in the country that uses natural gas or fuel oil for heating and hot water
  • Paying for a massive increase in renewable electricity capacity to deal with the sudden jump in demands and the loss of fossil fuel electricity infrastructure
  • Paying for the massively upgraded transmission capacity to move all that new renewable electricity from where it was generated to where it is needed.

Admittedly the GNDers have one thing right. They will create a massive number of new jobs, the only hitch will be how to pay for them all.

Now the funny part of this whole blog post is going to be the replies. I’ve already been asked “so what would you do instead?” and “so you think we should stand back and do nothing?”.

My response is simply to point out that the first step in evidence-based environmental decision-making is to assemble the data to help make an informed decision. Having done so it becomes clear that whatever we choose to do, the Canadian GND shouldn’t be part of the discussion. Looking at what it would take to achieve their goals it is clear that even with an infinite amount of money at their disposal they could not get there. 2030 is only 11 years away and there are too many tasks on their plate with too many rate-limiting steps and that doesn’t even consider that they want to run each step through the lens of indigenous knowledge and equity.

Until the authors of this Green New Deal actually put pen to paper and show how they will achieve their goals I think they should simply be ignored. They are peddling a fantasy. Trying to argue that we should attempt the impossible is a ridiculous approach that is guaranteed to end in failure and failure is something we can’t afford. If you want to see what we should do let’s look at the City of Vancouver’s Climate Emergency Response document. It is a great start but even it sets more realistic timelines.

Put simply the Canadian Green New Deal is another fantasy project that will distract from the real work that needs to be done. It is time activists stopped demanding the impossible and started working towards the possible.

Posted in Climate Change, Climate Change Politics, Leap Manifesto, Uncategorized | 16 Comments