Why a Pragmatic Environmentalist supports the Trans Mountain Pipeline

I am a pragmatic environmentalist. I have worked in the environmental field for over twenty-five years. My area of professional expertise is the investigation and remediation of former industrial and commercial sites with a specialty in the assessment of petroleum hydrocarbon contamination and its effects on human and ecological health.

In my professional capacity I serve as a technical specialist in: industrial chemistry; the biodegradation of contaminants; the effects of contaminants on natural systems; and ecosystem restoration. I have no connection, financial or otherwise, to Kinder Morgan or the Trans Mountain pipeline but I have some strong opinions on the project which are based on my personal experience and specialized knowledge of this field. I have spent the last 16 years cleaning up the messes made by the generations before me. I have seen the consequences of oil spills and industrial activities first-hand and I recognize that all industrial activities have environmental consequences.

We live in a society that, like it or not, is dependent on oil (petroleum hydrocarbons) and petroleum hydrocarbon-based products. Our food is produced on farms that need heavy equipment to operate. That food is shipped around the world by air, water and rail, all of which rely on petroleum hydrocarbons to operate. Petroleum hydrocarbons also serve as the feedstock of the petrochemical industry, which forms the basis of all the things that make our modern world work. They are the building blocks of our plastics, our computers, the tools we need to keep us healthy and the drugs we take when we are sick.

In 2015 world leaders passed the Paris Agreement. As part of the process Canada agreed to drop our greenhouse gas emissions to 30 per cent below 2005 levels by 2030. Irrespective of what many activists may claim, Canada did not commit to trashing our economy nor did we agree to abandon all fossil fuels. Canada certainly did not commit to achieving a fossil fuel-free status in less than two decades. I have read many recent articles written by activists who repeat ridiculous claims like: “new research shows that the fossil-fuel era could be over in as little as 10 years.” As I have demonstrated at this blog, the claim that we could eliminate our reliance on fossil fuels in the next 10 years does not even rise to the level of laughable. It is simply magical thinking. If we undertake herculean efforts and dedicate a historically unprecedented per cent of our national gross domestic product to the task we have a reasonable chance of weaning ourselves off fossil fuels in 30-50 years. Even then it is likely closer to the 50-year than the 30-year timeline. What this means is that Canada has, and will have, an ongoing need for fossil fuels for the foreseeable future.

A point seldom discussed by activists is the costs. As I noted, the effort to wean ourselves off fossil fuels is going to be incredibly expensive. That money has to come from somewhere. That somewhere is the Canadian tax base and the way to build that tax base is to take advantage of Canadian natural resources not to undercut them.

I know that the term “ethical oil” has some blemishes because of issues surrounding its origin, but I believe in the concept behind the term. As a Canadian I want my personal gasoline purchases to go towards subsidizing medicare and not subsidizing a despot or paying for a tyrant to bomb his neighbour. I want to know that the oil used in my car was not generated using slave labour in a country without a free press, and where environmental regulations are noted by their absence rather than their application. I want my oil being produced by well-paid Canadians in a country with a demonstrably free press, strong government oversight and a strong tradition of NGOs to watch over the regulator’s shoulder.

As a Canadian I will point out again that Canadian oil helps support Canadian jobs and Canadian institutions, and provides the funds to pay for our education and medical systems while subsidizing transfer payments. This brings us to the Trans Mountain Expansion proposal (TMX). The TMX has two major components:

  • Line 1 would consist of existing pipeline segments (with pump upgrades) and could transport 350,000 b/d of refined petroleum products and light crude. It has the capability to carry bitumen but at a much reduced volume per day. Notice that absent the heavier bitumen it can carry an extra 50,000 b/d. Line 1 is intended to help mitigate the supply bottleneck that has Vancouver drivers paying such high prices for gasoline and diesel.
  • The proposed Line 2 would have a capacity of 540,000 b/d and is allocated to the transportation of heavy crude oil. This new pipeline and configuration setup would, add 590,000 b/d to the existing system for a total capacity of 890,000 b/d.

Freeing up Line 1 will allow the west coast to become less reliant on foreign imports and provide a means for the Sturgeon refinery to get its production to BC. Meanwhile, a big complaint is that much of the increased pipeline capacity is for “export”, but “export” can mean a lot of things. It is likely that a major “export” location for Trans Mountain oil will be the Puget Sound with most of that increase traveling along the existing upgraded pipeline. Much of the remaining export will be to California which is also suffering from a heavy oil shortage. Due to its proximity, tankers from Vancouver to California will be the cheapest way for California to get heavy fuel which means Albertans will get the best price for that oil (as there will not be a transportation premium).

As I wrote in my previous post the current pipeline capacity to the West Coast is inadequate to supply demand. The volume in excess of demand still needs to get here so where is it going to come from?

  • Absent the TMX we will be seeing more foreign tankers in Washington waters. Those tankers will not meet the stringent safety requirements that the NEB has imposed on the TMX ships but those tankers will be still sailing through the same “treacherous” waters. So we see a significantly higher risk from tanker spills.
  • Absent the TMX upgrade we will see a significant increase in oil-by-rail to the Puget Sound (Bakken oil transported along the Columbia River Valley).
  • Absent the TMX we will see continued movement of oil-by-rail to the Lower Mainland down the Thompson and Fraser River valleys. A spill on any of the rivers is more likely by rail than by pipeline and would cause untold damage to endangered fisheries.

So what are we looking at if the activists manage to stop the TMX? Certainly not a decrease in ecological risk. Rather we will see an increase in risk to our rivers and the marine environment…and at what cost? Any rent-seeker who thinks that blocking the pipeline will somehow help us fight climate change is barking up the wrong tree, because the countries that will serve as the replacement for Canadian oil (the Saudis, Nigerians and Algerians) are not paying into our federation; they are siphoning money out of it. If you want your bridges, roads and sewage plants built/repaired, then you are going to need money and blocking the Trans Mountain is exactly the wrong way to obtain those funds.

As I have written numerous times at my personal blog, we need to wean Canada off fossil fuels as our primary energy source. If we are to avoid the serious consequences of climate change, we will need to eliminate fossil fuels from our energy mix. However, contrary to what many say, the process of doing so will take decades, and in the meantime we will still need petroleum hydrocarbons.

So, the question that must be asked is: from whom do we want to source our needs? From Canadian provinces that pay into equalization or from foreign despots who use the money generated to fund wars and underwrite totalitarian regimes?

The reality is that you can’t have a legitimate discussion about the topic of oil without considering the ethics underlying our oil supply. Regardless of branding, ethical sourcing has to be part of the discussion. As a pragmatic environmentalist seeking only to ensure a healthy economy on a healthy planet, I would be remiss if I ignored this topic.

Some commentators say we should get out of the oil business and cede the field to the despots, the tyrants and the murderers. I disagree. I see a need to supply the Canadian market with Canadian oil, produced by Canadian workers who pay into the Canadian tax system and thus underwrite the costs of Canadian civil services, the Canadian way of life and the Canadian move away from fossil fuels.

Put simply, I want the funds generated by Canadian oil to help fund our Canadian transition away from our dependence on fossil fuels. The first step in that process is getting that oil to market in the safest, least environmentally harmful manner and that means via pipeline. Most importantly, blocking the pipeline is not going to reduce our dependence on fossil fuels, rather it will simply redirect the crude to less safe means of transport while simultaneously reducing our economic ability to fight climate change. One might say we will end up with the worst of both worlds, a greater risk to the environment and less financial ability to finance the fight against climate change.

Posted in Canadian Politics, Climate Change, Pipelines, Trans Mountain, Uncategorized | 42 Comments

The question anti-Trans Mountain Pipeline Expansion activists refuse to answer

This weekend both pro- and anti-Trans Mountain Pipeline Expansion Project (TMX) rallies were held. Sadly I couldn’t attend either. I did take advantage of the interest to try to figure out what was going on in the heads of the people fighting TMX so I asked them a question:

My challenge to #StopKM protestors: Show me a safer way than the @TransMtn to get the fossil fuels we need to run our society to the West Coast What is your alternative to #KinderMorgan?

I tagged so I figured that I would get some informed discussion. Some people tried to change the topic but not one addressed the question at hand. At last count the tweet had over 16,000 views with no one actually addressing the question posed. Later in the day I sent out a follow-up which said:

This morning I asked the question: Show me a safer way than @TransMtn to get the fossil fuels we need to run our society to the West Coast What is your alternative to #KinderMorgan? 10,000 views and not 1 response

It got 6,000+ views. I got more non-responses with Tzeporah Berman providing the prototypical answer:

The alternative is don’t expand production.

My reply thread is here 

This blog post will provide the details and the references that I could not fit into a Twitter thread. I hope it will show just how hollow the arguments of the anti-TMX protesters really are.

There is a common misconception about the role of pipelines in our daily lives. We live in a society that is dependent on oil and oil products. These products aren’t just refined into gasoline and diesel to run our vehicles; they also serve as the feedstocks for the petrochemical industry which provides the building blocks of our plastics, cell phones and many of the drugs we take when we are sick. Without fossil fuels our economy would simply stop. Every calorie eaten by David Suzuki has fossil fuels carbon incorporated into it. There is literally no food a Vancouverite can eat that wasn’t in some direct way, obtained using fossil fuels.

As I have written previously, British Columbia is nowhere close to reaching a fossil fuel-free status. So let’s acknowledge the reality, we need gasoline, diesel and oil to run our society. So where do these oil products come from? In coastal B.C. most of it comes via the Trans-Mountain pipeline. So let’s look at what the pipeline currently does:

The Trans-Mountain currently has a capacity of about 300,000 barrels a day (b/d).

Now you will notice that I said that only half of Parkland’s raw crude comes via the pipeline. You might ask: why? The reason so much of the Parkland refinery’s crude comes by rail is that the current pipeline is typically oversubscribed by about 30% on a month-to-month basis. This means that only about 60% of the product that shippers want to send on the pipeline actually ends up on the pipeline. There is simply not enough room to get all the crude and refined fuel we need on the West Coast to the West Coast using the existing pipeline. So when activists say we have enough capacity that is simply wrong the pipeline is already unable to ship all the production that we need to move.

Because of the shortage of volume on the pipeline Vancouver Island is supplied with almost all of its refined products via barges from Vancouver and the Puget Sound.

So let’s talk about the TMX because this is another case of the activists always getting their facts wrong. I cannot count the number of people who claimed on my timeline this weekend that the pipeline was only to export bitumen to Asia. That is what they have been told and heaven help the person who directs them to the National Energy Board documents that say otherwise. Well here is what the NEB submission actually says:

The TMX has two major components:

  • Line 1 would consist of existing pipeline segments (with pump upgrades) and could transport 350,000 b/d of refined petroleum products and light crude. It has the capability to carry bitumen but at a much reduced volume per day. Notice that absent the heavier bitumen it can carry an extra 50,000 b/d. Line 1 is intended to help mitigate the supply bottleneck that has Vancouver drivers paying such high prices for gasoline and diesel (as I will explain later).
  • The proposed Line 2 would have a capacity of 540,000 b/d and is allocated to the transportation of heavy crude oil. This new pipeline and configuration setup would, add 590,000 b/d to the existing system for a total capacity of 890,000 b/d.

A big complaint is that much of the increased pipeline capacity is for “export” but “export” can mean a lot of things. Thanks to the lack of refining capacity in the Vancouver region, we actually “export” oil and almost immediately need to re-import it as aviation and jet fuel from the Cherry Point refinery in Washington (or as refined fuels on Vancouver Island).

Most Canadians don’t know that there are five major refineries in the Puget Sound with a combined capacity of 647,000 b/d. So why is that important? For the last 20 years, up to 600,000 b/d of Alaskan crude have traveled down the coast of B.C., in tankers, and into the Puget Sound. Now let’s talk about some hypocrisy. US NGOs bragged about sending busloads of protesters to the Saturday rally. One of their complaints was the increased tanker traffic. In response Stewart Muir of ResourceWorks tweeted this:

Muir

It is a map from the TankerTracker App that shows that at the same time as Seattle protesters were driving up the coast to complain, eight tankers were in US waters. Unlike the tankers under the TMX none of these tankers are carrying local pilots while attached to two rescue tugs. These American protesters came up to Vancouver to fight against seven tankers a week when they have eight tankers in their waters at the same time. Could these activists be any more hypocritical?

Now a not well-known fact is that the Alaskan oil fields are drying up and new sources are needed to keep the Pacific Northwest in fuel. As a result, new railway capacity is being built to supply up to 725,000 b/d of Bakken crude to the West Coast and the Puget Sound refineries. The route will travel over any number of rivers including the headwaters of the Kootenay River and alongside the Columbia River to the Puget Sound.

Transporting oil and gas by pipeline or rail is in general quite safe. But when comparing rail to pipelines, rail is over 4.5 times more likely to experience an occurrence than pipelines, and when it does, we get more Gogamas, Galenas and Lac Megantics. But the big ones and the ones we hear about, they aren’t the only ones, let’s not forget the dozen or so other rail issues that didn’t make our local press. As for the protectors of our coast they never mention the Mosier derailment that came within feet of hitting the Columbia River. We share the Columbia with our American cousins and if they need to transport crude along the Columbia it is only a matter of time before a big spill happens there and then what will the activists be saying?

In a previous blog post I did the math and realized that because of where the rail lines run (along the river valleys), the 4.5 times number is not even relevant in BC, Washington or Oregon. The number is actually much higher. The environmentalists claim to want to protect our fragile ecosystem but instead they will greatly increase the likelihood of a spill and when that spill happens it could wipe out the Columbia or Fraser River fisheries…something the activists choose not to talk about while simultaneously complaining that the TMX will put those fisheries at risk.

To put my number into perspective according to industry statistics, in 2014 about 185,000 b/d of Western Canadian crude oil was transported to market by rail. In 2018, rail volumes are estimated at around 500,000 b/d to 600,000 b/d if Keystone XL is not available.

Now before I finish up I have to point out another way in which the anti-TMX activists are talking out of both sides of their mouths. I can’t count the number of activists who declare that Alberta should refine bitumen in Alberta. Well had any of them bothered to do their research they would discover that Alberta is about to open up its first new refinery in decades: the Sturgeon refinery It is the first refinery built exclusively to refine bitumen and has cost billions to build.  Now that the refinery is almost ready to go online guess what one of its biggest headaches will be? That’s right getting its production to market. The Trans Mountain is oversubscribed and the rail lines are full. One of the big benefits of the TMX will be the space opened up in Line 1 for refined fuels. British Columbia could finally have a steady supply of clean, Canadian diesel. But only if the pipeline is upgraded.

So I’ve thrown a lot of numbers around but let’s go back to my original question to the activists. We have now clearly demonstrated that the current pipeline capacity to the West Coast is inadequate to supply demand. The volume in excess of demand still needs to get here so what is their solution?

  • Absent the TMX we will be seeing more foreign tankers in Washington waters. Those tankers will not meet the stringent safety requirements that the NEB has imposed on the TMX ships but those tankers will be still sailing through the same “treacherous” waters. So we see a significant increase in risk from tanker spills.
  • Absent the TMX upgrade we will see a significant increase in oil-by-rail to the Puget Sound (Bakken oil transported along the Columbia River Valley).
  • Absent the TMX we will see continued movement of oil-by-rail to the Lower Mainland down the Thompson and Fraser River valleys. A spill on any of the rivers is more likely by rail than by pipeline and would cause untold damage to endangered fisheries.

Remember the complaint about all the oil being exported? Well it is likely that a major “export” location for Trans-Mountain oil will be the Puget Sound with most of that increase traveling along the existing upgraded pipeline. Much of the remaining export will be to California which is also suffering from a heavy oil shortage. Due to its proximity, tankers from Vancouver to California will be the cheapest way for California to get heavy fuel which means Albertans will get the best price for that oil (as there will not be a transportation premium).

We need to move towards a society where oil products are not used for power or fuel but that is not going to happen in the next decade or even two. Until that day comes, we need these products and the safest, most environmentally responsible way to get them to us over land is via pipelines. While we transition away from fossil fuels lets ensure that we use the safest modes of transport in order to protect our joint ecological heritage. The argument that we can do without simply doesn’t hold water. Currently (and for the next 20+ years) our transportation and food systems will remain utterly dependent on fossil fuels to keep our communities and economies alive. Given those real needs a pragmatic environmentalist looks for the safest way to move those fossil fuels and in this case that means pipelines like the Trans-Mountain.

Addendum:

Dr. Andrew Leach has used NEB data to graph what has gone through the pipeline since 2006. This is presented below:

Leach Graph

note that in the last few years Westbridge terminal has not been receiving its full allotment for export as the domestic light going to Burnaby (and presumably Parkland – see below) has displaced some of the marine exports.

Correction:

The best thing about a blog is when someone reads it and can provide help to make it better. In this case I received information from Parkland that updates the information that I have relied on from a previous source. In this blog post, I reported that that the Parkland refinery in Burnaby gets about half  of its 55,000 b/d from the Trans Mountain and half by rail due to the lack of space on the existing pipeline. That reference is now out of date. I have been contacted by a representative from Parkland who informs me that unlike Chevron, Parkland now gets all its supply from the Trans Mountain. I will be editing my blog posts accordingly.

Posted in Canadian Politics, Pipelines, Trans Mountain, Uncategorized | 102 Comments

On Jason Kenney’s threat to shut off Vancouver’s gasoline supply

In the last week Jason Kenney was on a West Coast swing as part of his continued attempt to replace Rachel Notley as Premier of Alberta. During the trip Mr. Kenney repeatedly threatened to stop the flow of oil to B.C., in response to the BC government’s use of regulatory processes to delay and possibly convince investors to give up on the construction of the Trans Mountain Pipeline Expansion (TMX). While many have argued that Mr. Kenney might lack the regulatory authority to block the movement of fuels, Mr. Kenney has indicated that he can return control of the authorization of fuel exports to his government and then turn down/off the flow of refined fuels to the West Coast. The environmentalists have argued that this might be a good thing. Much to my surprise, one of the top minds in the field Dr. Andrew Leach of the University of Alberta echoed their argument and this started a rather odd discussion between the two of us. It all centered around this comment:

If you’re going to be left short fuel, being so with a deepwater port and a refining complex around the corner isn’t really the worst circumstance.

While we had a somewhat less than cordial discussion, I was left with the need to fill in the gaps left open during the discussion. Specifically, I want to point out some critical facts that individuals like Dr. Leach have overlooked when discussing the potential outcome of a partial shut-down of refined fuel shipments through the Trans Mountain Pipeline system.

Let’s start with the biggest myth being advanced by those who are belittling the risks associated with Mr. Kenney’s threatened shutdown. This with summed up by Dr. Leach’s comment in the National Post: “It’s pretty hard to hold someone hostage … when they have a port”. The problem with that suggestion is that it completely ignores the facilities, layout and limitations of the Port of Vancouver. The loading and unloading of fossil fuels is a dangerous process that requires specialized facilities and when you off-load that fuel you need somewhere to actually put it. The Port of Vancouver lacks any large tank farms for off-loading the fuel needed to keep Southwestern British Columbia supplied with refined fuel. That facility already exists on Burnaby Mountain.

It is not like ships can get the fuel to the current Trans Mountain tank farm which is located on a mountainside (180m – 190m above sea level), 2.5 km east of Burrard Inlet. The pumps at the tank farm are designed to take advantage of that altitude loss, the facilities at Westbridge terminal are not built to fight against that gravity challenge. They can’t pump fuel up that mountain.

Dr. Leach’s reply was you’ve got 400kbbl of storage at Westridge which is 2x interior storage. You’ve got docks. There are tankers. Storage is the easy part. This completely mis-represents the situation. The Westridge marine terminal houses three storage tanks and can handle volumes of approximately 395,000 barrels (63,000 m³) of fuels. Currently those tanks are being used mostly for jet and specialty fuels. To suggest that the terminal would be able to off-load and store all the jet fuel, diesel, gasoline and bunker fuel needed to run Southwestern BC simply ignores the laws of physics. Three tanks simply can’t simultaneously hold four types of fuel and the three tank aren’t nearly big enough to keep us supplied. Not only that, but I remain amazed that the people fighting the facility would expect Kinder Morgan to simply turn over their facility to help the government trying to put them out of business. Mr. Kenney has never suggested that he would block all exports, just refined fuels. Kinder Morgan would likely want to use their facility for export which would be consistent with its existing license and all its existing permits.

Even more challenging would be the supply network in the Okanagan. Virtually all the fuel used in the Interior of BC funnels through Kamloops. Some have argued that this supply can simply be replaced by refined fuel-by-rail. That, of course, ignores the critical shortage of rail cars in Western Canada. Just look what happened during the summer of 2016 when a minor hiccup in oil-by-rail left West Kelowna dry. Shut down the Kamloops terminal and the interior goes dry, and with it goes the tourist industry.

So the question arises, do we have an indication of what to expect if Alberta decided to reduce the movement of refined fuels down the Trans Mountain? This week gives us a pretty good indication of what would happen. As reported in BusinessinVancouver the Parkland refinery processes 55,000 barrels of oil per day to produce gasoline, diesel and jet fuel. The refinery supplies 20% to 25% of B.C.’s gasoline and diesel. As part of an upgrade, the refinery has reduced its throughput and the result has been a spike in gasoline prices with prices already over $1.50/L with the possibility of hitting $1.60/L in the near future.

We are talking about a minor reduction in the operator that produces less than a quarter of the fuel used on the West Coast. As reported by Natural Resources Canada, Edmonton refineries provide about 50%-60% of the petroleum product needs in the Vancouver market. If a minor cut from the producer that supplies 20% to 25% of B.C.’s gasoline and diesel results in gasoline priced at $1.60/L imagine what shutting down 60% of our supply for a week or two would do. As I have pointed out previously, there is simply not an excess of supply in PADD 5 so our trading partners don’t have the excess to sell us. Instead we would be looking at gas in the $2/L – $3/L zone.

Ultimately the challenge faced by British Columbia is that our geography conspires against us in the transportation of fossil fuels. With mountains and water bodies in the way, getting fossil fuels to market is particularly challenging. This is aggravated by the fact that we have allowed ourselves to become completely dependent on outside sources to keep our economy running. When those other constituencies choose to direct their output elsewhere this can leave us in a bind. It can be argued that we have allowed the market to become too tight such that even the smallest fluctuations in supply can result in large price spikes. Expanding the Trans Mountain to provide more space for refined fuels can only help improve the market conditions on the coast. In particular adding capacity to get access to the output from the new Sturgeon refinery would go a long way to reducing our fuel shortages on the coast. Alternatively, we need to accept the risks associated with replacing that supply via rail. Given our anticipated population growth and our still relatively slow transition off fossil fuels for transportation, British Columbia will remain dependent on Alberta for our fuels and will thus remain potentially at risk from unfriendly governments in Edmonton who may choose to rattle their swords in our direction.

Addendum

An earlier version of this post had a discussion of BC politics that likely represented a bridge too far and given the individuals who have made the suggestion (including Keith Baldrey who knows this stuff) I was probably being a bit too Machiavellian in my thinking. Admittedly, few people have gone wrong guessing that Jason Kenney would take things a step further than the consensus opinion. That being said, my expertise is definitely not provincial politics so I have removed the last paragraph from the main text and left it below for the political plotters.

Dr. Leach has also pointed out the difficulties Jason Kenney would have given that the Trans Mountain is a federally regulated pipeline. This also may be true but for the purposes of this thought experiment I will take Mr. Kenney at his word that he has the power to manipulate what goes down the pipe. Alternatively he could simply act and then use the delays inherent in NEB processes to his advantage.

To conclude I would point out that I have followed the back-and-forths between Dr. Leach and Mr. Kenney and by my count Dr. Leach is ahead by a landslide if not a TKO. Given a discussion between the two I would place my bets on Dr. Leach when it comes to this topic.

Removed paragraph discussed above

Politically, $3/L gasoline could well end the tenure of the BC NDP/Green government. Currently the BC NDP/Green government has a very slim majority in the Legislature. BC has recall legislation that allows recall petitions beginning 18 months after the last election. Several swing ridings in Surrey were won by the NDP, reportedly on the strength of the promise to reduce bridge tolls (making commuting cheaper). Meanwhile Diane Watts, who still has a powerful political machine in Surrey, is sitting on the sidelines biding her time. Consider what would happen if 18 months after the election gasoline was at $3/L based on the new Jason Kenney government being true to his word? Do you not expect 3-4 “spontaneous” recall petitions starting up in key swing ridings in Surrey? That is the political game being played here. Anyone who analyzes Jason Kenney’s plans while ignoring that he is a career politician who thinks this way is completely missing the point. The best way for Jason Kenney to get the pipeline built is to change the government of BC and doing that would simply involve reducing the flow of gasoline to the coast and watching the political dominoes fall.

Posted in Canadian Politics, General Politics, Pipelines, Trans Mountain, Uncategorized | 6 Comments

On questionable science about fugitive emissions in the BC natural gas industry

I was mostly off-line earlier in the year when two publications came out on fugitive emissions in the British Columbia natural gas industry. These two publications were:

Mobile measurement of methane emissions from natural gas developments in northeastern British Columbia, Canada by Atherton et al., 2017 and

Fugitives in Our Midst Investigating fugitive emissions from abandoned, suspended and active oil and gas wells in the Montney Basin in northeastern British Columbia by the Suzuki Foundation (primary author Dr. John Werring).

These two publications were hailed by opponents of the BC LNG industry as destroying the “myth” that BC LNG could be used to help reduce global carbon emissions. I discussed why I believe BC LNG is good for the planet in my previous blog post On the global climate change math supporting BC LNG. Coincidentally these two publications came out just when a different article (Country-Level Life Cycle Assessment of Greenhouse Gas Emissions from Liquefied Natural Gas Trade for Electricity Generation by Kasumu et al., 2018 ) appeared in the peer-reviewed press supporting the thesis that BC LNG can help reduce global global greenhouse gases (GHG) emissions. Since that paper is behind a paywall you can get a taste of its contents in this Business on Vancouver article: BC LNG exports would have net GHG reductions: study or this Intelligence Memo from the C.D. Howe Institute.

Given the importance of this topic I have decided to look more deeply into these two fugitive emissions studies. While Dr. Weaver (leader of our BC Green Party) likes to refer to the Atherton paper as the St. Francis Xavier Study it is important to recognize that both publications were funded, in whole or in part, by the Suzuki Foundation which is a strong opponent of BC’s LNG industry. Unsurprisingly, the results of these studies were seized on by politicians like Dr. Weaver  and activists who are fighting against the BC LNG industry. My conclusion from reading these publications is that both have significant flaws and neither represents, in my mind, science that should be considered in any serious policy discussion.

Let’s start with why I called them “publications”. Both were, arguably, “peer-reviewed”. The Artherton paper was peer-reviewed as an open-access article and so the peer reviewers were 1) self-selected, 2) not all necessarily experts in the area under consideration 3) not able to exert any authority over the process (i.e. the authors could choose to ignore the peer reviewers). Dr. Werring’s work doesn’t even rise to that that level. It was published independently by the Suzuki Foundation and  doesn’t even pretend to represent a balanced report but rather appears to be part of an anti-LNG campaign by the organization and has to be viewed in that light.

Since it is the most serious of the two let’s start with the Artherton paper. It presents a novel approach to estimating fugitive methane emissions. They did so by installing an “ultraportable greenhouse gas analyzer” (UGGA) into a vehicle and driving it around BC then using their results to extrapolate an emission volume calculation for the entire Montney formation. Now there are some incredibly significant issues with this paper that would have had any editor in a real journal pulling out their hair but given the limitations of this blog post I will only highlight the three most critical points.

We need to begin by considering how the authors generated their minimum detection limit (MDL). This is an incredibly important number for this assessment because it is the number used as a multiplier for almost all their total emission volume calculations. In this study they took their MDL, multiplied it by the percentage of facilities where they detected emissions and then multiply that number by the total number of facilities in the Montney. Thus if the MDL was flawed the resulting conclusion would be equally flawed.

The challenge with the instrument used (the UGGA) is that it is intended to be used in a stationary position when taking a reading. The amount of air it takes in matters for calibration purposes. In this study they operated the instrument while driving around the country-side. To address that the instrument was not being used according to design specs the authors reportedly did a bench experiment and came up with correction factor for their instrument readings (a multiplier of 3.3 times). The problem is they don’t present any details about how they derived that multiplier or the range of values generated in the process. They just reported that their “mean level of dilution was about 70%“. They don’t include error bars or any consideration of uncertainty they just multiply their observed values by 3.3 times. The problem is that this number really matters. It is critical to establishing total emission volume estimates. Recognize that later in the publication they report numbers to 6 significant figures from an initial value “about 70%”? Put another way, their headline volume estimate could be 3.3 times smaller and we, the reader, would have no way of knowing. Since the conclusion of the report was that total emitted volumes are 2.5 times higher than are officially being reported, this factor of 3.3 is pretty darn significant. By omitting the critical calibration data we are left to simply trust that they got the number right. Most confusingly, this is passed on as a minor issue barely worthy of consideration?

Taken alone the absence of calibration information should inject a serious level of doubt about the publication but then you have to consider how they addressed “surveyed facilities” which make up almost a third of all the emissions they report in the publication. In the study they use modelling to justify why we should consider their numbers as valid. In doing so they acknowledge that they have uncertainty about  their readings from “surveyed facilities” (collection facilities etc…) since those facilities don’t match the conditions of their model. As a consequence, they decide to ditch their established methodology altogether and to use a number essentially picked out of thin air.

For this reason, instead of actual measured MDLs we used previously published natural gas facility emission volumes of 2.2 g/s (Omara et al., 2016),

Now this is a pretty big deal because if you look at the study you see that almost 35% of the reported Montney emissions are based on this assumption. This caused me to go look at the Omara paper and I admit I am confused. I have read the Omara paper forward and backwards a half-dozen times and do not see where they get that number. The paper doesn’t even look at production facilities, it uses that term to describe multi-well pads. The paper is also studious about not providing individual numbers, it provides ranges with confidence intervals. Finally, the paper addresses facilities in the Marcellus Basin. The natural gas in the Marcellus formation is renowned for being low-sulfur (sweet) which means that leaks from their facilities are bad but not life-threatening. BC and Alberta natural gases are known for being high-sulfur (sour) and as such the facilities have to be much tighter because leaks from these facilities can be fatal to the employees who visit those facilities. This is a huge deal, the facilities simply aren’t comparable. To add insult, this issue was raised by one of the “peer reviewers” and the response by the authors was that:

We will seek to verify the definition of “facility” with Omara and perhaps a corrigendum can be issued that clarifies.”

They were told that one third of their total was based on an incorrect figure and instead of pulling back the paper to confirm their estimate they simply said they might have to offer a correction sometime later?

The third issue with this paper was raised by one of the peer reviewers (Tony Wakelin from the BC Oil and Gas Commission (BCOGC))and essentially ignored by the authors. Yes that is right the the BCOGC decided to have one of their people review the article. Needless to say the BCOGC, as the regulatory body that controls the database used by the authors, would be the people who could tell the authors a lot about what the coding used in the database means. For instance this is what the BCOGC told the authors the term “cancelled” facilities meant in the database.

“Cancelled” means the well permit expired without drilling commencing. So these wells do not physically exist in the field and can not be attributed to the release of methane.

Why is this important? Well the authors attributed pretty significant emissions to those  cancelled facilities which is a physical impossibility.

The BCOGC also raised the calibration issue (3.3 time multiplier I mentioned above) and the authors provided this reply:

The primary purpose of the paper was to determine emission frequencies, not to create a highly accurate volumetric inventory

Read that again. The intention of the paper was not to create a “highly accurate volumetric inventory”….except that the HEADLINE CONCLUSION of this publication is a volumetric inventory. Literally the only reason people are talking about this paper is that it provides a number (a volumetric inventory) for fugitive emissions in the Montney formation. How can the authors say it was not the purpose when that is literally the only reason people talk about the paper.

Finally let’s look at the final comment from the BCOGC:

The fact significant quantities of emissions were attributed to wells that do not exist (i.e. 25 per cent of cancelled wells were reportedly emitting) calls into question the accuracy and validity of the discussion paper. Also, the basis for determining emission factors used in this discussion paper is highly questionable – therefore, this study should not infer that the estimates constitute an emission inventory that could be compared with what is reported under the Greenhouse Gas Emission Reporting Regulation.

Yes you read that right in the final table (Table 2) 11.5% of the total emissions reported in the survey were assigned to wells that WERE NEVER DRILLED! These numbers were then extrapolated so that magically 1989 of these imaginary non-wells supposedly emit 12,948 tonnes of methane a year into the atmosphere. Can you imagine a professional editor letting a paper go into their journal when 11.5% of the volumetric assessment is simply imaginary? The response of the authors…crickets….

To summarize, in this publication:

  • 53.5% of the reported emission volume calculations could be inflated by as much as 3.3 times;
  • 35% of the reported emission volume calculations result from an extrapolation based on a number (2.2 g/s) that never appears in the referencing document and if it did would represent results from a geologically different formation in a different jurisdiction that produces sweet gas and thus has very different safety requirements; and
  • 11.5% of the reported emissions volume calculation is from wells that were never drilled in the first place.

I think it is pretty clear that this volume estimate represents problematic science to say the least and any expert reading the work would be shocked that a former scientist turned politician like Dr. Weaver would use these results to develop policy.

Now this post is already too long but I still want to touch on the Suzuki follow-up study by Dr. Werring. I really don’t have much to say about this report because it starts off so wrong that it is hardly worth digging deeper. Why do I make such a bold statement?  Here is how they describe their methodology:

On each survey day, our field investigators would drive to the chosen area for investigation and access any or all oil and gas facilities (wells, batteries, compressor stations), depending solely on whether they were accessible (whether or not the well site or facility was flagged as a possible emitter was only one determining factor). Accessibility was key. If a facility was behind a locked gate or if the property housing the facility was marked “No trespassing”, the facility was passed by, unless it could be reasonably and safely inspected using the FLIR camera from public roadways

Now re-read that paragraph again do you see the issue? The problem is an issue called “sampling bias“. Sampling bias is when your sampling program over-samples from a particular proportion of the population with respect to the entire population. To explain here is a simple analogy. Consider you decided to study the effect of drinking on incarceration. One Sunday morning you send a researcher to stand outside the municipal drunk tank just as the police were releasing the revelers from Saturday night. Your researcher dutifully interviews each reprobate about their drinking habits and their overnight incarceration. Suppose at the same time you send a different researcher to ask the same battery of questions outside the Mormon Temple in Langley. If the two researchers interviewed the same number of people your combined results might read something like 50% of the people interviewed in our study spent the night in jail and 100% of the people who spent the night in jail were drinkers. 100% of non-drinkers had not spent any time in jail in the last year. These would represent some pretty inflammatory results if you were trying to argue the connection between alcohol and incarceration wouldn’t they? Almost like there was a bias in your sampling methodology.

Now let’s look at the sampling program Dr. Werring used in his study. He only went to facilities that had zero security (not even a “No Trespassing” sign). They did not enter facilities that had any security whatsoever. Do you honestly think that investigating only those facilities that are completely unsecured can be used to extrapolate to all facilities in the region? Do the folks from the drunk tank represent all people in Langley? Of course not. The sampling bias introduced by the sampling plan makes the results essentially meaningless. Certainly Dr. Werring can argue that the natural gas facilities from the cheapest producers, (the producers too cheap to even put up a “No Trespassing” sign) leak at a high rate…but then you might expect that result. If you are too cheap to put out a “No Trespassing” sign you are likely too cheap to maintain your wells effectively. Admittedly, Dr. Werring encountered a bunch of bad facilities that deserve to be inspected by the BCOGC, but to brand an entire industry using the results from a subset of its worst members represents bad science and should not be the basis of a policy decision.

To conclude, fugitive emissions are an important concern in establishing the effectiveness of BC LNG in reducing global GHG emissions but anyone who relies on these two publications in establishing and inventorying emissions has made a bad choice. Both publications have easily recognizable flaws and neither should serve as a critical reference in any further policy assessment.

 

Posted in Canadian Politics, Renewable Energy, Uncategorized | 13 Comments

What are the real marine risks of the Trans Mountain Pipeline Expansion?

Every morning, starting around 7 am, the Spirit of Vancouver Island leaves its berth in Swartz Bay for its first run to Tsawwassen. On-board the Spirit are tens of thousands of liters of diesel fuel to run the ship for the day. On her car decks the Spirit carries around 400 cars and a dozen or more transport trucks, each carrying tanks of gasoline or diesel fuel. Starting in Swartz Bay, the Spirit sails through the incredibly tight shipping lanes of the Gulf Islands, through Active Pass (a notoriously treacherous passage), through the active shipping lanes of the Strait of Georgia (all home to the endangered J-pod of resident BC Orcas) to the Tsawwassen Ferry terminal situated near environmentally fragile eelgrass beds that provide a habitat for countless small fish and the protected Pacific flyway. The ferry carrying hundreds of cars and trucks and thousands of people makes this trip numerous times daily without the support of any rescue tugs. Even scarier are the hazardous goods runs they do late at night where, in the dark, through this treacherous route, the ferries transport tens of thousands of liters of goods too dangerous to transport with civilians on-board.

You might ask why am I talking about ferries? The answer is because from a marine risk assessment perspective this route is a nightmare. The number of potential risks to human health and the environment are almost countless: spills, collisions, narrow passages, charted and uncharted rocks, engine loss all are potential outcomes from each trip and yet given the tremendous risk to human health and the environment our government has not cancelled this run to evaluate its continued safety to the coastal marine ecosystem. Just look at this link a colleague provided me of the August, 1970 splicing of Queen of Victoria by Soviet freighter Sergey Yesenin in Active Pass. Yet this week our provincial government announced that they are proposing a freeze on increases in the transportation of dilute bitumen (dilbit) partially based on the risks associated with the project. This caused me to think about risk and marine transport.

As I mentioned in my previous post, my job involves investigating and remediating contaminated sites. As part of my job, I also carry out the due diligence risks assessments to evaluate the risks posed by contaminated sites to human and ecological health. I evaluate risks every day but not the way most look at risk, I’m responsible for putting a number on risk, or more specifically, putting a number on the hazard a chemical poses to ecological health to determine if the risk is acceptable or unacceptable. There is an entire science to this task and I have spent a lot of time at this blog explaining how we do this. At the bottom of this post is a summary I have prepared that gives readers a chance to go through those posts at their leisure.

One of the first things you learn in studying risk assessment is that there is no such thing as an activity with zero risk. In everything we do we encounter risks. When we get in the car we put on our seat-belts; before our kids get on the ice they put on their helmets; before my daughters play soccer they put on their shin-pads. All these are tools used to reduce the risk of typical day-to-day activities. Industrial activities are no exception. Pipelines run the risks of leaks, tankers run the risks of spills and that is something we have to accept as part of living in a modern industrialized country. Using safety processes and procedures we work hard to minimize those risks but we can’t eliminate them in their entirety. But unless our government has a plan to eliminate the use of fossil fuels virtually overnight we will need to transport fossil fuels and pipelines are the safest way to transport fossil fuels overland. If the government succeeds in stopping the pipeline all they will have done is increased our risk of a major fossil fuel spill. As for the absolute safest way to transport fossil fuels, that would be modern, double-walled tankers.

Going back to the BC coast, while the BC Ferries pose a pretty significant risk, far more frightening, from a marine spill perspective, are the daily barge runs that move fuels from Vancouver and the refineries in the Puget Sound to keep Vancouver Island supplied with the diesel and gasoline necessary to keep its communities alive. These barges run on odd schedules, through good weather and bad and are never accompanied by two marine rescue tugs. Has our provincial government blocked the movement of these barges? Of course not! Even worse, look at those fuel barges going up the coast. Does the Nathan E Stewart ring a bell? As I have written previously, the provincial government has essentially ignored this risk for decades and failed to put in the money necessary to ensure a reasonable spill response. So when our current government says it wants to investigate expanded dilbit transportation (a hypothetical future risk) while ignoring a real, pressing and much more significant existing risk, you are left to wonder if it is really politics rather than a concern for the environment that is causing them to make this decision.

On another front, as I write this blog post the port of Vancouver is engaged in a public consultation process about plans to increase the size of Delta Port. This at a port that currently has approximately 23,000 ship movements a year and is looking to add an estimated 5000+ more ship movements if all the future upgrades are included. This dwarfs the 720 additional ship movements associated with the Trans Mountain expansion (TMX).

Now unlike the Port, the fuel barges or the BC Ferries, the NEB required a detailed risk analysis of the TMX. The critical document on this topic is the report Termpol 3.15 – General Risk Analysis and intended methods of reducing risk which evaluated the risks of the project. It concluded that “with effective implementation of risk reducing measures most of the incremental risk resulting from the project can be eliminated“. To put a number on it:

  • Without the project the risk of a credible worst case oil spill is estimated in 1 in every 3093 years….If all the risk reducing measures discussed in this report are implemented the frequency will be one in every 2366 years.
  • This means that after the Project is implemented, provided all current and future proposed risk control measures are implemented, the increased risk of a credible worst case oil spill in the study area from the Trans Mountain tanker traffic will be only 30% higher than the risk of such an occurrence if the Project did not take place.

By increasing the number of tankers by 7 times, but also implementing the changes that were ultimately mandated by the NEB, the risk of a spill is less than one event every 2000 years. So no, the risk does not increase by 7 times, it increases by barely 30% and 30% more of almost zero remains almost zero. Essentially they are saying that the project provides no significant increase in risk over those risks we accept every day (what I refer to as a de minimis risk below). In exchange for a negligible increase in risk we get economic prosperity and the economic health and goodwill of our neighbouring provinces. The dollars generated by this project are what pay for our health care and social services.

Certainly the government could try to make a case that the risks posed by the TMX (one accident every 2000+ years) may be too high for the benefits incurred. But that is not the argument they, or the opponents of the pipeline, have been making. They argue that BC should not incur any risk to compensate for our current level of prosperity. The problem is that our current level of prosperity is a direct result of our national union. To suggest that we accept no risk, in a world where we balance every other risk out there, is simply not a legitimate argument to make.Arguing that the TMX poses too much risk while simultaneously refusing to fund improved spill response in the Central BC Coast is the epitome of hypocrisy. It shows that the ban is not risk-based but simply political in nature. The opponents of the pipeline need to enumerate the risks and explain why the de minimis increase in risk associated with the pipeline is not worth the improvement in the quality of life it provides to British Columbians and Albertans alike.

Addendum on Risk and Toxicity

I have written a lot at this blog about how risk is communicated to the public and I have prepared a series of posts to help me out in situations like this. The posts start with “Risk Assessment Methodologies Part 1: Understanding de minimis risk” which explains how the science of risk assessment establishes whether a compound is “toxic” and explains the importance of understanding dose/response relationships. It explains the concept of a de minimis risk. That is a risk that is negligible and too small to be of societal concern (ref). The series continues with “Risk Assessment Methodologies Part 2: Understanding “Acceptable” Risk” which, as the title suggests, explains how to determine whether a risk is “acceptable”. I then go on to cover how a risk assessment is actually carried out in “Risk Assessment Methodologies Part 3: the Risk Assessment Process. I finish off the series by pointing out the danger of relying on anecdotes in a post titled: Risk Assessment Epilogue: Have a bad case of Anecdotes? Better call an Epidemiologist.

Posted in Canadian Politics, Pipelines, Risk Communication, Trans Mountain, Uncategorized | 26 Comments

On dilbit, oil spill response and political gamesmanship

As many of my readers know my day job involves investigating and remediating contaminated sites. My particular specialty is the investigation and remediation of petroleum hydrocarbon impacts [and before anyone asks, no I have never worked for Kinder Morgan nor do I have any conflicts of interest associated with the Trans Mountain file]. I have a PhD in Chemistry and Environmental Studies and have spent the last 18 years learning how hydrocarbons behave when spilled in the natural environment. This post comes out of my surprise at the provincial government’s announcement that they are proposing a freeze on increases in the transportation of dilute bitumen (dilbit) until they can create an independent scientific advisory panel to help address the scientific uncertainties outlined in the  Royal Society of Canada (RSC) Expert Panel report The Behaviour and Environmental Impacts of Crude Oil Released into Aqueous Environments. The reason for my surprise is that, unlike most British Columbian’s, I have read the RSC report and the uncertainties expressed in the RSC report are not going to substantially change how spill response is planned or carried out in the West Coast of BC.

Before we go further I am going to make a blanket statement. It is my belief, informed by my years of study and practical experience in the field, that we know enough about how diluted bitumen (dilbit) will behave when spilled to design a world-class spill response regime. Why do I make such a statement? Let’s start by dispelling some myths. The first thing to understand is that virtually everything the activists (and certain politicians) tell you about dilbit is wrong. I have previously described the state of the research on dilbit and its behaviour in marine environments. To summarize, the research shows that dilbit behaves pretty much like other heavy oils in a spill scenario. In the marine environment dilbit floats, which is understandable based on it being a non-polar liquid with a density less than seawater. In freshwater environments the blend of dilbit and the length of time after the spill event will define how it behaves. Sometimes it will float for days on end and sometimes it will float for only a few days and then sink; as this graph from the National Academies of Science (NAS) report on the subject displays:

NAS study Fig 3-2

Whether it sinks or floats is something you can predict once you know the blend of the dilbit (see the difference between the two blends in the figure) and the conditions in the freshwater environment. If the dilbit spills into really silty water (either fresh or seawater) it will form oil-particle aggregates (OPAs) which, under certain conditions, will sink to the bottom. In other cases it gets entrained in the water column (and thus can become harder to clean). Once again this is almost exactly what all heavy crude oils do in the same conditions. To summarize, dilbit is not some existential threat to humankind, it is like many other heavy crude oils out there. When spilled it will cause a mess, but no bigger mess than a similar heavy oil. The world community has lots of practice and knowledge about how to handle heavy oil spills. This is a topic about which a huge amount of time, money and intellectual energy has been directed. The expertise of the world community can be used to inform our spill response.

At this point I can hear the activist saying: but what about the RSC report? To them I say: try actually reading the report. You see, the report is about all crude oils spills, so very little of its content refers to dilbit per se, rather the report is about all crude oils including dilbit. This is an important distinction because most of the recommendations for further research proposed by the RSC are general in nature and reflect all oils not just dilbit. Let’s look at what the Executive Summary says are the “high-priority research needs”:

High-Priority Research Needs Identified by the Expert Panel
1. Research is needed to better understand the environmental impact of spilled crude oil in high-risk and poorly understood areas, such as Arctic waters, the deep ocean and shores or inland rivers and wetlands.
2. Research is needed to increase the understanding of effects of oil spills on aquatic life and wildlife at the population, community and ecosystem levels.
3. A national, priority-directed program of baseline research and monitoring is needed to develop an understanding of the environmental and ecological characteristics of areas that may be affected by oil spills in the future and to identify any unique sensitivity to oil effects.
4. A program of controlled field research is needed to better understand spill behaviour and effects across a spectrum of crude oil types in different ecosystems and conditions.
5. Research is needed to investigate the efficacy of spill responses and to take full advantage of ‘spills of opportunity’.
6. Research is needed to improve spill prevention and develop/apply response decision support systems to ensure sound response decisions and effectiveness.
7. Research is needed to update and refine risk assessment protocols for oil spills in Canada.

Notice the the terms used: “better understand”, “increase the understanding”, “improve”, “update”. These aren’t the terms of a group that knows nothing about a topic, these are the terms of a group that wants to see incremental gains to our knowledge-base. Why is this important? Well because our provincial government has not banned the movement of ALL oil products even though the RSC report expresses concerns about all oil products. The provincial government has only expressed concern about one type of oil: dilbit. There is a serious disconnect here. It is almost as if the ban is not based on science but is instead a bit of political gamesmanship.

That being said research has advanced since the RSC report was published. What does the current research say? As Natural Resources Canada Research Scientist Dr. Heather Dettman points out:

light crude, a low-viscosity oil, may actually be more hazardous [than dilbit]. When light crude hits water, it’s “like adding cream to coffee. That’s it. It’s all mixed in, it gets stuck in the sediment.”

Dr. Dettman pointed to the 2010 dilbit spill in Michigan’s Kalamazoo River.

“It looks ugly and it’s not good for the fish. But because it’s there you can see it, you can pick it up, and then it’s gone,” she said. “We get a very high recovery rate.”

Here is a longer discussion with her on the radio. As Dr. Dettman points out, we have practical experience with handling a dilbit spill in in the Burrard Inlet and the results were heartening. Virtually no OPAs were formed and most of the dilbit was recovered. This was a lucky spill however, as it occurred in an area with limited wave action and no storms. Now let’s go back to Dr. Detman.

Dr. Dettman said she and her team have built substantially on the body of dilbit research since the Royal Society report was released three years ago. Their experiments – performed in an open tank filled with fresh North Saskatchewan River water – show that various blends of diluted bitumen won’t sink until the sludge has been left alone for at least 21 days, she said.

Even then, she added, only one type of bitumen found its way to the tank floor, even in warm conditions.

She said the data seem to indicate diluted bitumen tends to form a hardy slick on the water’s surface – a spill that can be somewhat contained – rather than dissolve into the water and end up coating riverbeds and marine life.

“The misinformation is that diluted bitumen will sink,” Dr. Dettman said. “But it’s not sinking.”

The reality of the situation is that any oil spill, be it crude oil or diluted bitumen, represents a tragedy and catastrophe. It will harm the natural environment, will kill some marine organisms, and will be very hard to clean up. The point of this blog post is that a diluted bitumen spill would not be a uniquely catastrophic situation. It would be comparable to a spill of any other heavy crude…you know the products that have been safely shipped in and through the Salish Sea for the last 50+ years. Banning the transport of dilbit until we have done more research has no basis in science. It is a political game. Any “independent scientific advisory panel” will end up concluding that we have the information to design a world-class spill regime. Anyone who says otherwise is either not aware of the state of research in the field of spill response or has a political axe to grind. You can decide where our current government and the anti-pipeline activists stand on this topic.

 

 

 

Posted in Oil Sands, Pipelines, Trans Mountain | 32 Comments

On that UBC Report comparing job numbers and the Site C Dam

My Twitter feed has been alive with news of a “new UBC Report” that according to its author, Dr. Karen Bakker from the UBC  Program on Water Governance, “concludes stopping Site C will create a larger number of sustainable jobs in the province“. This report has been cited in multiple locations in the last week so I have been hoping to get a chance to dig into it and having done so I am, once again, surprised at how easily questionable research can dominate the conversation on the Site C file. It was even cited in the Legislature. The selling point of the report (at least from what I have seen on Twitter) was that it showed that the BCUC Alternative portfolio produced 5 times more jobs in renewable energy than are expected to be produced by Site C. What I found through my research was that the majority of the jobs “created” by the BCUC Alternative Portfolio would likely never actually appear and that the costs for the renewables portfolio promoted by the anti-Site C activists would likely be much higher than was previously suggested. The rest of this blog post will examine why, in my opinion, the results from Dr. Bakker’s study are unreliable.

The report itself is actually an unsigned Briefing note and an accompanying spreadsheet analysis (link opens an Excel file). In the briefing note we are presented with an employment table that presents three columns of employment numbers. One for the BCUC Alternative portfolio, one for the BC Hydro Alternative portfolio and one for Site C Continued. The take-home message is that the cumulative person-years or “jobs” modeled for the BCUC Alternative portfolio is 208,498 person-years by 2094 while Site C only generates 40,578 person-years by 2094.  This is where the headline 5 times the job numbers headline comes from (actually 5.13 but who is counting).

Looking at the spreadsheet we find the basis for this table in  the tab “Comparison-LLF”. The BCUC numbers are derived from the tab “BCUC-LFF”. On that tab we find the “Employment Summary-All Resources (by year)” table starting at column T. Looking at this spreadsheet we can see where all the jobs are coming from. Adding up the columns we discover that the 208,498 person-years for the BCUC Alternative portfolio is made up of the following subtotals:

  • 10,296 person-years from decommissioning Site C  (5% of the total)
  • 183,600 from demand-side management (DSM) programs (88% of the total)
  • 14,602 person-years from the various wind projects  (7% of the total)

Looking at this you immediately recognize that contrary to what the people on Twitter have been claiming, renewables, on their own, do not generate more jobs than Site C. Rather according to the briefing note Site C generates 40,578 jobs, almost 3 times as many jobs as the actual wind projects.

Looking at this table one recognizes an obvious initial flaw of the model with respect to the wind jobs. According to the model the wind facilities will generate a lot of jobs during their initial construction phase (for the Wind -PC18 column (Column X) that would be 310 person-years per year from 2034 through 2038). The authors then confusingly have those facilities providing a constant number of maintenance jobs (52 person-years per year) between 2039 and 2094 (over 55 years).

Anyone who has studied wind energy projects understands that wind facilities are not designed to operate for 55+ years. The typical wind project has a 20-25 year lifespan at which point the turbines have to be decommissioned and new turbines installed if the power is still needed. From a jobs perspective this would significantly increase the number of jobs generated by the wind projects. Tearing down old turbines and building new ones would generate a lot of person-years of work and yet these jobs are completely missing from the analysis. While this necessary component of a wind project life cycle analysis would represent a good thing for the people touting renewables as a job builder; it does pose a bit of a quandary for the the anti Site c activists.

The problem is that many of their cost-calculations omit the need to decommission and re-build facilities 3+ times over the time period Site C will be in service. In order to effectively replace Site C you have to triple the number of turbines AND include the cost to decommission each generation of obsolete turbines during each life cycle. By tripling the number of turbines and incorporating decommissioning costs, suddenly the costs of those replacement renewables roughly triples. This is a fatal flaw in the cost calculations produced by the anti-Site C activists. I have been banging on this drum for a bit but, confusingly, have heard little from the supporters of Site C on this obvious error in price calculations. A single generation of wind turbines cannot replace a dam intended to operate for 70-120 years yet this is what the modelers assume in their analysis.

The BCUC’s actual Alternative Portfolio spreadsheet does partially account for  this cost but makes a number of odd assumptions including reducing the costs for refurbishing facilities (by 30% of original costs); omitting any decommissioning costs; and most strangely assuming that operations and maintenance costs go down over the lifespan of the facilities. I’m not sure about you, but I’ve found that as systems age they need more maintenance not less.

Going back to the Briefing Note, I have no issue with the person-years associated with decommissioning Site C in the BCUC Alternative portfolio but I have a very serious issue with those 183,000+ jobs attributed to demand-side management (DSM).

You might ask where Dr. Bakker and her colleagues got that huge number. Well according to the Briefing Note:

According to a study carried out for BC Hydro, spending on conservation or demand-side management (DSM) programs creates 30 jobs per $1M spent. 1

That 1 identifies Footnote 1 the Power Smart Employment Impacts DSM Programs, Rates and Codes and Standards, F2008 to F2037 (citing p. iv.). Now any time a footnote references a Roman Numeral that means the authors are sending you to the Executive Summary of the cited report. A good scientist never relies on the Executive Summary of a report because the Executive Summary typical does not contain any of the provisos from the body of the work itself. Looking at the Power Smart Report we do see in the Executive Summary that “employment intensity” for Power Smart DSM was an estimated 34.4 person-years per million dollars spent. The question lies: where does that number comes from?

Reading the report the 34 person-years of employment per million spent on DSM includes two components: Investment Effects and Re-Spending Effects. Investment Effects are described as:

These expenditures are required to implement the energy saving measures and are comparable to the construction expenditures and employment from supply-side projects. As depicted in Figure 4-1, investment employment ramps up over the initial years of the DSM Plan and achieves a plateau until F2028. Thereafter, it falls fairly rapidly with expenditures, but the decline is mitigated because some projects will be completed and paid out after F2028. Overall, the pattern of investment employment directly follows the expenditure pattern.

That is to not to say these are all direct jobs associated with spending all those millions. As described in the glossary, Investment Effects include:

Direct, indirect and induced employment estimated from the initial DSM investment expenditures in programs, rates, codes and standard measures.

Thus those original employment boost involves direct, indirect and induced employment. The other half is the “Re-spending Effects” which are described as:

The employment impacts from re-spending activity are estimate at 50,900 PYs [between 2008 and 2037] and are created as a byproduct of the economic benefits associated with the DSM expenditures (see Figure 4-1). Since these employment benefits continue, driven by the ongoing energy bill savings, they can be likened to the operation and maintenance employment from supply-side projects.

Did you get that second type? If not then let’s look at the Glossary [Definitions] that describes “Re-Spending Effects” as:

Direct, indirect and induced employment estimated from consumers’ re-spent electricity bill savings in the economy.

These two definitions should be raising red flags all over the place. First and foremost the Site C numbers are direct numbers not “induced or indirect employment”. Thus they are comparing apples to oranges. Moreover, Re-Spending Effects can only happen if you have substantial reductions in the consumer’s hydro bill associated with the DSM program. Now this is a huge assumption on the Site C file. As has been repeatedly told, if the project is scrapped then the sunk costs have to be recouped, reportedly by a 10% surtax placed on everybody’s hydro bills. This will also help cover the decommissioning costs. That surtax will eat up any initial savings the consumer might see from shutting down the project. It is unclear how people will re-spend savings that they never receive in the first place.

Moreover as I describe in my previous blog post, the only way for DSM to get us where we need to be [dramatically reduce our electricity demand] is by substantially increasing hydro bills. That is how DSM works, you make electricity more expensive to encourage consumers to use less. This brings up a rule-of-thumb I was taught as a student about DSM programs.  Consumers are generally used to how much they are willing to spend on household bills like hydro. If you increase the price of a household budgeting line item (electricity in this case) consumers will work to drop demand until they are spending about the same amount as they were spending previously. If the price rises substantially, they are often willing to spend a bit more to maintain a quality of life but will not make massive changes unless there is a big upside in savings. What this means is that the increase in price will likely drop demand but will not likely have a major effect on consumer hydro bills. Consequently the DSM measures will not result in reduced hydro bills that can be used to generate re-spending effects.

The absence of re-spending of non-existent savings becomes a serious consideration in the BCUC Alternative model because according  to the Power Smart document, the direct jobs (the Investment Effects) associated with the continued investment in DSM quickly disappear. In the Power Smart reference Figure 2-1 shows that under a steady investment state employment increases associated with Investment Effects essentially disappear about ten years into the spending cycle. By their measure only 50,900 person years are generated regardless of how long the money is spent because after that time the monies are spent on established programs and not on lots of people to set up those programs. For the UBC modelers this is a problem since they assume that 183,600 person-years will be generated by DSM between now and 2094 but if the Re-Spending Effects don’t appear then about 132,700 of those jobs simply vanish from the equation. Take those 132,700 person-years out of the equation then all those talking points go away. Instead of the BCUC Alternative portfolio generating 208,498 jobs it only produces 75,798 jobs which is barely double what Site C generates. Given the mushiness of the assumptions used in this report those two number may as well be the same.

Looking at the Briefing note I remain amazed at how easily the public can be swayed by someone with a fancy title and a complicated spreadsheet. Like the Swain model once you dig into the numbers it becomes increasingly clear that the output of the model is entirely dependent on the input assumptions and that, in this case, the input assumptions are demonstrably faulty. Wind farms don’t last for over 55 years they last closer to 25 and if wind is going to replace Site C then you will need to account for the equivalent time frame for a fair comparison.  On the DSM front, a modeling exercise that assumes that 64% of your total jobs will be derived from people spending savings they never obtained is not a good thing. Finally for the anti-Site C folks who keep proclaiming that renewables will make up for the jobs from Site C the Briefing Note makes it clear that the actual renewable job numbers don’t come close to comparing to the number of jobs generated by Site C. On the positive front, as I have pointed out numerous times at this blog, Site C will not come close to supplying the energy we need to electrify our transportation sector. As such we will need DSM (and all the ensuing jobs) in addition to Site C if we are to meet our Paris Agreement energy commitments.

 

 

Posted in Site C, Uncategorized | 7 Comments

Reviewing the demand estimates used by the opponents of Site C

This week Business in Vancouver (BiV) printed an article about the Site C project titled:  B.C. might not need any additional wind power either which included a number of quotations from Dr. Harry Swain a gent with whom I have disagreed on the topic of Site C. In the article Dr. Swain stated that BC has all the power-generating capacity it needs for the next 20 years and therefore does not need Site C. He indicated that the basis of his claim was his modelling on the topic. This led me to wonder what was in that model and how the opponents of the Site C dam were able to generate numbers that ran completely contrary to both my findings and those of BC Hydro. This blog post examines that question and demonstrates, once again, the importance of looking at the underlying data used in environmental decision-making. By the time you finish reading this blog post I think you will agree with me that the modelling used by the opponents of Site C is flawed and not worthy of consideration in the Site C debate.

To begin I had to get the model described in the article. As many of you know Dr. Swain et al. presented a forecast model to the BCUC (link downloads an Excel model from BCUC website). I downloaded that model a while back and discovered that the critical inputs were not included in the spreadsheet but rather referred back to a secondary spreadsheet called 2016-2036-Forecast-w-Revised-Trade-BCUC-RB-Eoin-Finn-Oct18.xlsx. At that point I was stymied, since I didn’t think that Dr. Swain or his colleagues would give me a copy of their model. However, when the BiV article came out I asked the article’s author if he had been given any information to support Dr. Swain’s claims. The response was a copy of Dr. Swain’s updated model titled BC-Hydro pro-forma 2017-37Rev5SiteC.xls. Moreover, much to my surprise I was informed that Dr. Swain had agreed to the release of the file to me. This represents a level of professional courtesy that was much appreciated and hopefully represents a step towards working together to meet BC’s continued energy challenges.

Now as a first note, I will point out that the 2017-37 Model differs in output from the 2016-36 model used in the BCUC submission. As I do not have the earlier model I cannot see the difference between the two but I can point out what I view as limitations with the 2017-2037 model (called the Model hereafter) and explanations for why I believe it does not present a reasonable estimation of future demand in BC.

In the BiV article Dr. Swain stated:

With the modelling that I did, I assumed – as BC Hydro did – that the population is going to increase, that GDP will increase

While that is strictly true (the Model has a correction for inflation) it does not tell the whole story. According to BC Statistics British Columbia’s population is expected to rise from 4.8 million in 2017 to 5.9 million in 2037. This represents an increase of 23% over the time covered by the Model. The problem is that the Model does not address that population growth directly. Rather than looking at per capita demand it simply assumes that demand will grow or decline using the average of historic residential demand growth between 2006 and 2016. The choice of dates is very significant since it includes the market crash from 2008-2009 which caused a retraction in our economy and associated energy use. The inclusion of the recession in the input number for the spreadsheet results in lower growth in residential demand for the entire time frame covered by the Model.

Moreover, looking at the residential sector demand estimates I identified a number of further critical flaws. The time period under consideration (2006-2016) was one where BC Hydro carried out intense demand-side management activities which temporarily de-coupled residential energy growth from population growth. To further lower the future residential demand the Model includes an elasticity factor (addressed later) which is sufficiently high to essentially eliminate demand increases associated with population growth over the last 10 years of modelling (2027-2037). According to projections the population of BC is supposed to grow by 550,000 souls between 2027-2037 but under the Model residential energy use will stagnate during that period. In total, the Model projects residential demand to increase by 7% over the entire 20 years of the Model. Not 7% per year but 7% over the period from 2017 to 2037 even as the population increases by 23% during that time.

On the commercial demand side the model has similar flaws. As I noted in my previous blog post on electricity demand commercial demand in BC pretty much mirrors GDP growth. The Model has commercial demand increasing by only 3% between 2017 and 2037. In a province with a 23% growth in population the service and commercial parts of our economy are only going to grow at 3%? This is simply not a reasonable assumption.

On the industrial side it gets even worse. I don’t look forward to living in the British Columbia the Model projects for 2037 as we will have no industrial base to pay taxes to fund our services. The Model has industrial use dropping by 66% over the 20 year period. It projects total industrial demand at 4,431 GWh in 2037. According to BC Hydro statistics mining alone used 3,800 GWh in 2017. The forestry sector used around 6,800 GWh with pulp and paper using about 4,400 GWh of that forestry number. Think of it, under the Model in 2037 BC will use the same amount of power in its entire industrial base that it currently uses for pulp and paper. Is this a reasonable number? If I told you that BC’s mining industry would be completely gone by 2037 and its forestry sector would be cut by more than half would you believe me? Funny thing Swain et al. said that very thing to the BCUC and no one called them on it. Looking at how BC Hydro generated its projected demand you discover that BC Hydro looked at each of its large industrial customers individually to project industrial demand in the future. I think I will trust BC Hydro on this topic.

Continuing our look at demand we have electric vehicles. On the topic of electric vehicles (EVs) the Model once again ignores population changes and assumes that the personal vehicle fleet will remain static with the same number of passenger vehicles on the road in 2037 as in 2016. This decreases the number of EVs needing electricity. The Model assumes that there will be no attempt to electrify commercial or transport trucks so there is no demand there. Moreover, the increase in EV uptake is extremely back-loaded with a net total increase in EVs of 29,310 between 2017 and 2024. [The Model uses compounding interest in their rows so the big increases on the demand end are located at the later end of the model.] This allows the Model to minimize the electricity demand in the early years while claiming larger numbers at the end of the model run (so they can claim they had those higher numbers). Interestingly, based on the numbers from FleetCarma.com by early Q3-2017 EV sales in BC had surpassed the Model’s projections for the entire year. Right now EV uptake is running at about twice the rate projected by the Model which will have a commensurate increase in electricity demand.

Looking what we have found so far, the Model presents unreasonably low demand numbers for every significant column on the demand side of the ledger. Based on this there is no wonder how they got their numbers so low. Now I could stop here but there are two other points about the Model that should be exposed. The first is the price elasticity component.

As we know, price elasticity addresses how demand goes down as price goes up. BC Hydro was criticized for using a relatively low elasticity value. BC Hydro’s research indicates that price elasticity should range between -0.08 and -0.13 (even as they used a lower number). The Model uses a residential elasticity of -0.15. This results in a larger than is typically observed reduction in demand associated with hydro rate increases.

Coincidentally, the model assumes rate increases in the Residential, Commercial and Industrial sectors of 3.8% per year every single year between 2017 and 2037 (no rate freezes here). The result in the unseemly residential power rate of $234/MWh and commercial rate at $201/MWH in 2037. Needless to say that huge number has the effect of driving down residential and commercial demand based on price elasticity. Admittedly it makes that $88/MWh Site C power look pretty good. Combining the extremely high power rates and extremely high elasticity rate results in massively suppressed demand numbers in 2037. These basic assumptions in the Model are the reason demand is so low in 2037.

Amusingly, while the Model projects incredibly expensive power for residential and commercial customers it assumes virtually no increase in the price received for electricity through sales. The Model assumes that the trade price for electricity will rise all the way to $40/MWh in 2025 and no further increases thereafter. In 2037 they have Hydro selling electricity to California at $40/KWh while simultaneously selling it to residential customers at $234/MWh. Why would the Model do such a thing? Well that way it can minimize the amount of income generated by the dam so the incongruous combination of stratospheric residential rates and negligible export rates results in a decrease in demand and a minimization of the income generated by the dam. The best of both worlds if you don’t want the dam built but absolutely unsupportable if you care about reliable data being used in decision-making.

Ultimately what this blog post shows is that the model used by Dr. Swain and his colleagues is so completely flawed that it is simply not a reasonable tool to be used in any decision-making process. What I find most confusing is why I haven’t read about this from anyone else. The assumptions for this model were all out there and yet no one went through the effort of examining them. The people pushing for the dam needs to shake their heads. So much misinformation is coming out about this project and the people supporting it simply shrug and move on. Wouldn’t it be nice if the people supporting the project put in the sweat equity that the opponents of the dam have been contributing. Then, maybe, we might have enough useful information on the table to make a good decision as to whether we complete or scrap the dam.

 

Posted in Site C, Uncategorized | 8 Comments

Why efforts to fight Climate Change will change the conclusions of the BCUC Site C Inquiry Report

I have been incredibly busy at work over the last few weeks and so was not able to get involved in the public consultation portion of the BCUC Site C Inquiry process. Such are the downsides of not being a paid activist; when my work calls, my activism (which is a hobby for which I receive no compensation) suffers. Happily my work deadlines have passed leaving me time to read the BCUC Site C Inquiry Final report (the Report).  What I read left me completely flabbergasted. The conclusions of the Report depend entirely on its load forecasts and the load forecast upon which all the major assessments are based completely ignores the overriding environmental issue of our age: fighting climate change. What also astounds me is that this incredibly important fact has not been highlighted in any of the analyses of the Report that I have read to date.

Since it is such an important point let’s evaluate it immediately. In the conclusion of the Report (on page 187) the Summary states:

We take no position on which of the termination or completion scenarios has the greatest cost to ratepayers. The Illustrative Alternative Portfolio we have analyzed, in the low-load forecast case, has a similar cost to ratepayers as Site C. If Site C finishes further over budget, it will tend to be more costly than the Illustrative Alternative Portfolio is for ratepayers. If a higher load forecast materializes, the cost to ratepayers for Site C will be less than the Illustrative Alternative Portfolio.

Let’s unpack that statement. Throughout its submissions BC Hydro has suggested that the Panel consider a mid load forecast in carrying out subsequent cost assessments. The Panel reports that it found the mid load forecast “excessively optimistic” and chose to use the low load forecast in conducting subsequent analyses. Now this is the critical point. Under the low load forecast the alternative renewables and demand-side management  (DSM) portfolio is comparable in price with completing Site C. Thus the decision as to whether to cancel or complete Site C is not clear. This is important because as the Panel points out, under the mid load and high load forecasts building Site C is clearly the better decision for ratepayers. Thus the entire conclusion of this Report depends on which forecast was chosen by the Panel

The question arises therefore: why did the Panel decide to rely on the low load forecast (which made the decision a toss-up) rather than the mid load or high load forecasts (which make Site C a slam dunk)? Well the answer to that is simply mind-blowing for me. As detailed on page 81:

Given the uncertainty, the Panel finds additional load requirements from potential electrification initiatives should not be included in BC Hydro’s load forecast for the purpose of resource planning. Although available information indicates that the effects of electrification on BC Hydro’s load forecast could potentially be significant, the timing and extent of those increases remain highly uncertain.

As someone who has been active on the climate change file this almost knocked me off my chair. The Panel decided that one of the primary tools for fighting climate change (reduction in reliance on fossil fuels via electrification) should be completely omitted from consideration in assessing future electricity demand in BC. My regular readers will note that the entire basis of my submission to the BCUC Inquiry was the need to consider the electricity needs associated with reducing our dependence on fossil fuels. When the preliminary report came out I was a bit surprised that the Panel had omitted any discussion of the Paris Agreement and our climate change goals. I saw that as an oversight and commented in a follow-up submission. Now I realize that it was a deliberate decision. It is as if the Panel lives in a world where Canada never signed the Paris Agreement.

That left me to wonder, why would the Panel make such an extraordinary decision?  Well Mr. Morton, the Chair of the Panel, explained it this way in a radio interview on CKNW radio:

We can’t make any predictions about what government policy would be in the future so our analysis did not include potential changes of government policy. They included what government policy is today and we pointed out that government policy would certainly change things…..if government electrification policy changed that would change demand. Again we couldn’t really make assumptions about what policy may or may not be in the future.

Reading and re-reading that quote I cannot believe that the Chair of the Panel (a regulator) could make that statement in light of what we already know about climate change. What is even more disconcerting is that page 129 of the Report includes text from Section 2 of the Clean Energy Act that defines British Columbia’s energy objectives and enumerates the requirements to reduce our emissions by 2050 while referring them back to the Greenhouse Gas Reduction Targets Act. To clarify, the province has a whole slew of “Climate Action Legislation” on the books. One of the primary ways of decarbonizing in a manner consistent with the Clean Energy Act (and the other applicable Acts) is via electrification and yet the BCUC suggests they can’t foresee policy implications that include electrification?

You might ask how the Panel came to this conclusion? Well that answer goes back to a critical weakness of this process: the rush to complete it and the absence of time for the Panel to effectively weigh the evidence they were presented against the body of scientific research in the public realm. Benjamin Disraeli is attributed with the quotation: “History is made by those who show up“. In this case the people who showed up to talk to the Panel were the activists who want the project cancelled and they brought their paid consultants with them. The people who did not show up (with some limited exceptions) were the scientific community of British Columbia. The result was that the panel was flooded with misinformation and anecdotes and lacked the time (and possibly expertise) to effectively weed out the bad information.

Since electrification represents the key factor in differing between the low load and high load forecasts let’s consider the Panel’s analysis and findings against electrification. On page 81 under “Potential disrupting trends” the Panel indicates that it leans heavily on the work of Hendriks et al. (Hendriks) This begs the question: who is this Hendriks fellow? Well according to his online CV, Richard M. (Rick) Hendriks is the Director of Camerado Energy Consulting Inc. which has been working for the Treaty 8 Nations against the Site C project since at least 2010. He is, or until recently was (I really don’t know), being paid to oppose the project.

Hendriks’ submission includes at its core the details from a paper that I have previously addressed at this blog. In my original blog post I note that the previous work by Hendriks and Dr. Karen Bakker of UBC attempts, and in my opinion fails, to discount the research from the Deep Decarbonization Pathways Project (DDPP) and Trottier Energy Futures Project (TEFP). That work was produced for, and  ultimately reviewed and published by, Environment and Climate Change Canada (ECCC) in their assessment report on Canadian energy needs under various climate change scenarios.

In a practical sense what we have is a difference of professional/scientific opinion. On one side we have research groups from leading research institutions in 16 of the world’s largest greenhouse-gas-emitting countries; a team of more than a dozen energy experts from the Canadian Academy of Engineering; all overseen by a team of subject matter experts from the Federal government. On the other side we have a consultant “trained in engineering, science and social science” who has spent the better part of a decade working for a group opposed to the dam and a water governance expert. Would anyone care to guess which side the Panel believed? Well it was the gent who showed up and talked to them in person (Mr. Hendriks). Thousands of hours of analysis by dozens of the world’s top subject-matter experts was dismissed by the Panel who chose instead to rely on the guy who showed up for a presentation and to answer questions.

The Panel also mysteriously chose to trust Mr. Hendriks over the far more qualified Dr. Jaccard (and his research Group) when it comes to electrification of British Columbia’s vehicle fleet. Once again the explanations are hard to explain. Dr. Jaccard and Associates prepared an Electrification Potential Review that included estimates of electricity demand under a number of scenarios and assumptions. The report concluded that electric vehicles would result in Terra-watt hours of demand which would have, once again, driven us from the low load to the mid or high load forecasts. Hendriks dismissed that detailed analysis by going back to a truly horrendous BC Hydro load forecast that suggested that by 2030 a little over 8% of British Columbia’s automobile fleet would be electric vehicles. [The link is to my analysis that demonstrates why the load forecast is so poor.] To summarize, the BC Hydro analysis assumes that about 8% of BC’s vehicle fleet will be electric vehicles in 2030. Now recognize, some analysts are claiming that we won’t be able to buy internal combustion engines in 2030 but the BC Hydro forecast used by Hendriks (and thus the Panel) assumes that electric vehicles will still be no more than a novelty at that point in time. To put it another way, we will have surely have failed in our fight against climate change if that is the case. So once again on one side we have a respected expert who provides a detailed analysis, supported by references to the peer-reviewed research, that shows a high demand for electricity in 2030 and on the other we have Mr. Hendriks who cites a back-of-the-envelope calculation from BC Hydro that pre-dates our signing of the Paris Agreement. Anyone want to guess who the Panel chose to believe? The guy who showed up to the meeting of course.

I can’t repeat it enough because this point is so important. The entire basis of the Panel’s conclusion that to build or not build Site C is a toss-up is based on the assumption that the BC and Federal Governments will do nothing to fight climate change. This in a province and country where both governments have dedicated massive resources to fighting climate change. Were the efforts to fight climate change through electrification included in the analysis then in the Panel’s own words “the cost to ratepayers for Site C will be less than the Illustrative Alternative Portfolio.” Looking at the Panel’s own report the basis for their discounting electrification is a couple paper-thin analyses that run exactly opposite to the massive consensus of scientific and regulatory opinion in Canada. Essentially we are making a $10 Billion bet that Canada will do nothing significant to fight climate change and the sole basis for that bet is an analysis done by a consultant working against the project and a low-quality BC Hydro analysis completed prior to Canada signing the Paris Agreement.

Posted in Canadian Politics, Climate Change, Site C, Uncategorized | 12 Comments

Agriculture near Site C: confronting mythology with facts

This blog is about evidence-based environmental decision making. I strive to present facts supported by references and emphasize the importance of using reliable data in decision-making. This is why I have spent so much time on the Site C Dam project, as many of the arguments against the dam have been built on a structure anecdotes, exaggeration and bad information. Nowhere has this been more evident than in the discussions around the agricultural potential of the Peace Valley and the spurious arguments about food security.

My last dive into this topic dealt with the now thoroughly debunked claim that the area to be flooded by Site C could feed 1 million people. This claim was started by a retired Professional Agrologist named Wendy Holm and was repeated by supporters and anti-Site C activists. Happily thanks to this blog and others like me Ms. Holm has backed down from her ridiculous claim. Rather Ms. Holm has become much more circumspect in her language. She no longer claims that the area flooded by Site Site C will feed 1 Million people instead she has adjusted her claim to:

this land is capable of producing sufficient vegetables to meet the nutritional needs of more than one million people a year, in perpetuity

The claim is based on a single, uncorroborated study written by a consultant in the 1980’s. Yes you read that right, a consultant wrote a non-peer reviewed report in the 1980’s and since that time no other researcher or other source has presented any details to support that claim. In real science, a claim is made and it is examined and studied and compared against real data but that is not how Ms. Holm works. She has taken a historic report she found that supported her general world-view and treated it like the word of god. She then used this uncorroborated assessment to extrapolate wildly (as I will discuss below) to come out with a fantastically inflated number that every anti-Site C activist seems to repeat like a Gospel. So let’s look at her claims a bit more closely.

Let’s start with her extrapolations. Ms. Holm, insists:

All 3,816 hectares of alluvial soils to be flooded are extremely high capability land (Class 1-3, improved ratings).

Juxtapose this with a  previous article in the Times Colonist: Reports of lost Site C farmland simply not true which states:

 the loss of valley bottom land with agricultural capability is closer to 3,800 ha, of which only 1,600 ha has actual potential. I would also point out that little of the land — less than 400 ha being flooded — was actually being cropped, and then mainly for forage, not food crops.

So the question arises how can they both be right? Well the answer is simple.  To help you visualize a kind researcher has posted a map of the area to be flooded. As you can see, much of the area to be flooded represents islands in the middle of the river that are inaccessible to industrial farming equipment. Yes the land is Class 1-3 but if you can’t access the land (because it is in the middle of the river) then it really doesn’t represent useful farmland. Mr. Anderson (author of the first article) only includes land that can actually be accessed with farm machinery, which makes sense if you plan on intensely farming an area. Ms. Holm has used the results of a GIS exercise that counted every square centimeter of land on every isolated little island in the middle of the river. This allows her to extrapolate wildly and few have called her to task on the subject.

So who should you trust on the topic? Ms. Holm’s supporters are quick to claim her expertise (she is a retired Agrologist) however it would appear Mr. Anderson has a wee bit of expertise in this area specifically:

James D. Anderson was director of farmland resources for the Ministry of Agriculture, Forests and Fisheries from 1980 to 1985 and involved in the first environmental review and agricultural assessment done of Site C in 1982.

On the face of it I would argue that Mr. Anderson has a strong claim to be a credible voice on this topic. He was, after all, the gent in charge of the whole shebang the last time this assessment was carried out. It is funny how the activist who are fighting the dam continue to highlight Dr. Swain’s expertise as Chair of the Joint Review Panel but they give short-shrift to the man who actually was in charge of the Department when Ms. Holm’s famous vegetable study was written.

The next question arises: who is right about the potential of the land? Well the the proof of the pudding is in the eating. In 1980 a claim was made that the Peace Valley could serve as a vegetable Mecca let’s look how that prediction has that turned out? This being a blog that relies on data lets look at some data. Every 5 Years Statistics Canada does a Census of Agriculture, the results of which are posted online.  While the most recent census results are not yet reported, the results from 2006 and 2011 are online for the Peace River Regional District. Let’s see how the actual facts line up with the mythology being portrayed by Ms.Holm.

According to the Census of Agriculture, in 2006 there were 26 hectares (in the entire Peace Valley) dedicated to commercial growing of vegetables. By 2011 that number had jumped to ZERO….yes you read that right in this Mecca of vegetables there were no hectares dedicated to commercial  vegetables in 2011 in the entire Peace River Regional District. Not just the Site C-affected Valley bottom, but the entire Peace River Regional District. Almost 824,000 hectares of farmland area and none, nada, nil dedicated to the commercial growing of vegetables. Sure some backyard gardens certainly grew carrots and lettuce but no agricultural land was dedicated to vegetables. That represents a pretty reasonable debunking of Ms. Holm’s hypothesis.

Ms. Holm argues that much of the best land in the Valley has been reserved for the legal flood reserve since the 1950s. What this fails to note is that the Valley has been farmed since the 1920’s and no one bothered to set up a commercial vegetable patch in the first 30+ years the valley was farmed. Moreover, as Ms. Holm and Mr. Anderson point out there are 400 hectares of this prime land that is currently being farmed and yet in 2011 exactly none of it was used for vegetables, rather the 26 hectares that were being farmed in 2006 had stopped being used for the purpose.

Let’s be absolutely clear here, Ms. Holm insists that Site C is so important because it represents most of the Class 1-2 land in the valley except according to the people who actually track this information the Peace River Regional District has over 5,000 hectares of Class 1 soils and almost 121,000 hectares of Class 2 soils. This means that literally thousands of hectares of Class 1 soils exist outside the legal flood reserve and would, under Ms. Holm’s hypothesis, represent ideal locations for vegetables and fruits. Yet, as the statistics demonstrate none of those thousands of hectares are being used to grow fruits or vegetables, Moreover, over 100,000 Hectares of Class 2 soils exist in the Peace Valley Regional District which would supply ample space for the needed growing area under a climate change scenario.

Now this seems a bit strange. Ms. Holm claims that the Peace Valley is the ideal location to grow vegetables and the entire farming community of the Peace Valley disagrees with her. In science we call that testing a hypothesis. A hypothesis was proposed in 1980 that the Peace Valley would be an excellent location for vegetables. Farmers tried it out and ultimately stopped growing vegetables sticking instead with grains and forage. That is a pretty definitive debunking of that hypothesis.

As a further note,  in her most recent letter to the Editor in the Times Colonist Ms. Holm expanded her repertoire of crops to include fruits, which she mentioned three times. Anyone care to guess how many hectares in the entire Peace Valley Regional District were dedicated to commercial fruit production? If you guessed zero in both 2006 and 2011 you would be right. It is almost as if farmers have more sense than to risk their livelihood on tree fruit and vegetable crops that are susceptible to frost. That far north an early/late frost can destroy an entire crop so farmers have decided to avoid those crops.

As this post is getting long (and it is getting late) I want to briefly touch on a final topic of  mythology being put forward by the anti-Site C activists. That we need to preserve the flood plain affected by the Site C dam for food security purposes.  According to the official numbers the Site C Dam will flood approximately 0.4% of the agricultural land in the Peace District or 0.2% of the agricultural land in BC. Doesn’t this put these food security arguments into perspective? It is ridiculous to claim that the flooding of the land required for Site C will put our food security at risk? We currently have almost 2 million hectares of ALR that we aren’t even bothering to farm (including 426,000 in the Peace District) and the activists claim we will go hungry if we flood around 5,000 hectares of it in the Peace?

Moreover, when it comes to the production of fruit and vegetables we don’t necessarily need to depend on Class 1-3 lands in the north because I have a secret to tell you. Our future food security in BC for fruit and vegetables is actually going to come from greenhouses. Anyone who has been to my neck of the woods has seen the greenhouse  industry springing up left, right and center. They are able to produce incredible quality produce from lands of all classes (even commercial and former industrial lands). As for the question: where is the soil going to come from for use in the greenhouses? Well that would be municipal organics management and composting. Composting facilities in the lower mainland are producing more high-quality organic soils than we know what to do with. Access to good soil will not be the limiting factor in the growth of the greenhouse vegetable industry.

Now let’s look at how greenhousing has flourished in the last decade. Going back to the Agricultural Census let’s look at the Metro Vancouver stats: Greenhouse space for vegetable production almost quadrupled from 1996 to 2011 from 500,000 mto 1.8 million m2 . Our food security for vegetables in British Columbia is not dependent on a small portion of a northern valley prone to unexpected frosts but rather to using the resources we have at hand (agricultural land and green-housing technologies) far closer to vast majority of consumers in the Lower Mainland. The Peace Valley, meanwhile, will retain its characteristics as our bread basket and can do that with Site C in place.

To conclude, our food security is not at stake from building Site C, rather the energy produced by Site C will help provide clean power to greenhouses that can produce higher yields closer to the population base of our province. Unlike the oft-repeated claims from Ms. Holm the Peace Valley is not a fruit and vegetable Mecca, rather commercial fruit growers have avoided the area for the last 100 years while the few farms that tried out vegetables have abandoned the effort. Put simply, just because Ms. Holm and the anti-Site C activists keep repeating a myth doesn’t make it any more real. The data make it clear that her fabulous report from 1980 was simply a case of wishful thinking and combined with political activism has created a mythos that desperately needs to be exposed to the light of evidence-based decision-making.

 

 

Posted in Canadian Politics, Site C | 15 Comments