An Ecomodernist-based approach to fighting climate change while protecting our shared global ecosystem

I am a pragmatic environmentalist and an Ecomodernist and in celebration of the fourth anniversary of An Ecomodernist Manifesto I have prepared this post to present an Ecomodernist-based approach to fighting climate change while simultaneously protecting our shared global ecosystem.

Let’s start with the obvious, but apparently necessary, declaration: I believe that climate change is both real and a significant threat to the long-term ecological health of our planet. That being said, I think we need to take a long and careful look about how we have approached the fight against climate change and how we should be carrying out the fight in the future. My concern is that we have made some poor choices, to date, in the fight and if we are not careful we risk being led down a path that could have consequences almost as bad as if we simply listened to those who deny climate change is real.

I write this blog post as someone who has spent almost three decades evaluating policy options to fight climate change from both an ecological and humanist perspective. As a humanist I see the need to reduce human suffering by pulling as many humans as possible out of poverty and giving them the resources to live good and fulfilling lives. This means making more energy available for more people since quality of life is strongly correlated with easy access to energy. From an Ecological perspective, I see a planet where the mass of humanity has inexorably squeezed out nature. I see a need to turn back that tide. As humans we need to reduce our global ecological impact. We need to restore nature not just for human use but because nature has a legitimate right to thrive absent human intervention. We need to make more space for nature and give it a chance to exist outside direct human influence.

Returning to the issue of climate change; we currently live in a world where climate policies appear to be more-and-more dictated by high-school students, political operatives and unaccountable international NGOs rather than the scientists and policy experts who have spent decades studying these problems. In that context, I can state that I have never been more afraid. I’m afraid because it is becoming increasingly clear that the idea that we should proceed using defensible and rational policy has effectively become passé. We have let ourselves be convinced that any action is better than our current path and in doing so are embarking on policies that could pose a significant threat to the long-term ecological health of our planet.

Now I know this last paragraph sounds harsh, but the truth of the matter is that the climate change debate is becoming one of harsh language, overblown rhetoric and a lot of really bad science. Political activists and unaccountable NGOs are pushing their political agendas under the guise of “fighting climate change” and the result has the potential to derail the real fight against climate change. Consider the “Green New Deal” (GRD). It represents a tremendous aspirational document and yet somehow its supporters simultaneously argue that the GRD requires that the all work be done by unions. The reality is that renewable energy projects do not produce fewer megawatts per hour if they are constructed by non-unionized labour. Placing these kinds of irrelevant restrictions in the way represents the sort of thing that takes away from achieving our goals.

In addition to activists adding unnecessary requirements to projects, others are demanding unrealistic and overly expensive approaches or approaches like expanding our use of biofuels even as we have come to recognize that biofuels often make climate change worse. When policy decisions are driven by students, political operatives and activist NGOs (often staffed by individuals with no real understanding of the underlying science that forms the basis of our current society) the results can be changes that can have massive negative consequences.

To understand what I mean by well-meaning activists look at the proposals of the 100% wind, water and sunlight team at the Solutions Project. Their idea of a solution to our climate challenges is to industrialize our fragile marine foreshores with low-efficiency wave and tidal facilities while massively increasing the amount of space dedicated to harvesting low-density energy sources (i.e. wind farms and solar facilities). Marine foreshores represent significant and highly-restricted ecological niches and filling them with disruptive human technologies represents an unnecessary burden on those niches.

It is hard to talk about bad approaches to energy policy without discussing the German experience with Energiewende. It has shown that you can spend almost $500 Billion and still see almost no decrease in carbon emissions if you make the wrong choices.

We need to stop listening to dreamers who don’t understand physics or ecology. The people who demand we depend solely on low-density, diffuse power sources or revert to low-tech, high-input agricultural practices are wrong. Instead let’s look at how an Ecomodernist approaches the problem.

An Ecomodernist Take

The core of the Ecomodernist approach is to decouple human development from environmental impacts. How do they suggest humanity do that? By increasing urbanization; intensifying agriculture; expanding the use of renewable power within the context of our urban environments; and supplementing low energy-density renewables with higher energy-density renewables like geothermal and nuclear power.

Serious environmental scholars understand that the best way to preserve nature is to enhance urbanization. Urbanization means putting more people into cities where they require fewer ecological inputs, per capita, to enjoy a healthy and fulfilling existence. In urban communities we can reduce per-capita energy costs through mass transit, shorter travel distances for supplies and shared heating/cooling in energy-efficient, high-density housing. The more spread out your community, the less likely that centralized services like sewer, water and gas are possible and the more expensive the cost to maintain the services. More people in cities means fewer people in suburbs and more space for nature and non-human species.

Similarly, our aim shouldn’t be to “go back to nature” to grow our food, rather we should intensify our agriculture while limiting our farming footprint. There is simply not enough land for humans to return to subsistence agriculture. While the 100 Mile Diet sounds intuitively like it should be better for the environment, that is far from the truth. We need to grow our food where it grows most efficiently and in doing so we can use less space leaving more space for ecosystems to thrive outside of direct human influence.

Marine aquaculture is a critical consideration on this topic. Our heavy reliance on the natural bounty of the oceans is quickly depleting their ecological diversity. We need to eliminate open-ocean and drift-net fishing. If we must fish then we should rely on terminal fisheries rather than indiscriminate fisheries and we need to use everything that we take. The ecologically criminal practice of discarding bycatch simply has to stop. Moreover, we need to create more marine nature reserves where marine species can re-build marine biodiversity. This means setting aside large expanses of oceans where we simply don’t fish or otherwise exploit the oceans while identifying smaller areas where we can take advantage of the bounty of the sea through aquaculture.

Part of the de-coupling involves changing the way we generate energy. We need to wean ourselves off fossil fuels and move toward electricity-based transportation and home-heating technologies. We also need to re-think how we look at renewable energy technologies.

Our current approach to wind technology has to be re-considered. Wind turbines can be differentiated by comparing the axis of orientation into two types: Horizontal Axis Wind Turbines (HAWT) and Vertical Axis Wind Turbines (VAWT). To date almost all our wind energy has been generated using HAWTs but we need to consider the advantages of VAWTs. While VAWTs tend to be smaller and individually less efficient than HAWTs, as described at Phys.org

While a single VAWT is not as energy-producing as an individual HAWT, the wind flow synergies created in a closely-spaced array of VAWTs can potentially generate up to 10 times more power per unit of land area than an array of widely-spaced HAWTs.

Moreover, because they are smaller in size VAWTs can be placed in locations where HAWTs cannot, like the medians of highways. Anyone who has walked near a roadway knows how much wind is generated by a large truck driving by. Now imagine hundreds of small VAWTs harvesting that otherwise wasted energy and pumping it back into the grid. Tests are being done around the world, and the results have been very promising. Our urban environments create massive wind corridors and VAWTs can take advantage of those conditions to generate energy.

As for solar energy, we need to re-think how we generate that as well. Our current approach of stripping huge swathes of nature to install solar panels has to be re-thought. We need to make more policy decision like the California rooftop solar mandate and ensure that new buildings don’t just use energy but generate energy. This doesn’t just mean solar panels on the roofs but fully integrating photovoltaics into our building designs. If we do decide to develop stand-alone solar power facilities, we should do so in combination with agricultural uses.

We shouldn’t stop with making our buildings energy producers; we also need to incorporate ideas like the the Vancouver Green Building plan that reduces the amount of energy need to keep a building warm in winter and cool in summer.

Now the topic where Ecomodernists always get attacked is the recognition that de-coupling human development from environmental impacts means incorporating our most effective low-carbon energy technology: nuclear power. Opponents of nuclear claim it is too expensive, will encourage nuclear weapons development and has waste issues. We all know that the reason nuclear has been so expensive is that traditionally nuclear plants were designed and built as one-off projects. Well, the Koreans and Chinese have demonstrated that by simplifying and standardizing nuclear design we can avoid most of the cost challenges in building nuclear reactors. As for nuclear proliferation, if Canada wanted the bomb we would already have it. As for China, the US, India, the UK, France and Russia? they already have the bomb. Ultimately, I’m pretty sure a new nuclear plant in Alberta won’t be the deciding factor as to whether Canada decides to become a nuclear weapons state. Finally, the waste argument is simply a red herring. Nuclear energy produces much less waste than solar or wind facilities (per MWh produced) and as for the spent uranium, the Generation IV reactors will be turning that “waste” into the next generation’s electricity.

To conclude let me take a couple lines directly from the Manifesto:

Urbanization, agricultural intensification, nuclear power, aquaculture, and desalination are all processes with a demonstrated potential to reduce human demands on the environment, allowing more room for non-human species. Suburbanization, low-yield farming, and many forms of renewable energy production, in contrast, generally require more land and resources and leave less room for nature…..A good Anthropocene demands that humans use their growing social, economic, and technological powers to make life better for people, stabilize the climate, and protect the natural world.

Posted in Climate Change, Environmentalism and Ecomodernism, Uncategorized | 5 Comments

A primer on the BC refined fuel market, lower mainland gasoline prices and how they can be affected by a change in mix in the Trans Mountain Pipeline

In the last couple weeks I have read a lot about gas prices and the threat by Jason Kenney to shut down the Trans Mountain pipeline. Since a lot of what I have read is incomplete and/or incorrect, I figure it is time to prepare a quick primer to help understands the refined fuel market in BC and what Jason Kenney may, legally, be able to do to mess that market up.

Understanding the BC Refined Fuel Market

Let’s start with what we know about the BC refined fuel market. Let’s start with this from an article in Business in Vancouver:

Provincially, B.C. lacks refining capacity. B.C.’s two refineries produce only 67,000 barrels per day (bpd) of gasoline and diesel, whereas B.C. consumed 192,000 bpd in 2015, according to the CFA [Canadian Fuels Association]. The Parkland Fuel Corp. refinery in Burnaby produces 55,000 bpd and supplies about 25% to 30% of Vancouver International Airport’s jet fuel supply. Alberta’s refineries supply about 100,000 bpd to B.C., and about 30,000 bpd is imported from Washington state refineries, according to the CFA.

To our south, the United States has broken their petroleum market up into five Petroleum Administration of Defense Districts (PADDs). The West Coast of the US, including California, Oregon and Washington, make up PADD 5. Geography defines PADD 5. It is mostly bordered on the east by mountains. The only (non-rail) major east-west connection on the west coast is the Trans Mountain pipeline. As the US Energy Information Administration (EIA) puts it:

Because PADD 5 is isolated, in-region refineries are the primary source of transportation fuels for PADD 5. In 2013, PADD 5 refinery production was sufficient to cover about 91% of in-region motor gasoline demand, 96% of jet demand, and 113% of distillate demand. Heavy reliance on in-region production further complicates the supply chain when disruptions occur. When disruptions occur, all of these factors noted above combine to limit short-term supply options, lengthen the duration of supply disruptions, and cause prices to increase and remain higher for a longer period than would be typical in markets outside PADD 5.

In a nutshell, BC is short 30,000 bpd in refined fuel supply domestically and relies on Washington State refineries which sell into a PADD 5 market that is also significantly short on supply.

Even more problematically, most of the big refineries are owned by oil companies that have long-term contracts for most of their production. We can only buy our 30,000 bpd out of the left-overs and we are competing with Oregon and California (that are also under-supplied) for whatever the Washington refineries have to sell. What is worse is that at this time of year the refineries have to temporarily lower capacity to allow the transition from winter gasoline to summer gasoline. The reason for this is that gasoline is affected by temperature and the winter blends have more volatile components needed to help cars run in the cold.

As for the suggestion that the Parkland refinery can somehow fill in the gap. The truth is that, as a very small refinery, Parkland has needed to specialize to survive in the international market. Parkland has specialized by tuning their refinery to make jet fuel and the more expensive high-octane, premium fuels. It actually exports some of this premium gas in the US market. To make regular gasoline, in any reasonable quantities, would require the refinery to shut down and would take time and money.

The only other significant facility in the Lower Mainland is the Suncor Burrard Products terminal. As I will point out below, the Trans Mountain is a batched pipeline. It carries refined fuels, light crude and heavy crude. One issue with batching is the refined fuels can pick up impurities left over from the heavy oil on the way. Before gasoline can be sold on the market it has to go to the Suncor facility for clean-up.

To conclude, the take-away from this section is that BC has an ongoing shortage of about 30,000 bpd in refined fuel supply, with no large marine import facilities. We are buyers in a seller’s market (PADD 5) and have no ability to change that equation. Should we choose not to buy the fuel from the Washington refineries, they have lots of other options. So we are price takers not price makers.

Local Gas Prices

The next topic to discuss is our local gasoline prices. Our local gas price has a lot of factors built into it. Let’s start with taxes. Provincially, in the lower mainland we pay 34.39 cents/L (c/L) of provincial taxes for every liter of gas. This can be broken down to:

  • 17 c/L in TransLink Tax,
  • 6.75 c/L in British Columbia Transportation Financing Authority  Tax
  • 1.75 c/L in Provincial Motor Fuel Tax, and
  • 8.89 c/L in Carbon tax

The federal government also gets its pound of flesh. Federally we pay:

  • 10 c/L federal excise tax and
  • 5% GST on our total purchase price (or 7.5 c/L on $1.55 gas)

Adding up all the taxes together we get 51.89 c/L for taxes on $1.55 gas. Now the problem with the gas business is that it is very opaque. The internal prices are kept private but one thing we are privy to is the rack price. The rack price is defined as:

the cost of the gas itself, as well as transportation, overhead, and profit costs. The price can vary from terminal to terminal and depends on the cost of crude oil and related refining costs. The rack price also depends upon the distance between the fuel retailer and wholesale terminal. A gas station located far from a terminal is going to pay a higher fuel rack price than one located just down the street.

That would be all the costs, exclusive of the dealer’s mark-up which pays for the retail facility and all its staff. Most oil companies publish their rack price somewhere. Here is a link to the Petro-Canada daily rack price for Canadian cities. In Edmonton, on April 14, 2019 it was 83.30 c/L while in Vancouver it was 105.2 c/L. There is a 21.9 c/L difference in the rack rate. This difference is the scarcity premium we pay because it is expensive to send gas to the west coast, then clean up the gas at the Suncor facility and then sell it to the big distributors. The only way to decrease that scarcity premium is by eliminating scarcity.

Assuming the rack price is pretty comparable between retailers (to simplify this discussion) then all we need to add is the dealer’s mark-up. This is normally around 12 c/L. So our gas ends up with:

  • $1.052 rack price
  • $0.3339 provincial taxes
  • $0.10 federal excise tax
  • $0.075 GST
  • $0.12 dealer’s mark-up

This becomes a gas price of $1.68/L.

Green fuel requirements

One thing we seldom hear discussed represents another big challenge with importing gasoline to BC: our fuel regulations. The Greenhouse Gas Reduction (Renewable & Low Carbon Fuel Requirements) Act and the Renewable & Low Carbon Fuel Requirements Regulation define what our gasoline looks like. Part 2 of the Act establishes the renewable fuel content requirements for gasoline and diesel sold in British Columbia:

  • Fuel suppliers must ensure that they have a minimum renewable fuel content of five percent (5%) for gasoline and four percent (4%) for diesel, on a provincial annual average basis.
  • Fuel suppliers have the flexibility to vary their blend percentages and can choose where in the province they supply renewable fuel blends, as long as they meet the provincial annual average requirement for renewable fuel content.

What does this mean for consumers? Well it means we can’t simply buy gas from Asia or California and sell it off the boat to retailers. Instead gasoline from other suppliers would need to be imported and mixed with enough ethanol to meet the BC regulations before it can be sold. This is another way we have, through our desire to fight climate change, made it harder (and more expensive) to get gas in BC.

Trans Mountain Capacity and allocations

Going back to the Trans Mountain Pipeline, we are all told that the Trans Mountain has a nominal capacity of 300,000 bpd. But that is not the entire story. You see the Trans Mountain is a batched pipeline. It carries refined fuels, light crude and heavy crude and the relative amount of each defines the actual capacity of the pipeline. Here is how those volumes have looked over the last few years.

The thing to understand about the pipeline is that the 300,000 bpd assumes we are moving 20% heavy crude. As the table below shows, the actual capacity of the pipeline would change dramatically if we changed the mix running down the line. The Trans Mountain can carry 395,000 bpd of light/refined, or 300,000 bpd if 20% is heavy or 269,000 bpd if 40% is heavy.

This means that if we can get all of the heavy fuel out of the pipeline (say moving by rail via CanaPux) then that would be almost like getting a brand new pipeline. Running only light crude and refined fuels, the pipeline could supply the Puget Sound with 260,000 bpd while still leaving lots of room for Parkland and the West Coast market. The flip side of this equation is that if Alberta were to require that the pipeline carry 30% – 40% heavy it would push the capacity of the pipeline down to 269,000 bpd which could squeeze the amount of refined fuel running down the pipe.

One additional fact that is not well known is that in 2010 the NEB provided the Westridge marine facility with two allocations. A firm allocation of 54,000 bpd and and uncommitted allocation of 24,000 bpd. As you can see from the graph above, over the last few years the marine terminal has been getting the short end of the stick but that doesn’t have to be the case. Essentially, the NEB said that the suppliers can demand up to 79,000 bpd be sent to the marine terminal. 79,000 bpd represents almost 30% of the pipeline capacity if that 79,000 bpd were heavy crude.

Now this is where it becomes interesting. Given the NEB’s decision, it is possible that Alberta could require that the entire marine allocation be put in the pipe as heavy fuel (say by nominating a whole lot of the in-kind bitumen they get in lieu of royalties). If that were the case it would, as discussed above, put the squeeze on the pipeline and reduce the amount of refined fuel running to BC. This would represent an entirely legal way that an Alberta government could constrain BC fuel supplies and there would be nothing a court could do about it. Simply enforcing existing allocations using heavy could shrink the amount of anything else running down the line.

Conclusion

To conclude we, in the Lower Mainland, are buyers in a sellers market. We have no ability to dictate to the market and if the provincial government decided to regulate gasoline it would have to start high because Canadian governments can’t force American refiners to sell them gasoline.

The Lower Mainland also lacks marine facilities to significantly increase the amount of fuel that comes here by ship and BC gasoline regulations would require that any imported gasoline be adapted to meet our local requirements. Both would raise the price of gasoline even more.

If Alberta wants to squeeze BC it can shut down the pipe. But if it did the courts would deal with that in hours/days. But the truth of the matter is that Alberta is full of smart people and they know that the Trans Mountain can be gamed to reduce the amount of refined gasoline coming down the pipe. This will have price consequences.

I’m going to end with the environmentally interesting part of this story. If you were an environmentalist wanting to reduce BC’s carbon emissions and our dependence on fossil fuels you should be cheering Jason Kenney on. Thanks to the Law of Unintended Consequences any effect he has on prices will force people to find alternatives. The reason we have carbon taxes is to increase the price of carbon-intensive fuels to discourage their use and encourage users to seek alternatives be they EVs, transit, walking or avoided trips. The reality of the case is that if they are looking to fight climate change, Jason Kenney and high gas prices are something they should be aiming for, not fighting.

Posted in Pipelines, Trans Mountain, Uncategorized | 16 Comments

More bad epidemiology about BC LNG from the MDs at CAPE

I have written a lot about the BC liquid natural gas (LNG) export industry. I have done so because my examination of the climate math says BC LNG will help reduce global greenhouse gas emissions and will help in the fight against climate change. Many activists disagree with me, with one of the most vocal anti-LNG groups being the good MDs at the Canadian Association of Physicians for the Environment BC (CAPE). CAPE has a history of fighting the natural gas industry. Their most recent effort involves a travelling roadshow called: “Voices from the Sacrifice Zone: Fracking in BC’s North“. Their presentation is summarized in a Narwhal article written by CAPE BC Board Member Melissa Lem. As I have written in the past, the good MDs at CAPE often struggle when trying to translate their professional experience in private practice through the lenses of toxicology and epidemiology. As I will discuss in this blog post, this is again the case with their travelling anti-LNG roadshow. Ultimately, what I intend to show in this blog post is that the anecdotal experiences of the people in the roadshow are trumped by the epidemiological work done to date in BC’s northeast and that any decision on the future of BC LNG should be based on science and not anecdotes.

To explain what I mean by anecdotal experience let me give you an example using one of my favorite hobbies: birding. I love birds and have a backyard feeding station for birds. Before we got our puppy I had black oil sunflower seeds in our station. As a result, I would see dozens of finches daily. Unfortunately, our puppy took to eating the discarded sunflower seed shells so we had to stop including the seeds in our station…and the finches disappeared from my backyard. As a birder I know that all the finches in the neighbourhood didn’t just die off. I am quite sure they simply moved to a feeding station down the street. If we did a regional survey and sent those results to a statistician that statistician would likely find that the number of finches in the region hadn’t changed. In the field of human health risk assessment that statistician is called an epidemiologist.

Going back to the CAPE roadshow. It consists of a small number of practitioners presenting their anecdotal observations to the public. The problem is the anecdotal information they present is not backed up with a look at the bigger picture. This is unfortunate because, as presented in the Narwhal article, a detailed epidemiological study has already been carried out for the BC Northeast and the results were very reassuring. The assessment called: Cancer Incidence in the Peace River South Local Health Area looked at the rate of cancer incidence in the region where LNG is generated and found that cancer incidence was normal or as they put it:

the number of total cancers diagnosed in this region are consistent with Northern Health regional rates; the number expected based on regional rates is almost identical (1193 vs 1201)….overall cancer incidence over the past 10 years in this region is consistent with average cancer rates in Northern BC.

So what are the CAPE doctors using to refute this epidemiology? Why anecdotes of course. In the Narwhal article they note that there were ten incidences of glioblastoma in the area when the average for that population should have been five. The article also notes that this is an “unofficial” count and the numbers presented are not reflected in the cancer incidence study. Not a surprise since the doctor had observed less than 2 incidents a year. Also interesting is the link provided in the piece did not indicate that VOCs were a particular risk factor for this cancer but rather:

The vast majority of glioblastomas occur randomly, without inherited genetic factors. The only confirmed risk factor [my emphasis] is ionizing radiation to the head and neck region. Studies of environmental and genetic factors contributing to glioblastomas have so far been inconclusive or negative. 

Looking at the literature it is understood that radon is a significant risk factor for this cancer while VOCs are not. So when the author points to this cancer she fails to note that the more likely reason for this potentially increased incidence is that Dawson Creek is a hot spot for radon.

They also present the experience of “an internist who diagnosed ten cases of idiopathic pulmonary fibrosis (IPS) in the two short years he’d worked in Dawson Creek“. What they omit is that genetics represents a major risk factor for IPS so in a close-knit community (especially one with First Nations communities that have strong genetic ties) this consideration becomes important and must be considered in the math. This demonstrates why it is so important to use the tools of epidemiology in trying to come to an evidence-based conclusion.

The most telling part of the presentation is CAPE’s go-to study in their fight against LNG; this is the study that CAPE has used in virtually every one of its presentations: Gestational exposure to volatile organic compounds (VOCs) in Northeastern British Columbia, Canada: A pilot study. This study is used because it involves pregnant women and benzene and so pulls at virtually every observer’s heart-strings. The problem is the study is explicitly described as a pilot study meaning it has an exceedingly small sample size and is not able to effectively address critical confounding variables. As such it is absolutely useless for telling us whether LNG is having an effect on the population. Because of its importance to CAPE, I will spend a bit of time explaining where it goes wrong from a decision-making perspective.

This pilot study involved testing for benzene and benzene metabolites in the blood of 29 pregnant women in northeastern BC. Now for the first problem: sample size. Twenty-nine participants is a tiny population to search for trends in any study but this one is particularly notable because of the 29 women, 2 were regular smokers and 4 more were regularly exposed to second-hand smoke. Smoking is a serious confounding variable in a study of this kind because smokers are expected to have higher benzene and benzene metabolite concentrations in their blood as a by-product of exposure to smoking.

In this study, the elevated benzene and benzene metabolite concentrations were only observed in a very small number of participants. Unfortunately, the authors did not confirm whether the elevated participants were among the number that were regularly exposed to cigarette smoke, they only indicate that the highest results were not smokers. Step 1 in any more detailed epidemiological assessment would typically have been to exclude smokers from the study.

The study also does no discuss another major confounding factor: whether the women lived in houses with attached garages. Attached garages you ask? Yes, since the presence of an attached garage is one of the strongest indicators (after smoking) of potential human exposure to volatile organic compounds (VOCs) like benzene. To explain, see this article:Automobile proximity and indoor residential concentrations of BTEX and MTBE where they point out:

Residing in a home with an attached garage could lead to benzene exposures that are an order of magnitude higher than exposures from commuting in a car in heavy traffic, with a risk of 17 excess cancers in a population of a million

Another article (Migration of volatile organic compounds from attached garages to residences: A major exposure source) points out:

A total of 39 VOC species were detected indoors, 36 in the garage, and 20 in ambient air. Garages showed high levels of gasoline-related VOCs, e.g., benzene averaged 37±39 μg m−3. Garage/indoor ratios and multizone IAQ [indoor air quality] models show that nearly all of the benzene and most of the fuel-related aromatics in the houses resulted from garage sources, confirming earlier reports that suggested the importance of attached garages. Moreover, doses of VOCs such as benzene experienced by non-smoking individuals living in houses with attached garages are dominated by emissions in garages, a result of exposures occurring in both garage and house microenvironments.

Put simply, the benzene metabolite study has challenges associated with sample size and confounding variables. These are the sorts of thing a more detailed epidemiological examination would be able to establish. Absent these considerations we have a study that tells us that more study is needed but should not be relied upon for decision-making purposes.

To continue my critique of the Narwhal article; I would also note that in Northeastern BC, where winters are extremely cold, indoor air issues are exacerbated because ventilation is minimized to prevent heat loss. In the Narwhal article the author questions why a study comparing the results to southern BC has not been carried out? The issue is that in southwestern BC we don’t see the same cold as they do in the northeast so people here are more likely to open their windows in winter and thus have fewer indoor air issues. This is why no epidemiologist would blindly compare northern and southern communities in a manner the author of the Narwhal author suggests should be done. It would be bad science.

Ultimately, the “Voices from the Sacrifice Zone” series consists of a collection of anecdotes with the common feature that all the anecdotes involve people who live or work in Northeastern BC. There are no controls, there is no evaluation of the big picture. It is a handful of activists telling their personal stories. As such it makes for a compelling presentation from a human-interest standpoint, and is useless from an environmental decision-making perspective.

The thing we have to keep reminding ourselves is that a presentation that includes a lot of compelling stories is of little use in determining if the LNG industry is safe. For that you need epidemiological research and the epidemiological work to date identifies no cancer hot-sports in need of more detailed assessment. There are no increases in VOC-related cancers in the data. Rather the research shows the exact opposite. The cancer rates are consistent with what is expected in this community. There are simply not enough cancer cases to raise any concerns. The authors of the report note that Acute Myeloid Leukemia (AML, the cancer CAPE associates with benzene) is so rare that the Peace River south Region sees an average of “less than 1 case per year” of AML. In the case of AML, in some years, there is literally no data to crunch.

To end this piece I want to reiterate a simple truth. Decisions about energy policy shouldn’t be made based on anecdotes and first-person narratives, no matter how compelling they may sound. First person narratives can inform further research but decision-makers need to consider real evidence assembled by people experienced in ensuring that the data is not the result of unexamined confounding variables. Epidemiologists have compiled those results and the current output from those experts indicates that the northeast is not a “sacrifice zone” but rather has absolutely typical diseases incidence rates. As such the anti-LNG roadshow really doesn’t inform evidence-based decision-making, rather it muddies the waters by implying a negative trend exists when the data says no such trends exist.

Author’s Note

An earlier version of this blog post had an error with respect to the incidence of glioblastomas. This was the result of my mis-reading the Narwhal article. The error had no significant effect on the piece but has been corrected in the current version.

The text has also been cleaned up with respect to the maternal benzene metabolites study as the original critique could be viewed as too imprecise. I hope the new language addresses those concerns.

Posted in Renewable Energy, Uncategorized | 1 Comment

On spherical cows, an idealized China and the futility of arguing with activists on BC LNG

It seems like every week we get another announcement from the BC government about the BC LNG industry. While there are clearly issues with how our government is handling the financial end of the BC LNG industry (including taxation policy and subsidies) the one area where the case is solid is on the emissions side of the ledger. As I describe in my blog post on the topic the peer reviewed academic literature shows quite categorically that BC LNG will lower global emissions for electricity production. BC LNG is not “dirtier than coal” as some activists falsely claim, rather it is much cleaner than coal. So how can the activists fight this science? Well the answer is simple: they use the “consider a spherical cow” defense, by imagining an idealized China. As I will show in this post, ultimately it is futile to argue against the professional activists if they insist on ignoring the real world and “assume we have a can opener“.

To start let’s explain what I mean by “consider a spherical cow”. This expression derives from a book: “Consider a Spherical Cow: A Course in Environmental Problem Solving“. The term is shorthand for relying on a too-simple model for a complex problem. Here is a good description by Timothy Lee:

There’s a famous joke about a dairy farmer who, hoping to increase milk production, seeks the help of a theoretical physicist at the local university. After carefully studying the problem, the physicist tells the farmer, “I have a solution, but it only works if we assume a spherical cow.”

There is a similar joke called “assume we have a can opener” which Wikipedia describes as:

there is a story that has been going around about a physicist, a chemist, and an economist who were stranded on a desert island with no implements and a can of food. The physicist and the chemist each devised an ingenious mechanism for getting the can open; the economist merely said, “Assume we have a can opener”!

You are probably asking what I mean by this? Well here is an example from today:

I presented data that showed that:

in the current market, BC LNG would not be replacing coal when exported to China. Instead BC LNG would mostly be replacing Chinese SNG and the climate math is even more categorical on that topic. BC LNG is much, much cleaner than Chinese SNG. Replacing Chinese SNG with BC LNG will help the planet and even if exporting BC LNG causes Canada to miss our Paris Agreement targets then the sacrifice is worth it. We have to keep reminding ourselves GHGs are global and it is more important to address global emissions than local ones. If a minor increase in Canada emissions can result in a major decrease globally then that is well worth our missing our Paris Agreement commitments.

The Wilderness Committee response was a link to a model that completely ignores the real world. The link provided by the folks at the Wilderness Committee concentrates on how there is no room for new natural gas. In the context of our discussion this can only mean that we are supposed to simply ignore what the Chinese are doing because there is “no room” for their emissions. In essence they are insisting we imagine some idealized China completely unrelated to the existing China. They argue that if we are to meet the goal of 1.5 degrees Celsius then China can’t be allowed to use the fossil fuels that they have declared they will use so the activists will simply pretend those emissions won’t happen.

Now here is a secret I need to share: China doesn’t care what a climate think tank or a bunch of North American activists say they should do. We do not have some idealized China, we have a very proud nation state that currently occupies a large hunk of the planet on a continent called Asia. The Chinese government has declared their intentions and backed those declarations with huge financial investments.

When activists argue that we shouldn’t lock in the emissions associated with BC LNG they ignore that the Chinese are investing billions building synthetic natural gas (SNG) projects:

As of June 2018, five pilot synthetic natural gas projects are operating in China, with a total capacity of just under 6 billion cubic meters per year (bcm/year). Roughly 80 SNG projects with a cumulative capacity of more than 300 bcm are in different stages of the development pipeline. (This is 10 more than were in the pipeline in June 2017.)….China currently emits 25.5 MT (or one third of the entire oil sands in 2017) just to create the SNG they will later burn. To put this into perspective the LNGCanada project in total is estimated to emit 10 MT/yr while Woodfibre would generate about 1 MT/yr .

These projects lock in emissions that produce around 1.4 – 2 times the emissions for the same energy output as compared to BC LNG. Our real-world China is expanding its coal industry not reducing it. They are locking in emissions that BC LNG can help reduce.

As a pragmatic environmentalist I keep insisting the climate change energy debate has to rely on real-world numbers. We need to look at what real countries are planning to do and figure out how we can reduce those emissions. We need to recognize that there are no spherical cows out there. We can’t simply “assume we have a can opener” and we can’t plan based on an “idealized China”. We live in a world where spherical cows don’t exist, where can openers don’t simply materialize and where the Chinese government exists and its current energy use trajectories cannot be dismissed. In that real world, BC LNG will lower global emissions for electricity production and are something reasonable observers should support, not fight.

Posted in Uncategorized | 5 Comments

How understanding Type I and Type II errors and p-values helps in assessing the conclusions of the Ramazzini Institute 13-week pilot study on Glyphosate

As regular followers of this blog know, my graduate research involved developing systems to allow data collected by researchers to be evaluated for reliability and made available for subsequent re-use by other researchers. I carried out my research in an era before the wide availability of computerized statistics programs. As a consequence, all my statistical calculations were done with a calculator and statistical look-up tables. Due to the difficulty in completing these analyses, we were taught some very important lessons about evaluating research studies. In this blog post I want to share some of those lessons while examining the most recent paper from the Ramazzini Institute 13-week pilot study on glyphosate.

To do this, I am going to introduce a couple important concepts in science (Type I and II errors and p-values) and then use that information to explain why the study, as designed, could not come to any definitive conclusions. Or put another way, because of the study’s design it is virtually impossible to derive any definitive conclusions from the research and any activist attempting to do so needs to spend more time learning about the scientific method.

Type I and Type II Errors

In science we identify two primary types of errors (okay some argue there are at least three types but I will not go into that today):

  • A Type I error, which represents a false positive, involves claiming that an observed hypothesis is correct, when in reality it is false.
  • A Type II error, which represents a false negative, involves claiming that an observed hypothesis is incorrect when it is actually correct.

In my opinion, the best visualization of the difference between the two is this graphic: (which I have seen in numerous locations and whose origin I have been unable to confirm although I believe it comes from the “Effect Size FAQ“).

The tools used to avoid Type I errors mostly involve better understanding the nature and characteristics of the populations under study. It is generally accepted by the scientific community that an acceptable risk of making a Type I error is the 95% confidence level (a p-value of 0.05). In order to derive an acceptable p-value, certain characteristic of the population must be understood. Primarily its distribution, or lack of an understood distribution.The details of how statisticians evaluate populations for this purpose involve mathematics that I won’t go into today.

The tools used to avoid Type II errors are less well-refined (but are getting better every day). Most depend on improving our understanding of the nature of the distribution that is being tested. Lacking that understanding an increase in sample size will increase the power of an analysis and reduce the likelihood of a Type II error.

P-Values: what do they mean?

P-Values are one of the most misunderstood features in research. A p-value helps you establish the likelihood of a Type I error. It does nothing to help avoid Type II errors and has absolutely no information about whether your results are “right” or “wrong”. Remember in science all results are right since they represent observations. It is just that some observations may help support a hypothesis while others may not.

As described in this article in Nature, when Ronald Fisher introduced the p-value in the 1920’s he did not mean for it to be a definitive test but rather one tool in the researcher’s tool belt. Nowadays there is an entire edifice in science built on the importance of achieving a p-value less than 0.05 (see a xkcd comic which makes fun of that idea). The problem is that a low p-value is not a proof of anything. A p-value simply provides the probability of getting results at least as extreme as the ones you observed. A really clear write–up on the topic is provided in this link. Unfortunately, even practicing scientists have a really hard time explaining what a p-value represents.

As I mentioned above, a p-value indicates the likelihood that an observation supports your hypothesis. At a p-value of 0.05 (95% confidence) we still have a 1 in 20 chance of being wrong (a Type I error). My son loves playing Dungeons & Dragons and in that game 20-sided dice are used for all battles. Roll a natural 20 (a critical hit) and your level 1 barbarian actually hits the other guy’s level 18 druid. Roll a 1 and your level 12 barbarian can’t even hit a wall.

Now the truth every D&D player knows is if you roll the dice enough times eventually everyone rolls a 20 and everyone rolls a 1. That is how statistics works. It is also true that speaking from a purely statistical stand-point it is incredibly unlikely that any one person will win the lottery, but eventually every lottery prize gets won. So when someone gets a “significant” result in a study you need to examine how many times they rolled the dice and how well the researcher understood the relationship between the observations and the conclusion. False correlations happen all the time in science and are so common there is a great web site dedicated to the more entertaining examples.

The Ramazzini Institute 13-week pilot study on Glyphosate

This brings us to the the “Ramazzini Institute 13-week pilot study on Glyphosate”. This is a project intended to try and find whether glyphosate has a potential effect on various human health endpoints. This research project appears to have started with the assumption that these impacts exist and so is using a shotgun approach to try to find topics for further study. So what have they done in this study?

In the study cited they exposed a relatively small number of rats to glyphosate in two forms, as pure glyphosate and as the Roundup formulation and then did all sorts of measurements and assessment to see if any significant effects were observed.

Now remembering what we read earlier look at the paper the number of times they rolled the dice. They generated eleven tables with dozens and dozens of comparisons between the treated mice and the controls. As you can expect, in the end they found a number of “significant” differences, but should we be convinced by this fact? Once again I direct you to an apt comic from xkcd.

Well, let’s start with the obvious question: since both glyphosate and Roundup have the same critical active ingredient you would expect the two treatments to have the same effect. So any case where only one of the two treatments had an effect should raise some red flags. From the Type I error perspective, given the number of comparisons that were made, this outcome would not represent anything different than you would expect by simply rolling 20-sided dice. There was one single occurrence where both the glysophate and Roundup significantly differed from the control, but once again given the number of comparisons that is not an altogether unexpected result.

How about false negatives? Well from a Type II error perspective, the number of rats in the study are so low as to not really prove anything either. Look at this study of “Pesticide use and risk of non-Hodgkin lymphoid malignancies in agricultural cohorts from France, Norway and the USA: a pooled analysis from the AGRICOH consortium“. They looked at 316,270 farmers accruing 3,574,815 person-years of exposure to glyphosate and found no effect. This puts the inclusion of 99 rat pups in this study into perspective doesn’t it.

So what is the point?

By now a lot of you are likely asking: if the study had little chance of coming to a solid conclusion what is the point of this type of study? The simple answer is a pilot study is exactly that, a pilot study. You throw stuff at a wall to see if anything sticks. If something does stick then that is a good direction for further research.

Due to its small sample size; massive number of potential comparisons; and lack of refinement all a pilot study of this type does is sets the project up to do a more directed study sometime later. Another research group can use the information from this study to design an experiment to see if the initial observed correlations can be repeated. Until those more-detailed studies are done, the results from this pilot study really tell us nothing useful about whether glyphosate is the cause of the observed changes or whether those numbers are simply the result of a random roll of the dice.

Conclusion

As I finish this blog post I know the question I am going to be asked is: well did glyphosate have an effect or not? My response is: I don’t know and neither do the authors of this study. The study was not designed to answer that question and as such is unable to do so but rather opens up directions for future investigations. The only people who are going to express certainty about the outcome of this study are activists who will trumpet it as proof that glyphosate is a danger. Because of the study’s design it is virtually impossible to derive any positive or negative conclusions from the research but that won’t stop the activists or their friendly journalist friends who love a scary headline. All I can hope is that if enough observers understand the limitations of sample design and statistics they will call out the activists when they make these inaccurate statements.

Posted in Chemistry and Toxicology, Risk, Risk Assessment Methodologies, Risk Communication, Uncategorized | 3 Comments

Understanding the difference between a “hazard” and a “risk” or why scare stories about glyphosate and pesticides in your food shouldn’t frighten you

I have written a lot at this blog about how chemical risks are communicated to the public and so I am often asked about news stories depicting the latest science scare story. Sometimes they are handled badly, like the CTV National news report about glyphosate with the chilling title: Weed-killing chemical found in pasta, cereal and cookies sold in Canada: study. Sometimes it is done much better, like the Global New take on a similar topic with ‘Dirty Dozen’: Do these fruits and veggies really have harmful amounts of pesticide? As I will explain in this blog post, ultimately it comes down to understanding that we have to stop asking the question “can this compound cause cancer?” and instead ask: “is this compound expected to cause cancer at the concentrations encountered in that study?” because while the answer to that first question may be “yes”, the answer to the second will almost always be “no”. In asking those questions we can understand the fundamental difference between a hazard and a risk.

As many of my readers know, one of my areas of professional practice is risk assessment. In my practice I often hear people interchange the words “hazard” and “risk“. These are not interchangeable terms. I can’t repeat this enough, the words “hazard” and “risk” mean very different things and it is important to understand that something can be a hazard without posing a serious risk.

  • A hazard is anything that has the potential to cause harm
  • A risk is the likelihood that a hazard will cause harm.

Notice the difference? An unfenced swimming pool is a hazard to toddlers. But if that unfenced swimming pool is on a fenced estate where toddlers are not allowed then it poses no risk to your toddler.

So let’s bring this back to the idea of pesticides like glyphosate.

Well by now we all know that the International Agency for Research on Cancer (IARC) has designated glyphosate as a Group 2A carcinogen meaning they believe it is “probably carcinogenic to humans”.

There are very strong arguments that the IARC conclusion was incorrect and that glyphosate is likely not a carcinogen . The UN Food and Agriculture Organization and World Health Organization, the European Food Safety Authority (EFSA), Health Canada and the US EPA all agree on that the IARC is wrong. Strong articles have even been written to suggest the IARC decision was fundamentally flawed.

That being said, let’s assume that every other major health agency is wrong and the IARC is right. In their Monographs (research studies) the IARC makes a very important point

The IARC Monographs Programme evaluates cancer hazards but not the risks associated with exposure. The distinction between hazard and risk is important. An agent is considered a cancer hazard if it is capable of causing cancer under some circumstances. Risk measures the probability that cancer will occur, taking into account the level of exposure to the agent. The Monographs Programme may identify cancer hazards even when risks are very low with known patterns of use or exposure [my bold].

Even in their own document the IARC explains that a pesticide can pose a hazard and not be a risk to human health.

The other thing to understand is analytical chemists are really, really good at finding very small amounts of compounds in mixtures. As I pointed out in a previous post; analytical chemistry has got so precise that a modern mass spectrometer can distinguish to the parts per trillion range. That would be 1 second in 30,000 years. So when an activist says they found “detectable” concentrations of a pesticide in a sample you should probably take that with a grain of salt. Reading the two studies presented at the top of this blog they found pesticides in the parts per billion range. A part per billion would be a drop of water in an Olympic-sized swimming pool.

In toxicology and risk assessment the way we determine whether an exposure to a chemical poses a risk is to calculate the reference dose, (RfD). A RfD is a concentration or dose of a compound in question to which a receptor may be exposed without causing adverse health effects (i.e. a dose that is considered “safe” or “acceptable”). For pesticides, Health Canada calculates maximum residue limits (MRL) that represent concentrations of a compound that are not considered to pose a significant risk to the public.

Health Canada has established MRLs for glyphosate for all sorts of foodstuffs. The entire list is here. This list establishes concentrations that are considered to be entirely safe (i.e pose no significant threat to the public). This is where journalists like in the CTV story, can get it wrong. To explain let’s look at a post from Joe Schwarcz who looked more deeply into that study:

the highest amount of glyphosate found was 760 ppb which is way, way below Health Canada’s standard for oat products at 15,000 ppb. A small child eating 100 grams of the cereal would consume 0.076 milligrams of glyphosate. Most regulatory agencies have concluded that consumption up to 0.5 mg/kg body weight per day presents no problem, so that a 10 kg child could consume 5 mg per day. The 0.076 mg consumed is 1/66th of this.

That is, a 10 kg child (a baby) would have to eat 66 bowls of Cheerios a day to experience a detectable risk to their health.

This brings us back to our story about fear-mongering and glyphosate. I can’t count the number of people on my social media feed who pointed out that a jury found Monsato guilty for causing a man’s cancer. Well I am not the first person, nor will I be the last, to point out that US juries are not known for their ability to understand science. Another jury once believed the story about an infamous glove “if it doesn’t fit, you must acquit” and we now all agree they were out to lunch.

Instead of trusting US juries for our science I think we should stick with the professionals and they all agree (even the IARC) that the concentrations of glyphosate you encounter in your daily life in your breakfast cereals and in your nutritious fruits and vegetables are not high enough to cause any harm. In other words, while glyphosate may represent a theoretical hazard to human health it does not pose a real risk to you or your children.

To conclude let me reiterate. When you see a study like those presented above do not ask: “can this compound cause cancer?“. Instead ask: “is this compound expected to cause cancer at the concentrations encountered in that study?” By doing so you will get the correct answer. Then you can ignore the fear mongers and go back to eating those healthy fruits and vegetables in peace.

Image from Shutterstock

Author’s note:

Because I deal with risk all the time in this blog I have prepared a series of posts to help explain the risk assessment process. The posts start with “Risk Assessment Methodologies Part 1: Understanding de minimis risk” which explains how the science of risk assessment establishes whether a compound is “toxic” and explains the importance of understanding dose/response relationships. It explains the concept of a de minims risk. That is a risk that is negligible and too small to be of societal concern (ref). The series continues with “Risk Assessment Methodologies Part 2: Understanding “Acceptable” Risk” which, as the title suggests, explains how to determine whether a risk is “acceptable”. I then go on to cover how a risk assessment is actually carried out in “Risk Assessment Methodologies Part 3: the Risk Assessment Process. I finish off the series by pointing out the danger of relying on anecdotes in a post titled: Risk Assessment Epilogue: Have a bad case of Anecdotes? Better call an Epidemiologist.

Posted in Risk, Risk Assessment Methodologies, Risk Communication, Uncategorized | 1 Comment

Some thoughts from a Pragmatic Environmentalist for the Climate Strikers

As a pragmatic environmentalist who has been working to advance environmental causes for the last 30+ years I would like to take a moment to provide some advice to the youth of today on your 2019 Climate Strike.

First let me start with congratulations. You have taken the first step by starting a movement, hopefully one that will go on to do great things. But movements can easily get sidetracked. This is a particular concern in a movement like yours. You have deeply held and sincere beliefs, but little experience. You need to understand that knowing a problem needs to be solved is very different from knowing how the problem should be solved.

How energy is generated and used is not a topic you can learn quickly or easily. It has complexities that experts who have spent their lifetimes studying the topic still struggle with. One of my big concerns with the climate strike is that it is being driven by people who really aren’t aware of the complexity of the energy debate and instead talk about simple answers to complex problems. This is not a field that lends itself to simple answers.

As an example, I keep listening to the demands of the Climate Strikers and it is all about stopping all fossil fuel use now and blaming the previous generations for the conditions of the present. Well that approach ignores the realities of our era. I read that:

What Thunberg and her fellow protesters want from their governments is to “keep fossil fuels in the ground, phase out subsidies for dirty energy production, seriously invest in renewables and start asking difficult questions about how we structure our economies and who is set to win and who is set to lose,” 

What I don’t hear from your speakers is a recognition that we currently have a transportation (and thus food supply) system that is utterly dependent on fossil fuels and will be for the next 20+ years. We simply don’t have widely available fossil fuel-free options for transport trucks, container ships, cube vans or airplanes. Were we to “keep fossil fuels in the ground” our food supplies would quickly dry up and people would starve. This means to fight climate change we need to figure out how to address non-transportation uses while we innovate in the transportation field.

Part of this fight means getting electricity from fossil fuel-free sources and getting off fossil fuels for household uses. The City of Vancouver model is a great start on that front moving away from using natural gas for residential uses and using less electricity in those residence. In addition, we need carbon taxes to provide funds to pay for the research that will fuel innovation. This will also get more people into electric vehicles.

As activists we need to understand that it is a bad idea to undercut governments that are actively trying to enhance our fossil fuel-free energy alternatives (like Site C and run- of-river) and not knee-capping sympathetic governments (like Rachel Notley’s government in Alberta) when they try to get incremental change implemented. While many complained that Alberta’s Climate Leadership Plan didn’t get everything you wanted it did involve spending hundreds of millions getting Alberta off coal while investing in renewables. Those activists who fought the program are helping elect a government that will be massively antithetical to the cause. This will ensure Alberta ceases being a climate leader and instead has to be dragged into the fight.

I know you believe that you need to inhabit the moral high ground, but holding the moral high ground as the planet burns around you will get you nowhere. You need allies and fellow travelers to achieve your goals. This brings me to my most important piece of advice: don’t let outsiders with political motivations corrupt your movement. If you want it to grow and thrive it has to be non-partisan. You have to avoid being drawn into historic political battles because the only way we can get global change is to build a big tent. Excluding fellow travelers because they have different political views will ensure that your movement fails.

Historically, environmental movements have allied themselves with socially progressive groups. As someone who has studied environmental history I can assure you that this approach has failed every time it was tried. In the cases when the progressives finally won political power, with the aid of environmentalists, the environmental goals of the resulting government were mostly ignored while the progressives concentrated on their social goals. Your big tent movement has to include free-enterprise conservatives and political moderates to succeed.

You have to detach yourselves from the political activists who have latched onto your movement. Especially the watermelons who insist that the only way forward is to destroy the current system and start again. We don’t have the time, and the public does not have the appetite, to follow that road. Look at the environmental performance of every strictly socialist country to date. The results have been abject failures because as described in the “Tragedy of the Commons” when individuals don’t have a stake in the protection of a resource they don’t protect that resource. Look around the world, the countries that have done the best on the environment are ones that combine the best of all systems, like Canada and the Scandinavians.

Now I am going to say something you won’t want to hear. It is time to stop demanding impossible changes and to start looking at what is possible. If you insist on virtually impossible goals, like the purveyors of 100% wind, water and sunlight, then you are going to fail. You need to consider a pragmatic approach to energy which includes regionally appropriate renewables combined with low-carbon, high-density supporting power like nuclear.

The movement also has to eject the anti-scientific core of people who refuse to accept that real alternatives (like nuclear) are a necessary part of the solution. We also have to take a global look at emissions. Ask yourself: what can we do regionally to help reduce global emissions? If that means developing the BC LNG industry so we can reduce the amount of carbon emitted by China well that is something we should do because we live in a global environment and we can’t ignore what is happening in Asia and Africa.

Put another way. If you are in a sinking ship with a massive hole in the side you don’t wait until you can fully fix the hole before you start trying. You first try and stem the flow of water into your ship to give you more time for a permanent fix. BC LNG is a tool to reduce the amount of water flowing into the ship so we have more time to fix it permanently.

We live in a world where 1.1 billion people live in energy poverty and each year 4.3 million people a year die from preventable indoor air pollution directly resulting from that energy poverty. Their governments are going to prioritize the health of today’s people over those of tomorrow. It is easy for climate strikers and activists who will go to bed well-fed and warm in Canada and Europe to tell the world we should use less energy but the governments of China and India still have deep poverty and hardship to fight and will ignore your cries.

So we need to do what we can at home. We need to reduce our personal emissions, while working to reduce our community emissions and working to get policy change so we can develop the technologies that we can share around the globe to ultimately reduce global emissions. It is time to take the power of this movement and use it to implement real, incremental change because your current demand that we burn it all down and start again is surely doomed to fail because it leaves out the 5 billion who are just trying to survive. Ultimately, it doesn’t matter what we do in Europe and North America if we can’t bring Asia and Africa along with us because there are over 5 billion Africans and Asians who will ultimately decide whether we have a chance to beat climate change.

Posted in Uncategorized | 6 Comments