By | archive, textbook

Animal research is a very controversial topic that has been generating heated arguments and debates all around the world over the past few decades. Recently, there appears to be an overwhelming growth in animal right groups all over the world. Most of these groups don’t just support animal welfare, but they demand a ban on any kind of use of animals in research. At the same time, more and more pro-testing groups that support the humane use of animals in research have been forming – these proponents argue and recognize the essential role that animal research has played in medical advancements and breakthroughs throughout the years.

Imagine for a second, a world where animal research is banned. First of all, medical progress would come to a standstill, with a variety of major setbacks in developing treatments for devastating diseases, including neurodegenerative disorders such as Multiple Sclerosis, Parkinson’s disease, and Huntington`s disease. Yes, we would still be able to use non-invasive techniques on humans (ie Magnetic Resonance Imaging – MRI) and in vitro “test tube” experiments, but will that be adequate? At least in my field of research, Neuroscience, in vitro (Latin for “in glass”) studies are not very representative of the complexity of the human brain. This is a general disadvantage of in vitro studies; in most cases, the natural environment of the cells or the tissue cannot be easily replicated in a petri dish. Of course, in vitro experiments are useful in certain cases and there are various applications for them, but they don’t emulate the environment and complexity of a living “in vivo” tissue (no surrounding tissues, no blood supply, nutrients etc).

The truth of the matter is that the majority, if not all scientists and researchers acknowledge the difficult ethical issues that arise from animal research. To my knowledge there is no scientist that enjoys using animals in research, as it’s been insinuated by various animal extremists. This type of research is demanding, time-consuming, laborious and very expensive. Animals need to be housed, fed, constantly monitored and taken care of by specially trained animal welfare technicians and veterinarians, especially since stressed animals tend not to provide the best experimental results. Regardless, it is necessary, at least for the time being, to rely on such studies to understand how our body functions and to develop new effective drugs for diseases.

The most important question we need to address as researchers and scientists is the following: Why use animal models in research? Animal models can provide a great tool to learn about certain diseases, especially in regards to how they progresses in time and how they can be diagnosed. They also allow us to find new ways to treat diseases without endangering human lives in the process.

What animal activists fail to understand, in my opinion, is that we researchers appreciate the past and present contribution of animals to improving human health, helping cure diseases and saving lives. We take animal welfare very seriously and we are committed to the humane treatment of the animals under our care. Investigators are required to carefully design research projects and protocols, while taking into account the Three Rs (Reduction, Replacement and Refinement) [1]. The Three Rs represent widely accepted ethical principles that are taken into account when designing a research project involving animals. They promote the replacement of animals where possible and the reduction of the overall number of animals used in the other cases through refinement of the techniques and procedures employed. These principles also ensure there is a continuous refinement of the established protocols in order to provide better conditions and care for the animals.

One of the main arguments used by animal activists against the use of animal models in research is the fact that in many cases, the results obtained from animal research and treatments that worked in animals were not successful in clinical trials with humans and vice versa. [2] The fact of the matter is that no animal model can completely reproduce a human disease or a human organ simply because of the biological differences between the different species. However, by choosing the appropriate model you can get a relatively good representation and therefore more accurate and relevant results. This does not necessarily mean that we have never learned anything or provided safe and successful treatments based on animal research. On the contrary, there are countless examples that provide evidence on the merit of such practices. The development of several vaccines we use today is largely based on animal research, like the human papilloma virus (HPV) vaccine [3], the whooping cough and the polio vaccine; as well as the discovery of insulin, the development of organ transplant techniques [4] and anti-transplant rejection medication [5].

Another important argument made by animal activist is that there are several alternative methods to conduct research and gather the results necessary for medical advancements. These alternative methods, however, cannot replace animal research just yet. Some of these methods include imaging techniques, such as MRI and functional MRI (fMRI) scanning, in vitro testing, micro-dosing and computer models.

Imaging techniques like MRI and fMRI scanning allow us to see areas of the brain “light up” under different conditions giving us important information about how the brain works at a large-scale. Nevertheless, imaging techniques have their limitations. The resolution is quite low, which means that you cannot see individual brain cells, but rather whole areas that could contain thousands of different types of brain cells. In order to study brain disorders, it is important to know what cell types are affected and how, which is not possible with the current non-invasive imaging technology. Imaging techniques can provide invaluable information, but once again animal research is needed to understand diseases at the cellular and molecular level.

In vitro testing is based on the use of tissues and cells, a major source of which was and continues to be animals. Immortalized cell lines (that are preserved for many years and used over and over) are not always representative of the physiological functions of cells that are in their normal environment, the body. Even though there are applications for these immortalized cells, using “fresh” cells will yield results that will more closely represent what really happens in the body. We could possibly use human cells to address the need for these “fresh” cells, however, there are tissues and cells that are much more difficult to obtain from humans. One good example is brain cells. Would anyone be willing to go through unnecessary and extremely risky procedures to donate brain cells to science? Probably not. This brings us back to the use of animals to obtain the necessary cells to even conduct in vitro testing.

Micro-dosing (Phase-0 microdosing trials) is a new technique used to study the effect of drugs in humans by administrating very small doses, as the name of the technique suggests. The idea behind micro-dosing is the administration of doses so low that it is unlikely they will cause a large-scale response (throughout the body), but instead cause a small localized response that can be observed and studied. However, this technique has a few limitations. Since it only studies small doses of a drug, it cannot effectively predict the consequences of administrating a higher pharmacological dose. [6] Future studies may be able to exemplify whether the body responds the same to “micro-doses” and pharmacological “therapeutic” doses of a particular compound. Generally, micro-dosing seems to be a very promising tool that might potentially replace the use of some animals in drug testing trials in the upcoming years.

To quote Professor Stephen Hawking – “Computers can do amazing things. But even the most powerful computers can’t replace animal experiments in medical research.” (Quoted by Seriously Ill for Medical Research in 1996). Computers might not be fast or powerful enough yet to simulate and reflect all aspects human physiology, but they are closer than ever. New advancements in the fields of computer science and engineering are making projects like the “Human Brain Project” [7] (human brain simulation) possible. It is important to note that computers can’t possibly replace the study of a live brain, since we don’t understand its complexity to a point where we are able to produce programs that can represent brain function effectively. Computer model simulations could, however, contribute to the optimization of experimental protocols, and thus result in the reduction of animals required for research.

Research is always evolving, improving and progressing. It is possible that in the future we might not need to use animals for research proposes as better, less expensive and time consuming methods may be available. Unfortunately, we are still not at the stage where animal research is obsolete. I do believe that scientists can and should devote the time and effort to adopt such techniques when possible, as well as help develop and refine new techniques and procedures that will help minimize unnecessary use of animals. Finally, it is important to communicate to the general public how animal research is conducted and how carefully animal welfare is being addressed by scientists, as well as the measures that are in place to protect and care for the animals.


1. Russell, W.M.S. and Burch, R.L. The Principles of Humane Experimental Technique. 1959.

2. Perel, P., et al., Comparison of treatment effects between animal experiments and clinical trials: systematic review. BMJ, 2007. 334(7586): p. 197.

3. Peng, S., et al., Development of a DNA vaccine targeting human papillomavirus type 16 oncoprotein E6. J Virol, 2004. 78(16): p. 8468-76.

4. Moore. F.D Give and Take: the Development of Tissue Transplantation. 1964, New York: Saunders.

5. Discoveries in Pharmacology, ed. Parnum, M.J. and Bruinvels, J. Vol. vol 3. 1986, Amsterdam: Elsevier.

6. Garner, R.C. and G. Lappin, The phase 0 microdosing concept. Br J Clin Pharmacol, 2006. 61(4): p. 367-70.

7. Human Brain Project, H.B. 2013; Available from: https://www.humanbrainproject.eu/

About Katerina Othonos

Katerina is a Neuroscience PhD student at UBC. When she isn't in the lab obsessing about making little neurons fat and happy, she enjoys reading sci-fi books and listening to heavy metal.


By | archive, textbook

It is seven in the morning and your alarm is chirping away, or maybe it’s the morning sun, or the screech of a parent pressed for time. Regardless of the method from which you awake from your slumber, you crawl out of bed and feel that familiar morning growl emanating from your hungry gut. Some might grab a snack and quickly rush out the door, others might ruminate over an elaborate breakfast, and some are content with a simple bowl of oatmeal. This often first and necessary act of the day is repeated by billions of humans around the globe and should serve to remind us all that we are nothing without food. However, implicit in this realization is another: that there can be no food without that life-giving stuff beneath our feet – the soil.

Soil silently performs a multitude of services critical for life-sustaining ecosystem functions. It controls the cycling of energy (carbon) and nutrients (nitrogen, phosphorus, potassium, sulphur etc.) that sustain the food and fibre we depend on for nourishment and materials. Soils filter our drinking water and with careful management, can fix carbon dioxide and reduce other greenhouse gases to mitigate the effects of climate change. Soil is teeming with life – just one teaspoon of fertile soil can contain 9 billion microbes, larger than the sum of all humans on this planet (Doran et al. 1999). These microbes are largely responsible for the biochemical transformations that provide the above ecosystem functions. In a sense, we can’t live without these microbes, and for this reason, understanding who they are and what they are doing can help us in nearly limitless ways, from increasing crop yields to finding novel enzymes to better understanding climate change.

The challenge, however, remains in how exactly do we study and characterize who these microbes are and what they are doing? Most soil microbes are invisible to the naked eye and less than 1% of microbial life has been cultured in the lab (Torsvik and Øvreås, 2002). For many years soil scientists referred to the soil microbial community as a ‘black box,’ meaning reliable measures of soil processes, such as soil nutrient concentrations, were easily obtained, but how the microbial community adapts to and influences these processes remained a mystery. The soil scientist’s toolbox was not equipped to measure who is out there, much less what they are doing. Today, that black box is beginning to be pried open largely due to advances in human genetics that paved the way forward for microbiologists of all stripes and colours.

The driving force behind the current revolution in understanding soil microbial communities has its roots in the Human Genome Project. A genome is the total complement of DNA from a single organism; it contains all of the genes that produce all of the proteins that make an organism what it is (Clark et al. 2009). The Human Genome Project was an unprecedented investment in genetic research – it cost U.S. taxpayers $2.7 billion but enabled scientists to sequence the human genetic code in its entirety by 2003 (NHGRI 2010). A sequenced genome is an invaluable resource, especially for human health, as mutations within the genetic code that lead to disease can be better understood, potentially leading to alternative medical treatments. The technology used to sequence the human genome is known as Sanger sequencing, a method that became the rate limiting step in sequencing the 3 billion base pairs in the human genome (Mardis 2011). The massive investment in the human genome project, coupled with the drive to increase the speed of DNA sequencing, led to new developments in sequencing technology, known today as ‘next generation sequencing.’

Decoding soil microbial genomes is an important step in understanding the microbial ‘black box.’ From knowledge of the genetic code of a community of soil organisms comes inference of the capabilities of that community. For example, scientists may expect soil samples with a diverse complement of genes that code for enzymes involved in decomposition to more quickly decompose inputs of organic matter such as fallen leaves. The problem though is that DNA extracted from soil does not contain DNA from only one type of microbe; one estimate pegs the diversity of prokaryotes (unicellular microbes without a nucleus) at 52 000 unique species in one gram of soil (Roesch et al. 2007). The average genome size of prokaryotic cells is 2 million base pairs, which would require that 104 billion base pairs be sequenced for a single gram of soil (Gilbert and Dupont 2011). In contrast, the well-funded human genome project took 13 years to complete with only 3 billion base pairs sequenced. It is precisely for this reason that next generation sequencing technology has allowed such advancements in the study of soil microbial communities.

The first high-throughput second generation sequencing platform was introduced in 2005 as the Roche 454 pyrosequencer (MacLean et al 2009). Other platforms are available and in some ways are more widely used today (the Illumina set up for example), but a simplified outline of the method for the 454 sequencing technology is highlighted here:

DNA extracted from a soil sample is first ligated (attached) to universal adapters. The DNA-adapter complex is then immobilized onto reaction beads before sequencing begins. Each bead now contains a unique fragment of extracted DNA, and thousands of beads, each with a different DNA fragment, are loaded onto a plate with thousands of wells where the individual beads reside (MacLean et al. 2009, Mardis 2008). In this way, each well houses a bead that itself is carrying a unique piece of extracted DNA. The soil DNA attached to the beads is single stranded, meaning nucleotide bases (G, A, T and C) remain unpaired and will readily bind with their mate to form a base pair. Sequencing begins by flooding the plate with a single nucleotide. If a fragment in any well has an unpaired adenine (A) that is next in sequence, and thymine (T) is flooded into the plate, an A-T bond will form. The nucleotides that are flooded onto the plate are modified such that when bound to their mate, the reaction triggers the activity of luciferase, the same enzyme responsible for the light emitted from fireflies (Mardis 2008). The activity of luciferase emits light that is detected by a camera. The plate is then washed of nucleotides, another different nucleotide is introduced, light is emitted when the added nucleotide finds its mate and the camera snaps another picture. Every well that has a DNA fragment with an unpaired nucleotide this is the made of the nucleotide being flooded will light up, and the light will be captured by the camera from multiple wells at the same time. In this way, thousands of DNA fragments can be simultaneously sequenced, driving down both the time and cost of sequencing. This is a powerful technique – had this technology been around when the human genome was being sequenced, it would have sequenced the entire human genome in 10 days or less (Nyrén 2007).

With advanced sequencing technologies in their toolkit, soil scientists (who might now call themselves soil microbiologists) can get down to the business of figuring out who is beneath our feet, and what they are doing. Easier said then done. The reality is that genomic DNA extracted from soils can come from hundreds of thousands of uncultured, never characterized, mystery microbes. The DNA extraction procedure produces millions of DNA fragments in a ‘shotgun’ approach such that bits and pieces of the entire soil microbial community are sequenced to produce a profile of all genes distributed throughout the community (Gilbert and Dupont, 2011). This method of study is in contrast to genomic studies where the genome of only one organism is considered. Handelsman et al. (1998) in their study of soil microbes from environmental DNA coined the term ‘metagenomics’ to describe this new branch of genomic research.

Metagenomic studies of soil DNA produce vast quantities of discontinuous sequence data. Post-sequencing data processing involves stitching together the fragments of DNA into something that resembles partial microbial genes. This work is computationally difficult – it is as if you fired a shotgun close-range at a painting and then attempted to re-construct the painting by piecing together all the obliterated bits of canvass without knowing what the painting looked like to begin with. In fact, an entire field – bioinformatics – is devoted to developing algorithms for processing metagenomic sequence data. The first step in data processing is to attempt to stitch together (assemble) the sequence data to form longer DNA fragments (Wooley et al. 2010). Longer fragments are easier to characterize and these fragments are ‘binned’ to assign the DNA fragments to a known microbial species, usually by comparing sample DNA to reference databases using tools such as the Basic Local Alignment Search Tool (BLAST). One downfall is most of the sequences in a metagenomic dataset will remain unassigned because most of the reference database is derived from well-characterized, cultured organisms (Simon and Daniel, 2010). Overall, processing sequence data is the last step when trying to piece together the structure and function of a soil microbial community. Although reference databases are incomplete, they do assign some collected fragments to known microbial species that have well characterized genes with known function. Less than 10 years ago this was a largely impossible or at least a prohibitively expensive task. Advances in sequencing technology, bioinformatics and reference databases will only improve our resolution over time.

With the advances afforded to the study of soil microbial communities through second generation sequencing technology, soil scientists have only just begun pry open the microbial black box. Currently, research is largely focused on microbial decomposition of plant inputs into soil due first to the search for novel enzymes for the biofuels industry, and second, because microbial decomposition has important consequences for global CO2 emissions (Baldrain 2014). The field is also rapidly changing, with emphasis increasingly on characterizing gene products (RNA, proteins) over the metagenome so as to capture the genes and proteins that are active in the soil under various environmental conditions. The rapid advances in sequencing technology over the last 10 years make this an exciting time for soil science research. These methods may one day prove to reveal the secrets held within one of the last frontier in modern science – the soil beneath our feet.


Baldrian, P., & López-Mondéjar, R. (2014). Microbial genomics, transcriptomics and proteomics: new discoveries in decomposition research using complementary methods. Applied Microbiology and Biotechnology, 1-7.

Clark, D. P., Dunlap, P. V., Madigan, M. T., & Martinko, J. M. (2009). Brock Biology of Microorganisms.

Doran, J. W., Jones, A. J., Arshad, M. A., & Gilley, J. E. (1998). 2 Determinants of Soil Quality and Health. Soil quality and Soil Erosion, 17.

Gilbert, J. A., & Dupont, C. L. (2011). Microbial metagenomics: beyond the genome. Annual Review of Marine Science, 3, 347-371.

Handelsman, J., Rondon, M. R., Brady, S. F., Clardy, J., & Goodman, R. M. (1998). Molecular biological access to the chemistry of unknown soil microbes: a new frontier for natural products. Chemistry & Biology, 5(10), R245-R249.

MacLean, D., Jones, J. D., & Studholme, D. J. (2009). Application of’next-generation’sequencing technologies to microbial genetics. Nature Reviews Microbiology, 7(4), 287-296.

Mardis, E. R. (2008). Next-generation DNA sequencing methods. Annu. Rev. Genomics Hum. Genet., 9, 387-402.

Mardis, E. R. (2011). A decade’s perspective on DNA sequencing technology. Nature, 470(7333), 198-203.

National Human genome Research Institute. “The Human Genome Project Frequently Asked Questions.” Genome.gov. http://www.genome.gov/11006943 (accessed Feb 12 2014).

Nyrén, P. (2007). The History of Pyrosequencing®. In Pyrosequencing® Protocols (pp. 1-13). Humana Press

Roesch, L. F. W., Roberta R. F., Alberto R., George, C., Hadwin, A.K.M., Kent, A.D., Daroub, S.H., Camargo, F.A.O., Farmerie, W.G., and Triplett, E.W. “Pyrosequencing enumerates and contrasts soil microbial diversity.” The ISME journal 1, no. 4 (2007): 283-290.

Simon, C., & Daniel, R. (2011). Metagenomic analyses: past and future trends. Applied and Environmental Microbiology, 77(4), 1153-1161.

Torsvik, V., & Øvreås, L. (2002). Microbial diversity and function in soil: from genes to ecosystems. Current Opinion in Microbiology, 5(3), 240-245.

Wooley, J. C., Godzik, A., & Friedberg, I. (2010). A primer on metagenomics.PLoS Computational Biology, 6(2), e1000667.

About Tim Philpott

Tim Philpott is a PhD student in the Faculty of Forestry's Belowground Ecosystem Group at UBC. He is trying to get a handle on the brave new world of metatranscriptomics and apply it to soil ecology, hopefully before a new sequencing technology makes his new skills obsolete.


By | archive, textbook

The prediction of gravitational waves, which are the ripples of space and time, is a major theory that many scientists are working hard to confirm. Emmanuel Fonseca talks about how the detection of these waves would uncover previously invisible objects such as black holes and cosmic strings, and may have further important implications.

Emmanuel, a 2nd year PhD student in Physics and Astronomy, gives an informative introduction to gravity, gravitational waves, and what research into this concept could mean for the world. In his talk he also speaks to the importance of communicating scientific research and science education.

About Emmanuel Fonseca

Emmanuel Fonseca is a PhD student in UBC Physics and Astronomy, and currently on the hunt for gravitational waves.


By | archive, textbook

Note that this was written in April 2013, with an emphasis on the activities around the Northern Gateway pipeline. In late December 2013, the joint review panel recommended that the Federal government approve the pipeline, “with conditions.” See this link for more information.


Q: What is a pipeline?

A pipeline is a means of transporting products from one point to another via a piping system. Oil pipelines run crude oil and petroleum products from the oil source to the communities that require the gas, or to an alternate point from which the oil can be shipped or transported somehow. Oil pipelines generally run underground or underwater, but sometimes certain sections can be found running above ground. An example of an oil pipeline is the Mississippi Canyon Pipeline that carries gas from an offshore drilling site in the Gulf of Mexico into Louisiana, connecting there with two interstate pipelines onshore. The Mississippi Canyon Pipeline uses 30” diameter piping, runs 45 miles, and has a capacity of around 800 million cubic feet of gas per day [1]. It is owned by Enbridge Offshore Gas Transmission L.L.C.

Q: How does a pipeline leak or spill?

No oil pipeline can 100% guarantee a leak- or spill-free operation. Even outspoken pipeline advocacy groups admit, “While every pipeline company is working to achieve incident-free operations, accidents do happen.” [2] A major factor in the bursting of pipelines is corrosion. Pipelines are usually constructed out of carbon steel that is coated with various protectants, instead of using expensive but structurally sound stainless steel [3]. There is also recent indication that the life span of the pipes may be overestimated by some companies, leaving questionable lines subject to corrosion [4]. There are also other factors in pipeline leaks, like the failed gasket that led to the Elk Point, Alberta leak in the summer of 2012 [5], and excavation projects that disturb pipelines. When oil is being barreled and transported by tankers at the coastal end of an underground pipeline, other causes of spills like shipwrecks must be considered as potential threats too.

Q: How much gas do they spill?

For the year 2009, Enbridge reported 69 spills in Canada and 20 in the U.S.A. (America has different standards for reporting spills), amounting to a total of 8,353 barrels of oil [6] (one barrel equals about 150 litres of oil). British Columbian online environmental news publishers, The Watershed Sentinel, totaled a decade’s worth of Enbridge’s spills from 2000 – 2010: 132,715 barrels [7]. In those ten years, Enbridge was responsible for leaking about 20 million litres of oil throughout North American fields, forests, streams, rivers and seas. Of course, Enbridge is only one of the hundreds of pipeline companies in North America.

Q: Who is responsible for cleanup?

Pipeline companies like Enbridge rely on their own response crews when spills occur. They use trained teams to execute response plans in efforts to mitigate environmental damage. Enbridge uses equipment like skimmers, containment booms, and vacuum trucks and techniques like dredging in their responses to spills, and even have specialized equipment that supposedly keeps wildlife away and out of harm [8]. However, the effectiveness of such equipment and techniques must be called into question in light of such examples of insufficient cleanup efforts as the response to the July 2010 Kalamazoo River spill, which is still being contained and cleaned today.

Q: What are the dangers of a spill?

Oils spills obviously have serious environmental and health consequences. Pollution, fires, property and land damage, water poisoning, and the destruction of habitats of birds, fish, and many other animals are a few of the environmental risks inherent in pipelining oil. For humans, oil spills hold the potential for cancer and leukemia, and offshoots from the consequences of the environmental damage, not to mention serious economical loss [9].

- – -


Q: Who is Enbridge?

Enbridge is a North American pipelining company (with many subsidiaries) that claims to be the “largest single conduit of crude oil into the United States.” [10] They operate many inland and offshore, liquid and natural gas pipelines throughout The U.S.A and Canada, and employ around 6,900 workers [11]. It is important to remember that Enbridge is a corporation, and is concerned with providing financial return for investors. Though preventative measures to avoid spills are taken, Enbridge is not an organization with expressed environmental or health concerns other than how such matters might affect investment return.

Q: What is the proposed pipeline?

Enbridge’s proposed Northern Gateway pipeline would run from the Alberta tar sands through British Columbia to the Pacific coast. A 36” diameter pipeline would be installed over a 1,177 km route from outside Edmonton to Kitimat, B.C., carrying about half a million barrels of petroleum daily to the ocean [12]. It is estimated to cost $5 billion CAD [13]. Construction of the pipeline is currently slated to begin mid-2014, and oil proposed to flow in 2017 [14]. The economic benefits of petroleum trade with Asia are the clear reason for Northern Gateway’s inception.

Q: What are the areas vulnerable to spillage?

The proposed pipeline runs very nearby or through populated areas of British Columbia such as Vanderhoof, Fort St. James, Bear Lake, Burns Lake and Prince George on its way to Kitimat. Inhabitants of these towns and their properties would be under high risk in the event of a leak or spill. The Northern Gateway pipeline would cross more than 1,000 natural waterways including the Fraser and Skeena watersheds [15] and allow oil tankers constant access to the naturally vivacious Great Bear Rainforest region of the British Columbian coast [16].

Q: What happens if it leaks or spills?

If a spill or leak were to occur along the Northern Gateway pipeline, the environmental consequences would be severe. Streams, lakes, rivers and groundwater could be contaminated, and the local human and animal populations would risk poisoning and disease. Ecosystems could be done damage beyond repair. In addition to the harsh damage to marine and forest life, one report claims that a tanker spill near Kitimat has the potential for local and provincial economic damage that greatly outweighs Northern Gateway’s potential economic benefits [17].

- – -


Q: What is the government’s stance?

B.C. Premier Christy Clark is outspokenly ambivalent on the Northern Gateway project. She acknowledges the economic benefit it would bring, but has expressed five criteria that Enbridge must meet. These criteria include second-to-none response plans to ocean or land spills, and the addressing of First Nations’ claims and honouring of treaty rights. The National Energy Board and Canadian Environmental Assessment Agency have formed a joint review panel on the Enbridge Northern Gateway program. The panel is in the process of assessing the environmental consequences of the Northern Gateway pipeline. They consider all relevant material from those concerned and involved, and will release a recommendation to the Governor in Council on whether or not the project should proceed and if so what terms and conditions Enbridge must adhere to [18]. As of now the Panel is requesting more information from Enbridge Northern Gateway regarding potential hydrocarbon releases and the geographic effects of the proposed route [19]. The Panel’s delay and the Premier’s ambivalence could be seen as signs of a reluctant province.

Q: What is Enbridge’s assurance?

Enbridge claims, “Pipelines are the safest, most efficient and most reliable way to transport liquid and natural gas energy resources.” [20] One must ask how such a safe and reliable means of transportation can yield so much leakage and spillage with such consistency. As mentioned earlier, no pipeline can be guaranteed to be leak or spill free. The Northern Gateway project promises that their “goal is to have zero spills” [21] and that they plan on taking extra precautions, highlighted by 24/7 monitoring and safety valves near water crossings [22]. Enbridge’s Northern Gateway project advocates repeatedly assure British Columbians that there is no safer way to transport natural gas than pipelines [23], which seems to somewhat undermine the damage done in areas like the Kalamazoo River.

Q: Why is this not enough?

Strict monitoring and strategically placed released valves only highlight the imminence of a leak or spill along the Northern Gateway pipeline. While these measures are effective responses to gas leaks and spills, they aren’t magic wands. If oil spills into the ocean, irreparable damage will be done to marine life. If oil leaks into B.C.’s forests, irreparable damage to animals and plants will be done. When oil leaks into your backyard, you will be very unhappy.

- – -


It seems there are two alternatives to proceeding with Enbridge’s Northern Gateway project as planned. The first would be to impose stricter regulations regarding leak and spill prevention that would bring the operation nearer to a 100% guaranteed spill-free standard. The other would be to wave goodbye to the Enbridge Northern Gateway project entirely and assess where else $5 billion might be spent to stimulate British Columbia’s economy and generate jobs.

At enormous costs, pipelines could be made out of stainless steel to prevent corrosion. While it seems unlikely that pipeline companies would spring for this cost, other precautions can be taken to reduce corrosion. Stronger steel can be sought out and afforded, as well as stronger protectants for coating the outside of the pipes [24]. The pipelines could also be cased in outer pipes to prevent damage to the inner ones that carry the oil. As a rule, pipelines should be clearly marked so as to avoid accidental disruption by construction or excavation projects. Of course, marking the route of the pipeline comes at the expense of marring the natural, rural or urban landscape with such markers.

Of course, alternative investments that have less potential for environmental and human harm should be strongly considered. In the Canadian Centre for Policy Alternatives’ publication, “Enbridge Pipedreams and Nightmares,” Marc Lee outlines alternate investment options in green-oriented fields. Lee reveals how investments in such fields could translate into 3 to 34 times as many jobs as the Northern Gateway project would [25]. He offers alternative jobs in fields like construction, education, and manufacturing that would contribute four specific areas of alternate investment: a switch from fossil fuels to clean energy sources (of which Lee suggests that Enbridge be a participant), implementing transportation alternatives, improvements in energy efficiency to homes and buildings, and reduction of GHG emissions via advancements in the recycling system [26].

Considering the imminent threat a pipeline, however fortified, would bring to the province and its local communities and the potential economic benefits of alternative investments, the more appealing option seems to be a scrapping of Enbridge’s plan for the Northern Gateway project altogether. Perhaps an alternate pipeline could be constructed that would avoid risks to British Columbian forests and coasts, like a pipeline that runs eastward from the tar sands. Or better yet is a realization of Marc Lee’s proposal, investments in clean energy sources and a movement away from natural gas.

- – -


The Enbridge Northern Gateway Joint Review Panel formed by the National Energy Board and Canadian Environmental Assessment Agency has been holding forums in B.C. and Alberta in order to accrue information from the public about concerns over Enbridge Northern Gateway. However, access to these public forums is limited, so members of the public must submit advanced questions and hope to be included in the panel’s process [27]. As an individual, it can be frustrating to try and negotiate the system of deliberation on Northern Gateway. Fortunately, there are several organizations intent on halting the project, and one could reach out and become involved with these organizations in efforts to aid such a cause.

The Canadian Centre for Policy Alternatives is a non-profit organization that conducts in-depth research regarding matters that might be taken for granted otherwise. The CCPA has a mandate to provide balanced debates, debunk myths generated by corporations, the government and the media, and offer sound and sustainable solutions to the problems it uncovers [28]. The CCPA forefronts relevant environmental and economic issues, and is a leading voice in the call to review alternatives to the construction of Enbridge Northern Gateway. Marc Lee’s CCPA publication is extremely valuable in the struggle against the pipeline as it uncovers how Northern Gateway would not only be environmentally hazardous, but economically unsound. You can get involved as a member or donor with CCPA via their website: www.policyalternatives.ca

Forest Ethics is another group that has decidedly aligned itself opposite Enbridge Northern Gateway. Their cause is expressly concerned with the protection of forests and waters in British Columbia, as well as ensuring the environmental responsibility of industries and corporations. Forest Ethics are outspoken anti-pipeline advocates, and have criticized the Joint Review Panel for a lack of transparency in their decision making process [29]. To be a part of Forest Ethics’ activities or to donate, visit their website: www.forestethics.org

The Blue Drop organization is a much more grassroots movement. It is extremely accessible to environmentally minded individuals in British Columbia. Their stance is decidedly against “the destruction of land and water by resource extraction projects.” [30] Blue Drop provides immediate information regarding ongoing with the Enbridge Northern Gateway process, and organizes events aimed at making the voices of the opposing public heard by policy and decision makers. Blue Drop has begun a trend of people pinning blue felt water droplets to their lapels, backpacks, jackets and whatever else creating a visible symbol of public resistance and solidarity against the Enbridge Northern Gateway project. Join the Facebook group and make your own blue drop! Just visit www.bluedrops.ca

- – -


1. “Mississippi Canyon Gas Pipeline,” Enbridge U.S. Operations, accessed March 16, 2013, LINK

2. “Operating Pipelines,” Pipeline 101, accessed March 16, 2013, LINK

3. “How Safe Are North America’s Pipelines?” Globe And Mail, last modified Aug. 23, 2012, accessed March17, 2013, LINK

4. Ibid.

5. Nathan Vanderklippe, “Enbridge Pipeline Reopens After Spill Near Edmonton,” Financial Post, last modified June 20, 2012, accessed March 16, 2013, LINK

6. “Looking at Enbridge’s Spill Record in 2009,” Enbridge, accessed March 16, 2013, LINK

7. “Enbridge Spills,” Watershed Sentinel, accessed March 19, 2013, LINK

8. “Emergency Response,” Enbridge, accessed March 20, 2013, LINK

9. “Fuel Pipelines,” Groundwork Environmental Justice Action in Southern Africa, accessed March 19, 2013, LINK

10. “Pipeline Systems,” Enbridge U.S. Operations, accessed March 17, 2013, LINK

11. “Company Overview,” Enbridge U.S. Operations, accessed March 17, 2013, LINK

12. “Project at a Glance,” Enbridge Northern Gateway Pipelines, accessed March 18, 2013, LINK

13. Marc Lee, “Enbridge Pipedreams and Nightmares,” pp. 4, Policy Alternatives, accessed March 20, 2013, LINK

14. “Timeline,” Enbridge Northern Gateway Pipelines, accessed March 18, 2013, LINK

15. “The Facts – Enbridge Northern Gateway,” Forest Ethics, accessed March 19, 2013, LINK

16. Ibid.

17. “Single spill could wipe out economic gains from Northern Gateway,” Media Release, UBC Public Affairs, accessed March 18, 2013, LINK

18. “The Joint Review Process,” Gateway Panel, accessed March 18, 2013, LINK

19. “Panel Session Results and Decision Questions and Answers,” Gateway Panel, accessed March 18, 2013, LINK

20. “Pipeline Safety,” Enbridge U.S. Operations, accessed March 17, 2013, LINK

21. “Pipeline Assessment and First Response Plan,” Enbridge Northern Gateway Pipelines, accessed March 19, 2013, LINK

22. Ibid.

23. “Pipeline Safety,” Enbridge Northern Gateway Pipelines, accessed March 19, 2013, LINK

24. “Fuel Pipelines,” Ground Work Environmental Justice In Southern Africa, accessed March 19, 2013, LINK

25. Marc Lee, “Enbridge Pipe Dreams and Nightmares,” pp. 17-19, Policy Alternatives, accessed March 20, 2013, LINK

26. Ibid.

27. “Panel Session Results and Decisions Questions and Answers,” Enbridge Northern Gateway Project Joint Review Panel, accessed March 19, 2013, LINK

28. “About,” Policy Alternatives, accessed March 19, 2013, LINK

29. “Canada’s Enbridge Northern Gateway pipeline and tanker hearings” Forest Ethics, accessed March 20, 2013, LINK

30. Blue Drop Facebook page, accessed March 20, 2013, LINK

- – -


“About.” Policy Alternatives. Accessed March 19, 2013. LINK

Blue Drop facebook page. Accessed March 20, 2013. LINK

“Canada’s Enbridge Northern Gateway pipeline and tanker hearings.” Forest Ethics. Accessed March 20, 2013. LINK

“Company Overview.” Enbridge U.S. Operations. Accessed March 17, 2013. LINK

“Emergency Response.” Enbridge. Accessed March 20, 2013. LINK

“Enbridge Spills.” Watershed Sentinel. Accessed March 19, 2013. LINK

“The Facts – Enbridge Northern Gateway.” Forest Ethics. Accessed March 19, 2013. LINK

“Fuel Pipelines.” Groundwork Environmental Justice Action in Southern Africa. Accessed March 19, 2013. LINK

“How Safe Are North America’s Pipelines?” Globe And Mail. Last modified Aug. 23, 2012. Accessed March17, 2013. LINK

“The Joint Review Process.” Gateway Panel. Accessed March 18, 2013. LINK

Lee, Marc. “Enbridge Pipedreams and Nightmares.” pp 4-26, Policy Alternatives. Accessed March 20, 2013. LINK

“Looking at Enbridge’s Spill Record in 2009.” Enbridge. Accessed March 16, 2013. LINK

“Mississippi Canyon Gas Pipeline,” Enbridge U.S. Operations. Accessed March 16, 2013. LINK

“Operating Pipelines.” Pipeline 101. Accessed March 16, 2013. LINK

“Panel Session Results and Decisions Questions and Answers.” Enbridge Northern Gateway Project Joint Review Panel. Accessed March 18, 2013. LINK

“Pipeline Assessment and First Response Plan.” Enbridge Northern Gateway Pipelines. Accessed March 19, 2013. LINK

“Pipeline Safety.” Enbridge Northern Gateway Pipelines. Accessed March 19, 2013. LINK

“Pipeline Safety.” Enbridge U.S. Operations. Accessed March 17, 2013. LINK

“Pipeline Systems.” Enbridge U.S. Operations. Accessed March 17, 2013. LINK

“Project at a Glance.” Enbridge Northern Gateway Pipelines. Accessed March 18, 2013. LINK

“Single spill could wipe out economic gains from Northern Gateway.” Media Release, UBC Public Affairs. Accessed March 18, 2013. LINK

“Timeline.” Enbridge Northern Gateway Pipelines. Accessed March 18, 2013. LINK

Vanderklippe, Nathan. “Enbridge Pipeline Reopens After Spill Near Edmonton.” Financial Post. Last modified June 20, 2012. Accessed March 16, 2013. LINK

About Matt Clarke

Matt is an English Lit. major from Vancouver. He loves satire and sci-fi and is passionate about environmental preservation.