A widely cited study on the amount of methane leaking from oil and gas sites, including fracked wells, shows signs of a major flaw, a newly published peer-reviewed paper concludes.
“The University of Texas reported on a campaign to measure methane emissions from United States natural gas production sites as part of an improved national inventory,” researcher Touché Howard wrote in a paper published today in the journal Energy Science & Engineering. “Unfortunately, their study appears to have systematically underestimated emissions.”
The University of Texas study, the first in a 16-part research series backed by the oil and gas industry and the Environmental Defense Fund, had been hailed as “unprecedented” when it was published in October 2013. The drilling industry and its supporters cited it as clear-cut evidence that methane leaks were lower than previously believed and falling further due to new technology.
The study’s key contribution to the science on methane leaks was that researchers were allowed to access to oil and gas wells, including 27 wells where fracking was underway, and test individual pieces of equipment. “This is actual data, and it’s the first time we’ve had the opportunity to get actual data from unconventional natural gas development,” Mark Brownstein, an Environmental Defense Fund associate vice president, told FuelFix when the UT study was published.
But the problem stems from the tool that the University of Texas study used to collect its data – which can malfunction when leaks are spewing at high rates. The “University of Texas study underestimates national methane emissions at natural gas production sites due to instrument sensor failure,” Mr. Howard, who invented the basic technology used by that instrument, wrote.
How big could the resulting problem be? “Over 98% of the [methane] inventory calculated from their own data and 41% of their compiled national inventory may be affected by this measurement failure,” the new paper concludes. Correcting that data isn’t a realistic possibility, since erroneous readings could have been off by a small amount or as much as 100-fold.
Methane is a potent greenhouse gas which warms the climate 86 times more than an equal amount of carbon dioxide (CO2) in the first two decades after it leaks – and if enough methane leaking from the nation’s oil and gas sites, burning natural gas could be worse for the climate than burning coal. Studies of methane leaks nationwide have come to conclusions that suggest very different leakage rates – some that could suggest gas could be a better fuel for the climate than coal, and others that suggest that using gas over coal for electricity could quickly move the climate from the frying pan into the fire.
The $18 million 16-part research series aims to provide a comprehensive national leakage rate that policy-makers can rely on – which is why the new evidence that a major part of their research relies on problematic data is so important.
“If Howard’s right, we’ll need to review other emission estimates used in EPA inventories,” Robert Jackson, an earth science professor at Stanford University who studies methane leaks, told Inside Climate News. “We need to sort this out as quickly as possible.”
Mr. Howard’s paper does not prove that the UT study understated the amount of methane that leaked at those wells, but it describes several red flags that strongly suggest that tool malfunctions led to problems.
“That such an obvious problem could escape notice in this high profile, landmark study highlights the need for increased vigilance in all aspects of quality assurance for all [methane] emission rate measurement programs,” his newly published paper concludes.
A Problem of Measurement
The University of Texas study used three Bacharach brand meters, which generally cost around $20,000 each, to collect its data. The meters are not only sometimes used to measure leaks reported to the EPA by drillers, but they’re also used by the industry to identify what leaks are most serious and merit the fastest repairs.
Back in March, a peer-reivewed paper published in the Journal of the Air & Waste Management Association and also co-authored by Mr. Howard, concluded that the Bacharach meter has a troubling flaw. The meter relies on two different sensors – a sensitive one to sniff out small leaks and a more powerful one to measure leaks with higher amounts of methane. The meter is designed to switch from one sensor to the other when methane leaks rise – but it turns out that something can go wrong in that handoff, the researchers found.
That problem can cause the Bacharach meter to provide erroneously low readings. “[W]e’ve never seen it get stuck in the high range – it’s always comes back down to the low range and to zero,” Mr. Howard told DeSmog.
Mr. Howard, who currently works as a firefighter, holds the patent for the technology underlying the Bacharach meters but was not directly involved in developing the Bacharach meter (nor is he a direct competitor for Bacharach, as he does not market other similar meters, though he does use a custom-built version for his own sampling work).
“In the [University of Texas] pneumatic study, as part of testing their Hi-Flow, I discovered that one of the meters they were using to measure emissions from pneumatic devices was measuring a factor of three too low,” Mr. Howard said.
That discovery drove him to take an even closer look – and he found evidence that the data reported by the University of Texas was likely affected by the faulty switchover between sensors. The data itself seem to show patterns that suggest low readings came from a problem meter, not low actual leak rates – and those patterns held true even when taking into account how new technology or different state regulations could have made leak rates fall.
The University of Texas researchers had tried to catch potential errors by adding tracers to the gas – but Mr. Howard noticed some inconsistencies between the tracer readings and the methane readings that corroborated the idea that the meters had malfunction.
The lead researcher on the University of Texas, Dr. David Allen, has served since 2002 on the EPA’s Science Advisory Board and this year completed a two year term as its chair. He rejected Mr. Howard’s conclusion that the data collected could have been flawed.
“There may be issues with some of these instruments, but we tested our instruments pretty thoroughly and when we went out into the field we had multiple instruments, all of which gave us information,” he told The New York Times.
After the March paper showing the Bacharach potential for erroneous readings was published, Mr. Allen authored a reply comment noting that they did find issues with one of the three meters used, but that the issue “did not significantly impact” their conclusions.
The stakes in this debate are enormous. The natural gas industry is the largest source of methane emissions in the country – and for the past decade, a shale gas rush has swept across the U.S., creating a vast new network of wells, pipelines, compressor stations, and storage facilities.
If natural gas leaks at more than 2.7 percent overall, any climate change benefit gained by switching away from coal for electricity generation and burning natural gas instead will be lost.
For that reason, debates over leakage rates are raging.
The University of Texas study concluded that 0.42 percent of natural gas could be expected to leak during well completions – a number that would be consistent with EPA’s official but controversial estimate that 1.5 percent of natural gas leaks overall, after accounting for leaks from places like pipelines (the agency’s own Inspector General called a closer review of that figure).
The University of Texas study’s finding had been cited in many publications as evidence that fears were overblown.
“Natural-gas drilling sites aren’t leaking as much methane into the atmosphere as the federal government and critics of hydraulic fracturing had believed, according to the first study of emissions at multiple drilling sites,” the Wall Street Journal reported when the University of Texas research was released.
“The industry has led efforts to reduce emissions of methane by developing new technologies and equipment, and these efforts are paying off,” Howard Feldman, director of regulatory and scientific affairs for the American Petroleum Institute, said at the time. “This latest study shows that methane emissions are a fraction of estimates from just a few years ago.”
“The reason the Environmental Defense Fund wanted this study done is precisely so that unassailable data, rather than mere estimates, could become part of the debate over fracking,” wrote New York Times columnist and shale gas supporter Joe Nocera. “You can’t have sound regulation without good data.”
But right from the start, the University of Texas conclusions came under fire, in part because it was funded by the industry and the sites selected were chosen by drillers.
More importantly, however, flyovers of oil and gas sites have repeatedly found much higher methane levels in the atmosphere than would be predicted if readings on the ground were comprehensive – levels far higher than the 2.7 percent threshold.
A 2013 study in Utah’s Uinta basin found leak rates between 6 and 12 percent.
Last year, NASA researchers found a methane plume over New Mexico, where over 4,000 natural gas wells have been drilled, that the agency said was “more than triple the standard ground-based estimate.”
A 2014 study in northeastern Colorado similarly found roughly triple the amount of methane that on-the-ground tests would predict. “These discrepancies are substantial,” lead author Gabrielle Petron, an atmospheric scientist with NOAA’s Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder, told GeoSpace.
As the scientific debate continues, policy-makers are facing tough calls about whether natural gas can help wean the country off of coal – or whether relying on that gas will only make things worse for the climate.
Currently, the Obama administration relies on the voluntary participation of oil and gas companies in a program called the Natural Gas STAR program to reduce emissions. That program, which operates on the logic that companies can save money by capturing leaks since they can then sell that gas instead of wasting it, has had a poor track record. Less than one percent of oil and gas producers and operators have decided to participate in that program. It prevented 24 million tons of CO2 equivalent from leaking in 2013 – out of 182 trillion tons that leaked out that year.
On August 3, the Obama administration announced a Clean Power Plan designed to tackle greenhouse gas emissions over all. But much remains unclear about how that plan will address methane leaks.
A White House statement on that plan emphasized voluntary leak controls. “EPA and other agencies are taking actions to cut methane emissions from oil and gas systems, landfills, coal mining, and agriculture through cost-effective voluntary actions and common-sense standards,” the administration wrote.
As the scientific debate over leakage rates slowly continues to unfold, some argue that action on methane leaks is needed now, regardless of the precise degree that natural gas pollution stacks up against coal.
David Doniger, head of the climate policy program at the Natural Resources Defense Council told the Washington Post last year, that regulating methane would be “the most significant, most cost-effective thing the administration can do to tackle climate change pollution that it hasn’t already committed to do.”