In an editorial, the Economist this week argues that “if Britain wants an American-style energy boom, it should import American-style local taxation.” In short, they argue that differences in public opinion toward fracking are driven by differences in how the benefits of development are distributed. In the UK (and most other European countries), subsurface mineral rights are held by the state, and taxes on production are generally collected at the national level. Local communities therefore don’t get or at least don’t easily see the economic benefits of development, only its environmental and other costs. The editorial argues that distributing revenues back to communities would increase local support, enabling greater development that would (it is implied) benefit the UK in general.
This argument is not new – private ownership of subsurface rights are a frequently cited structural advantage for oil and gas development in the US. The argument is also appealing, especially for me – as one of my professors loved to say, lawyers are “institutional engineers”. If policy outcomes can be explained by relatively small differences in legal institutions like the distribution of subsurface rights, then the application of some institutional engineering can probably improve things. Even if not, just being able to argue that law drives public opinion and policy is attractive, partly because it makes me look good in front of colleagues from other disciplines, like the economists I work with here at RFF.
Nevertheless, these institutional differences can’t be the only driver of differing attitudes toward fracking in the US and UK. As the editorial points out, attitudes toward development vary greatly within the US, despite universal state-level control over regulation and taxation. Even ownership of subsurface rights isn’t a good explanation for these differences, at least by itself. New York’s moratorium on fracking continues, driven by a highly skeptical public, while development generally has greater support in the West – this despite the fact that the federal government is by far the largest landowner in that part of the country. Severance taxes (taxes on gas production) also vary greatly across the country, making it difficult to draw any connections between public opinion and public (or at least government) benefit.
Over the past 60 years, regulators have implemented market-based programs for air pollution, water pollution, land management, and other environmental policy problems at local, state, federal, and—in the case of greenhouse gas regulation—international levels. Some applications hew more closely than others to ideal market-based policy design, as defined by economic theory, and programs have met with varying degrees of success. In this symposium, drawn from discussions at RFF’s December 2012 First Wednesday Seminar, four RFF scholars draw lessons from the successes—of which there have been few—and the many failures. They also consider the desirability, feasibility, and design of market-based environmental policy in the future.
Click here to read the full article.
In a recent analysis of real estate data from Portland, OR; Austin, TX; and the Research Triangle region of North Carolina, we find, with colleague Todd Gerarden, that local “green” certifications appear to have a larger impact on sales prices for homes than the national Energy Star certification. We also find that Energy Star certification only affects sales prices of homes built between 1995 and 2006 but not newer homes.
Twenty two percent of US carbon dioxide emissions can be attributed to residential buildings, and investments in the energy efficiency of these homes can help reduce both energy bills for homeowners and emissions. But the uncertainty over whether investments in energy efficiency can be recouped upon the sale of a home can lead to underinvestment—buyers may not have full information or understanding of a home’s energy efficient features and sellers may not be able to reliably or accurately advertise those features in the marketplace.
The federal government’s Energy Star program has helped to address this information gap by certifying new homes that are 15 percent more energy efficient than other new homes on the market. And the US Green Building Council oversees the more rigorous LEED (Leadership in Energy & Environmental Design) certification program. Many local certification programs exist around the country as well. But how do these certifications affect home prices?
In a new RFF discussion paper, we assess the impact of Energy Star and two local “green” certifications on home sales prices. Using data from real estate multiple listing services in three independent housing markets, we find that the local certifications appear to have larger effects on sales prices than Energy Star, for newer homes as well as older ones. The local certifications encompass green attributes beyond energy efficiency, including considerations for landscaping, building materials and water efficiency, among others. However, which of these different factors is most important is still a question. We also find that Energy Star certification only affects the sales prices of older homes, not new ones. We hypothesize that the reason for this finding is the improved energy efficiency of new homes, even uncertified ones.
Certification provides an information signal to the marketplace and can be a valuable way for homebuyers to learn about the energy efficiency and other “green” attributes of houses. However, it is an open question how homebuyers interpret these certifications and which of the attributes beyond energy efficiency that are included in local green certification schemes are of most value.
Each week, we review the papers, studies, reports, and briefings posted at the “indispensable” RFF Library Blog, curated by RFF Librarian Chris Clotworthy. Check out this week’s highlights below:
Reducing Carbon Black and Methane Emissions Provide Only Modest Benefits
Emissions reductions focused on anthropogenic climate-forcing agents with relatively short atmospheric lifetimes, such as methane (CH4) and black carbon, have been suggested as a strategy toreduce the rate of climate change over the next several decades. We find that reductions of methane and black carbon would likely have only a modest impact on near-term global climate warming. — via Proceedings of the National Academy of Sciences
Overwhelming Risk: Rethinking Flood Insurance in a World of Rising Seas
Subsidized insurance rates, the practice of passing through damage and loss costs to taxpayers, and the lack of accurate information on flood risks — all these factors have led to more coastal development, more exposure to climate risks, and less incentive to take measures that reduce these risks. To address this, UCS recommends the following…
— via Union of Concerned Scientists
Saving Oil and Gas in the Gulf
The systemic waste of oil and gas in the Gulf is eroding economic resilience to shocks and increasing security risks, including to citizens’ health. Success or failure in setting and meeting sustainable energy goals in the Gulf Cooperation Council (GCC) countries will have a global impact. — via Chatham House
Wildlife Fire Management: Improvements Needed in Information, Collaboration, and Planning to Enhance Federal Fire Aviation Program Success
…The Forest Service and Interior contract for aircraft to perform various firefighting functions, including airtankers that drop retardant. The Forest Service contracts for large airtankers and certain other aircraft, while Interior contracts for smaller airtankers and water scoopers. However, a decrease in the number of large airtankers, from 44 in 2002 to 8 in early 2013–due to aging planes and several fatal crashes–has led to concerns about the agencies’ ability to provide aerial firefighting support. — via U.S. Government Accountability Office
Lessons from Hurricane Isaac: Gulf Coast Coal & Petrochemical Facilities Still Not Storm Ready
Problems at coal, chemical and oil facilities, many of them preventable, resulted in extremely high levels of air and water pollution during and after Hurricane Isaac. The findings are released today in a new report – Lessons from Hurricane Isaac – researched and written by the Gulf Monitoring Consortium (GMC)… — via Gulf Monitoring Consortium Report
For more from the RFF Library blog, click here.
This is a guest post by Brian Potts, a partner at law firm Foley & Lardner, LLP in Madison. Yesterday I (Nathan) critiqued some arguments he’s made recently regarding prospects for EPA’s future carbon performance standards for power plants. I’m happy to offer Brian space here to respond to that critique. -Ed
First, I would like to thank Nathan for his comments. There has been a substantial amount of conversation in the press about what EPA might do with its climate rules, but there has been a striking lack of discussion about what EPA can do with those rules under the Clean Air Act. This is therefore an important dialog. Let’s start with the areas where Nathan and I agree. Nathan is right to point out that there is a range of risk that EPA can take with these rules. And we obviously agree that these rules are not going to achieve the President’s target of a 17% economy-wide emissions reduction by 2020 on their own. But he has a few facts wrong, and is making a mistake in assuming that the agency’s past BACT determinations won’t impact what EPA can do now (to be fair, Nathan’s comments were based only on my op-ed in The Hill, and not on my longer article that was just released yesterday in the Yale Journal on Regulation). Here are my problems with Nathan’s analysis:
- Nathan seems to assume that BACT only applies to new power plants. However, EPA’s Tailoring Rule that went into effect in 2011 requires CO2 BACT to be conducted for both new plants and significantly modified existing plants. Most of the CO2 BACT determinations to date have been for existing sources (one of which I site in my article), and these determinations have only led to a few percentage point reductions in emissions. To my knowledge, they have also uniformly concluded that carbon capture and sequestration is either not commercially available or is too expensive, and fuel switching is not allowed (I’ll come back to this point below).
- Nathan argues that EPA can set NSPS/ESPS standards that are more stringent than previous BACT limits, otherwise “EPA would have a hard time ever strengthening NSPS, or writing new ones.” But EPA has historically set NSPS standards first as the floor and then BACT determinations happen on a plant-by-plant basis later. The NSPS were created as part of the 1970 Clean Air Act, and BACT requirements were added to the Act later in 1977. I’m not aware of a situation where the agency has first set a bunch of BACT limits, and then tried to establish a new uniform NSPS/ESPS. And while EPA does revise NSPS standards from time to time, I’m also not aware of a situation where they revised the NSPS to be more stringent than the recent BACT determinations that were set for the same types of sources.
- Perhaps more importantly, the technology tests under the Clean Air Act are the same in all material respects for the NSPS/ESPS and BACT. The chosen technology has to be both commercially available and cost-effective. So – even if the courts agree that the Act does not bar the EPA from adopting NSPS/ESPS that are more stringent than previous BACT determinations – the courts would almost certainly find that the EPA was arbitrary and capricious if the agency set an NSPS/ESPS at a significantly more stringent level than past BACT determinations that were recently set for the exact same types of sources.
- Finally, the suggestion that EPA can regulate coal plants and gas plants together is tenuous. EPA’s proposed CO2 NSPS that it released last year did just that – and after receiving comments from industry that pointed out the unlawfulness of the approach, the word on the street is that EPA is going to re-propose separate NSPS for new coal and gas plants. This is because a natural gas plant is not a carbon control technology for coal plants. It’s a completely different kind of power plant. And it has long been EPA’s policy that technology-based limits should not “redefine the source.” This position has also been adopted by the courts, and is the reason why fuel switching is not BACT in any of the prior determinations.
EPA knows it will have an uphill battle regulating power plant emissions using the Clean Air Act. Amy Harder in the National Journal back in 2010 quoted now EPA Administrator Gina McCarthy as admitting at the time that “GHG permitting is not a process for reducing overall GHG emissions.” My position is that EPA should take a conservative approach with these rules, or it will end up with nothing at all.
Brian Potts says EPA existing-source performance standards (ESPS) for power plant carbon emissions won’t matter much since they can’t or won’t be very stringent. This is partly true, if a bit overstated. You’re certainly kidding yourself if you’re counting on ESPS to take care of US climate policy on their own – though I know of nobody making that claim. But if ESPS are modest it will be because of conservatism at EPA (and the states), not the legal or technological limitations Potts claims.
Potts argues that the Clean Air Act itself bars EPA from enacting stringent ESPS. EPA has already done a few case-by-case reviews of new power plants (NSR) and required those plants to reduce their emissions by only a few percent. Potts argues that the statute prevents EPA from issuing performance standards that are any tighter than this. I don’t think that’s true. When a new plant is built, the law requires it to use “best available control technology” or BACT, which must be at least as effective at reducing emissions as the prevailing performance standards for new or existing sources under §111. (This BACT review is what EPA has already done at a few new plants). But the reverse is not true – the law doesn’t require performance standards to be less stringent than all current BACT reviews. If it did, EPA would have a hard time ever strengthening new-source performance standards (NSPS), or writing new ones, as it is doing now. Once the agency does finalize NSPS, any future BACT reviews will have to be at least as strict. But I don’t see how one-off BACT analysis limits EPA’s authority to set sector-wide NSPS.
While state governments are the primary regulators of oil and gas development, landowners have the first and arguably greatest opportunity to shape drilling firms’ conduct. This is most obvious in the case of the largest landowner of all: the federal government, which controls 700 million acres of subsurface rights (plus 56 million subsurface acres of Indian mineral estate) across 24 states. Large landowners have a lot of negotiating power in setting lease terms. With this much land, the federal Bureau of Land Management (BLM) has so much negotiating power that it can essentially dictate terms – its rules on how drilling activity can take place on federal lands look just like regulation, making it the largest “regulator” of drilling activity in the country.
BLM last revised its oil and gas regulations (the Onshore Orders) in the 1980’s and early 1990’s, well before the recent rapid expansion of shale gas development. Thus the Agency felt it needed to propose revisions and additions to its rules aimed specifically at hydraulic fracturing activity (both shale gas or tight oil). There have been two rounds of proposed revisions so far, the most recent issued in May of this year after 177,000 comments were received on the first round changes. Both industry and environmental groups have criticized the proposal.
What would these proposed rules mean for shale development on federal lands and, crucially, how would they interact with existing state rules? We discuss these questions in a new paper. In the proposal, BLM is explicit that it is not “occupying the field” and thereby preempting state law (in fact, it can be argued that it is not really regulating at all). Therefore state law is not preempted and operators must continue to comply with it, at least so long as there is no direct conflict with federal law. This means that stricter state rules prevail over relatively weaker BLM rules.
Understanding the interactions between state law and BLM rules is important because of opposing views about the federal role in shale gas development. To perhaps grossly generalize, one view is that BLM regulations are unnecessary and burdensome and that the federal government should just defer to states. The other view is that federal rules are inadequate to protect the public and the land, and that new BLM rules should remedy that and also provide a model for states.
In comparison with BLM’s earlier 2012 proposed rules, industry outcry clearly persuaded BLM to make the rules less restrictive, particularly by removing requirements for fracking fluid disclosures before drilling could commence.
We compared BLM’s proposed rules with regs in the six states with shale development and the largest amount of federal land. The story here is complex. There are some gaps in the federal regulations compared to what some states regulate. Sometimes this is because BLM lacks the authority to regulate. Other times, updates may not be needed. For example, many BLM well integrity rules were written originally as performance standards (i.e., thou shalt not pollute groundwater) and which, therefore, do not need updating for technological change or public demands (so long as monitoring and enforcement keep up). Still other areas are simply absent, such as setback requirements. Finally, there are some areas in which BLM rules would limit activities that some states currently leave unregulated, like use of liners on fluid storage pits and some types of fracking fluid disclosure.
But generallythe BLM rules do not impose tighter requirements than those the states already have. A caveat to these conclusions is that we cannot observe what happens during the permitting process, either at the federal or state level. So how stringent the regulations are in practice is a very open question.
Our read of this analysis is that BLM rules should be more focused on areas not adequately or consistently regulated at state level. One candidate would be tighter restrictions on flowback/produced water containment, which is not well addressed in some states. This would be a better target than, setback restrictions, for example, which are far more prevalent at the state level.
BLM could go further: some have called for rules that set a comprehensive “minimum federal floor” or, further, a “gold standard” for states. Whether this course is wise is debatable. BLM may lack the resources and expertise to regulate comprehensively. On the other hand, federal lands held in trust for the people should arguably receive the greatest protections – if anywhere should be the “gold standard,” it is probably federal lands. In any case, either a baseline or “gold standard” would be a much more modest approach than new general federal regulation. States would remain free to regulate less stringently outside of federal lands.
BLM to date has taken a minimalist approach, apparently due to industry influence. It should decide whether to continue on that path and be open about its philosophy.
Scott Horner, a farmer in western Colorado, awaits the bloom of spring lilacs with anticipation each year. Once the lilacs peak, it’s time for Horner to plant potatoes and launch the growing season.
The fragrant bunches have been appearing earlier and earlier in recent years, Horner has noticed.
Earlier lilacs, plants growing out of turn, an extended wildfire season: these are some of the observations and data that are shared in a new science journalism project, iSeeChange, launched at KVNF-TV in Paonia, Colorado. The initiative is designed to bring local farmers, ranchers, and residents into conversation with scientists about environmental issues.
iSeeChange producer Julia Kumari Drapkin described her project at RFF’s February 2013 First Wednesday Seminar, “The Media, Science and Cognition: How We Shape Our Understanding of Environmental Issues.” The discussion was part of RFF’s 60th anniversary event series, Resources 2020, a yearlong exploration of how economic inquiry can address future environmental challenges.
Drapkin, along with Barbara Allen, the Ada M. Harrison Distinguished Teaching Professor of the Social Sciences and chair of Women’s and Gender Studies at Carleton College in Northfield, Minnesota, led the discussion about how the public understands and shapes its opinions about global warming and other environmental issues.
Drapkin went to Paonia, which last year experienced its worst drought since 1924, to launch iSeeChange as part of Localore, a nationwide project funded primarily by the Corporation for Public Broadcasting to bring public service journalism to local media. Her vision was to turn science reporting upside down by starting with the observations of local residents on changes in climate and bringing scientists into the conversation to discuss how those observations fit and compare with broader scientific research.
Corporate Average Fuel Economy (CAFE) standards are the primary U.S. policy aimed at reducing greenhouse gas emissions (GHGs) from new cars, requiring manufacturers to achieve specific fuel economy and GHG emissions rates for their fleets. Proponents of CAFE and other energy efficiency policies argue that they correct market failures associated with the adoption of energy-saving technologies. As evidence of a market failure, advocates often point to a 2009 report by the Electric Power Research Institute and McKinsey that estimated negative costs (energy savings greater than adoption costs) for a wide range of efficiency investments. The report also suggests that huge GHG emissions reductions are possible through energy efficiency policies and at very low or even negative cost. But recent research at RFF suggests those reductions may be overstated for passenger vehicles.
Each week, we review the papers, studies, reports, and briefings posted at the “indispensable” RFF Library Blog, curated by RFF Librarian Chris Clotworthy. Check out this week’s highlights below:
Charged Up: Southern California Edison’s Key Learnings about Electric Vehicles, Customers and Grid Reliability
Southern California Edison (SCE) released a white paper summarizing learnings from its Electric Vehicle (EV) readiness program. The paper…shares information based on customer data and utility operations gathered since SCE began to prepare the distribution system and its customers for widespread electric vehicle (EV) adoption in its service territory… — via Southern California Edison (SCE)
The State of the Climate in 2012
The National Oceanic and Atmospheric Administration on Tuesday issued a peer-reviewed 260-page report, which agency chief Kathryn Sullivan calls its annual “checking on the pulse of the planet.” The report, written by 384 scientists around the world, compiles data already released, but it puts them in context of what’s been happening to Earth over decades… — via US National Oceanic and Atmospheric Administration
Censored EPA Presentation on Groundwater Pollution in Dimock, PA
DeSmogBlog has obtained a copy of an Obama Administration Environmental Protection Agency (EPA) fracking groundwater contamination PowerPoint presentation describing a then-forthcoming study’s findings in Dimock, Pennsylvania. The PowerPoint presentation reveals a clear link between hydraulic fracturing (“fracking”) for shale gas in Dimock and groundwater contamination, but was censored… — via DeSmogBlog
Electric Vehicle Grid Integration in the U.S., Europe, and China: Challenges and Choices for Electricity and Transportation Policy
A recently published paper by M.J. Bradley & Associates, commissioned by the Regulatory Assistance Project (RAP) and the International Council on Clean Transportation (ICCT), examines key drivers of EV adoption in the US, Europe and China, with an emphasis on vehicle charging scenarios and infrastructure… — via International Council on Clean Transportation (ICCT)
Land-Use Requirements for Solar Power Plants in the United States
The NREL report found that a large photovoltaic (PV) plant that provides all the electricity for 1,000 homes would require 32 acres of land. That technology on average requires 3.7 acres per annual gigawatt-hour, the report said. — via US National Renewable Technology Laboratory
For more from the RFF Library blog, click here.