What’s the Worst That Could Happen?

[This post is co-authored with my three collaborators in cap-and-trade work for the California Air Resources Board:  Frank Wolak, Jim Bushell and Matt Zaragoza-Watkins.]


California’s year-and-a-half old cap-and-trade market for reducing greenhouse gases (GHGs) has drawn renewed interest over the last few weeks, since the EPA announced its initiative for limiting greenhouse gases from existing power plants.  California’s program is seen by many as a model of how state-level policies can lower GHGs without imposing undue costs on the economy.  This is heartening to supporters of the program who argued that by going first California could develop a market-based system for emissions reduction that could be a model for other regions and countries.

Of course, being on the “bleeding edge” of any movement means learning from risks and mistakes as well as from successes.  So far, California’s program has operated smoothly and is mostly viewed as a success, though the price has stuck pretty close to the effective price floor, $10-$11 for an “allowance” that covers one ton of CO2 emissions.  Some stakeholders, however, have expressed concerns with how the market will perform in the future, particularly after January 1, 2015 when emissions from transportation fuels, residential natural gas combustion and some other sources are added to the market.

The oil refining industry and some lawmakers have argued that the market could see volatile allowance prices, which would feed through to the price of gasoline and to other GHG-intensive products.  Another set of stakeholders have worried that certain market participants might be able to manipulate the price of allowances to their advantage.

The final report of the Market Simulation Group to the California Air Resources Board – released today by the Energy Institute and the Air Resources Board – examines these concerns.  The members of the MSG are Severin Borenstein, Jim Bushnell, and Frank Wolak working with Energy Institute graduate student Matt Zaragoza-Watkins.  In 2012, we were enlisted by the ARB to “stress test” the market design.

The report focuses on what might still go wrong in the market, not on the many thoughtful design features that have reduced or eliminated myriad other problems that could have otherwise occurred in such a complex market.   Our group recognizes the years of careful planning that have gone into creating the foundation for an efficient and robust market.  While we find that some risks remain, we conclude by recommending a couple of straightforward adjustments that could address those risks.  We believe that with these changes, California’s course for addressing climate change with a market mechanism will indeed be a model for other regions and countries.


CCD Carbon Price without flag (including caption) 2014.07.08 2The price of allowances in California’s cap-and-trade market since trading began. Graphic by Climate Policy Initiative with interactivity and updates at CalCarbonDash.org.


A Sidebar on Including Transportation Fuels in Cap and Trade

Before we get too far into the details of market rules and trading strategies, we want to note that our report has direct implications for the recent debate over whether or not to include gasoline in the cap-and-trade program starting on January 1, 2015, as is now prescribed by state law.  Our estimates imply that by far the most likely effect of including transport fuels in the program next January will be to raise California gasoline prices about 10 cents per gallon.[1]  That barely registers in the context of gasoline price gyrations.

To put that in context, the average Californian uses about 40 gallons of gasoline and diesel per month (directly in their own cars and indirectly through transportation of products they purchase), so this will raise their cost of living in California by about $4 per month.  That’s not nothing, but in the long list of +costs and benefits of living in California, it isn’t one of the major factors.  And unlike when crude oil price spikes drive up gasoline prices, the cap and trade revenue is going back to Californians through state expenditures of the funds.

Finally, the program has been designed around the 2015 expansion to include transport fuels.  Changing the program at this point to exclude fuels from cap-and-trade would be extremely disruptive to the market and could exacerbate the risk of volatile and high allowance prices.


WhatsTheWorst2California gasoline prices since 1995 (constant 2014 dollars).  Source: U.S. Energy Information Administration


Now Back to Our Study

Our report evaluates the possible outcomes of a competitive allowance market and the potential for non-competitive outcomes, such as market manipulation.  Importantly, we incorporate the uncertainty in economic growth, GHG intensity and other factors that would determine the “business as usual” path of emissions, which is the starting point for analyzing a GHG market.  Previous estimates have not accounted for this uncertainty.

We find that the most likely outcome is a competitive allowance market that yields a low allowance price.  But we also find real risks that allowance prices could rise to much higher levels due to combinations of low CO2 abatement, strong economic growth, and possible market manipulation.  Such disruptive price spikes that could create a backlash against cap-and-trade markets are reminiscent of the impact the California electricity crisis had in virtually stopping electric industry restructuring in the U.S.    Along with nearly all the people we’ve spoken with about the market, we would hate to see a disruption in California’s cap-and-trade program slow the pace of adopting market mechanisms to address climate change.

In response to these risks, the report recommends two changes to the market that would greatly reduce the probability of disruptive price spikes, which we explain below.  Any regulatory change in California requires a lengthy legal process, but we think the changes we propose are relatively straightforward.


Forecasting Long Run Supply/Demand Balance in the Cap-and-Trade Market

We first examine the likely supply/demand balance through 2020, the period for which the rules of the program have been set.  The most likely outcome in the overall market, we conclude, will be allowance prices at or just slightly above the price floor.  We also find, however, that in less likely, but plausible, scenarios in which the market tightens and the price starts to rise, there would probably be relatively little additional GHG abatement available.  Thus, if the California economy were to grow strongly and boost GHG emissions – not the most likely outcome, but certainly not unimaginable — we could see allowance prices jump to much higher levels, most likely in the later years of the program.

Our median estimates suggests that without changes to the program there is an 18% chance that prices would eventually rise high enough to trigger the Allowance Price Containment Reserve (APCR).  The APCR adds some additional allowances to the supply if the price exceeds a trigger level that increases from $40 in 2013 to $56 in 2020 (all adjusted for inflation to 2013 dollars).  The same estimates suggests a 6% chance of depleting all the allowances in the APCR, which would then allow prices to go much higher.[2]

Severin’s previous blog posts, in May and September of last year, addressed this risk of price volatility in the cap and trade market, and the need for a credible price ceiling.  The first recommendation in the report is for ARB to adopt a firm and credible price ceiling by standing ready to make additional allowances available at the ceiling price – which they could do by borrowing from post-2020 emissions, purchasing allowances from other GHG markets, or otherwise expanding supply at the ceiling price.


Scarcity and Market Manipulation Risks in the Short Run

The final report also expands the analysis to examine the competitive supply/demand balance in the market during the earlier “compliance periods” – 2013-2014 and 2015-2017 – within which participants will accumulate allowance obligations by emitting GHGs and then have to acquire enough emissions allowances to cover them. We find that for each of the earlier compliance periods, prices are very likely – over 80% probability in nearly all scenarios — to remain at or near the price floor.

But, as with the overall eight-year program analysis, we find that there are scenarios of strong economic growth — especially when paired with only modest reductions in emissions intensity — that could push up emissions.  In those cases, the market would be unlikely to demonstrate much price-responsive abatement over the short time available for such response.  The outcome could be substantial price spikes and potential disruption in the market.

Finally, we turn to the potential for anti-competitive behavior in the market.  We find a small but significant risk that some market participants could manipulate the price upwards by buying up more allowances than they need for a given compliance period and then “banking” some of them for use in future compliance periods, thereby creating an artificial shortage of allowances for compliance in the current period.  The report walks through in detail how this might be done and the difficulties a firm would face in carrying out such manipulation.  Banking is just saving and there are legitimate and important reasons for banking; the program needs a policy that undermines use of banking as part of a manipulation strategy.

In response to these risks of short-run shortages and market manipulation, we propose a change in market protocols.  Our second recommendation is to allow “vintage conversion”:  by paying a “conversion fee,” a firm would be permitted to use a later-year “vintage” allowance to meet a compliance obligation it faces in earlier years.[3]  We demonstrate that vintage conversion would greatly reduce the size of price spikes that could occur due to real or artificial short-run allowance shortages and it would undermine the incentive for participants to manipulate the market in order to create an artificial shortage.

California’s cap and trade market is working well so far and our analysis suggests there is a good chance that would continue without changes to the market.  But we also find real risks that the market could tighten and result in large increases in allowance prices.  Such increases would likely make the market politically unsustainable and, in any case, would damage the credibility of this cap and trade market, and probably others.  We urge the Air Resources Board to make the changes proposed in our report in order to greatly reduce or eliminate these risks.  Such changes would be very much in the spirit of pioneering a new path in environmental policy not just by celebrating the successes, but also by learning from the risks and possible failures.

A press release from the Energy Institute on the report can be found here

The final report can be found here as working paper #251


[1] Each gallon of California gasoline is counted as about 0.008 tons of GHG, so the current price around $12/ton translates to about $0.10 per gallon.

[2] To be clear, the 6% chance of exhausting the APCR is a subset of the 18% probability of triggering the APCR.

[3] The concept is very close to what is known as “borrowing” in the cap and trade literature.

Posted in Uncategorized | Tagged , | 4 Comments

Open Sourcing Risky Business

A lot of the policy discussion, and many of our blog posts, focus on the difficult task of trying to slow climate change. It’s useful to remind ourselves of the difficulties associated with NOT slowing climate change.

Last week, Michael Bloomberg, Hank Paulson and Tom Steyer released their Risky Business report quantifying the costs of climate change for the US economy. The report relies heavily on modeling work by one of EI@Haas’ own, Solomon Hsiang. Sol is a professor at UC Berkeley’s Goldman School of Public Policy and was a visitor at the Energy Institute this past semester. He was the lead economic author on the technical report that generated all of the results for the Bloomberg et al. glossy.


Photo from Risky Business Report

The technical report is notable on several dimensions, both the results it presents and the methodologies the authors use. The press has picked up on the bi-partisan leaders of the project, their call to apply risk management thinking to climate change, and the impactful charts and figures documenting the regional effects of climate change.

I was intrigued by a component of the report that hasn’t gotten as much attention –its commitment to “open science.” There’s no hard and fast definition of that term, but the Open Science organization defines it as, “the idea that scientific knowledge of all kinds should be openly shared as early as is practical in the discovery process.”

I have been involved in lengthy discussions about open science with my colleagues. Here are some of the tradeoffs in the context of the Risky Business report.

Bartopen-1Underlying the report’s fabulous pictures is an elaborate computer simulation model. Sol and his co-authors spent many, many hours developing their model, which they are calling SEAGLAS (Spatial Empirical Global-to-Local Assessment System). Developing a model like SEAGLAS involves everything from digging into the literature to come up with the best inputs to the model (e.g., the projections for temperature changes, impact of weather on electricity consumption) to writing lines and lines of code to combine all the inputs to generate useful outputs.

The benefits to embracing open science in this case are pretty clear. If Sol and his co-authors make the code underlying SEAGLAS accessible, other researchers can check the code for errors, stress test it by using it to generate results different from the ones in the original report, write modules to expand it, and generally improve its usefulness.

But, there is a potential cost. Academics are rewarded (e.g., promoted to tenure or given raises) almost purely on the number of original scientific publications they produce. So, the inclination of some people who develop models like SEAGLAS is to keep them under lock and key while they publish papers. After all, what’s the purpose of investing a lot of time in a well-constructed model if other researchers can use it to do cool stuff before you can? It’s like a company shielding its intellectual property, just that the currency is academic publications rather than cash. The risk is that we will get fewer valuable models created in the first place if researchers are expected to immediately make them open.

Much like Elon Musk at Tesla, Sol and his co-authors seem to be opening up the model vault. The report explicitly says, “We will be making our data and tools available online at climateprospectus.rhg.com. We hope others build on and improve upon our work in the months and years ahead.”

Sol and his co-authors are certainly not the first climate modelers to make their code public, and, in fact, how truly public it is remains to be seen. Posting undocumented code is almost useless to other researchers – it can take as much or more time to dissect someone else’s code as to write your own. On the other hand, some modelers have made an industry out of teaching people about their models and go to great lengths to make it accessible.

I am optimistic that Sol and his co-authors will provide useful information at the link above. After all, they’ve already made an early, public commitment to do that in the Risky Business technical report.

And, in terms of the inputs to the model, Sol sent us an email explaining that, “We have developed a new Distributed Meta-Analysis System (DMAS) that continuously and dynamically integrates new empirical findings (that are crowd-sourced from researchers around the world).” This means the inputs will be kept current and should represent consensus estimates from other researchers.

We have a lot of work to do quantifying the impacts of climate change and identifying the best mitigation and adaptation strategies. With public-minded – not to mention superb – researchers like Sol, I’m optimistic about the progress we can make.

Posted in Uncategorized | Tagged , | 1 Comment

Swapping negawatts for megawatts under the EPA’s proposed Clean Power Plan

If you are embarking on a new weight loss plan, it typically makes sense to pursue a mixed strategy of diet and exercise. If you are working to get your finances under control, you should look for ways to increase your household earnings and reduce spending. And so it goes, as we work to reduce carbon emissions from the nation’s power sector, pursuing a mix of supply and demand-side strategies makes good sense.

By Auffhammer’s Yoga Theorem (or, for you purists, Le Chatelier’s principle), the outcome of an emissions regulation that supports compliance flexibility will be less costly than a rule that limits states’ compliance options.  Under the auspices of the Clean Air Act, the EPA has rolled out its metaphorical yoga mat and demonstrated some serious regulatory flexibility with its “outside the fence” approach to compliance.


“Outside the fence” basically means that states can look beyond their existing power plants for ways to reduce emissions cost-effectively. There is evidence to suggest significant untapped potential for cost-effective energy efficiency gains on the demand-side.  Allowing states to use demand-side efficiency improvements to meet their compliance obligations has the potential to significantly reduce compliance costs.  But with these potential cost savings come implementation challenges.

Factoring in the demand side

The EPA has defined a set of state-specific emissions standards based on a detailed assessment of state-specific “best system of emissions reductions” (more details can be found in the  proposed rule).   The EPA chose to define these standards in terms of emissions rates (i.e. tons CO2/MWh), although states can choose to convert their standard to a mass-based (tons of CO2) target.

A state that sticks with the rate-based standard would, for compliance purposes, calculate their emissions rate as follows:eqnThe numerator is simply the tons CO2 emitted from electricity generation.  The denominator is the sum of electricity generation plus “negawatts”.  In principle, negawatts represent the electricity consumption that did not happen thanks to a demand-side efficiency improvement.

Looking at this equation, a state could bring its emissions rate into compliance using supply-side strategies that reduce the emissions rate per MWh of electricity generated. Examples include supply-side operating efficiency improvements and increased reliance on less carbon intensive generation sources.  On the demand-side of the fence,  the state can pursue efficiency improvements in order to reduce the total quantity of electricity generated while increasing its negawatts.

How many megawatts in a negawatt?

Measuring carbon emissions and electricity generated is relatively straightforward because state-level emissions and electricity production are directly observable.  In order to measure negawatts you need to construct a credible estimate of something you cannot observe directly: the “counterfactual” electricity consumption that would have been observed absent the efficiency intervention. Recent studies have looked into how projected savings from energy efficiency programs compare with realized energy savings in a variety of settings.  Whereas some find that savings materialize as expected, others find that realized energy savings fall short of predictions (some recent studies can be found here and here). This can happen for a variety of reasons including mis-calibration of simulation models, sub-standard installation of efficiency measures, rebound, or a failure to fully anticipate free riding behavior.

If energy efficiency savings are over-estimated, the denominator in the above equation will be inflated. If a state’s emissions are divided by an artificially large number, the stringency of the rate-based standard is effectively reduced.  In other words, too many emissions will be permitted per unit of electricity generated. Importantly, damages from these increased emissions will work to offset- or even eliminate-  the relative cost-advantages of demand-side compliance options.

Measurement matters

The moral of this story is *not* that we should throw outside-the-fence compliance flexibility out with the bath water.  The real punchline is that measurement matters. Over-estimates of energy savings make energy efficiency look artificially cheap relative to supply-side strategies. This mis-measurement undermines cost-effectiveness. Under a rate-based standard,  it can also reduce stringency and increase emissions.

The EPA has indicated that it intends to develop guidance for evaluation, monitoring, and verification (EM&V) of demand-side energy efficiency programs for the purpose of this rule. This is a daunting but important task. The good news is that there is lots of work being done by academic researchers, government agencies, and other stakeholders to inform this process and advance the state-of-the-art. Providing clear guidance and resources to help states tackle these EM&V challenges will be key to realizing the real potential of outside-the-fence efficiency improvements.


Posted in Uncategorized | Tagged , , | 2 Comments

EPA and climate regulation: Mind the gaps

Last week, following years of anticipation, the shoe finally dropped on EPA carbon regulations. Two or three things are notable in this. First, we’re on the road to a national climate policy!   Second, EPA is going out of its way to highlight how much flexibility it will give to States in implementing this policy. Traditionally, most regulations under the Clean Air Act have taken the form of “command and control” technology mandates for industries and equipment located inside dirty, “non-attainment” counties and regions. Economists have long complained about this and spoken of the virtues of more flexible, market-based approaches.


That’s why there has been so much excitement amongst the econ-wonk community about the EPA’s embracing of flexibility. Instead of rigid technology requirements, which really aren’t a great tool for dealing with the CO2 in the electricity sector, states can find other ways to come up with equivalent reductions – like cap-and-trade!

So flexibility can be great, but there are some reasons to worry about too much flexibility when it comes to CO2 reduction strategies. While the announcement has had economists manning their whiteboards to extol cap-and-trade, it is not at all certain that “flexibility” will translate to “cap-and-trade.” The EPA is clear that it considers its regulation to be fundamentally based upon emissions rates (CO2/MWh) rather than limits on total CO2. This means an emissions rate (or emissions intensity) standard will likely be one of the flexible options. Emissions intensity standards, such as the low carbon fuel standard (LCFS) for fuels, are sort of the ugly stepsister to cap-and-trade. They attack environmental problems through rates, and have “trading” elements where one can comply with the rate standard, but they do not explicitly limit the total amount of pollution.

The complaint about intensity standards, as articulated by Stephen HollandJon Hughes, and Chris Knittel in their 438-part series “why we hate intensity standards,” is that while they raise costs on dirty sources that exceed the standard, they implicitly subsidize sources that are still dirty, but cleaner than the standard. Thus gas plants can receive implicit subsidies to replace the output of coal plants, even though gas plants still produce CO2.

Within a single state or region, this is dismissed by some as a lesser-of-evils, but there is a real potential for mischief if some states go one way (cap) and others go another way (rate standards). This is because an intensity standard subsidizes the output of plants cleaner than the standard.  Gas plants in Arizona could end up having their output subsidized by an Arizona standard to import power into California, displacing sources that would have otherwise been capped.  If the Arizona plants are dirtier than the California plants they displace, total emissions increase.

More generally, California is going have to come to grips with how its internal climate policy will interact with those adopted by other states. While the expansion of CO2 regulation to other states would appear to reduce concerns about negative spillovers, new compatibility issues will be created. California will face decisions about which states to partner with, and how to design its policies to better adjust to those adopted by states it chooses not to partner with.  If it joins other states under a common cap, California will have to confront the fact that its aggressive renewable electricity goals could simply create more head-room under the cap for other states to use fossil fuels (something I wish the city of Davis would appreciate).  Some issues, such as leakage, could be mitigated if neighboring states adopt programs similar to California. However, neighboring states could also adopt compliance pathways that are inconsistent with California’s, and even exacerbate concerns such as leakage and reshuffling.

I don’t want to sound too pessimistic. The EPA’s actions are a tremendous development in the climate policy arena. Just a year ago it was common at conferences to hear statements like “there will be no national climate policy in the US in the foreseeable future.” The Obama administration is using the tools at its disposal to push the ball forward and I am glad they are doing it. However, those who thought California’s worries about leakage will be solved by these developments need to think again. We still need to mind the gaps.


Posted in Uncategorized | Tagged , , , , , , | 1 Comment

Unlocking Cost Savings with Cap-and-Trade

Much was made last week about the flexibility of the EPA’s proposed new power plant regulations. According to EPA administrator Gina McCarthy, it “gives states the flexibility to chart their own customized path. There is no one-size-fits-all solution. Each state is different so each state’s path can be different.”

Perhaps most importantly, states would be able to meet their required carbon dioxide reductions by starting a cap-and-trade program, or by joining an existing cap-and-trade program like the one that is part of California’s AB32. This approach raises legal and administrative challenges, but it also could dramatically reduce costs by incorporating emissions reductions from other sectors.

For decades, economists have emphasized the efficiency gains associated with cap-and-trade and other market-based environmental policies. The economic argument for cap-and-trade is right out of Econ 101. Every textbook teaches this the same way and I have always thought that this is the best way to understand how cap-and-trade works.

In the video, I draw marginal abatement cost curves for two sectors. If you’d like, you can think of “Sector A” as fossil-fuel power plants and “Sector B” as all other sectors. I then show how incorporating this additional sector with a cap-and-trade program could reduce costs substantially. Cap-and-trade moves abatement from high-cost sectors to low-cost sectors, thus reducing the total cost of achieving a given level of reductions.

By the way, none of this hinges on exactly what these abatement cost curves look like. In fact, with cap-and-trade we don’t even have to know, ex ante, where to find the cheap abatement. By putting a price on carbon dioxide emissions you create an incentive for abatement in all sectors of the economy.  The price efficiently allocates abatement across and within sectors, and spurs innovation in new and sometimes unexpected technologies.

And while the theory is compelling, it is also important to emphasize that cap-and-trade is not some unattainable ideal that only exists in the classroom. Think back to 1990, when President George H.W. Bush signed into law the EPA’s cap-and-trade program for sulfur dioxide.  Between 1990 and 2004, sulfur dioxide emissions decreased by 36% — yielding $50 billion annually in benefits at a cost of less than $2 billion per year (here).  Economists have estimated that, relative to technology standards, the program reduced costs by as much as 90% (here).

We have practical experience with nitrogen oxides too. EI@Haas faculty Meredith Fowlie and Catherine Wolfram, along with EI@Haas alumnus Chris Knittel (MIT) have a paper measuring the potential cost-savings from a cap-and-trade program for NOx (here).  It turns out that NOx abatement from power plants is relatively expensive, like the “Sector A” in the video, while abatement from cars and other mobile sources is relatively cheap, like the “Sector B” in the video.  Consequently, they find that there would be enormous cost savings from equating marginal costs across the two sectors.

As Max explained with his “Yoga Theorem” last week, the less flexible you are, the more you will suffer.  This is exactly the case with the EPA’s proposed regulations. Nobody can predict with certainty exactly how much these regulations would cost, but economic theory is extremely clear on the benefits of making these policies as multi-sector, and as price-based as possible. In practice, every time we have implemented market-based environmental policies, they have ended up costing less than expected and there is every reason to believe this would work again for carbon dioxide. 

Posted in Uncategorized | Tagged , | 7 Comments

The Yoga Theorem

With yesterday’s historical release of the EPA’s new carbon emissions policy, I took an extra day to comb through and digest the news.

I have organized my intermediate microeconomics class around something called the “Yoga Theorem.” This almost universal truth states that the less flexible you are, the more you will suffer.  It holds in a very large number of settings (e.g., tax incidence,  market power). Yesterday, the Obama administration –  barred from implementing a national price based (=very yogaesque) policy like a carbon tax or cap and trade – turned up the heat on existing coal fired power plants. This is big news. Almost 40% of energy related US CO2 emissions come from power generation and the new rule will cut these emissions by 30%. This means that this rule will result in a 12% overall reduction in emissions by 2030 relative to 2005 baseline emissions.  I hear cheering from the left and jeering from the right.

As far as standards are concerned, there is a lot to like about the new rule. Each state has a target spelled out in terms of pounds of CO2 per MWH.

Instead of prescribing what states have to do to meet these standards, there are a number of flexibility mechanisms designed to help states meet their targets. For example, states can upgrade older plants, switch from coal to natural gas, ramp up their energy efficiency efforts or they can increase renewable generation off site. The goal behind this strategy is designed to help states tailor approaches to their local economies and fuel mixes. States can even meet the targets by implementing their own carbon taxes or joining existing cap and trade schemes. This is conceptually very similar to a global climate regulation architecture, which allows countries to choose how to meet a pre-specified target. Only that in this context there is an enforcer with a big stick, which we do not have globally.

Let’s take a brief step back and look at the broader picture.

In 1990 US energy related CO2 emissions were 5040 million metric tons. The Kyoto Protocol, had we ratified it, would have pushed us to 7% below that level by 2008-2012. That would mean a target of 4873.2 MMT. In 2005, we emitted 5999 MMT. The new rules, if they get implemented, would get us to 5279 MMT by 2030. That is 8.3% above the Kyoto target. If we compare the new target to today’s (2013) emissions (5393 MMT), the new plan only reduces emissions by 2.1% by 2030, since emissions have come down drastically since 2005 due to the natural gas revolution. So the choice of 2005 as a baseline to advertise reductions superficially includes what has happened already. The 30% reduction from the power sector is equivalent to a 7.5% percent reduction if you use today as a baseline, not 2005.

Another thing that has changed since 1997, the year the Kyoto protocol was signed, is that emissions from China have skyrocketed. Negotiators from China, India and other rapidly developing economies have always argued that they would never agree to a regulation unless the EU and US would have their own. And even then, there should be a “common but differentiated” responsibility. Meaning we should do more and they should do less.

While I applaud the Obama administration for this very smart piece of regulation in a world where the right side of the aisle is hostile to least-cost, market-based approaches, I am concerned that this will do little to move the countries that matter to act in a significant way. China, one day after the new rule was published, signaled that it is likely to put a total cap on carbon emissions – not just the carbon intensity of GDP. We will find out soon whether the negotiating strategies of the LDCs will change at the all important Paris meeting of the parties and how big that Chinese cap is.

I am certain that this new rule is part of a solution, but by no means the last word in mitigation policy. We need to do much more. And very soon.



Posted in Uncategorized | Tagged , , , , , | 6 Comments

Chumps or champs? California leads on climate

Governor Jerry Brown, speaking at a Giannini Foundation event last week, summarized California’s dilemma with respect to climate change:

“We’re one percent of the (climate change) problem. We have to get other states and other nations on a similar path forward, and that is enormously difficult because it requires different jurisdictions and different political values to unite around this one challenge of making a sustainable future out of our currently unsustainable path.”

What can economics tell us about this critical collective action problem?  As is often the case with economics, we’ll need at least two hands to answer this question.

On the one hand, implementing a regional policy to combat climate change in a state that accounts for only one percent of the problem seems futile.  Pick up any resource economics textbook and you will find a dismal (even by economists’ standards) tale of the tragedy of the commons. The basic story goes something like this. A group of individuals share a limited resource (an ocean, a pasture, an aquifer, a planet). With everyone acting independently to maximize their own immediate gain, they end up depleting the resource, even though everyone can see that it is not in anyone’s long-term interest to do so.

garrett hardinSource:http://www.quotesdump.com/garrett-hardin-image-quotes-and-sayings-3/garrett-hardin-image-quotes-and-sayings-3-2/

Years ago, the standard economic narrative was that all commons problems inevitably end in this kind of ecological disaster. And if this is true, regional climate change mitigation policies in California, Europe, and elsewhere will be unavailing. Any fossil fuels we refrain from burning will just get burned somewhere else.

How’s that for dismal?

On the other hand, there are some clear counter examples. There are plenty of cases where large groups of people with competing interests have managed to find a way to manage a shared and finite resource sustainably.  Elinor Ostrom collected and analyzed these cases in meticulous detail, beginning with her dissertation work on groundwater management in Los Angeles.

ostromSource: http://im-an-economist.blogspot.com/2012/03/evening-with-elinor-ostrom.html

In 2009, Elinor Ostrom was awarded the Nobel Prize in Economic Sciences for her research that challenged the view that individuals cannot overcome the commons dilemmas they face. Drawing from extensive field work, creative laboratory experiments, and game theoretic modeling, she worked to understand how groups of individuals can cooperate to organize sustainable, long-term use of common pool resources.

The woman was brilliant. But if you are looking for a silver bullet in Ostrom’s work, you won’t find it.  If there is a unifying theme, it’s that there is no panacea.

Ostrom did look for broader institutional regularities in the success stories she studied. One of these: governance structures built from the bottom up. Smaller units can develop ideas, establish norms and make rules informed by local knowledge, culture, and circumstance. As larger units become involved, larger governance structures can leverage momentum and take advantage of the trial and error learning that has been going on at a smaller scale. This may sound like hippie-talk. I am from Berkeley, after all. But Ostrom has the data to back this up.

In an earlier blog post, Severin argued that the primary goal of California climate policy should be to develop the technologies that can facilitate low-carbon economic growth.  He’s right. But policy innovation is an important complement to technology innovation. California can serve as an important lynchpin in the emerging “polycentric” system of policies designed to decouple economic growth and increasing atmospheric concentrations of greenhouse gases.

A recent World Bank report notes that regional, national, and sub-national carbon pricing initiatives are proliferating. More than 40 national and 20 sub-national jurisdictions have now adopted some form of carbon pricing (either a tax or an emissions trading program). The map below (taken from the report) shows existing and planned carbon pricing programs.  Together these countries account for 22 percent of global emissions. When you add the regions that have established plans to implement programs, this share increases to almost 50 percent.

map of existing trading
It is impossible to precisely quantify the extent to which California is helping to accelerate the diffusion of these policies and programs.  But there is plenty of anecdotal evidence that California’s experience is serving to support and advance these collective actions. Take China, for example. Last year, California’s Governor Brown and China’s top climate change negotiator signed a Memorandum of Understanding which included pledges to work together on sharing low-carbon strategies and creating joint-ventures on clean technologies. China is currently establishing and implementing pilot cap-and-trade programs in seven of its provinces and cities covering 250 million people.

Economic theory tells us that, if we assume all users of a scarce resource act to maximize their own narrowly defined self interest, they will inevitably meet with tragic ends. Work by Elinor Ostrom and others have shown that, if we use more nuanced models that allow for more complex motivational structures (such as trust, cooperation, optimism, a will to lead),  solutions to the commons problem are complex, hard won, but not impossible.

Global climate change is  the most challenging  collective action problem we have ever faced. Individual states cannot tackle the problem on their own. But state-level actions can move us towards a more global solution. In the words of Elinor Ostrom:

“While we cannot solve all aspects of this (climate change) problem by cumulatively taking action at local levels, we can make a difference, and we should.” 

Posted in Uncategorized | Tagged , | 2 Comments