The Yoga Theorem

With yesterday’s historical release of the EPA’s new carbon emissions policy, I took an extra day to comb through and digest the news.

I have organized my intermediate microeconomics class around something called the “Yoga Theorem.” This almost universal truth states that the less flexible you are, the more you will suffer.  It holds in a very large number of settings (e.g., tax incidence,  market power). Yesterday, the Obama administration –  barred from implementing a national price based (=very yogaesque) policy like a carbon tax or cap and trade – turned up the heat on existing coal fired power plants. This is big news. Almost 40% of energy related US CO2 emissions come from power generation and the new rule will cut these emissions by 30%. This means that this rule will result in a 12% overall reduction in emissions by 2030 relative to 2005 baseline emissions.  I hear cheering from the left and jeering from the right.

As far as standards are concerned, there is a lot to like about the new rule. Each state has a target spelled out in terms of pounds of CO2 per MWH.

Instead of prescribing what states have to do to meet these standards, there are a number of flexibility mechanisms designed to help states meet their targets. For example, states can upgrade older plants, switch from coal to natural gas, ramp up their energy efficiency efforts or they can increase renewable generation off site. The goal behind this strategy is designed to help states tailor approaches to their local economies and fuel mixes. States can even meet the targets by implementing their own carbon taxes or joining existing cap and trade schemes. This is conceptually very similar to a global climate regulation architecture, which allows countries to choose how to meet a pre-specified target. Only that in this context there is an enforcer with a big stick, which we do not have globally.

Let’s take a brief step back and look at the broader picture.

In 1990 US energy related CO2 emissions were 5040 million metric tons. The Kyoto Protocol, had we ratified it, would have pushed us to 7% below that level by 2008-2012. That would mean a target of 4873.2 MMT. In 2005, we emitted 5999 MMT. The new rules, if they get implemented, would get us to 5279 MMT by 2030. That is 8.3% above the Kyoto target. If we compare the new target to today’s (2013) emissions (5393 MMT), the new plan only reduces emissions by 2.1% by 2030, since emissions have come down drastically since 2005 due to the natural gas revolution. So the choice of 2005 as a baseline to advertise reductions superficially includes what has happened already. The 30% reduction from the power sector is equivalent to a 7.5% percent reduction if you use today as a baseline, not 2005.

Another thing that has changed since 1997, the year the Kyoto protocol was signed, is that emissions from China have skyrocketed. Negotiators from China, India and other rapidly developing economies have always argued that they would never agree to a regulation unless the EU and US would have their own. And even then, there should be a “common but differentiated” responsibility. Meaning we should do more and they should do less.

While I applaud the Obama administration for this very smart piece of regulation in a world where the right side of the aisle is hostile to least-cost, market-based approaches, I am concerned that this will do little to move the countries that matter to act in a significant way. China, one day after the new rule was published, signaled that it is likely to put a total cap on carbon emissions – not just the carbon intensity of GDP. We will find out soon whether the negotiating strategies of the LDCs will change at the all important Paris meeting of the parties and how big that Chinese cap is.

I am certain that this new rule is part of a solution, but by no means the last word in mitigation policy. We need to do much more. And very soon.

 

 

Posted in Uncategorized | Tagged , , , , , | 6 Comments

Chumps or champs? California leads on climate

Governor Jerry Brown, speaking at a Giannini Foundation event last week, summarized California’s dilemma with respect to climate change:

“We’re one percent of the (climate change) problem. We have to get other states and other nations on a similar path forward, and that is enormously difficult because it requires different jurisdictions and different political values to unite around this one challenge of making a sustainable future out of our currently unsustainable path.”

What can economics tell us about this critical collective action problem?  As is often the case with economics, we’ll need at least two hands to answer this question.

On the one hand, implementing a regional policy to combat climate change in a state that accounts for only one percent of the problem seems futile.  Pick up any resource economics textbook and you will find a dismal (even by economists’ standards) tale of the tragedy of the commons. The basic story goes something like this. A group of individuals share a limited resource (an ocean, a pasture, an aquifer, a planet). With everyone acting independently to maximize their own immediate gain, they end up depleting the resource, even though everyone can see that it is not in anyone’s long-term interest to do so.

garrett hardinSource:http://www.quotesdump.com/garrett-hardin-image-quotes-and-sayings-3/garrett-hardin-image-quotes-and-sayings-3-2/

Years ago, the standard economic narrative was that all commons problems inevitably end in this kind of ecological disaster. And if this is true, regional climate change mitigation policies in California, Europe, and elsewhere will be unavailing. Any fossil fuels we refrain from burning will just get burned somewhere else.

How’s that for dismal?

On the other hand, there are some clear counter examples. There are plenty of cases where large groups of people with competing interests have managed to find a way to manage a shared and finite resource sustainably.  Elinor Ostrom collected and analyzed these cases in meticulous detail, beginning with her dissertation work on groundwater management in Los Angeles.

ostromSource: http://im-an-economist.blogspot.com/2012/03/evening-with-elinor-ostrom.html

In 2009, Elinor Ostrom was awarded the Nobel Prize in Economic Sciences for her research that challenged the view that individuals cannot overcome the commons dilemmas they face. Drawing from extensive field work, creative laboratory experiments, and game theoretic modeling, she worked to understand how groups of individuals can cooperate to organize sustainable, long-term use of common pool resources.

The woman was brilliant. But if you are looking for a silver bullet in Ostrom’s work, you won’t find it.  If there is a unifying theme, it’s that there is no panacea.

Ostrom did look for broader institutional regularities in the success stories she studied. One of these: governance structures built from the bottom up. Smaller units can develop ideas, establish norms and make rules informed by local knowledge, culture, and circumstance. As larger units become involved, larger governance structures can leverage momentum and take advantage of the trial and error learning that has been going on at a smaller scale. This may sound like hippie-talk. I am from Berkeley, after all. But Ostrom has the data to back this up.

In an earlier blog post, Severin argued that the primary goal of California climate policy should be to develop the technologies that can facilitate low-carbon economic growth.  He’s right. But policy innovation is an important complement to technology innovation. California can serve as an important lynchpin in the emerging “polycentric” system of policies designed to decouple economic growth and increasing atmospheric concentrations of greenhouse gases.

A recent World Bank report notes that regional, national, and sub-national carbon pricing initiatives are proliferating. More than 40 national and 20 sub-national jurisdictions have now adopted some form of carbon pricing (either a tax or an emissions trading program). The map below (taken from the report) shows existing and planned carbon pricing programs.  Together these countries account for 22 percent of global emissions. When you add the regions that have established plans to implement programs, this share increases to almost 50 percent.

map of existing trading
It is impossible to precisely quantify the extent to which California is helping to accelerate the diffusion of these policies and programs.  But there is plenty of anecdotal evidence that California’s experience is serving to support and advance these collective actions. Take China, for example. Last year, California’s Governor Brown and China’s top climate change negotiator signed a Memorandum of Understanding which included pledges to work together on sharing low-carbon strategies and creating joint-ventures on clean technologies. China is currently establishing and implementing pilot cap-and-trade programs in seven of its provinces and cities covering 250 million people.

Economic theory tells us that, if we assume all users of a scarce resource act to maximize their own narrowly defined self interest, they will inevitably meet with tragic ends. Work by Elinor Ostrom and others have shown that, if we use more nuanced models that allow for more complex motivational structures (such as trust, cooperation, optimism, a will to lead),  solutions to the commons problem are complex, hard won, but not impossible.

Global climate change is  the most challenging  collective action problem we have ever faced. Individual states cannot tackle the problem on their own. But state-level actions can move us towards a more global solution. In the words of Elinor Ostrom:

“While we cannot solve all aspects of this (climate change) problem by cumulatively taking action at local levels, we can make a difference, and we should.” 

Posted in Uncategorized | Tagged , | 2 Comments

An Energy Efficiency Parable

Here’s a story that captures a lot of the challenges we face as we try to improve energy efficiency. Spoiler alert: it doesn’t have a happy ending, but I’m holding out hope for the sequel.

Many of us leave our work computers on 24/7 – estimates suggest more than half of us. I do mainly because I like to use remote desktop from home.Computer on at Night

In some organizations, the computer teams insist that desktops remain on overnight so they can run software updates, download the newest security patches, backup files, and do other important stuff during hours when people are less likely to be working.

The folks who pay an organization’s energy bills – let’s call them the accountants – would probably be unhappy if they knew how much they were paying to power computers between the hours of 5PM and 9AM. It’s a lot of money — at electricity prices of 11 cents per kilowatt-hour, a 75W computer left on for all 8760 hours costs almost $75 per year, while one left on for 2000 working hours only costs $16.50. Estimates suggest that nationwide, desktops consume several times more energy than servers.

But, accountants have no visibility on the computers’ energy costs – they simply pay a utility bill at the end of every month and few have any idea how much of it is going to computers versus air conditioning, lighting, etc.

Even if accountants knew exactly how much money they were spending on idle computers, it’s unlikely they could do anything about it. The computer folks, who operate the network, and in many cases dictate what kind of computer employees can buy, would revolt if the accountants mandated that computers be put to sleep or turned off at night.

Sleeping ComputerThe computer team cares much, much more about keeping the computers virus free than about saving energy. They will get midnight calls or hauled in to see the boss if anyone is infected by a virus.

Employees probably would be upset if they had to turn off their computers, too. How annoying is it to run in late for a phone call and then have to wait 3 minutes for your computer to boot so you can find the phone number? This isn’t an issue if computers are put to sleep and not turned off. But, many of us don’t know how to do that, have bad associations with early versions of sleep, or just can’t be bothered to do it.

Like other energy efficiency examples, this is a case of split incentives – the computer teams’ and employees’ versus the accountants’ – plus poor information.

No computer team is going to request computers that use less energy if it in any way makes the computers more vulnerable to viruses or harder to operate. And, no one in the organization has good information about how much money is spent on energy because of nocturnal computers.

But wait, my neoclassical economist friends might say. Isn’t there a boss who cares about BOTH lower energy bills and productive employees with virus-free computers? Shouldn’t the boss be motivated to collect the information and then get the accountants and the computer team to sit down and figure out a good solution? Workdays would need to be 108 hours long before that crossed most bosses’ minds.

What if Apple and Microsoft develop operating systems that allowed computers to get downloads off the network even while it’s asleep? Computers that are asleep use almost no energy – 1-2 Watts – it’s almost as good as turning them off.

A colleague of mine at LBNL, Bruce Nordman, was part of a team that set out to solve just this problem. If this were a movie, they’d be wearing capes.Superhero at Computer

They worked with a standards organization and formed a committee that included Apple, Microsoft, AMD, Intel, Sony (game consoles are networked and often stay on to receive occasional bleeps), among others. This is no easy task to get a bunch of competitors together to talk about collectively changing their products, particularly when the topic is as unsexy as energy efficiency.

They helped develop a “network proxy,” which effectively lets a computer go to sleep while the proxy tells the network it’s awake and then wakes the computer up if anything important comes down the pike. Energy Star now provides an incentive for companies to include the feature in their product, and the program’s early support for the technology was critical to initiating the standards process.

Apple bought into it, as did some chip manufacturers. Unfortunately, Microsoft didn’t buy in. And, Apple now ships its computers with the capability turned off, though you can turn yours on – see here, it’s called “Wake on Demand.”

What are the general lessons about energy efficiency? First, we should encourage more teams like Nordman’s. How? We can’t rely on the market to support them because split incentives and information asymmetries are examples of market failures.

Public research and development funding is certainly one answer. Nordman and his team were funded by state and federal grants. And, while the network proxying story has yet to make much progress, they have had other successes, like encouraging Ethernet chip providers to change the standard to use 75-90% less energy.

A lot of people are championing energy efficiency. The recent Shaheen-Portman bill on energy efficiency had bipartisan support, though it eventually failed. Most carbon mitigation scenarios assume a hefty share of the greenhouse gas reductions will come from energy efficiency. But, figuring out how to unlock energy efficiencies’ potential will be difficult. And, it’s less exciting than developing a new solar panel or a new gizmo.

Fortunately, there are a lot of smart people, like Nordman, trying to tackle the organizational, as well as the technical, issues in order to solve the energy efficiency puzzle.

Posted in Uncategorized | Tagged , , | 12 Comments

Money for Nothing?

Since the beginning of electricity grids, demand has fluctuated and supply has been made to follow along. But for decades, economists and some grid engineers have dreamed of having demand play a more active role in balancing the system.  With increasing use of intermittent renewable energy resources, now is the time to make that demand-response dream come true.  But we can only get there if we clear up a common misconception in the world of electricity policy:

Paying customers to reduce their demand – such as through the common Peak-Time Rebates (PTR) — is not the same as time-varying electricity prices.  PTR is a very inefficient alternative to charging time-varying rates that reflect the true time-varying cost of electricity.

Despite the flaws, PTR programs, and their close cousin aimed at businesses — demand bidding programs — are growing.  In California, Pacific Gas & Electric, Southern California Edison, and San Diego Gas & Electric have demand bidding and SCE and SDG&E have PTR programs.  PTRs are also in Maryland, in New Orleans, in Ohio, and many other parts of the country.

What’s wrong with paying for demand reduction?  Where to start.

No, I mean that literally, where to start? What’s the baseline level from which you start paying for reduction?  In nearly all programs, baselines are based on the customer’s consumption in the recent past, usually on other high-demand days.

But such “endogenous” baseline setting distorts incentives for conservation.  When my baseline for peak-time reduction is based on consumption during other high-demand days it undermines my incentive to conserve on those other days.  Frank Wolak demonstrated that this effect is significant in his study of Anaheim.  It can even lead to intentionally increasing consumption during baseline-setting times, as was uncovered at the Baltimore Orioles baseball stadium, which was turning on stadium lights during electricity shortages in order to be paid to then shut them off.

Camden-Yards

Camden Yards: Home of the Baltimore Orioles and a scofflaw energy manager

The distorted incentives also undermine long-run investments in energy efficiency.  Sure, a more efficient air conditioner will cut my usage at critical times when the utility is paying rebates, but it will also cut my usage at baseline-setting times, which will lower my rebates, thus reducing my incentive to upgrade the A/C.  In contrast, time-varying pricing — where prices are reduced in most hours, but higher during high-demand times – results in bigger (and appropriate) rewards for buying the efficient A/C.

Once the baseline is set, another problem pops up: The incentive to conserve changes drastically  around the baseline quantity.  Rebate programs pay for reductions below the baseline, but don’t charge extra for going above the baseline.  Take a typical hot summer day when a conservation rebate day is announced.  On top of the customer facing a price of, say $0.20/kWh, she now will get a rebate of $0.60/kWh for the difference between her baseline and her actual consumption, but only if she cuts consumption below the baseline quantity.  That means that below the baseline quantity the incentive to save is $0.80/kWh – the electricity price she saves by consuming one less kWh plus the rebate she gets for doing so.  But above the baseline, the incentive to save is still just $0.20.

PTRincentive

The economic cost of consuming a kWh changes drastically at baseline quantity

 

The way PTR baselines are set, on a typical critical day nearly half of all customers will be far enough above their baseline quantity that they won’t have any shot at getting the rebate and therefore won’t have any extra incentive to cut consumption.  In addition, many of the customers who do end up below baseline and receive a rebate will be there by accident, for instance because they just happen to have been away from home that day.  In demand reduction studies, these people who receive subsidies for doing something they would have done anyway are called “free riders.”  Wolak and other studies have shown that most of the demand reducers who get paid the rebate are actually free riders.

The drastic change in the effective price at the baseline quantity has another effect that goes against fairness: it rewards random variation in consumption.  Here’s a fictional example:

Catherine’s consumption during the baseline setting period has established a baseline quantity of 50 kWh during each of two conservation rebate days.  She is a reliable sort, who dutifully reduces her consumption to 45 kWh on each conservation rebate day.  The rebate is $0.60 per kWh “saved,” so for cutting her consumption to 45 kWh on each day, saving a total of 10kWh on the two days, she receives a total rebate of $6.00.

Max is the flighty, unpredictable type.  He has also established a baseline quantity of 50 kWh, but on one of the two rebate days he’s away from home anyway and his consumption drops to 20 kWh.  On the other day, he stays home and cranks up the A/C, raising his consumption to 70 kWh.  His average consumption on the two days is 45 kWh per day, just like Catherine, but his total rebate is $18.00, all received for his 20kWh day.

Max’s rebate is three times as large as Catherine’s even though his average reduction on the rebate days is the same.  That’s because when Max decreases his consumption he gets the $0.60 rebate for every kWh below the baseline, but when he goes above baseline he doesn’t have to pay the extra $0.60.  The sudden change in the effective price at the baseline quantity rewards people with unpredictable demand relative to people who have reliable consumption patterns.  If during each hour the price were the same for all kWh that Catherine and Max consume, Max would be getting the same bill savings as Catherine.  That’s what would happen if Max and Catherine were on time-varying pricing instead of PTR.

If PTRs are so bad, why are they so popular?  Because they hide the cost.  Rather than a higher price on the hottest days of the year — reflecting the truly higher cost of providing electricity on those days – PTR pays out for conservation (real or imaginary) on those hot days and raises the price a bit on all other days to cover the cost.  Thus, customers who consume a higher share of their electricity on non-peak days (e.g., those who use less, or no, air conditioning) subsidize heavy peak-time users who manage to be slightly less heavy users on a specific peak day.

Some defenders of PTR say it is the way to transition to time-varying electricity pricing.  I’m very skeptical.  Once a customer gets used to being paid for reducing consumption on peak days, it is very difficult to change to a system that just charges higher prices on those days.  There may be a utility that has managed to move from fully-implemented PTR to time-varying pricing, but I’m not aware of any example.

To integrate intermittent renewable energy sources, we really need to start taking demand-side participation seriously.  PTR is an inefficient route to that end that will end up paying for a lot of faux “demand reduction.”  Time-varying pricing is the direct route to the goal.  Sacramento Municipal Utility has recently had a very successful rollout of critical-peak pricing, one form of time-varying pricing, which the real Catherine has blogged about.  My own research suggests that time-varying pricing would reduce bills for the majority of residential and industrial customers, and that it would raise bills by more than 20% for only a few percent of customers.  Those are the customers who consume the most at peak times and impose the most cost on the system.  Prices that reflect the cost of electricity would be a more effective way to integrate renewables and a fairer way to allocate the costs.

 

If you want an even geekier discussion of the problems with PTR, I recorded a screencast video on the subject for my MBA course on Energy & Environmental Markets.

I’m still tweeting interesting energy news articles @EnergyClippings

Posted in Uncategorized | Tagged , | 17 Comments

Driving Restrictions and Air Quality

¡Feliz Cinco de Mayo! Today we travel south of the border for an update on Mexico City’s well-known driving restrictions. What better way to celebrate Cinco de Mayo? Some people prefer margaritas, but here at the Energy Institute we like to turn up the mariachi music and analyze air pollution data. ¡Orale!

Also known as “Hoy No Circula” (HNC), Mexico City’s driving restrictions have now been in place for 25 years, and have spurred similar restrictions in Santiago, Sao Paulo, Bogota, Medellin, San Jose, Beijing, Tianjin, and Quito. The format differs across cities, but most of these programs follow Mexico City’s approach and restrict driving based on the last digit of the license plate.

The initial rationale for HNC was local air pollution. Mexico City had some of the highest ozone levels in the world in the late 1980s, and the program was a politically visible way to attempt to address the problem. Unfortunately, there is no evidence that HNC actually improved air quality (Davis, 2008Gallego, Montero, and Salas, 2013). Drivers did not switch to the subway or bus system. Instead, they used taxis more and bought additional cars so that they could drive every day. Gasoline sales (below) increased steadily throughout the period.

gas

Despite the lack of empirical support, HNC has remained in place. And, in the summer of 2008, the program was expanded to include Saturdays. Again, the primary rationale was air quality (details here), with the Mexico City government attempting to address Saturday air pollution levels that had increased during the 2000s to reach and often exceed typical weekday levels.

The implementation of “Hoy No Circula Sabatino” was much like the original restrictions. Vehicles with a license plate ending in “5″ or “6″, for example, cannot drive during the first Saturday each month. Certain newer vehicles are exempt, so in practice about 8% of vehicles are not allowed to drive on any given Saturday.

brochure

There is more discretionary driving on Saturdays, so one might expect these restrictions to be more effective at getting drivers to substitute to public transportation or to avoid trips altogether. One might also expect Saturday restrictions to engender substitution to Sundays, which would be welfare-improving since traffic and pollution levels tend to be lower.

I am working on a full-scale analysis of the Saturday restrictions. In the meantime, here is some preliminary graphical evidence on what happened to air quality. I constructed a dataset describing hourly pollution levels from all Saturdays during 2007, 2008, and 2009. These data come from 48 monitoring stations located throughout the city that are part of Mexico City’s RAMA network.

Air pollution in Mexico City varies widely due to weather and seasonal factors, so it is important to control for this. The figures below plot the variation in air pollution that is left after controlling flexibly for these factors. I also plot a line which follows the overall pattern. This fitted line is allowed to “jump” on July 5th, 2008, the day when the Saturday restrictions started. For clarity, I also show a vertical line on that same day.

More work needs to be done, but these preliminary results are pretty disappointing. If the program had a beneficial effect on air pollution, you would expect the fitted line to the left of July 5th, 2008 to be higher than the fitted line to the right. But across pollutants, there is no evidence of a decrease in pollution when the Saturday restrictions began. If anything, several of the figures actually show a small increase in pollution.

If these results hold up, they will raise further questions about the effectiveness of programs like this. Driving restrictions may seem like a reasonable approach for addressing the difficult problem of urban air pollution. However, the effectiveness of any indirect policy like this depends on the available substitutes.

Drivers everywhere have a revealed preference for fast and convenient transportation and will look for ways to circumvent rationing programs of this form. Depending on the emissions characteristics of available alternatives, these changes in behavior can seriously undermine the potential benefits.

 CO

NOX

PM10PM25O3

Posted in Uncategorized | Tagged , | 8 Comments

Your tax dollars hard at work. EIA’s new data portals.

We empirical economists get very excited about finding or generating new data sets. There are big returns to splicing together different data sources to answer new and interesting questions. This is hard work and not everyone is good at it. Many datasets are hidden deep inside government vaults and some are under lock and key. You need to become a sworn Census Officer to access many of them. I have a lot of friends who took that oath. There is a reason to take privacy concerns seriously and we need to protect the identity of individuals in these microlevel datasets.

What I find frustrating though, is how difficult it is to get access to publicly available data on the energy economy. It should not be hard to figure out where we are drilling for natural gas. It should not be difficult to figure out where pipelines run. Wouldn’t it be nice to have the majority of the public datasets the federal government collects on energy online in an easy and accessible format? Make a map of gas wells. Make a map of oil wells. Download these data in GIS format onto your computer and splice them in with your favorite dataset? For those of you who have tried to do this in the past, this was an exercise in banging your head against your 27” flatscreen. Well, a new era has broken. I am not sure which EIA administrator (Richard Newell, Howard Gruenspecht or Adam Sieminski) deserves the credit, but the EIA a little while back started rolling out its U.S. Energy Data Mapping System.

What they have done here is started putting their spatial datasets online into one easily accessible system. So how about that map of natural gas and oil wells? 30 seconds of clicking produces this!

Screenshot 2014-04-27 19.36.33
Or what about that story that there is a lot of drilling going on in North Dakota? Is that true? Zoom in on North Dakota and choose the satellite background layer:

Screenshot 2014-04-27 19.39.28

A quick click on notes and sources gets you access to where the data come from. If the shapefiles are public, there is a link to download the original data. Clicking on individual points on the map gets you information about that datapoint.

This effort of EIA to make public data accessible will not only generate new papers, but also make it easy for anyone interested in the local and national energy economy to visualize aspects relevant to their inquiry.

For the nerdier folks, who know what an API is, the EIA has that too now. If you program in R, Matlab or Stata (or use Excel), you can get updated versions of data series automatically via this interface. Currently this interface contains:

  • 408,000 electricity series organized into 29,000 categories
  • 30,000 State Energy Data System series organized into 600 categories
  • 115,052 petroleum series and associated categories
  • 11,989 natural gas series and associated categories
  • 132,331 coal series and associated categories (released Feb 25, 2014)

With so much data readily accessible to anyone with a computer, I anticipate that we will see more papers on, better analysis of, and certainly better maps depicting the national energy economy. For those of you who get your horoscope by email in the morning, I would also highly suggest subscribing to the daily “today in energy” mailing. These (unlike your daily horoscope) are well researched and present short pieces discussing something important and/or timely every morning. I could not imagine having my coffee without it.

Thank you Team EIA!

Posted in Uncategorized | Leave a comment

Will Smog in China Spur Climate Solutions?

I have read a number of news stories about air pollution in the major Chinese cities recently. A soupy smog of particulates, ozone, sulfur and nitrogen oxides hangs over Beijing, Tianjin and other northern cities. The concentration of particulate matter (PM2.5) in Beijing recently registered at 501 μg/m3, more than 15 times the highest recorded value in Los Angeles County.

Beijing Smog

Beijing Smog

Ex pats are fleeing the country, while the lifespans of people who live in these cities fall. The primary culprits for much of the air pollution are the coal-fired power plants, which produce roughly 80 percent of China’s electricity.

Some of my clean tech colleagues seem to be almost cheering for Chinese smog, though. They seem to believe that the Chinese will be forced to invest in renewables and cleanup their energy sector to address the local pollution. Because it is visible to the naked eye, has a distinctive smell and has immediate impacts on quality of life, smog, unlike greenhouse gases, will spur a clean energy transformation. Or, so some argue.

I love the idea of killing two birds with one stone as much as the next person, but I’m skeptical of this particular application. I worry about the greenhouse gas implications of both demand- and supply-side responses to smog.

On the demand side, I worry that people will react to air pollution by consuming more energy. I was in Singapore recently and stunned to learn that 30% of the households do not have air conditioning — this in a country with the third highest average income and beastly hot (to my Minnesota-born tastes) weather. If I had to live in Singapore without air conditioning, I might never sleep.

Natural Air Conditioning in Singapore?

Natural Air Conditioning in Singapore?

But, a good share of the local Singaporeans seem to think that “air conditioning” involves opening the windows wide and capturing any wisp of a breeze.

As air pollution increases, the natural, low-energy approach to air conditioning becomes less attractive. My colleague at the National University of Singapore, Alberto Salvo, is working on a study that will document by just how much air conditioner purchases and electricity consumption increased in a recent episode of poor air quality.

Similarly, wealthy Chinese are investing in air conditioners, air purifiers, and more people are spending time in the miles and miles of air conditioned underground shopping centers that seamlessly connect with above ground buildings. If the air is hot, muggy and polluted, why ever go outdoors?

But, if smog encourages governments to adopt renewables for energy production, it won’t matter that city-dwellers are consuming more energy. Will that work? I have concerns about the supply side responses to smog as well.

AC.on.buildingUnfortunately, most commercial-scale technologies that remove local pollution from the energy sector create more greenhouse gases. In other words, greenhouse gases and local pollutants are typically substitutes and not complements in the production process.

Consider coal gasification, a process that transforms coal into methane. Power plants that burn natural gas emit many fewer criteria pollutants than coal plants, so turning coal into natural gas and then burning the gas to make electricity can reduce local air pollution significantly.

China currently has one operating coal gasification plant and four under construction. The government recently announced plans to produce the equivalent of more than 10% of its total gas demand using the technology by 2020. In fact, if the gas that was created from the five plants under construction plus four others that are already permitted were all used to generate electricity in an efficient combined cycle natural gas plant, it would produce more electricity than China’s wind turbines.

So, coal gasification will help reduce local pollution and it appears commercially viable, at least in China. Unfortunately, it’s a disaster for climate change.

This study, reports that, “If all 40 or so of the projected [coal to gas] facilities are built, the GHG emissions would be an astonishing ~110 billion tonnes of CO2 over 40 years.” To put this in context, all of China currently emits less than 10 billion tons annually. Gasifying coal to burn in a natural gas power plant can produce almost twice as much greenhouse gas as a coal power plant.

As far as I’m concerned, the only potential silver lining is that it appears much easier to sequester the CO2 emitted from coal that has been first been converted to gas than to sequester the CO2 from a coal power plant.

But, this will involve convincing the Chinese government that they need to address both climate change, by investing in sequestration, and local smog, by gasifying their coal. Unfortunately, there’s no free lunch from addressing smog.

Of course, coal gasification is not the only, nor necessarily even the cheapest, means of reducing local air pollution. Other options include building more nuclear plants, accessing Chinese shale gas reserves and burning gas instead of coal, replacing old and inefficient coal plants with newer more efficient plants that are also fitted with pollution control technology (scrubbers/bag houses, etc.). But, other than nuclear, these will go much further to reducing local air pollution than to reducing greenhouse gases.

So, we need to continue pushing for real climate solutions as we are unlikely to see a silver bullet emerge as the by-product of some other goal.

Posted in Uncategorized | Tagged , , | 15 Comments