Do Residential Energy Efficiency Investments Deliver?

Today’s post is co-authored by Michael Greenstone (University of Chicago) and Catherine Wolfram

We recently released a paper presenting the findings of a first-of-its kind, randomized controlled evaluation of the returns to some common residential energy efficiency investments. The study’ s context is the nation’s largest residential energy efficiency program, the Weatherization Assistance Program (WAP). You can read media coverage of the paper here, here, here, and here.

For those who haven’t read about the paper, between 2011 and 2014, we administered a randomized controlled trial (RCT)—considered the gold standard in evidence—on a sample of more than 30,000 WAP-eligible households in the state of Michigan in order to shed some light on a critical question: Do investments in important residential energy efficiency measures (improved insulation, air sealing, weather-stripping, window replacement, furnace replacement, etc.) deliver the energy savings they promise?

The research revealed five main results: (1) The energy efficiency measures undertaken by households in the study reduced their energy consumption by between 10 and 20 percent on average; (2) However, these savings were just 39 percent of the average savings predicted by engineering models; (3) There is no evidence that the shortfall in savings is the result of rebound—households did not turn up their thermostats after the investments were made; (4) While the investments cost roughly $4,580 on average, our best estimate of the energy savings was about half of these costs[1]; and (5) The costs also greatly exceeded the benefits when the monetary value of pollution reductions are added to the energy savings to calculate benefits. While the WAP program has a number of goals, when measured by the energy savings and emissions benefits, these efficiency upgrades were not a good investment.


The urgency of the climate challenge means that it is critical to identify cost-effective strategies that will deliver real greenhouse gas emissions reductions. Energy efficiency is a crucial component of most climate change mitigation plans, underscoring the importance of developing a body of credible evidence on the real-world—versus projected—returns on energy efficiency investments in the residential sector and beyond.

Such a process will undoubtedly uncover some gems, but in some instances it will also be necessary to update our beliefs. When seemingly inconvenient evidence comes to light that challenges our beliefs—as we have uncovered with this analysis—that data should not be undermined and ignored.  Instead, it should be used to inform our strategy to confront climate change. The magnitude of the climate challenge requires that we ruthlessly pursue the most cost effective mitigation options.

Our paper has generated some strong reactions and important questions, some the result of misconceptions about what exactly we evaluated and how the evaluation was conducted. In the remainder of this blog, we respond to the most common criticisms of our study and its findings.

*              *              *

Reaction 1: This is just one study and scores of other studies have opposite findings.

Some critics have cited prior evaluations showing that residential energy efficiency programs are good investments and that our study is an anomaly. Many of these evaluations, however, are based on savings projections that- as we found- can significantly overestimate the savings when applied in the real world. Other studies use real- world data, but analyze these data using methods that can confuse the effects of energy efficiency improvements with other factors that drive changes in energy consumption.

Our study is different. It represents a first-of-its-kind evaluation using a randomized controlled trial, the gold standard for rigorous evaluation. Society routinely relies on this methodology to assess the efficacy of new drugs, treatments, and other interventions. This approach is increasingly used in the social sciences, including criminology, education, development economics, and energy economics. In many instances, the application of randomized control trials has changed the conventional wisdom. Our application of this approach to residential energy efficiency measures is therefore an important departure from, and improvement upon, previous analyses.

Reaction 2: The study unfairly paints WAP as an ineffective program.

WAP has multiple goals and improving the living standards of its recipients is clearly a central and worthy one.  Our study does not claim to provide a comprehensive evaluation of WAP, nor would it be appropriate to do so.

Rather, the study’s purpose is to measure the real-world energy savings resulting from WAP-funded energy efficiency improvements.  We then compare them to both the investment costs and the projected energy savings generated from detailed energy audits.

In interpreting the results, it is important to bear in mind that for a measure to be implemented under WAP, federal regulations require that it pass a cost-benefit analysis—that is, the projected cumulative energy savings must be greater than the investment costs. This cost-benefit analysis is based on an in-home energy audit conducted using an engineering model, in this case the National Energy Audit Tool (NEAT).

For the households we studied, NEAT-driven audits projected that the WAP measures would reduce annual energy consumption by 43.7 million British thermal units (MMBtu). Yet, when we observed the energy bills of households that received WAP measures, the actual energy savings were just 17.2 MMBtu. In other words, the model systematically over-predicted energy savings by a factor of 2.5.

This is an important finding. The investments in efficiency in our study underperformed relative to projected values and in a way that the program was expressly designed to avoid. Homeowners, program managers, and taxpayers only received 39 percent of the projected savings. According to the Department of Energy, the NEAT model is used by approximately 700 state and local Weatherization Assistance Program subgrantees in more than 30 states.

Broader program objectives notwithstanding, WAP is a compelling setting to learn about the returns to energy efficiency investments. WAP is the nation’s largest residential energy efficiency program. According to the Department of Energy, which administers the program, more than 7 million homes have participated in the program since its inception in 1976. If one is attempting to assess the performance of commonplace residential energy efficiency investments on a large scale, there may be no better option.

Reaction 3: The study’s calculations of costs and benefits are inaccurate.

Here again, it is important to note that we recognize WAP has benefits beyond saving energy. But, the intent of our study was focused solely on evaluating the energy-related (and associated emissions) costs and benefits. We never claim to evaluate the other benefits of these upgrades, as that is beyond the scope of our study. It is also important to note that, no matter how one decides to evaluate monetary costs and benefits, a central finding of our analysis remains unaffected: efficiency upgrades delivered just 39 percent of the energy savings they promised. It is therefore challenging to find a set of assumptions (e.g., about lifespans and discount rates) that would cause these efficiency investments to have energy savings and emissions benefits that exceed their costs.

To drill down a bit more, here are some of the criticisms of our calculation of the costs and benefits and our responses:


Some have argued that it is inappropriate to factor in costs that don’t directly lead to energy savings.  As anyone who has done home repairs knows, once you start down the path to do something like lay new insulation, additional costs are necessarily incurred. For example, weatherization can reduce indoor air quality by tightly sealing a house, so additional costs may be required to maintain indoor air quality. Separating what’s required to lay the insulation from what’s completely separate is not easy. The average household in our sample received approximately $4,600 in energy efficiency upgrades, which includes roughly $800 in costs required to make installation of the weatherization measures safe and functional, such as wiring upgrades. Our judgment is that the most reasonable assumption is to include all of these installation and materials costs. It is worth noting, however, that if we take the polar opposite view and exclude all costs that do not directly result in energy savings, the average cost per household still significantly exceeds our central estimate energy savings.

Moreover, there are other costs associated with these retrofits that are not reflected in our cost-benefit comparison. For example, we do not include any program overhead or administrative costs. Nor do we account for the hassle and effort that households expend to implement a weatherization retrofit, even one with zero out of pocket costs. An earlier blog makes the point that these process costs can be large (we found that it cost $1,050 per weatherized household to encourage take up of these measures). Accounting for these additional expenses would of course widen the gap between costs and savings.


We measure benefits by calculating the net present value of annual energy savings using a range of discount rates (3, 6, and 10 percent) and investment lifespans (10, 16, and 20 years). Our estimate of the benefits also includes an estimated upper bound on the benefits households derive from increased warmth (based on our analysis of “rebound” in demand for heat in the winter). In no case does the present value of energy savings reach parity with actual costs, even if we ignore the indirect efficiency-related improvements.

In calculating program benefits, we used real 2013 residential energy prices for electricity and natural gas in Michigan and assumed that these figures would increase at the rate of inflation over the lifetime of the investments. While some have criticized this as too conservative, it is standard to  use current energy prices as a predictor of future energy prices.

Reaction 4: The results cannot be generalized because they only relate to one part of Michigan, to one program, and to one subset of the population.

We study a subset of low income households in Michigan undertaking a particular set of residential efficiency measures recommended by NEAT. However, minimizing the significance of our findings on account of this context ignores the ubiquity of the measures we analyze and of the reliance on audit tools like NEAT.

As noted above, the households in the sample we studied were subjected to the same measurement tool that is used by residential weatherization programs throughout the country to gauge which upgrades are the most cost effective; and all implemented measures had to pass the same cost-benefit analysis. The types of upgrades installed at the WAP households in our sample (e.g., furnace upgrades, improved insulation, and weather stripping) are commonplace for home retrofits for all income groups.

Drawing implications from a study is not an all-or-nothing proposition.   For example, the results of a randomized controlled trial studying the effectiveness of a given drug or treatment on middle-aged men will in some instances tell us everything we need to know about its effectiveness on young women.  Of course, in other instances, less decisive conclusions are warranted until further research is conducted.

Our study tells us that a common set of efficiency measures installed in the low-income households we studied in Michigan did not deliver the expected energy savings, and that investment costs significantly exceeded these savings. Given similarities between the setting we evaluate and other efficiency applications, these findings likely generalize to a broader set of residential efficiency investments. There is logic behind this implication, while also acknowledging the need for  further experiments on the returns to energy efficiency investments  in other contexts  (Indeed, we have already begun to do them and, in at least one case, our preliminary results are qualitatively similar).

Reaction 5: The study period covered a time when the program experienced a significant increase in funding that led to poor results (e.g., inexperienced contractors).

The time period we studied included an unprecedented number of weatherizations because the American Recovery and Reinvestment Act (ARRA) increased the amount of money allocated to the program dramatically. As a result, some say new, inexperienced contractors were called in to do weatherizations and their work may not represent the norm.

To investigate this possibility, we compared savings at homes where the contractors were experienced to homes where the contractors weren’t experienced and found no difference in the average energy savings. Consequently, we find no evidence that inexperience during the time period played a role in explaining the lower-than-expected savings.

[1] This blog focuses on a subset of the numbers and results reported in the paper. Here we emphasize our preferred estimates from the randomized controlled trial that estimates average impacts for the subset of households whose participation in weatherization was the result of random assignment to our experimental intervention.  These households are associated with somewhat lower average costs ($4,580) as compared to the larger sample of recipient households from whom we collected data for our quasi-experimental analysis ($5,150).

This entry was posted in Uncategorized and tagged , . Bookmark the permalink.

35 Responses to Do Residential Energy Efficiency Investments Deliver?

  1. Jane S. Peters says:

    The vast majority of evaluations of WAP programs throughout the country, often using quasi-experimental designs, have shown that WAP programs are not cost effective. So your findings are nothing new. The claim by the choice of your title, that a WAP program evaluation is relevant to any other residential weatherization program or to residential energy efficiency policy generally is the problem with your posting. Low income programs are not residential programs writ large, your implication that they are, is misleading. Research that makes claims beyond the actual population studied is unbecoming of an academic institution as well respected as the Haas School and the University of California.

    • Azmat says:

      So you think low-income people are ‘different’ from non-low income people. If anything the low income people should show a greater efficiency outcome; after all savings are ‘more important’ to them.

      • ARRAgreen says:

        I don’t believe that is what Ms. Peters is conveying — saying that energy savings in a low income housing setting doesn’t necessarily equate to the standard behavior in all residences across the globe / US is what she was merely pointing out.

        Next research topic to be explored based on your logic: energy projects in mid- and high-income housing will ALWAYS show more energy savings than low-income housing. Retaining these energy savings are very important mid- and high-earners, because, the savings represents an important part of their income stream (as evident because these are the same folks who are obviously not low income).

    • Jeff Miller says:

      I don’
      t believe a word of what you say

  2. The study mirrors my personal experience. Also, the greatest gains in heating and cooling efficiency were obtained from the expensive improvement. In 2004, I bought a standard suburban Houston, 1974 era, two story, wood and brick clad home. The house was dilapidated and the electric bill in the summer was enormous. Cooling is a bigger problem for Houstonians than heating.

    I made the following investments over the next ten years: Complete HVAC replacement including ducts for $8500 with a small reduction in electricity usage. Then complete roof replacement for $16,500 with another small bump down in electricity usage. I then replaced all windows and doors for $23,000 with the smallest improvement yet. Two years ago, I hired a contractor to nail down every board, caulk every gap, apply three coats of paint, and blow insulation in the attic for $7800. This improvement reduced my electricity usage by a third… a third! Subsequently, I have replaced about fifty percent of my lighting with LED’s. I have spent a couple hundred on this and the savings have been about another five to ten percent.

    Overall, I have reduced my electricity usage by half, over 10 years. I would have made all of the big investments in the house regardless, but the biggest savings and the most cost effective savings have come from paint, caulk, insulation and LED lightbulbs.

    • Azmat says:

      “but the biggest savings and the most cost effective savings have come from paint, caulk, insulation and LED lightbulbs.” that says it all — the simplest solutions are often the most effective.

      • I feel compelled to suggest that the potential savings from the HVAC and windows were not fully realized until I sealed and insulated the house. I should have sealed and insulated the house first and then made the other investments. Still, no amount of electricity savings will ever recoup the $56,000 I have spent.

      • Also, I love the LED’s. My kitchen has ten bulbs, once required five hundred watts worth of incandescents. Now there are ten LED’s which together, burn less than a hundred watts and they are significantly brighter too. You have to wear sunglasses when I turn all the lights on!

      • Paint does not save energy, Caulk generally does not save energy unless some version of it is used to seal between the occupied space and the attic or crawlspace. On the other hand — there are problems with the simulation programs used to “predict” savings.
        At the same time there is a problem even of the concept of “cost effectiveness”. How cost effective is it for us to breathe clean air or to have our grandchildren in a planet that has not been compromised by global warming. Only an economist would dare to put a monetary value on that. Within that fold are the economists who insist that “energy efficiency upgrades would have happened anyway” and that therefore building codes don’t matter. Economists, I am sure are lovely people. However they live in their world of fiction where all things can be monetized.
        What this study provides us with is another look at the energy savings effectiveness of some package of measures applied in a particular way, AND that is important. On the other hand, people continually mix up a “program” evaluation with a “measure” evaluation.
        This was a program evaluation.
        There is much to be said about the effectiveness of residential energy efficiency measures and their delivery to different populations, in different climates, with different building characteristics. — There is not room for all that here — but the discussion needs to happen. Note that I said effectiveness, not “cost effectiveness”. Cost effectiveness is a notion based on too many assumptions
        PS for low-income weatherization agencies who had staff trained and available for a limited number of houses, who had a limited pipeline of clients, ARRA was trying to take a drink of water from a fire hose.

      • ARRAgreen says:

        Interesting point John on viewing this as a program evaluation. I would argue that baselining is a major issue here — tot that I pick on low income houses but it would be good to have gotten a sense of conditions in these houses, i.e. were they drafty? were indoor temperature always whatever it was outside? was it moist? how did pre- and post- temperature and setpoints differ?

        On evaluating energy retrofit measures alone, I suspect that the package of measures may have left out key interactive effects. Were furnaces installed with pre or post weatherization loads? Were there duct cleaning? Were there repowering/replacement of furnace blowers? Were said furnaces / blowers discrete or continuous modulation? The point is there’s too many data points left undisclosed.

        But what do I know, I’m neither a business person nor an economist.

    • Hello John Proctor, P.E., Paint and Caulk reduced or eliminated the flow of air from the inside to outside and vice versa. I believe that, sealing the house, in this manner is what reduced my electric bills. If you added up the total area of all the holes and gaps, I had a gape the size of my front door continuously open. I am curious; do you think that three coats of beige latex paint would have any insulation value, especially during hot summer months?

    • Jay says:

      I work as an energy auditor / final inspector for IHWAP here in the state of Illinois. You have discovered what I try and tell my customers every day. That the Big ticket items like windows and door replacement are not the energy savers people think they are. When I complete an energy audit I will enter all that information into a program that evaluates the cost of each item, to see if the investment is worth the cost of that items application. If the savings is not there, we do not install that item.
      The savings to your energy bills comes in the AIR SEALING and INSULATION.
      If you have a trained qualified Energy Auditor do a thorough evaluation of your home using a blower door set up. He/she will find the air leaks in the thermal boundary of your home. Once these leaks are addressed you will see savings in your flue bills as well as a rise in the comfort level.
      As in your case the air sealing done through caulk, paint and closing up gapes did the best for reducing your flue bills. Not the high cost of replacing windows and doors.

      • mcubedecon says:

        What does sealing leaks do to indoor air quality. We’ve seen increases in asthma rates although outdoor air quality has improved. The next logical place to look is indoor air quality.

        • The Weatherization Assistance Program has implemented ASHRAE 62.2, which requires a certain amount of air circulation through the structure. Good home performance contractors also follow this standard. Prior to the standard we checked air infiltration with a post-install blower door test — and still do as a part of the final inspection. Our final inspections must be done by a BPI-certified Quality Control Inspector. Oh, and we work lead-safe (training and certification required) and follow all relevant OSHA standards.

  3. Did you find out how the temperature inside the homes responded to the weatherization? Were bedrooms warmer for a given thermostat setting, for example?

  4. I have the privilege to work for a network of 55 agencies that provide weatherization services throughout Ohio using state, federal and utility funding. Last year we served 24,000 families, providing comprehensive weatherization services to almost 12,000 (the balance were primarily baseload services.) During ARRA our network weatherized 41,000 households in 27 months, far more than the 33,000 we were tasked with serving. If I were the authors of this study I would some stop hyping it and instead figure out how to do a study that provides some useful insight(s) that could improve the program. This study does not. Following are a few of the reasons:

    1. The authors claim that the study is ‘a first of its kind’ and ‘the gold standard’. However, it fails to follow the protocols used by the weatherization and home performance industry to measure effectiveness. Sometimes the first time isn’t golden. To conclude that savings are lower than the costs of the measures is not an appropriate ‘randomized trial’. All good studies pull random homes to analyze, and use a randomized control group.

    2. The opening of the paper is devoted to a recruitment experiment using paid canvassers to convince low income families to enroll in the weatherization program. They were not very good at it, probably because people did not trust them. Our agencies have deep relationships in the community. People trust us. Nationwide the network weatherized over 800,000 homes during ARRA, serving 25% more families than the program goal. We found plenty of clients. Now that the ARRA money is gone, 1–2 year waiting lists are again common.

    3. The study did not include over 30,000 weatherized homes as some have reported. The savings study appears to be based on around 10% some of the 1,600 homes actually weatherized during the early days of ARRA. The huge infusion of funding affected quality initially. Other states had a larger training infrastructure; Ohio operated four training centers.

    4. The study says that over 34% of the units had furnace replacements. Furnace replacements, after insulation and air sealing have been completed, are almost never cost-effective. Furnaces are replaced as a health and safety measure because they are not working – ‘no heats’ we call them — and paid for with other funds. The study authors somehow missed this and classed one of the most expensive measures as an efficiency investment, leading to the assertion the program has a poor return on investment.

    5. Savings exceed the cost of air sealing, attic insulation, sidewall insulations, hot water, and electric baseload savings measures such as refrigerators and lighting. Those are the services that evaluation professionals measure.

    6. Agencies don’t do windows unless they are broken. Payback is over 35 years. It is a health and safety measures.

    7. Engineering models, including NEAT and the model developed by Berkeley for DOE’s new Home Energy Score program, regularly overstate savings and have other flaws. Everyone in the industry knows it. Audits are a tool to identify high end uses, not to predict outcomes, and do not use a 6% discount rate to determine net present value.

    8. In Ohio, weather-normalized bill analysis shows natural gas savings for low income families in the 28-34% range in independent studies funded by utilities looking at combo units – units combining federal and utility funded measures. The numbers have been consistent since 1995. Electric savings to the families are in the 8-12% range. Programs in other states are equally effective, though savings will vary by climate, housing stock, and other variables. A home that is weatherized well will provide comfort, prevent health problems, and save more in energy bills than the cost of the service.

    • ARRAgreen says:

      On #7. Engineering Models — I bet you a 100 bucks that even the folks who developed any State-approved or peer-reviewed software packages wouldn’t stand by their models when compared to a utility bills. More often than not these packages can’t capture “all” the savings correctly let alone analyze interactive effects. And then I’ll bet you another 100 bucks to say that the “consultant” who used these software packaged to punch out an energy savings report didn’t fully understand the software, and how they could fit a model to the real building and vice versa.

      So I’m not sure if it’s a big deal at all that savings were overestimated by a factor of 2.5 — we still captured 39% of the savings (at some confidence level using some methods described by this paper against a certain geographical / segmented population) anyway on top of a whole host of non-energy (IEQ, comfort, health, security).

      We’re also overlooking the fact that ARRA funds — and I forget the details — requires (mostly) American materials and parts to “help” stimulate the economy. Energy savings / carbon reduction / water savings was always a secondary objective per ARRA. So this paper potentially calls out a recipient of ARRA funding who painted too rosy of a picture and we were all better off with Solyndra, A123 and Freddie & Fannie?

      The ARRA infused $821B into the economy, of which $2.9B was allocated for weatherization of modest-income homes and, relatedly, $4.3B home energy credit to individual taxpayers. Both allocations amount to a de minimis 1% and there was actually a “measurable” ROI, it did mobilize people and industry when we needed it, and, presumably there’s a host of non-energy benefits that our less than wealthy counterparts are enjoying. Without ARRA, weatherization programs, even in the realm of energy efficiency, tend to always fall onto the backburner and never as economically competitive as other measures related to lighting, HVAC, controls and industrial / process.

  5. Marty Kushler says:

    I must say this is an incomplete and selective response to the major criticisms of your study, and to the manner in which it was promoted. For readers not familiar with the critiques (since you don’t cite any here), I provide the URL for my own blog on this subject at ACEEE.

    But to cut to the chase, I am curious about the real purpose of your efforts here. Is it to cause policymakers to terminate the WAP program? If so, your failure to quantify the numerous well-known and well documented “non-energy benefits” that are a key part of the rationale for WAP is a glaring omission, and which frankly disqualifies this study from being used to pass summary judgement on WAP. Even you now acknowledge this limitation, but the damage may have been done with the plethora of media stories in response to your original promotion. (In fact, your original press release and associated “policy brief” contained no such caveat, and in fact indicated that your study had accounted for “the broader societal benefits of energy efficiency investments”.)

    Is the purpose to discredit residential energy efficiency programs in general? That certainly seems to be suggested by your initial press release and policy brief…not to mention the title of this posting itself. If so, this narrow study of one type of program, targeted solely to low-income customers, delivered in a small area of one state, during widely acknowledged adverse circumstances with conflicting policy objectives during the ARRA time period, fails completely for that purpose. Jane Peters’ succinct observation on that subject in her earlier comment here is very apt.

    And finally, please, no more rhetoric about your “gold standard” methodology. The actual methodology used to assess energy savings in the study was quasi-experimental in nature, and the actual analyzed sample was a tiny fraction of the “30,000 households” so frequently mentioned. This study is in no way superior to scores of good studies of energy efficiency programs conducted by experienced evaluation professionals over the years. For starters, I suggest you check out the archives at IEPEC

  6. Scott Pigg says:

    As someone who has conducted third-party evaluations of these programs since the late 1980s, I’ll second the request to drop the “gold-standard” talk. You dismiss several decades of independent studies on these types of programs with a wave of the hand, but the simple fact is that while well-intentioned, your randomization approach was a failure, because so few people in the treatment group were actually weatherized in the end—despite throwing nearly half a million dollars of somebody’s money at the encouragement effort.

    How is it remotely an “important departure from, and improvement upon, previous analyses” when your estimated average savings of 20 percent has a 95% confidence interval that stretches from 4 to 36 percent savings? This study was statistically underpowered to the point that it could barely confidently detect any savings from the program at all, let alone make any conclusions about the magnitude of the savings. It sure looks to me like a case of the cure (randomization) being worse than the disease (possible bias from using quasi-experimental approaches).

  7. Thank you, Marty Kushler and Scott Pigg, for your thoughtful comments. For a thorough discussion of the flawed paper by Fowlie, Greenstone, and Wolfram, see my article on the topic here:

  8. Mike Spillman says:

    It is unfortunate that in today’s politicized real world, any evaluation of an issue or study, controversial or not, needs to include an understanding all of the factors supporting it. Thoroughness would include the funding behind it; if only to refute any potential claims of bias, none of which is being implied in this suggestion. It would perhaps be helpful to ascertain the source(s), and any constraints, detailed or implied, of the funding utilized for the authors study, together with any pre-publication review, peer oriented or not.

  9. Pingback: Energy Efficiency Research Confirms: Measure What You Manage | Resynergy Systems

  10. Pingback: Are energy efficient products a rip-off? - Australia News

  11. Sam Borgeson says:

    I studied building science with a focus on building simulation and energy policy with a focus on efficiency as a climate mitigation strategy. Through my studies, I have come to know Wolfram and Fowlie professionally and deeply respect their research. I believe in the need for more empirical efficiency program evaluations and recognize that the urgency of climate mitigation requires an unsentimental re-evaluation of our efficiency program priorities. I applaud the use of energy consumption data in this study and support the use of randomized controlled trials to evaluate programs (but statistical power issues are a very real concern, not all programs lend themselves to such design, and “gold standard” is a heavily freighted and disputed label for them). My intuition, experience studying building efficiency, and technical training all lead me to accept this study’s energy savings estimates.

    So in theory I should be very pleased by the release of this study. However, I am distressed to see that the technically competent savings estimates have been mixed with a remarkably naive interpretation of the complex and politically determined mission of the Weatherization Assistance Program (i.e. that it is reasonable to assign 100% of its costs to efficiency and that its generosity of assistance is representative of efficiency programs in general), a straw man dismissal of existing program evaluations and their lessons (they have been telling us for years that models are consistently used to over-estimate savings), and a surprising lack of interest in the incentives for over-estimating savings that cost effectiveness criteria provide (the models just simulate physics, but the users of the models get paid based on their results – remember econ 101: incentives matter) to generate a very misleading round of press coverage.

    In brief, the “gold standard” only applies to the energy savings measurements (there is room there for a critique of the sample quality, but the quality of this study’s execution was quite high). The cost effectiveness claims and generalization beyond low income programs that were the headline generating aspects of this project are backed only by some very heavily freighted assumptions that reasonable people could easily disagree about. For example, when you replace an old, malfunctioning furnace with a new and more efficient one, should 100% of the cost of the new furnace be allocated to the efficiency gains (what this study assumes) or should at least some of the cost be considered a necessary payment in the service of continued access to heat? Should a contractor intervening for an efficiency project in a home leave cancer-causing asbestos duct insulation in place because replacing it costs money but doesn’t save energy?

    Such questions do not have objective and precise answers, but can lead to dramatically different views on what the underlying efficiency gains cost. In their widely publicized conclusions, the authors have taken a maximalist view: 100% of program costs are efficiency costs. What can not be disputed is that among efficiency programs, only WAP pays the full costs of furnaces and asbestos abatement. This is the primary reason efficiency experts are tying themselves in knots to underscore that WAP programs are not representative of all efficiency programs on costs.

    It is one thing to have your research turned into a set of click bait articles with misleading headlines questioning the legitimacy of efficiency as a public investment or climate strategy by an intellectually lazy press. It is another thing entirely to cultivate that press with a policy brief and authoritative public statements that lean very heavily on assumptions about cost allocation and generalizability that are so debatable. To do this based on a study that has not even been peer reviewed and at the expense of a program that is providing valuable non-energy services to low income people is not up to the standards of academic practice that the authors should hold themselves to.

  12. Pingback: Hayward S270T2 Pro-Series 27-Inch Top-Mount Pool Sand Filter for In-ground Pools • Landscaping in 2015

  13. Pingback: Are energy efficient products a rip-off? | Perth Now

  14. Pingback: Why Won’t People Invest in Energy Efficiency—Even When It Saves Them Money? - ECOJUZZ

  15. Pingback: Why Won’t People Invest in Energy Efficiency—Even When It Saves Them Money? | EMS

  16. Pingback: How homes kept cool before the age of AC | gWashington1999

  17. Pingback: Are the Benefits to the Weatherization Assistance Program’s Energy Efficiency Investments Four Times the Costs? |

  18. Pingback: Energy Institute @ Haas takes on DOE weatherization study | Economics Outside the Cube

  19. mcubedecon says:

    I was struck by this passage: “While some have criticized this as too conservative, it is standard to use current energy prices as a predictor of future energy prices.” As a practicing energy economist for 30 years, I am unaware of this “standard.” It’s a simplistic approach to the problem, but the uncertainty of energy prices is one of the justifications for energy savings programs. The fixation on using single-point forecasts is at the root of this mistaken assumption. A more thoughtful approach would have used some sort of scenario or risk analysis to measure the value of the savings.

  20. Pingback: Do Energy Efficiency Investments Deliver During Crunch Time? |

  21. Pingback: I’m Not Really Down with Most Top Down Evaluation |

  22. Pingback: I’m Not Really Down with Most Top Down Evaluations |

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s