A lot of the policy discussion, and many of our blog posts, focus on the difficult task of trying to slow climate change. It’s useful to remind ourselves of the difficulties associated with NOT slowing climate change.
Last week, Michael Bloomberg, Hank Paulson and Tom Steyer released their Risky Business report quantifying the costs of climate change for the US economy. The report relies heavily on modeling work by one of EI@Haas’ own, Solomon Hsiang. Sol is a professor at UC Berkeley’s Goldman School of Public Policy and was a visitor at the Energy Institute this past semester. He was the lead economic author on the technical report that generated all of the results for the Bloomberg et al. glossy.
The technical report is notable on several dimensions, both the results it presents and the methodologies the authors use. The press has picked up on the bi-partisan leaders of the project, their call to apply risk management thinking to climate change, and the impactful charts and figures documenting the regional effects of climate change.
I was intrigued by a component of the report that hasn’t gotten as much attention –its commitment to “open science.” There’s no hard and fast definition of that term, but the Open Science organization defines it as, “the idea that scientific knowledge of all kinds should be openly shared as early as is practical in the discovery process.”
I have been involved in lengthy discussions about open science with my colleagues. Here are some of the tradeoffs in the context of the Risky Business report.
Underlying the report’s fabulous pictures is an elaborate computer simulation model. Sol and his co-authors spent many, many hours developing their model, which they are calling SEAGLAS (Spatial Empirical Global-to-Local Assessment System). Developing a model like SEAGLAS involves everything from digging into the literature to come up with the best inputs to the model (e.g., the projections for temperature changes, impact of weather on electricity consumption) to writing lines and lines of code to combine all the inputs to generate useful outputs.
The benefits to embracing open science in this case are pretty clear. If Sol and his co-authors make the code underlying SEAGLAS accessible, other researchers can check the code for errors, stress test it by using it to generate results different from the ones in the original report, write modules to expand it, and generally improve its usefulness.
But, there is a potential cost. Academics are rewarded (e.g., promoted to tenure or given raises) almost purely on the number of original scientific publications they produce. So, the inclination of some people who develop models like SEAGLAS is to keep them under lock and key while they publish papers. After all, what’s the purpose of investing a lot of time in a well-constructed model if other researchers can use it to do cool stuff before you can? It’s like a company shielding its intellectual property, just that the currency is academic publications rather than cash. The risk is that we will get fewer valuable models created in the first place if researchers are expected to immediately make them open.
Much like Elon Musk at Tesla, Sol and his co-authors seem to be opening up the model vault. The report explicitly says, “We will be making our data and tools available online at climateprospectus.rhg.com. We hope others build on and improve upon our work in the months and years ahead.”
Sol and his co-authors are certainly not the first climate modelers to make their code public, and, in fact, how truly public it is remains to be seen. Posting undocumented code is almost useless to other researchers – it can take as much or more time to dissect someone else’s code as to write your own. On the other hand, some modelers have made an industry out of teaching people about their models and go to great lengths to make it accessible.
I am optimistic that Sol and his co-authors will provide useful information at the link above. After all, they’ve already made an early, public commitment to do that in the Risky Business technical report.
And, in terms of the inputs to the model, Sol sent us an email explaining that, “We have developed a new Distributed Meta-Analysis System (DMAS) that continuously and dynamically integrates new empirical findings (that are crowd-sourced from researchers around the world).” This means the inputs will be kept current and should represent consensus estimates from other researchers.
We have a lot of work to do quantifying the impacts of climate change and identifying the best mitigation and adaptation strategies. With public-minded – not to mention superb – researchers like Sol, I’m optimistic about the progress we can make.
Catherine Wolfram is Associate Dean for Academic Affairs and the Cora Jane Flood Professor of Business Administration at the Haas School of Business, University of California, Berkeley. She is the Program Director of the National Bureau of Economic Research's Environment and Energy Economics Program, Faculty Director of The E2e Project, a research organization focused on energy efficiency and a research affiliate at the Energy Institute at Haas. She is also an affiliated faculty member of in the Agriculture and Resource Economics department and the Energy and Resources Group at Berkeley.
Wolfram has published extensively on the economics of energy markets. Her work has analyzed rural electrification programs in the developing world, energy efficiency programs in the US, the effects of environmental regulation on energy markets and the impact of privatization and restructuring in the US and UK. She is currently implementing several randomized controlled trials to evaluate energy programs in the U.S., Ghana, and Kenya.
She received a PhD in Economics from MIT in 1996 and an AB from Harvard in 1989. Before joining the faculty at UC Berkeley, she was an Assistant Professor of Economics at Harvard.