* It’s pronounced like “juggernaut” without the “jug.”
Gernot writes the Risky Climate column for Bloomberg Green and has written two books: Climate Shock, joint with Harvard’s Martin Weitzman and published by Princeton (2015), among others, a Top 15 Financial Times McKinsey Business Book of the Year 2015, and Austria’s Natural Science Book of the Year 2017; and But will the planet notice?, published by Hill & Wang/Farrar Strauss & Giroux (2011).
He teaches climate economics and policy at NYU, where he is a clinical associate professor at the Department of Environmental Studies and associated clinical professor at the NYU Wagner School of Public Service.
Prior to joining NYU, Gernot was the founding executive director of Harvard’s Solar Geoengineering Research Program (2016 – 2019), a research associate at the Harvard John A. Paulson School of Engineering and Applied Sciences, and a lecturer on Environmental Science and Public Policy. Before Harvard, Gernot served as economist at the Environmental Defense Fund (2008 – 2016), most recently as lead senior economist (2014 – 2016) and member of its Leadership Council (2015 – 2016). He has taught at Columbia, Harvard, and NYU, and has been a term member of the Council on Foreign Relations.
Born and raised in Amstetten, Austria, Gernot graduated from high school in his hometown before moving to the U.S. for college. He holds a joint bachelor’s magna cum laude with highest honors in environmental science, public policy, and economics, and a master’s and Ph.D. in political economy and government from Harvard, as well as a master’s in economics from Stanford.
Gernot lives in New York City with his wife, Siri Nippita, a gynecologist at NYU Langone Health and the division director of Reproductive Choice at Bellevue Hospital, and their two young children.
To get a better sense of Gernot’s work, read his bi-weekly column, follow him on Twitter, sign up for his email newsletter, or find him at: Google Scholar | Leigh Speakers Bureau | NYU Wagner | LinkedIn | Amazon | Bloomberg Green | Project Syndicate | and the always trustworthy Wikipedia.
Economics—misguided market forces—is at the core of most environmental problems. Economics—guiding market forces in the right direction—is also fundamental to the solution.
In this course we develop some of the fundamental economic tools for environmental policy analysis and management: Economics 101 applied to environmental problems—often, though not exclusively, focused on climate change.
We will also go well beyond that initial Econ 101 take, narrowly defined. In fact, focusing exclusively on Econ 101 may sometimes be positively misleading.
For example, Econ 101 traditionally tells us to price each ton of carbon dioxide (CO2) emitted into the atmosphere, and to get out of the way. Markets will take of the rest.
Not so fast.
Econ 102 tells us that not only is there a negative carbon spillover of economic activity, but also a positive learning-by-doing one. Installing the first rooftop solar panel is costly. The one hundredth is already cheaper. The millionth is a breeze. That goes for any individual roofer. It also goes for entire countries, and it is at the heart of policies from California’s Solar Initiative (formerly, its Million Solar Roofs Initiative) to Germany’s Energiewende (energy transition).
Then there’s Political Economy 101. Shouting “carbon tax” all day long will not make it so. In fact, subsidizing clean technologies may even be a necessary step to get a price on CO2 passed in the first place.
We will discuss this and similar examples, applying Econ 101 (and 102) to the real world, keeping Political Economy 101—and real-world politics—in mind every step along the way.
How to make decisions in light of pervasive uncertainties? How to think about incentive structures faced by decision-makers, and think through unintended consequences of one’s decisions?
Economics, for better or worse, is organized common sense. No more, also no less. This class makes use of the toolkit given to us by economics and applies them to real-world policy problems.
Given my own background, the class will focus on questions around climate, energy, and the environment, though not exclusively. In the end, we will pick examples based on how well they help us expand our toolkit and answer specific policy questions.
What to make of the precautionary principle? What can economics teach us that engineering can’t? How to deal with constant learning, experimentation, and streams of new information?
Some of the questions we will be asking have clear answers. Many don’t. The biggest question to us then often is in how far the tools economics gives us can provide objective policy advice, and at what point do normative judgments—politics—take over.
We will develop our toolkit around these and many other questions, looking to the policy world—and the news—for ideas. In doing so, we apply economic insights, some basic mathematical tools, statistical thinking, and econometrics, and borrow fundamental ideas from various other disciplines—all in the service of turning ourselves into better policy analysts and, ultimately, more astute decision makers.
Moral hazards are ubiquitous. Green ones typically involve technological fixes: Environmentalists often see ‘technofixes’ as morally fraught because they absolve actors from taking more difficult steps towards systemic solutions. Carbon removal and especially solar geoengineering are only the latest example of such technologies. We here explore green moral hazards throughout American history. We argue that dismissing (solar) geoengineering on moral hazard grounds is often unproductive. Instead, especially those vehemently opposed to the technology should use it as an opportunity to expand the attention paid to the underlying environmental problem in the first place, actively invoking its opposite: ‘inverse moral hazards’.
Climate change has myriad physical and economic impacts. Even those that can be easily quantified indicate the need for ambitious climate action. Other climate impacts have yet to be quantified. We argue here that uncertainties in climate and weather extremes only further increase the social cost of carbon emissions.
Some countries prefer high to low mitigation (H ≻ L). Some prefer low to high (L ≻ H). That fundamental disagreement is at the heart of the seeming intractability of negotiating a climate mitigation agreement. Modelling global climate negotiations as a weakest-link game brings this to the fore: Unless everyone prefers H to L, L wins. Enter geoengineering (G). Its risky and imperfect nature makes it arguably inferior to any country’s preferred mitigation outcome. However, absent a global high-mitigation agreement, countries facing disastrous climate damages might indeed wish to undertake it, effectively ranking H ≻ G ≻ L. Meanwhile, those least affected by climate damages and, thus, least inclined to agree to an ambitious mitigation agreement, might be unwilling to engage in risky geoengineering, resulting in L ≻ H ≻ G. With these rankings, all players prefer H to G, and the mere availability of a credible geoengineering threat might help induce an ambitious climate mitigation agreement (H). The analysis here introduces the simplest possible model of global climate negotiations and derives the conditions for this outcome. These conditions may indeed be likely, as long as geoengineering is viewed as a credible albeit risky emergency response given the danger of low mitigation levels.
China and the United States are the two largest emitters of greenhouse gases, making them pivotal players in global climate negotiations. Within the coming decade, however, India is set to become the most important counterpart to the United States, as it overtakes China as the country with the most at stake depending on the type of global burden-sharing agreements reached, thus becoming a member of the ‘Climate G2’. We create a hypothetical global carbon market based on modelling emissions reduction commitments across countries and regions relative to their marginal abatement costs. We then analyse net financial flows across a wide range of burden-sharing agreements, from pure ‘grandfathering’ based on current emissions to equal-per-capita allocation. Among the four largest players – the United States, the EU-27, China, and India – it is China that would currently be the largest net seller of emissions allowances in all but the grandfathered scenario. The United States would be the largest net buyer. However, India is poised to take China’s position by around 2030. That leaves the United States and India as the two major countries with most to gain and lose, depending on the type of climate deal reached.
Pricing greenhouse-gas (GHG) emissions involves making trade-offs between consumption today and unknown damages in the (distant) future. While decision making under risk and uncertainty is the forte of financial economics, important insights from pricing financial assets do not typically inform standard climate–economy models. Here, we introduce EZ-Climate, a simple recursive dynamic asset pricing model that allows for a calibration of the carbon dioxide (CO2) price path based on probabilistic assumptions around climate damages. Atmospheric CO2 is the “asset” with a negative expected return. The economic model focuses on society’s willingness to substitute consumption across time and across uncertain states of nature, enabled by an Epstein–Zin (EZ) specification that delinks preferences over risk from intertemporal substitution. In contrast to most modeled CO2 price paths, EZ-Climate suggests a high price today that is expected to decline over time as the “insurance” value of mitigation declines and technological change makes emissions cuts cheaper. Second, higher risk aversion increases both the CO2 price and the risk premium relative to expected damages. Lastly, our model suggests large costs associated with delays in pricing CO2 emissions. In our base case, delaying implementation by 1 y leads to annual consumption losses of over 2%, a cost that roughly increases with the square of time per additional year of delay. The model also makes clear how sensitive results are to key inputs.
Nonstate actors appear to have increasing power, in part due to new technologies that alter actors’ capacities and incentives. Although solar geoengineering is typically conceived of as centralized and state-deployed, we explore highly decentralized solar geoengineering. Done perhaps through numerous small high-altitude balloons, it could be provided by nonstate actors such as environmentally motivated nongovernmental organizations or individuals. Conceivably tolerated or even covertly sponsored by states, highly decentralized solar geoengineering could move presumed action from the state arena to that of direct intervention by nonstate actors, which could in turn, disrupt international politics and pose novel challenges for technology and environmental policy. We conclude that this method appears technically possible, economically feasible, and potentially politically disruptive. Decentralization could, in principle, make control by states difficult, perhaps even rendering such control prohibitively costly and complex.
The Ramsey equation ties the utility discount rate and the elasticity of marginal utility of consumption together with per capita consumption growth rates to calculate consumption discount rates. For many applications, per capita consumption growth rates can be approximated with per capita output growth rates. That approximation does not work for climate change, which drives an ever-increasing and increasingly uncertain wedge between output and consumption growth. NAS (2017), in a central recommendation and illustrative example, conflates the two. The correct, consumption-based discounting method generally decreases consumption discount rates and, thus, increases the resulting Social Cost of Carbon Dioxide (SC-CO2).
The question of how to discount the distant future has long been at the core of climate economics. It has also divided economists. Some argue for prescriptivist approaches to discounting, often calling for social discount rates of as low as 1% per year. Others argue strongly for descriptivistapproaches and rates as high as 5% or more. A look to financial economics has since added another wrinkle, by pointing to the need to separate risk aversion from intertemporal substitution to calibrate real-world behavior, at times lowering effective descriptivist rates close to prescriptivist ones.
We attempt to reconcile some of these methodological differences by identifying three types of prescriptivism. Economists are frequently uncomfortable with what we term parameter prescriptivism, while being comfortable with both axiom and policy prescriptivism. That faces theoretical challenges. We argue that if a priori moral reasoning is not allowed to influence parameter values, then the results of one's analysis should not be framed as a prescriptive policy ‘recommendation’. While descriptivist analysis is relevant to policy, we must be clear that it can only inform policy choices, not determine them. We use our framework to evaluate recent proposals in climate economics to replace the standard isoelastic utility function with Epstein-Zin preferences to allow for the separate treatment of risk aversion and intertemporal substitution.
We review the capabilities and costs of various lofting methods intended to deliver sulfates into the lower stratosphere. We lay out a future solar geoengineering deployment scenario of halving the increase in anthropogenic radiative forcing beginning 15 years hence, by deploying material to altitudes as high as ~20 km. After surveying an exhaustive list of potential deployment techniques, we settle upon an aircraft-based delivery system. Unlike the one prior comprehensive study on the topic (McClellan et al 2012 Environ. Res. Lett. 7 034019), we conclude that no existing aircraft design—even with extensive modifications—can reasonably fulfill this mission. However, we also conclude that developing a new, purpose-built high-altitude tanker with substantial payload capabilities would neither be technologically difficult nor prohibitively expensive. We calculate early-year costs of ~$1500 ton−1 of material deployed, resulting in average costs of ~$2.25 billion yr−1 over the first 15 years of deployment. We further calculate the number of flights at ~4000 in year one, linearly increasing by ~4000 yr−1. We conclude by arguing that, while cheap, such an aircraft-based program would unlikely be a secret, given the need for thousands of flights annually by airliner-sized aircraft operating from an international array of bases.
Equilibrium climate sensitivity (ECS), the link between concentrations of greenhouse gases in the atmosphere and eventual global average temperatures, has been persistently and perhaps deeply uncertain. Its ‘likely’ range has been approximately between 1.5 and 4.5 degrees Centigrade for almost 40 years (Wagner and Weitzman, 2015). Moreover, Roe and Baker (2007), Weitzman (2009), and others have argued that its right-hand tail may be long, ‘fat’ even. Enter Cox et al. (2018), who use an ‘emergent constraint’ approach to characterize the probability distribution of ECS as having a central or best estimate of 2.8 °C with a 66% confidence interval of 2.2–3.4 °C. This implies, by their calculations, that the probability of ECS exceeding 4.5 °C is less than 1%. They characterize such kind of result as “renewing hope that we may yet be able to avoid global warming exceeding 2[°C]”. We share the desire for less uncertainty around ECS Weitzman (2011), Wagner and Weitzman (2015). However, we are afraid that the upper-tail emergent constraint on ECS is largely a function of the assumed normal error terms in the regression analysis. We do not attempt to evaluate Cox et al. (2018)’s physical modeling (aside from the normality assumption), leaving that task to physical scientists. We take Cox et al. (2018)’s 66% confidence interval as given and explore the implications of applying alternative probability distributions. We find, for example, that moving from a normal to a log-normal distribution, while giving identical probabilities for being in the 2.2–3.4 °C range, increases the probability of exceeding 4.5 °C by over five times. Using instead a fat-tailed Pareto distribution, an admittedly extreme case, increases the probability by over forty times.
Solar geoengineering, which seeks to cool the planet by reflecting a small fraction of sunlight back into space, has drawn the attention of scientists and policymakers as climate change remains unabated. Unlike mitigation, solar geoengineering could quickly and cheaply lower global temperatures. It is also imperfect. Its environmental impacts remain unpredictable, and its low cost and immediate effects may result in ‘moral hazard,’ potentially crowding out costly mitigation efforts. There is little understanding about how the public will respond to such tradeoffs. To address this, a 1000-subject nationally representative poll focused on solar geoengineering was conducted as part of the Cooperative Congressional Election Study (CCES) of the US electorate in October–November 2016. The importance that individuals place on solar geoengineering’s speed and cost predicts their support for it, but there is little to no relationship between their concerns about its shortcomings and support for its research and use. Acquiescence bias appears to be an important factor for attitudes around solar geoengineering and moral hazard.
This paper introduces an approach for separately quantifying the contributions from renewables in decomposition analysis. So far, decomposition analyses of the drivers of national CO2 emissions have typically considered the combined energy mix as an explanatory factor without an explicit consideration or separation of renewables. As the cost of renewables continues to decrease, it becomes increasingly relevant to track their role in CO2 emission trends. Index decomposition analysis, in particular, provides a simple approach for doing so using publicly available data. We look to the U.S. as a case study, highlighting differences with the more detailed but also more complex structural decomposition analysis. Between 2007 and 2013, U.S. CO2 emissions decreased by around 10%—a decline not seen since the oil crisis of 1979. Prior analyses have identified the shale gas boom and the economic recession as the main explanatory factors. However, by decomposing the fuel mix effect, we conclude that renewables played an equally important role as natural gas in reducing CO2 emissions between 2007 and 2013: renewables decreased total emissions by 2.3–3.3%, roughly matching the 2.5–3.6% contribution from the shift to natural gas, compared with 0.6–1.5% for nuclear energy.
Delivering emission reductions consistent with a 1.5°C trajectory will require innovative public financial instruments designed to mobilize trillions of dollars of low-carbon private investment. Traditional public subsidy instruments such as grants and concessional loans, while critical to supporting nascent technologies or high-capital-cost projects, do not provide the price signals required to shift private investments towards low-carbon alternatives at a scale. Programmes that underwrite the value of emission reductions using auctioned price floors provide price certainty over long time horizons, thus improving the cost-effectiveness of limited public funds while also catalysing private investment.
Taking lessons from the World Bank’s Pilot Auction Facility, which supports methane and nitrous oxide mitigation projects, and the United Kingdom’s Contracts for Difference programme, which supports renewable energy deployment, we show that auctioned price floors can be applied to a variety of sectors with greater efficiency and scalability than traditional subsidy instruments. We explore how this new class of instrument can enhance the cost-effectiveness of carbon pricing and complementary policies needed to achieve a 1.5°C outcome, including through large-scale adoption by the Green Climate Fund and other international and domestic climate finance vehicles.