Suppose the experts are wrong about climate change. Suppose they’ve underestimated the impact of global warming. What would happen if temperatures were to rise much higher and much sooner than predicted? What if the worst-case scenario were to come true?
That is a question utilities must face today, even as the industry races to find solutions to reduce carbon emissions.
This predicament is not entirely fair. That’s because even a modest rise in temperatures, standing alone, would force utilities to build more capacity to meet the expected rise in cooling demand, which would only add more fuel to the fire, assuming that that the added capacity would include a fair share of fossil resources.
But the alternative could be worse. The heat wave in 2003 in Europe killed at least 35,000 people— 20,000 in Italy and 15,000 in France. Utilities—whether in Europe or America or even India or China—don’t want to be caught short of power when the mercury climbs. They’ll be held accountable if it happens.
Many of those who died in the European heat waves were seniors without air conditioning, as average temperatures during that period exceeded 104 degrees Fahrenheit during the day, and settled around 86 degrees Fahrenheit at night. In Alsace, where at one point temperatures soared to 118 degrees Fahrenheit, the electric company had to train water cannons on the roof of the nuclear-power reactor to keep it cool. But that was not enough to stop the grid from failing—there were widespread blackouts throughout France and Italy from the extreme rises in electricity demand. Moreover, experts say 2003 was Europe’s hottest summer in at least half a millennium. Oxford University climate scientist and statistician Myles Allen has said that while the heat wave was caused by anticyclones over Europe, which always increases temperatures, “climate change made the background temperatures within which the anti-cyclones operated that much higher.”
Of course, to longtime readers of Public Utilities Fortnightly, the idea that a warming climate might force adjustments in utility resource plans is nothing new. Nearly 20 years ago, in this very column (then known then as “Pages With the Editor”), Editor Bruce W. Radford covered some of the first industry discussions on global warming.
His April 14, 1988, column was titled, “Are We Planning for the Greenhouse Effect?” And back then, the first focus of the industry seemingly was on steel in the ground. In his article, Radford reported on a draft study by ICF on the effects of global warming on electric utilities that was commissioned by the Electric Power Research Institute, the Edison Electric Institute, the New York State Energy Research and Development Authority, and the U.S. Environmental Protection Agency. That report had sought to predict the fate of the New York state utility system and a southeastern utility in a world of uncertain climate.
Focusing on the year 2015 (then 27 years in the future), the report had predicted an increase in generating requirements of between 10 percent and 20 percent, attributable to moderately rapid climate changes.
Citing other findings in that report, Radford wrote that a “business-as-usual” approach to greenhouse gases would mean warming of between 1.6 degrees Celsius and 4.7 degrees Celsius by the year 2030.
This prediction generally holds today. However, a recent report from the U.N. Intergovernmental Panel on Climate Change (IPCC) released earlier this year predicts that average global temperatures could rise between 1.1 degrees and 6.4 degrees Celsius (or between 2 degrees and 11.5 degrees Fahrenheit) when going out further to the year 2100. How many more power plants would America need to keep buildings, schools, and people cool if temperatures jumped by 11.5 degrees Fahrenheit? That’s the issue that should be part of the climate-change discussion.
These days, the focus seems to have shifted from resource adequacy to carbon reduction—a laudable goal that figures significantly in the minds of today’s utility executives, as revealed in this year’s CEO forum, “Greenhouse Gauntlet.”
The message is clear, both in Al Gore’s movie, An Inconvenient Truth, or in any number of U.N. reports: Significant reductions in carbon emissions can very well lead to a Hollywood ending, a reversal of the global-warming trend. And it may be cheap, too. The IPCC Working Group III study released in early May concluded that preventing global temperature increases of 3.6 degrees Fahrenheit (the level at which some scientists believe severe climate change could be triggered) would cost only 0.12 percent of world economic growth over the next 20 years. But make no mistake. The greatest burden would fall on the energy industry, which is one of the largest contributors to greenhouse gases.
The IPCC has stated that a range of mitigation strategies already is available to policy-makers to reduce emissions. Such strategies include improvements in supply and distribution efficiency, fuel switching from coal to gas, nuclear power, renewables, and early applications of carbon capture and storage (CCS). But what happens if we have passed the point of no return? What should the strategy be then?
Even if emissions were to peak in 2015 (an unlikely event), and thereafter fall by about 50 to 80 percent over the next several decades, global warming would be limited to about 2 degrees Celsius above pre-industrial levels, the recent IPCC report found. The world already has warmed by about 0.7 degrees Celsius in the past century. But if emissions continue to grow until 2030—a much more likely scenario—temperatures probably would rise by 3 degrees Celsius above preindustrial levels, according to a Financial Times analysis. This scenario corresponds to a level of greenhouse gases in the atmosphere equivalent to about 535 to 590 parts per million of carbon dioxide, according to the report. Scientists fear that at levels above that, the likelihood of “feedback” effects, which amplify temperature rises, could result in runaway climate change—a rapid acceleration in temperature and effects that could include more violent storms, desertification, and a sharp reduction in agricultural productivity.
However, even 535 to 590 parts per million sounds like a rather frighteningly high level if you think in terms of tons of carbon. According to Fred Pearce’s book, With Speed and Violence: Why Scientists Fear Tipping Points in Climate Change, during the depths of the last ice age, the amount of carbon dioxide in the atmosphere hovered around 440 billion tons. Then, as the ice age closed, some 220 billion tons rose back out of the oceans and into the atmosphere, raising the level there to about 660 billion tons.
That’s where things remained at the start of the Industrial Revolution, when humans began large-scale burning of carbon fuels. Today, after a couple of centuries of rising emissions, we have added another 220 billion tons to the atmospheric burden, making it about 880 billion tons—twice what it was during the last ice age and a third more than recent interglacial eras.
Scientists interviewed in the Pearce book conclude that a more conservative safety-first concentration should be below 450 parts per million or below 935 billion tons. But given current emissions, we could hit that level in as little as 10 years, which is consistent with U.N. warnings that the world has until 2020 to reverse the affects of climate change.
The IPCC report’s safety level is 590 parts per million or 1.38 trillion tons, which is a level never before seen on this planet. Most scientists don’t foresee us reaching those levels, but instead returning to a historical pattern of unstable weather patterns, or alternating glacial and interglacial periods.
During the last ice age, Starbucks didn’t exist, but fur coats were all the rage. Let’s hope electricity is around in the future, so we at least can brew some java to stay warm before heading outside to forage for food.