A gas industry leader says Bush got it right, yet admits the worth of carbon abatement.
By now, the press has had its day with the saga of Christine Todd Whitman, the new Administrator of U.S. Environmental Protection Agency (EPA) - how she sought to treat carbon dioxide (CO2) as a pollutant and then cap CO2 emissions, but without any regulatory authority under the Clean Air Act, its 1990 Amendments, or its legislative history.
Supposedly, EPA's action stemmed from a campaign promise by George W. Bush. But the President wisely withdrew it in view of the spreading California power crisis. As I see it, the President's new policy stands up well, not only in terms of regulatory law, but also as a matter of common sense for U.S. energy and economic policy.
All the same, however, I continue to favor a move toward minimizing CO2 emissions, both for the U.S. and for the global energy system. Such a policy can be shown to be prudent and feasible. Yet, there is much misinformation. Much of it can be blamed on those responsible for interpreting the findings of the U.N. Intergovernmental Panel on Climate Change. Any worthwhile discussion of the Kyoto Protocol, fossil fuels, CO2, and electric generation cannot begin without a full grasp of the costs and capabilities of gas exploration and production, and the consequences of any increased reliance on gas-fired, combined-cycle turbines. And it ought to start with the clear understanding that carbon dioxide is not a pollutant.
Carbon and Climate: The Latest Data
Obviously, the difference between a pollutant and a greenhouse gas is more than semantic. Air pollutants must either be toxic outright, as is carbon monoxide or mercury, or threaten major detrimental health and environmental impacts, such as sulfur and nitrogen oxides, particulate matter, and reactive (non-methane) hydrocarbons. I shall deal with the supposedly dire consequences of anthropogenic greenhouse gas emissions later on, but CO2 is a special case. It is not only used as the surrogate for all of these emissions arising from the combustion of fossil fuels (coal, oil and natural gas) and land use practices, but it is also the essential source of life on our planet because photosynthesis is the beginning of the food chain. In fact, one can cite many beneficial impacts from the enrichment of the atmosphere in both the recent and distant past. (The atmospheric concentration of CO2 grew from about 280 parts per million by volume (ppmv) around 1800 and to 367 ppmv today. It also varied from roughly 180 ppmv to 300 ppmv during the preceding 160,000 years). For example, the higher CO2 concentration has stimulated vegetation growth, such as afforestation of the Northern Hemisphere. In the most recent assessment of total annual anthropogenic emissions, some 2 billion of 8 billion metric tons (gigatonnes) of carbon (GtC) are now sequestered in a "missing sink," in addition to the 2 GtC/year permanently sequestered by the oceans1.
Also, of the 200-gigatonne annual cycle of carbon circulation that occurs in the form of CO2 , only 4 percent stems from human activities - the rest is all natural circulation between the atmosphere, the terrestrial biomass, soil and detritus, and the oceans1. Roughly 100 GtC per year are fixed by photosynthesis; 50 GtC each are released by plant respiration and decay of organic residues; another 100 GtC circulate between the atmosphere and the oceans, except for the total of 4 GtC/year of net uptake. Therefore, classifying CO2 as a "pollutant" goes far beyond semantics. It is grossly misleading.
The issue of the supposedly disastrous impacts of human activities on global climate is also far from settled. Most of the 0.7°C surface temperature increase between 1860 and 1999 determined by seriously flawed thermometer measurements occurred prior to 1940, when anthropogenic greenhouse gas emissions were relatively small. In fact, the increase to date is primarily a recovery from the 0.6°C negative peak deviation from the global average of 15°C during the "Little Ice Age," which plagued our planet during much of the second half of the last millennium2. It followed the Medieval Warm Period, when it was as much as 0.5°C warmer. From the late 10th to early 13th Century, vineyards flourished in England and the coast of Greenland could be farmed. Earlier during the current 10,000-year old interglacial period (the Holocene), about 5000-6000 years ago, temperatures were as much as 2°C warmer, which coincided with the flowering of many civilizations. The melting of glaciers and some polar ice, which is now blamed on human activities has, in fact, been going on for well over 100 years, when the recovery from the "Little Ice Age" began.
By contrast, the expected impact of roughly doubling the CO2 concentration has been a settled issue for more than 100 years. Based on simple radiation physics, such a doubling should cause an increase in effective emission temperature of 1.2°C. The problem, however, is that this value can be grossly inflated (or deflated) by the assumption of a whole range of feedback effects - stemming primarily from the role of water vapor in various forms (gaseous or as clouds of various reflective properties)3.
Why some fault the UN Panel
This winter the United Nations Intergovernmental Panel on Climate Change (IPCC) recently raised its forecast of possible temperature increases, but relied on a combination of questionable scenarios.
IPCC in 1996*
- 0.8°C to 4.5°C - forecasted run-up by 2100
- 2.5°C - most likely rise from doubling of CO2
- 450 to 650 ppmv - carbon emissions stabilize in late 21st century
IPCC in 2001#
- 1.4°C to 5.8°C - forecasted run-up by 2100.
- 1.5°C to 4.5 C - sensitivity to CO2 doubling
- 770 to 2190 - gigatonnes of carbon emissions, 1991-2100
* Second Assessment Report
# "Policymakers Summary" (summarizing IPP's yet-unpublished Third Assessment Report.)
In reality, 97 percent of the greenhouse effect is caused by these various forms of water vapor in the atmosphere over which we have no control. The greenhouse gases whose concentrations are affected by human activities cause only about 3 percent of the greenhouse effect. Of this, roughly 2 percent is due to CO2 and a total of 1 percent is due to methane, nitrous oxide, and various halogenated carbon compounds. Even much of this remaining 3 percent comes from natural sources. However, the literature and media reports about the supposedly disastrous enhancement of the greenhouse effect by human activities seldom note this dominant role of water vapor. Moreover, the alarmists also fail to mention that the greenhouse effect is benign, because it keeps the average global surface temperature at +15°C instead of -18°C. Without the greenhouse effect, our planet could support at most very primitive forms of life.
Global Warming: The Alarmist Predictions
Up to now the modest rise in global surface temperature has caused none of the predicted disasters (floods, droughts, hurricanes, etc.). To heighten these concerns, the alarmist predictions deliberately inflate future temperature increases. These predictions emanate primarily from the Intergovernmental Panel on Climate Change (IPCC), a U.N. body of scientists responsible for assessing the extent and impacts of anthropogenic climate change under the 1992 U.N. Framework Convention of Climate Change (the "Rio Treaty"). A total of 160 countries (including the United States) endorsed the treaty as signatories or "Parties." Since 1992, these countries have held six "Conferences of Parties," including the last one in The Hague, in November 2000, and the important one in Kyoto, in December 1997. That meeting produced the highly controversial "Kyoto Protocol" - the agreement that would force industrial countries to make deep cuts in greenhouse gas emissions, while exempting developing countries. °
In anticipation of the Sixth Conference of Parties (COP-6) in The Hague, the IPCC released a "Policymakers Summary" of its yet-unpublished "Third Assessment Report." That report widened and raised the earlier range of projections given in the IPCC's "Second Assessment Report," published in 1996. The third report predicts an increase in average global surface temperature between 1990 and 2100 ranging from 1.4°C to 5.8°C - up from the increase given in the second report, of 0.8 to 4.5°C1,4. The IPCC based this revision on questionable scenarios of energy consumption, economic and population growth, energy mix, and energy technology change, as well as the previously mentioned faulty data on actual temperature increases that occurred between 1860 and 1999.
Meanwhile, the press continues its efforts to create alarm over global warming caused by human activities. For its most widely reported figure of the increase in average global surface temperature from 1990 to 2100 the press has settled on a forecast represented by of the new IPCC projections, based on assumptions listed in the IPCC's Special Report on Emission Scenarios. To heighten the intended alarm over this new assessment, the press has generally reported this rise in temperature as 11 degrees Fahrenheit (actually 10.4°F). What was not reported, however, is that the entire range of increase in surface temperature projected from 1990 to 2100 and offered in the IPCC's third report (a range lying between 1.4°C and 5.8°C) was based on a simple model "tuned" to yield responses similar to several complex models. The range narrows considerably, to somewhere between 2.0°C and 4.6°C, when one calculates the average of these various models over all of the scenarios. Also, the media reports fail to note that the larger range stated in the third report includes climate sensitivities to doubling of CO2 concentrations of anywhere from 1.5°C to 4.5°C. The "Policymakers Summary" for the Third Assessment Report does not acknowledge that the most likely climate sensitivity for a CO2 doubling, as per the Second Assessment Report, is only 2.5°C. Reliance on that figure reduced the range of forecasted temperature increases for all scenarios to 1.3°C to 2.5°C, for the period 1990 to 21001.
In a similar fashion, the scenarios in the third report appear to encourage alarm by again using a range from between 770 to 2190 GtC of cumulative carbon emissions between 1991 and 2100. As was pointed out in the 1995 report, the most probable scenario is that CO2 concentrations will stabilize between 450 and 650 ppmv between 2100 and 2200. At the midpoint of 550 ppmv, this would correspond to less than 1000 GtC of cumulative emissions between 1991 and 2100. That scenario assumed a gradual phase-out of fossil fuels during the second half of the 21st Century and their replacement with renewable or essentially inexhaustible energy sources. In fact, it is the most carbon-intensive of these fuels - coal - that is expected to continue to lose energy market share most rapidly. Next to lose share would be oil, and then eventually natural gas, the least carbon-intensive and polluting of all the fossil fuels. (However, global gas use and market share are expected to continue to grow well into the 21st Century.) Therefore, at the most likely climate sensitivity of 2.5°C (to a doubling of CO2), the further temperature increase between 1990 and 2100 will probably be nearer to 1.5°C than 5.8°C - even on the basis of the flawed surface temperature measurements used by IPCC ().
Consider also the question of recent temporary anomalies in weather patterns.
One reason for criticism of the "Summary for Policymakers" for the new IPCC Third Assessment Report is that it again fails to give greater weight to the large disparity between its surface temperature measurements and NASA satellite measurements. For example, it doesn't discount the "fly-up" in mean temperatures seen recently by these satellites in the lower troposphere (from the surface up to about 15,000 ft.) in both the Northern (1°C) and Southern Hemispheres (0.6°C). This fly-up was due to the large El Niño anomaly in 1998. It quickly abated during 1999. By 2000, NASA satellite measurements had dropped below the 1979-98 average in both hemispheres5. NASA satellites uniformly and accurately measure lower tropospheric temperatures for the entire globe, unlike the statistically much less uniform and highly unreliable surface temperature measurements which, among other uncertainties, are greatly affected by the "urban heat island" phenomenon (i.e., the ever-growing and spreading heat release from human activities in urban and suburban areas). This problem is exacerbated by the concentration of surface temperature measurements over land. The NASA satellite measurements are considered accurate to 0.1°C and closely match weather balloon measurements in the atmospheric layer between 5000 and 28,000 ft. from 1958 to the present.5,6 Moreover, the NASA satellite and weather balloon measurements fail to show the generally upward trend of temperature data used by IPCC of about 0.5°C to 0.6°C between 1979 and 2000. Also note that other temperature records differ sharply from those cited in the "Policymakers Summary." In fact, in the United States, the highest temperatures during the past century were recorded during the 1930s7. The failure to discount this sharp, but temporary, increase in temperatures in the lower troposphere and the closely related surface temperatures caused by the El Niño effect, combined with the general reliance on basically flawed surface temperature measurements by the models developed for the IPCC Third Assessment Report, plus the unrealistic assumptions in several of the cases in the IPCC Special Report on Emission Scenarios, invalidates the extreme projections for the upper bound of temperature increases from 1990 to 2100.
Kyoto Compliance: The Terrible Cost
None of this criticism affects my firm belief that it will prove beneficial to continue and accelerate the ongoing decarbonization of the global energy system. It should be feasible, both technically and economically, to limit additional anthropogenic carbon emissions between 1991 and 2100 to less than 1000 GtC, rather than the 2190 GtC in the IPCC's "business as usual" scenario1. The voluntary efforts by leading industrial firms to reduce their emissions of the most important greenhouse gases - carbon dioxide and methane - should also help to reduce the threat of global warming.
What it would take.
The industrial countries cannot possibly meet the Kyoto target of reducing greenhouse gas emissions by the required amounts without liberal provisions for emissions trading with low emitters or "clean development" projects in developing countries.
- 5.2 percent - cut in average greenhouse gas emissions (CO2 equivalent) below 1990 levels, by 2008-12, on average for all Annex 1 industrial countries.
- 7 percent - reduction required in United States, below 1990 level.
- 31 percent - reduction required in greenhouse gas emissions in U.S., below level now projected for 2010.
- 306 Gigawatts - U.S. fleet of coal-fired generation, much of which would require replacement by gas turbines.
- 11.3 Tcf/year - added demand for gas to replace 306-gigawatt fleet of coal-fired generation. (Assume gas turbines operate at heat rate of 6,300 Btu/kWh, and at same 68 percent average load factor as coal-fired plants they replace.)
- 7.6 Tcf/year - added gas demand already projected for new gas-fired generation between 1999 and 2020.
- 19 Tcf/year - Total new gas demand (11.3 + 7.6) roughly equals total annual U.S. gas production in 2000.
It now seems highly unlikely that the 1997 Kyoto Protocol will be ratified by a sufficient number of industrial countries covered by the Protocol (the 35 Annex I countries) to put it into effect. U.S. ratification was already in doubt as early as the summer of 1997. That is when the U.S. Senate passed a resolution 95 to 0 that warned President Clinton not to sign any treaty limiting American emissions unless similar limits were placed on the developing countries (which will be responsible for most of the future increase in CO2 emissions.) Then, in March 2001, President George W. Bush announced his opposition to the Kyoto Protocol.
The industrial countries cannot possibly meet the Kyoto target of reducing CO2-equivalent greenhouse gas emissions by an average of 5.2 percent below 1990 levels by 2008 to 12 without liberal provisions for emissions trading with low emitters (such as Russia) and/or projects to reduce emissions in developing countries (the so-called "Clean Development Mechanism"). The U.S. quota calls for a 7 percent reduction below the 1990 level. According to recent estimates, this would require a 31 percent reduction of carbon emissions below the level currently projected for 2010.8,9 At this stage, any attempt to meet this target primarily through domestic measures could prove devastating. For example, it might require the shutdown of most of the 306 gigawatts (GW) of coal-fired power generation capacity in the United States. Policymakers would face a dilemma, as coal-fired plants still supply more than one-half of all U.S power requirements but produce nearly one-third of all U.S. carbon emissions from fossil fuels in the form of CO2.9
As was discussed in an article by the author in the December 1999 ,10 one option would be to replace this coal-fired capacity with very efficient gas-fired combined-cycle turbine units, which emit only about one-third the CO2 emitted by coal-fired steam-electric plants. Gas turbines also produce only negligible amounts of the major air pollutants (except for nitrogen oxides, which are readily controllable). As was noted in that article, such a replacement by years 2008 to 2012 would set up an impossible target, both in terms of the required construction schedules and availability of natural gas at reasonable prices to fuel the new turbines. And today, 18 months later, the challenge is even greater. At current estimates, replacing that coal capacity with combined-cycle turbines would require about 11.3 trillion cubic feet (Tcf) per year of natural gas, assuming that this 306 GW of new combined-cycle capacity operates at the same 68 percent average load factor as the existing coal-fired steam-electric capacity9. That estimate also assumes that the heat rate of the combined-cycle systems is 6,300 Btu/kWh, which corresponds to a lower heating value efficiency of 60 percent. And note further: this 11.3 Tcf/year in new demand for natural gas would come on top of the net increase of 7.6 Tcf/year in gas demand already projected for gas-fired power generation needs between 1999 and 2020.9 Thus, the combined increase in new gas demand for power generation would climb to 19 Tcf/year, a figure roughly equal to total U.S. natural gas production in 2000. Current projections already call for a substantial increase in gas demand - from 21.4 Tcf in 1999 to 34.7 Tcf in 2020 - and that's after including an increase in coal-fired generation capacity from 306 GW to 316 GW.9
Carbon Abatement: A Sustainable Scenario
Looking forward, replacement of the existing 306 GW of coal-fired steam-electric generation capacity with highly efficient, gas-fired, combined-cycle turbine units would reduce annual U.S. carbon emissions from this source from 491 million metric tons to an estimated 163 million metric tons. This reduction would meet more than half of the 558 million metric tons of carbon emissions abatement by 2010 required by the Kyoto Protocol.8,9 Over a much longer term, this plan would offer a promising scenario.
The transportation sector, which is another major source of greenhouse gas and air pollutant emissions, is already set on a course to increase automotive efficiencies by two- to three-fold by switching from mechanical to electric drive. Fuel cells seem destined to become the power source, using hydrogen as the fuel. Every major global automobile manufacturer is spending hundreds of millions of dollars to achieve this goal. Although the transitional sources of this hydrogen will be petroleum products, natural gas, or water electrolysis with commercial power, eventually this hydrogen will come from renewable energy sources. This will first sharply reduce and, finally, eliminate greenhouse gas and pollutant emissions from the transportation sector.
In fact, as was clearly shown in the 1995 IPCC report, global anthropogenic carbon emissions could be allowed to increase to nearly double their 1990 level by 2040. At that point, however, carbon emissions would have to drop sharply to achieve an eventual stabilization of atmospheric concentrations of greenhouse gas equivalent to 550 ppmv of CO2. That figure would roughly double the pre-industrial level and cause only moderate further temperature increases.1 This scenario would allow sufficient lead time to move the global energy system closer to sustainability by aggressive development of such renewable energy sources as photovoltaic, solar thermal and wind power, while using natural gas as the transition fuel. As mentioned above, the Kyoto Protocol is not a practical approach to achieve this goal.
The industrial (Annex I) countries covered by the Protocol will be responsible for only 30 percent of the projected increase in carbon emissions in the form of CO2 between 1997 and 2020.8 The major problem is caused by the rapidly expanding economies of such populous, coal-rich developing countries as China and India, so that the effectiveness of the Protocol in reducing global warming, even if implemented, would be very limited. It will take large investments of capital to assist the developing world in changing its course of economic development, away from increased use of fossil fuels - especially coal.
Nuclear power, though it emits no air pollutants or greenhouse gases, unfortunately cannot provide a sustainable energy source when utilizing "burner" reactor technologies. It cannot reach its full potential until the completion of the development and commercialization of inherently safe, proliferation-proof breeder reactors (such as the integral fast reactor) that increase the energy output per unit mass of uranium 60-80 times and overcome much of the waste problem by the use of metallic fuel and on-site pyroprocessing.11
Nor is biomass a practical replacement for fossil fuels, because of its huge land requirements, high labor intensity, and the serious environmental impacts of mono-culture energy crops.11 Moreover, the best use of biomass is as a sink for CO2 emissions from the combustion of fossil fuels. As noted before, the afforestation of the Northern Hemisphere and other effects of anthropogenic greenhouse gas emissions in stimulating plant growth already more than offset the still ongoing loss of tropical forests.
Instead, a great deal more effort must be devoted to make such high-tech renewable technologies as photovoltaic power more economically competitive. This option offers the additional advantage of being a distributed generation technology, especially suited for high-insolation areas in continental Asia and in Africa that lack electric power grids. Again, abundant natural gas, delivered by international pipelines or in liquefied form by tanker, can serve as an environmentally benign transition fuel until the further development and deployment of high-tech renewable energy sources.
In spite of the remaining challenges to achievement of a sustainable global energy system no later than 2100, there are realistic options for pathways to sustainability that avoid unacceptable environmental impacts. This is why so many analysts disagree strongly with the alarmist data on climate change disseminated recently by the IPCC.
- Hougton, et al., eds., "Climate Change 1995 Ð The Science of Climate Change" in Second Assessment Report of the Intergovernmental Panel on Climate Change, Working Group I, Cambridge University Press (1996).
- Bette Hileman, "Web of Interactions Makes It Difficult to Untangle Global Warming Data," , Vol. 70, No. 17, pp. 7-14, 16, 18-19 (April 27, 1992).
- Richard S. Lindzen, "Some Coolness Concerning Global Warming," , Vol. 71, No. 3, pp. 288-299 (March 1990). Also Richard S. Lindzen, "Climate Forecasting - When Models are Qualitatively Wrong," Washington Roundtable on Science and Public Policy, May 17, 2000, George C. Marshall Institute, Washington, DC.
- Intergovernmental Panel on Climate Change, "Summary For Policymakers" for the "Third Assessment Report" of Working Group I made available on the Internet on October 22, 2000 with the notation "Do not Cite. Do not Quote." In view of the wide distribution of this "Summary" and its key findings to the media and the technical and trade press and the numerous published references and articles dealing with its content, it seems permissible to comment on it here.
- , Vol. 4, No. 1, p. 7 (January 2001).
- Roy W. Spencer, "1996: A Preview of Cooler Days Ahead," pp. 14-17, , Patrick J. Michaels, Chief Editor. New Hope Environmental Services, Inc. (1997).
- "Temperature Histories in Perspective," , Vol. 2, No. 4, pp. 5-11 (Summer 1994). Also National Climate Data Center U.S. Temperature History as reported in Greening Earth Society "Virtual Climate Alert," Vol. 1, No. 46 (December 21, 2000).
- "International Energy Outlook 2000 With Projections to 2020," Energy Information Administration, Document No. DOE/EIA-0484(2000), March 2000.
- "Annual Energy Outlook 2001 With Projections to 2020," Energy Information Administration, Document No. DOE/EIA-0383(2001), December 2000. (Reference Case Forecasts.)
- Henry R. Linden, "Fuel for Thought: Some Questions on the Future of Gas-fired Generation," , Vol. 137, No. 22, pp. 26-35 (December 1999).
- Henry R. Linden, "Let's Focus on Sustainability, Not Kyoto," , Vol. 12, No. 2, pp. 56-67 (March 1999).
Articles found on this page are available to Internet subscribers only. For more information about obtaining a username and password, please call our Customer Service Department at 1-800-368-5001.