When ratepayers become generators, the utility industry is turned upside-down. A warning to legislators, regulators – and even governors – on what to expect.
Encore for Negawatts?
Congress renews PURPA’s call for conservation and load management, but the world has changed since the 1970s.
The “N-word” in the title first appeared in this journal more than 20 years ago, courtesy of the celebrated environmentalist Amory Lovins and his widely quoted piece, “ Saving Gigabucks with Negawatts .” (Public Utilities Fortnightly, March 21, 1985, p. 19.)
The concept was simple: You satisfy power needs and ease strains on generation resources by curtailing consumption. The benefits—measured in terms of “avoided costs”—would outweigh the price of building new power plants.
In the United States, Congress already had lent support to the concept. In section 111(d) of the Public Utility Regulatory Policies Act of 1978 (PURPA), Congress had set a federal “standard” asking state regulators to study the benefits of energy conservation. It told public utility commissions (PUCs) to consider the idea of forcing electric utilities to offer load management to retail consumers, and interruptible rates to commercial and industrial (C&I) customers.
So, when Lovins cleverly coined his one-word sound bite, the game was on. All sorts of conservation programs soon began to appear in public sectors all over the world. The academic world joined in as well, investigating how to incorporate the new negawatt theory into solicitations for new capacity resources. (For example, see the analysis of Charles Cicchetti and William Hogan: “Including Unbundled Demand-Side Options in Electric Utility Bidding Programs,” Public Utilities Fortnightly, June 8, 1989, p. 9.)
But now scroll forward a few decades. With restructuring of wholesale electric markets at the Federal Energy Regulatory Commission (FERC), plus formation of regional transmission organizations (RTOs), independent system operators (ISOs), the game was changed. Many utilities sold off their power plants to become wires companies; afterward, they saw little need to continue with load management. And with the advent of regional day-ahead spot markets, incremental costs have come to be defined more by spot wholesale energy prices than by dispatch costs calculated internally at individual utility companies. That makes load-management programs problematic, if they depend on utility-specific avoided costs for their justification.
As California Attorney General Bill Lockyer has noted, “It is impossible to provide demand response to lower one’s cost, when one does not cause such costs in the first place.”
The data in fact, shows a marked decline in traditional utility load management, according to ISO New England (ISO-NE).
Recently, ISO-NE gathered statistics on load-management programs from past “summer assessments” conducted by the North American Electric Reliability Council (NERC). The analysis shows that, from 1998 to 2003, as a percentage of peak load, traditional load-management programs (interruptible and direct load control) have declined remarkably in nearly every region of the country.
The declines appear most dramatic in the Electric Reliability Council of Texas, Northwest Power Coordinating Council, and Mid-Atlantic Area Council, but show up also in Southeastern Electric Reliability Council and Mid-Continent Area Power Pool, where regional spot markets were not in play.
(See, Charles Goldman, May 13, 2005, “The Future of Demand Response, presented at the