Congress renews PURPA’s call for conservation and load management, but the world has changed since the 1970s.
Bruce W. Radford is editor-in-chief for Public Utilities Fortnightly.
The “N-word” in the title first appeared in this journal more than 20 years ago, courtesy of the celebrated environmentalist Amory Lovins and his widely quoted piece, “Saving Gigabucks with Negawatts.” (Public Utilities Fortnightly, March 21, 1985, p. 19.)
The concept was simple: You satisfy power needs and ease strains on generation resources by curtailing consumption. The benefits—measured in terms of “avoided costs”—would outweigh the price of building new power plants.
In the United States, Congress already had lent support to the concept. In section 111(d) of the Public Utility Regulatory Policies Act of 1978 (PURPA), Congress had set a federal “standard” asking state regulators to study the benefits of energy conservation. It told public utility commissions (PUCs) to consider the idea of forcing electric utilities to offer load management to retail consumers, and interruptible rates to commercial and industrial (C&I) customers.
So, when Lovins cleverly coined his one-word sound bite, the game was on. All sorts of conservation programs soon began to appear in public sectors all over the world. The academic world joined in as well, investigating how to incorporate the new negawatt theory into solicitations for new capacity resources. (For example, see the analysis of Charles Cicchetti and William Hogan: “Including Unbundled Demand-Side Options in Electric Utility Bidding Programs,” Public Utilities Fortnightly, June 8, 1989, p. 9.)
But now scroll forward a few decades. With restructuring of wholesale electric markets at the Federal Energy Regulatory Commission (FERC), plus formation of regional transmission organizations (RTOs), independent system operators (ISOs), the game was changed. Many utilities sold off their power plants to become wires companies; afterward, they saw little need to continue with load management. And with the advent of regional day-ahead spot markets, incremental costs have come to be defined more by spot wholesale energy prices than by dispatch costs calculated internally at individual utility companies. That makes load-management programs problematic, if they depend on utility-specific avoided costs for their justification.
As California Attorney General Bill Lockyer has noted, “It is impossible to provide demand response to lower one’s cost, when one does not cause such costs in the first place.”
The data in fact, shows a marked decline in traditional utility load management, according to ISO New England (ISO-NE).
Recently, ISO-NE gathered statistics on load-management programs from past “summer assessments” conducted by the North American Electric Reliability Council (NERC). The analysis shows that, from 1998 to 2003, as a percentage of peak load, traditional load-management programs (interruptible and direct load control) have declined remarkably in nearly every region of the country.
The declines appear most dramatic in the Electric Reliability Council of Texas, Northwest Power Coordinating Council, and Mid-Atlantic Area Council, but show up also in Southeastern Electric Reliability Council and Mid-Continent Area Power Pool, where regional spot markets were not in play.
(See, Charles Goldman, May 13, 2005, “The Future of Demand Response, presented at the ISO-NE Demand Response Summit. For graphic summary data, see Comments of ISO New England, p. 16, filed Dec. 19, 2005, FERC Docket No. AD06-2.)
The lack of interest extends also to time-of-use and real-time pricing (RTP) plans. For example, many cite a key study published by the Lawrence Berkeley Laboratory for the Department of Energy’s Environmental Energy Technologies Division to show a surprising number of RTP programs with little or no customer participation. (See, Neenan, Barbose, Galen, and Goldman, “A Survey of Utility Experience with Real-Time Pricing,” December 2004.) The exception, however, seems to be in the Desert Southwest, where Salt River Project and Arizona Public Service Co. compete with each other, and together show a one-in-three participation rate in RTP plans among retail electric customers.
The rise of regional spot markets and the consequent irrelevancy of internal utility avoided-cost calculations also may have dampened interest in demand-response (DR) plans for other reasons. Many observers believe that the consequent disconnect between bid-based wholesale market-power prices and static retail-rate designs based on cost averaging has depressed participation in demand-response programs, as very few real price signals ever filter their way down to the consumer.
Perhaps also, as the Steel Manufacturers Association claims, this declining customer participation marks “a rational response” to the price increases, volatility, uncertainty, and general lack of confidence (the Enron debacle, “wash trades,” etc.) that have attended some regional spot markets. (See, Discussion Paper, “Getting Serious About Demand Response,” March 22, 2005, p. 9, reproduced in Comments of SMA, filed Dec. 19, 2005, FERC Docket No. AD06-2.)
Into this climate steps Congress, which amended PURPA to add a new federal initiative on demand response in the Energy Policy Act of 2005 (EPACT) enacted last fall. In particular, EPACT sec. 1252(d)(3) requires FERC within 180 days of enactment to report to Congress on the market penetration, saturation, past performance, and future potential of both electric utility advanced metering infrastructure (AMI—interval meters and associated software and systems) and DR programs. Also, FERC must report on the role and potential of DR in resource planning, in terms of peak energy demands and transmission capabilities.
Thus, on Nov. 3, FERC opened an investigation and launched a “voluntary” industry survey of technology, practices, policies, experience, and opinions, with a technical conference set for Jan. 25, 2006, in Washington, D.C. (See, FERC Docket No. AD06-2, issued Nov. 3, 2005.)
In all likelihood, Congress added Sec. 1252 to EPACT in part to appease the environmental crowd and to get legislation passed. Also, it may have seen Sec. 1252 as a carrot to state PUCs in states that have opposed the RTO revolution, ensuring a continuing niche for states and old-style utilities in the resource planning process—a process increasingly being taken over by the regional system operators, under FERC’s purview.
Nevertheless, the move may end up forcing unintended results. In short, according to the tenor of comments filed so far in FERC’s DR survey docket, the EPACT initiative, ironically enough, may well end up having the opposite effect: It may give an added boost to the RTO revolution and regional spot-energy markets, rather than limiting their expansion.
In fact, experts and policymakers from across the country seem to hint that FERC’s most radical step also might be the most logical. That step would bring the granularity of RTO wholesale generation markets down to the retail level. Such a change would introduce fully nodal and locational marginal pricing (LMP) to individual loads, instead of averaged over zones, as is done today in New York, New England, and PJM.
There are exceptions, of course. Several years ago, FERC ordered PJM to comply with tariff provisions allowing any wholesale customer with an hourly interval meter data to elect full nodal LMP pricing at retail. (See Occidential Power Services, Inc. v. PJM Interconnection, LLC, Sept. 15, 2003, 104 FERC ¶61,289.) Also, ISO New England is moving ahead with “special case nodal pricing,” whereby certain individual customer loads can qualify to receive non-averaged LMP prices at retail. (See, ISO-NE Compliance Filing, June 30, 2005, FERC Docket No. ER02-2330-37.)
The most radical idea, however, comes from the Midwest Independent System Operator (MISO), which plans a so-called energy-only market (no capacity market or credits) where customers eventually would pay scarcity prices without caps. MISO argues that if you want customers to temper their demand in response to price, you must first offer a price in a range high enough that the slope of the demand curve moves off the vertical and acquires a real, measurable price elasticity. With prices in that range, a further increase will compel a reduction in electric use—not out of homage to an administrative program, but out of natural consumer behavior.
Are Congress and FERC ready for that?
Programs and Potential
Lacking any standard terminology—questioning even FERC’s use of terms within its survey questionnaire—the power industry has found it difficult to provide FERC with a clear and concise characterization of existing DR plans.
At the regional level, there are some programs, such the New York ISO’s EDRP (Emergency Demand-Response Program), that are clearly keyed to reliability and system capability, even though curtailments may be voluntary, and thus might well qualify as a plan deserving of federal regulatory oversight. The EDRP kicks in when reserves fall low, and pays the higher of $500/MWh or the zonal real-time LMP price for demand reductions. (Compensation is paid to LSEs—load-serving entities—but more likely to aggregators known as “curtailment service providers,” or CSPs).
Others, such as NY-ISO’s DADRP (Day-Ahead Demand Response Program), which allow LSEs or customers to submit decremental bids in the regional day-ahead spot market, are seen as “price-responsive” programs, or as “economic” in nature. Such programs, arguably, should remain hands-off at Congress and FERC.
Lastly, consider programs such as PJM’s ALM (Active Load Management), or NY-ISO’s ICAP/SCR (Installed Capacity Special Case Resources). These plans are designed not to reduce peak demand, but instead to procure generating capacity of a quality that might otherwise qualify as a resource under a regional ICAP, UCAP, or LICAP plan. An example might be a large C&I customer with self-generating capability, who agrees to curtail consumption so that its plant capacity (lying behind the meter) can be released to the market. Thus, these programs are linked to resource adequacy, but the payments appear indistinguishable to the return on capital that a merchant generator earns for market-based power sales.
Detailed counts of program enrollments, curtailments (number, size, dollar value), and DR reductions in demand peaks and contribution to resource requirements for the regional grids can be found in the most current of one of the regular reports on demand response that RTOs and ISOs submit periodically to FERC:
- New York ISO. See Sixth Biannual Compliance Report on Demand Response, filed Dec. 15, 2005, FERC Docket No. ER01-3001.
- ISO New England. See Semi-Annual Status Report of Load Response Programs, filed Dec. 29, 2005, FERC Docket No. ER03-345-006. See also, Independent Assessment of Demand Response Programs (by RLW Analytics, LLC, and Neenan Associates, LLC), filed Dec. 30, 2005, FERC Docket No. ER02-2330-40.
The Silicon Valley Leadership Group (SVLG) founded in 1978 by David Packard of Hewlett-Packard, claims that demand response typically can provide demand reductions of 3 to 5 percent of annual peak load, for periods of up to 100 hours or so per year. Thus, as SVLG points out (and also Dan Delurey, of the Demand Response and Advanced Metering Coalition, or DRAM), demand response typically produces only a small reduction in total energy usage. That is because demand interruptions usually account for less than 1 percent of total annual hours. (The PJM market monitoring unit says DR for the region throughout 2003-04 produced an overall price impact of only about $1/MWh.)
Data from the U.S. Energy Information Administration (Form 861), as cited by the American Public Power Association, shows a total peak load reduction in the United States of 9,300 MW in 2003, versus 979,586 MW in total nameplate capacity. MISO estimates a total regional DR potential of about 5,000 to 10,000 MW. PJM says its ALM program (giving capacity credit against the ICAP obligation) could reliably provide up to 7.5 percent of the region’s summer peak.
Planning and Certification
What role should DR and load-reduction programs play in regional capacity planning, or in meeting state-sponsored resource adequacy requirements (RAR), especially since many DR programs envision interruptions that are purely voluntary on the part of load?
Nearly three years ago, the California PUC set target goals for demand reduction from DR programs and instructed electric utilities on how to integrate those DR goals into their resource-procurement plans. The goal for 2005 was 3 percent of annual peak system demand, increasing to 4 percent for 2006, and 5 percent in 2007. (See CPUC Decision 03-06-032, June 5, 2003.)
As of a year ago, however, the PUC had noted that the state’s three major electric utilities, Southern California Edison, Pacific Gas and Electric Co., and San Diego Gas & Electric, each appeared to be running short of achieving those DR goals. In the PUC’s words, all three “question the achievability and cost-effectiveness of the DR [megawatt] goals, noting that there may be more cost-effective alternatives to meet their loads.”
The PUC acknowledged that it was “too early to judge” whether its DR goals could be met, but it decided nevertheless to retail the targets as defined. (See, CPUC Decision 04-12-048, Dec. 16, 2004, 238 PUR4th 1.)
Also, while the California ISO would prefer that only those demand reductions produced by emergency DR programs directly mandated and dispatched by the ISO should count in meeting each utility’s RAR figure, the CPUC appears more lenient. It appears to say that non-dispatchable DR programs also can qualify (price-responsive DR bidding programs, for example, and even demand reductions from real-time pricing tariffs). It concedes, of course, that these soft DR programs qualify only as debits from load forecasts, instead of positive capacity resources—but isn’t that just a difference without a distinction? (See, CPUC Decision 04-035, Oct. 28, 2004, pp. 20-21; CPUC Decision 05-10-042, Oct. 27, 2005, 244 PUR4th 341, p. 366.)
Three months ago, the CPUC directed the three utilities in their next comprehensive rate-design proceedings to make proposals for C&I customers for real-time pricing tariffs and critical peak pricing. CPP programs generally provide for lower rates throughout the years in exchange for steeply higher prices during a limited number (usually about a dozen events per summer season) of short periods (four hours or so) of super-peak demands. (See, CPUC Decision 05-11-009, Nov. 18, 2005.)
And, just four weeks ago, the CPUC announced that it would retain its ambitious DR goals for 2006-2007, and offered further guidance on how to integrate those programs with resource adequacy requirements. (See, CPUC Decision 05-01-056, Jan. 27, 2005.)
The Cost-Benefit Equation
In restructured regions, with RTO, ISO, and spot wholesale energy markets, where utilities have divested generation and their internal avoided-cost calculations no longer offer much guidance, how do regulators and policymakers evaluate the costs and benefits of load management?
The answer, it seems, may come down to hopeful guesswork.
Consider ISO New England, which has attempted to devise a mathematical model based not on resource costs, but on customer perceptions and an assumed monetary value representing the dollar cost that consumers would suffer in the event of outage. For this analysis, ISO-NE has pegged that value, known as VOLL—value of lost load—at $5,000/MWh ($5/kWh, or about 50 times higher than a typical, all-in retail electric rate).
According to ISO-NE, one calculates the benefits of a DR programs by first estimating the expected improvement in LOLP (the loss of load probability), and then multiplies that figure by the share of load that was otherwise at risk. This step, it says, yields a product that represents a benefit achieved through an incremental reduction in unserved energy. It then multiplies this last figure by VOLL to produce a dollar figure of expected reliability benefits.
But the ISO does not stop there. It then calculates a “market impact ratio,” based on: (1) power bill savings from lower LMPs seen in the spot market; and (2) a much greater savings fixture that it calls “hedge savings.” These so-called hedge savings represent the product of the change in average monthly LMPs times the energy quantity and load served through bilateral trading contracts. (For the full explanation, see Comments of ISO New England, pp. 10-14.)
ISO-NE admits that its model is still a work-in-progress, but says it can use this method even to calculate the benefits of extending full nodal LMP pricing (“dynamic pricing”) to individual loads.
In fact, it estimates that over a five-year period, dynamic pricing for loads of 1 MW or greater (what is now enjoyed only by power producers on the supply side) could bring benefits of $340, at a cost of only $6.7 million a year, “at most.”