Utility CEOs debate the merits of a retail surcharge to fund clean-tech R&D.
Encore for Negawatts?
Congress renews PURPA’s call for conservation and load management, but the world has changed since the 1970s.
it was “too early to judge” whether its DR goals could be met, but it decided nevertheless to retail the targets as defined. (See, CPUC Decision 04-12-048, Dec. 16, 2004, 238 PUR4th 1.)
Also, while the California ISO would prefer that only those demand reductions produced by emergency DR programs directly mandated and dispatched by the ISO should count in meeting each utility’s RAR figure, the CPUC appears more lenient. It appears to say that non-dispatchable DR programs also can qualify (price-responsive DR bidding programs, for example, and even demand reductions from real-time pricing tariffs). It concedes, of course, that these soft DR programs qualify only as debits from load forecasts, instead of positive capacity resources—but isn’t that just a difference without a distinction? (See, CPUC Decision 04-035, Oct. 28, 2004, pp. 20-21; CPUC Decision 05-10-042, Oct. 27, 2005, 244 PUR4th 341, p. 366.)
Three months ago, the CPUC directed the three utilities in their next comprehensive rate-design proceedings to make proposals for C&I customers for real-time pricing tariffs and critical peak pricing. CPP programs generally provide for lower rates throughout the years in exchange for steeply higher prices during a limited number (usually about a dozen events per summer season) of short periods (four hours or so) of super-peak demands. (See, CPUC Decision 05-11-009, Nov. 18, 2005.)
And, just four weeks ago, the CPUC announced that it would retain its ambitious DR goals for 2006-2007, and offered further guidance on how to integrate those programs with resource adequacy requirements. (See, CPUC Decision 05-01-056, Jan. 27, 2005.)
The Cost-Benefit Equation
In restructured regions, with RTO, ISO, and spot wholesale energy markets, where utilities have divested generation and their internal avoided-cost calculations no longer offer much guidance, how do regulators and policymakers evaluate the costs and benefits of load management?
The answer, it seems, may come down to hopeful guesswork.
Consider ISO New England, which has attempted to devise a mathematical model based not on resource costs, but on customer perceptions and an assumed monetary value representing the dollar cost that consumers would suffer in the event of outage. For this analysis, ISO-NE has pegged that value, known as VOLL—value of lost load—at $5,000/MWh ($5/kWh, or about 50 times higher than a typical, all-in retail electric rate).
According to ISO-NE, one calculates the benefits of a DR programs by first estimating the expected improvement in LOLP (the loss of load probability), and then multiplies that figure by the share of load that was otherwise at risk. This step, it says, yields a product that represents a benefit achieved through an incremental reduction in unserved energy. It then multiplies this last figure by VOLL to produce a dollar figure of expected reliability benefits.
But the ISO does not stop there. It then calculates a “market impact ratio,” based on: (1) power bill savings from lower LMPs seen in the spot market; and (2) a much greater savings fixture that it calls “hedge savings.” These so-called hedge savings represent the product of the change in average monthly LMPs times the energy quantity and load served