The one-day-in-10-years criterion might have lost its usefulness in today’s energy markets. The criterion is highly conservative when used in calculating reserve margins for reliability. Can the...
Reconsidering Resource Adequacy, Part 1
Has the one-day-in-10-years criterion outlived its usefulness?
Electric utilities and regional transmission organizations (RTOs) in the United States aim to have enough electric generating capacity to meet anticipated peak loads with a reserve margin for reliability. The reserve margins usually are set to meet the widely-accepted “one day in 10 years” (1-in-10) resource adequacy criterion, under which the expected frequency of having to curtail firm load due to inadequate capacity should be no greater than once every 10 years.
The 1-in-10 criterion always has been highly conservative—perhaps an order of magnitude more stringent than the marginal benefits of incremental capacity can justify—and capacity planning has been even more conservative in practice. Indeed, economists have questioned the 1-in-10 criterion for many decades. 1
Resource adequacy practices based on the 1-in-10 criterion perhaps make more sense for utility planners and regulatory authorities, who would have to answer for any curtailments that occur, than for the consumers who are directly affected if reliability isn’t maintained, but who also bear the cost of the additional capacity.
Marginal Costs and Benefits
The 1-in-10 resource adequacy criterion is economically efficient if it calls for an amount of capacity that reasonably balances the incremental costs and benefits of additional capacity. Under this principle, more capacity should be built as long as its incremental cost is exceeded by the anticipated incremental benefit.
The cost of incremental capacity is the annualized cost to build and maintain the most economical type of capacity, less the amount of those costs the plant can be expected to offset through sales of energy and ancillary services in the wholesale markets. It’s generally considered that gas-fired combustion turbines represent the cheapest type of capacity, and the type that would be built to meet an incremental need for capacity for reliability.
The incremental benefit of holding more capacity for reliability results from reducing curtailment due to shortages. This potential benefit depends upon the anticipated frequency of such outages (the LOLE) and the cost of outages to the electricity consumers who are curtailed (often called the value of lost load or VOLL). Additional capacity also can contribute to lower market prices for energy and ancillary services, potentially an added benefit from the consumer’s perspective. However, the last increments of capacity built to satisfy the 1-in-10 criterion (or any criterion leading to a low frequency of outages) will run very infrequently and have little, if any, impact on these prices.
When comparing incremental costs and benefits in this manner, the 1-in-10 criterion appears to be extremely conservative, calling for a much higher level of capacity than is justified by the economics. With an LOLE of only 0.1 outages per year (as implied by the 1-in-10 criterion), the incremental cost of capacity exceeds the incremental benefits by a wide margin across a range of reasonable assumptions. Estimates of the