Engineering, procurement and construction (EPC) contracts are evolving as utilities seek to spread risks, contain costs, and execute their business strategies. As a result, turnkey contractors are...
Reconsidering Resource Adequacy, Part 1
Has the one-day-in-10-years criterion outlived its usefulness?
10 percent of a system’s customers. If it’s further assumed that 50 percent of the customers are, or share circuits with, essential-use 8 customers, and are therefore exempt from curtailment, the curtailment must be imposed on the remaining 50 percent of customers. With these assumptions, the exposed customers would be curtailed once every five outage events on average. Thus, 1-in-10 for a system translates into roughly one hour of outage every 50 years for the average customer exposed to such outages.
While the 1-in-10 criterion might result in a risk of curtailment to the average customer of once every several decades, most electricity customers experience a much higher frequency of outages due to disturbances in the electric distribution systems that serve them—roughly two orders of magnitude (100x) higher. The comparison of the 1-in-10 criterion to distribution system outage rates also suggests that the 1-in-10 standard is extremely conservative.
Utilities summarize the number of minutes of interruption the average customer experiences with the System Average Interruption Duration Index, or SAIDI, usually expressed in minutes of outage per year. Two values usually are provided, one including all events, and a somewhat lower value excluding major events, with the latter measuring more localized events that originate in utility distribution systems. A recent LBL report summarized utility-reported SAIDI values by census division, with major events excluded, showing a range from 107 to 212 minutes a year and a national average of 146 minutes a year. 9 The SAIDI values will of course vary for each portion of each electric distribution company’s service area.
The 1-in-10 resource adequacy criterion can be expressed in minutes per year for comparison to SAIDI values, based on the rough estimates regarding curtailment quantity and duration used above. Those assumptions suggested that under 1-in-10 the average customer would be curtailed for one hour every fifty years, or 1.2 minutes per year on average. Thus, distribution system outages appear to impose roughly two orders of magnitude more minutes of outage on customers than does resource adequacy under the 1-in-10 criterion— i.e., 146 compared to 1.2 minutes a year.
Distribution system reliability, excluding major events, has averaged about 3.6 nines (see Table 2) . By comparison, the 1-in-10 criterion corresponds to 4.2 nines at the system level, or more than five nines for the customer, under the above assumptions.
Is 1-in-10 Justified?
In practice, capacity planning approaches result in resource adequacy that usually exceeds the 1-in-10 criterion. While 1-in-10 has been accepted in principle, planners and regulators understandably have as a goal that curtailments never occur, and there might be thumbs on the scale as resource adequacy is implemented.
There are a number of ways resource adequacy in practice often is more conservative than the 1-in-10 criterion requires. Because the criterion is probabilistic, probabilistic modeling is required to determine the reserve margin required to satisfy it. Such models rely on assumptions about future load growth and its variability; capacity resources and their outage rates and availability during peak periods; the amount of interruptible load available; the assistance that may be available from neighboring systems