What we're not arguing about is important too.
More than 200 organizations and individuals have staked out positions in comments filed with the Federal Energy Regulatory Commission, in...
The electric utility industry is undergoing its most profound change since Thomas Edison and George Westinghouse battled over whether the American power system should be AC or DC. In essence, that technological choice shaped the industry we know today. Edison's low-voltage, DC system would have required many small generating stations and short distribution lines. The high-voltage Westinghouse AC system promoted development
of long-distance transmission networks that deliver electricity efficiently from large, remote power plants. The economies of scale
involved led directly to the emergence of today's vertically integrated utilities.
Now, the technological revolution based on silicon electronics is again forcing fundamental choices and threatening to reshape electric utilities. Through computerized control, the U.S. transmission system is able to handle greatly
increased numbers of bulk-power transactions. Some 40 percent of the electricity generated each year in this country is sold wholesale. At the same time, customers with sensitive electronic loads are demanding higher quality, more reliable power at the distribution level. An outage of less than one cycle of AC power, or a voltage sag of 25 percent for just two cycles, can cause a microprocessor to malfunction. The cost of a two-cycle outage at a large computer center can be as high as $600,000.
As a result of these technological developments and the regulatory changes accompanying them, competition is increasing rapidly throughout the industry. Success in this era of accelerating change will require taking advantage of new technologies and using them to turn competitive challenges into strategic opportunities.
Technology to Reduce Costs
Operation and maintenance (O&M) cost remains a major item in every utility cost structure. But, unless performed carefully, reductions in O&M costs could come at the expense of reliability. Fortunately, new technologies offer a means of reducing O&M cost while simultaneously improving reliability.
Reliability-centered maintenance (RCM) is a method of establishing maintenance intervals based on actual equipment performance data, rather than relying on manufacturers' specifications or past company practices. When properly applied, it balances the often-competing goals of cost containment and reliability enhancement.
RCM practices first evolved in the aircraft industry and then were adapted for use in nuclear and fossil power plants. Overall, implementing RCM usually produces a shift in maintenance schedules. Some become longer; others, shorter. The principal cost savings result from reduced unplanned maintenance for forced outages and increased overall system performance. RCM practices, for example, produced cost savings of 25 to 49 percent in the nuclear industry, depending on specific plant circumstances. A pilot RCM program has just been completed for electric substation equipment, and the results promise similar savings.
Even greater O&M cost reductions can be obtained with new sensor technologies that enable predictive maintenance. Recently, there has been an explosion of new, low-cost sensors that can be used to implement "just in time" maintenance on equipment that otherwise would soon fail. Currently entering the demonstration phase is a sensor technology for online monitoring of transformer oil to warn of abnormal conditions as they begin. A microelectronic fault gas analyzer directly inserts a small sensor at the end of a probe into the insulating oil.