Can NERC Juggle All Three En Route to Open Access?
At the year's start, the North American Electric Reliability Council decided to leave its "peer pressure" policy behind and require...
Some believe that small-scale, distributed generation will usher in a new era of magically inexpensive power: Industrial users will run their own cogeneration units. Many residential customers will use some sort of portable (em perhaps exotic (em power equipment in their homes. Existing, utility-owned, large-scale generating stations will be cast off on the path to ultimate efficiency.
Meanwhile, New England is running out of power this summer. At this writing three nuclear plants have come off line for various reasons, while a fourth could follow by press time. The situation already threatens widespread brownouts.
Let's revisit the question of dynamic system efficiency over extended periods of time, with particular emphasis on the essential role of the transmission grid and associated computer, control, and communications (3C) systems.
Research at MIT
Industry deregulation, while introducing welcome competition and perhaps lower prices, also harbors some less palatable side effects. First, a power-distribution system with multiple suppliers and customers is much harder to control than a system in which a single utility decides whether to take its own units on and off line. Second, competition among suppliers tightens profit margins because new, smaller, gas-fired units can produce electricity more cheaply than nuclear power plants and older coal plants.
Researchers at MIT have been studying the complex control aspects of a deregulated power system. Such a system is quite different from the traditional control systems developed under regulation. Complicated operating strategies are necessary to provide the reliability that customers demand. I and others at MIT are examining the performance of the existing New England power system under peak summer load scenarios. We have developed some techniques that may help power grid managers identify the best short-term strategies for shedding loads and selectively choosing brownout areas to minimize the extent of compromised service to consumers, as well as the best long-term options to improve future system performance.
MIT researchers are also examining the value of providing operating reserves for reliable service on hot days or when a large generating unit has to come off line. Those studies may help the industry decide whether to recover some of its stranded costs by appropriate peak-power pricing that will encourage them to keep backup power plants in running condition, or whether there are better alternatives to achieve reliability.
If brownouts occur during hot spells this summer, we need to think about the compromises we make when we want cheaper electricity but forget to pay for the reliable service we also want. The MIT studies will play a role in deciding how to keep the advantages of competition, but also how to charge consumers in an equitable way for the reliable service they count on to keep their computers running and to avoid burning out their air-conditioner motors.
How much capacity is adequate for reliability?
Several arbitrary assessments of excess generation capacity in this country are often cited to point out possible inefficiencies in providing generation capacity. However, I do not find these analyses applicable to the eastern United States since they do not consider the locational and temporal dependencies