The U.S. Treasury cash grants for new renewable power projects expired at the end of 2011. These incentives, which were implemented under Section 1603 of the American Recovery and Reinvestment...
The Blackout of 2003: Why We Fell Into The Heart of darkness
to rise faster than economies of scale could reduce them. During the economically stable 1940s, '50s, and early '60s, economies of scale drove down the cost of electric power in both real and absolute terms. With the onset of stagflation, real costs may still have been falling, but in absolute dollars they were not. The energy crisis of the 1970s also caused fuel costs to rise, quadrupling petroleum prices, with consequent impacts on other fuel forms.
In response, utilities went to regulators for rate increases to cover rising costs and retain their credit ratings. Regulators being as much influenced by politics as by utility financial needs, granted some, but not all, the rate increases requested. And as economic conditions worsened, and the public's tolerance for absorbing sharply higher energy costs decreased, the increases tended to grow smaller and smaller, while utility bond ratings steadily declined. The relationship between utilities and rate regulators became adversarial.
Utilities were in the midst of major construction programs to upgrade their systems, exacerbating the situation. These upgrades were a response to the 1965 Northeast blackout and the rapid increases in electricity demand. Few reading this article will remember the 1973 National Power Survey prepared by the Federal Power Commission (FERC's predecessor). The survey, completed in response to the 1965 blackout, called on utilities to make significant investments in new generation and transmission plant to ensure adequate electricity supplies and reliable service.
Utilities were in effect complying with the wishes of the FPC, while at the same time meeting resistance from state commissions that had the final say on the rates utilities could charge customers, what they could earn, and thus what they could invest and pay shareholders and creditors. Much of the construction driving utility capital requirements took the form of large nuclear and coal-fired power plants, two very capital-intensive technologies. Rising inflation and high interest rates increased the cost of building these plants. In addition, the cost of these plants would increase as schedules lengthened in response to capital rationing and the technical complexity of introducing new technologies (nuclear in particular) on a massive scale.
As these large projects were completed, regulators grew less willing to incorporate their full costs into the rate base. In response, the 1970s and 1980s saw a plethora of prudency reviews to determine whether new plants were in fact needed, and if so whether utilities mismanaged construction, resulting in higher costs than justified. In either case, electric utilities were the losers. Rate requests were less than requested, phased in, or disallowed.
Consequently, the creditworthiness of electric utilities declined precipitously:
- Bond ratings declined;
- Common stock dropped to below book value;
- Earnings dilution occurred as utilities sold stock to complete construction projects that were beyond the point of no return;
- Utilities began unloading assets to remain liquid. Con Edison's sale of generating plant to the New York Power Authority in 1974 was the first of these sales; and
- Con Edison cut its dividend, introducing turmoil to valuation of utility common stocks where previously stability had been the watchword.
As political and regulatory resistance