Dwindling economic competitiveness has plagued the nuclear power industry for
some years. In the industry's early years, some reactors were completed for less than $100 million. Experience gained overseas (often in projects with American partners) provides sobering evidence that nuclear reactors can still be built at low cost in short periods of time. But not here in the United States, where rising construction and O&M costs have become industry bywords.
What lies behind this decline?
For the last 20 years, the U.S. nuclear industry has devoted the bulk of its energies largely to regulatory issues. We have not developed the types of economic analyses that would be taken for granted in the oil, securities, or automobile industries. True, certain notable power-plant improvements in recent years have trimmed fuel costs and boosted plant output. And other moves are now underway to reduce operating and maintenance (O&M) costs. But to prioritize and optimize these cost reductions, we need a better understanding of what has happened to costs in the U.S. nuclear industry.
In this article we present some cost findings drawn largely from publicly available data for 109 U.S. nuclear power reactors. The data sources include such agencies as the Nuclear Regulatory Commission, Energy Information Agency, Federal Energy Regulatory Commission (FERC), Utility Data Institute, Electric Utility Cost Group, and Institute for Nuclear Power Operations. The study applies multivariate, nonlinear regression analysis to O&M costs from 1975 to the present, and construction costs from 1966 through 1989.