Infrastructure Development: Avoiding the Next Debacle


How to ensure another Chunnel, WPPS, or Big Dig doesn’t happen to you.

Fortnightly Magazine - July 2007

In the 1970s and ’80s, the South Texas Nuclear Project (STNP) and Washington Public Power Supply System (WPPSS) epitomized large utility projects spinning out of control. In the ’80s and ’90s, The Channel Tunnel underwater rail project connecting France and the UK, known as the Chunnel, ran 80 percent over budget plus 140 percent over forecasted financing costs. With revenues that were less than half of projections, the Chunnel was described by Business Week as “a financially disastrous pipe dream.” In the United States, Boston’s Big Dig, the Central Artery/Tunnel Project re-routing downtown traffic into a 3.5 mile tunnel, became America’s most expensive highway project as costs quintupled to more than $14.6 billion in 11 years.

Many large projects, particularly multibillion-dollar, mega-infrastructure public projects, “have strikingly poor performance records in terms of economy, environment, and public support,” concluded Flyvbjerg, Bruzelius, and Rothengatter in their 2003 study Megaprojects and Risk.

Other “spectacular examples of cost overruns” cited by the authors include the Sydney Opera House, with actual costs 15 times higher than projected; the Concorde supersonic airplane at 12-times original forecasts;l and the Suez Canal where actual construction costs upon completion in 1869 were 20-times higher than the earliest estimates and three times higher than the projection one year prior to construction start-up.

Bankruptcies, bond defaults, arrests, and lawsuits litter the landscape. “Cost overruns and lower-than-predicted revenues frequently place project viability at risk and redefine projects that were initially promoted as effective vehicles to economic growth as possible obstacles to such growth,” Flyvbjerg and his colleagues conclude.

In the electric-power industry during the 1970s and ’80s, time spans from project initiation to commercial operation—or cancellation—extended over decades, and costs topped $5 billion, more than quadruple original estimates. Skepticism by association spread across much of the rest of the electric-power industry, increasing siting difficulties and regulatory pressures on fossil-fuel plants as well as nuclear facilities, as project difficulties opened the door for complaints by opponents of power development.

Present-Day Questions

What may appear to be ancient history has increasing relevance today, as the electric-power industry prepares to meet an expected 40-percent increase in electricity consumption by 2030 by building some 258 GW of new generating capacity—the equivalent of between 250 and 500 new baseload plants, at a cost of more than $400 billion (in 2005 dollars). In addition, with more than half of North America’s electricity generated at coal-fired plants, a huge environmental retrofit program is underway, requiring investment of up to $30 billion for sulfur dioxide (SO2) controls alone, and a total of $50 billion to reduce SO2, nitrogen oxide, and mercury emissions during the next 15 years, according to Cambridge Energy Research Associates (CERA). Including transmission and distribution as well as generation, CERA forecasts that the industry needs at least $900 billion in direct infrastructure investment over the next 15 years, more than the current net book value of all the nation’s power assets.

Additionally, attention needs to be paid to the multibillion-dollar investments utilities have made and will continue to make in upgrading or replacing customer information systems (CIS) and implementing automated meter reading (AMR) systems and infrastructure. These expenditures have been treated as capital, which entitled the companies to include these system costs in the rate base and to receive an agreed-upon rate of return on their investment subject to some form of prudence or reasonableness review by state public utility commissions or other regulators. To reduce the likelihood or significance of unreasonableness findings in future project reviews mandated by the commissions or driven by legal actions, it will be critical to surface and periodically mitigate risks involved in these projects throughout their design and implementation phases.

Lehman Brothers Vice President Melanie Crader, in a recent research note, wrote: “We believe risks associated with a significant capital expenditure cycle are likely to be priced into the (regulated utility) group in coming years. These risks include heightened regulatory risk as companies seek to increase rates for new infrastructure adds, financing risk associated with capital outlays and related free cashflow deficits, and execution risk associated with construction of the first major cap-ex cycle for regulated utilities since the 1970s.”

And nuclear-power opponents like Paul Gunter, a founder of the anti-Seabrook Clamshell Alliance, signal their continuing opposition: “To ante up for another generation of nuclear power would be a colossal mistake that would really trivialize the Seabrook debacle.”

The bottom line is that companies that take on new generation projects are likely to be under intense financial, regulatory, legal, and political scrutiny. To meet their communities’ power-supply needs with environmental sensitivity and fiscal prudence will require a new level of excellence in managing these projects—a level that some may not be able to achieve.

Important lessons in project management are available from the experience of the 1970s, from major public projects around the world, and from our hands-on experience over the past several decades. However, before examining these lessons, it’s valuable to begin with a review of the actual results produced by the U.S. electric power industry since the 1970s.

Diagnosing the 1970s

It is best to start with an accurate analysis of the actual cost overrun and schedule delays experienced in the 1970s, rather than relying on headlines and project opponents. To document this, the Energy Research and Development Administration commissioned a study in 1976 by the Mitre Corp. of projected versus actual costs for both nuclear and coal-fired power plants.

The analysis compared costs for plants with common start-of-operation dates from 1967 through 1975 in order to evaluate factors like inflation, project-cost escalation, longer construction times, and tightening environmental standards that changed during that period, and to allow for differences in construction time for coal versus nuclear plants.

On a “looking ahead” basis comparing the projected and actual costs-per-kilowatt of plants completed in the same years, the cost overrun ratio for coal plants ranged between 1.43:1 to 1.69:1, averaged 1.6:1 and generally declined over five years. Nuclear plant construction times ranged from 5.8 to 7.8 years and, unlike examples of costs quadrupling, experienced actual-to-projected cost ratios ranging from 1.37:1 to 2.15:1, with an average of 1.78:1. The nuclear plants’ average construction costs-per-kilowatt of capacity increased 57 percent to $340/kW during the five years, while the coal plants’ average cost rose 31 percent to $222/kW.

The Mitre analysis identified several factors among the reasons for increases in the ratios of actual to projected costs during that period, including:

• Inadequate data base and insufficient engineering studies for early nuclear power-plant cost estimates;
• Inflationary pressures not present in the ’60s and, hence, not accounted for in early projections;
• Underestimation of construction manpower costs, especially for nuclear plants;
• Construction delays and additional equipment introduced by environmental and safety regulations; and
• Unforeseen problems in going to larger plant sizes.

Three years later, a RAND Corp. study put energy plants into the context of the challenges facing all major projects at that time. It found that actual costs for energy projects typically were more than 250 percent of estimates. Major construction projects followed at 218 percent of estimates; 1960s weapons systems were delivered at 140 percent of projected costs, and highway projects came closest to budget at 126 percent.

The RAND analysis found that the reliability of the estimates studied primarily was influenced by:

• How well the project was defined;
• Whether new techniques were to be employed; and
• The purpose of the estimate—cost-plus defense contract bids often were 20 percent to 50 percent below the bidder’s own best internal estimate of costs.

It also identified four factors as having the largest influence on actual cost outcomes:

• Externally imposed scope changes and repeated design changes;
• Deviations from an appropriate construction schedule;
• Management and organization deficiencies, magnified by fragmented or unclear management responsibility; and
• Outside factors such as regulation and unanticipated inflation.

The lessons of the ’70s point in the right direction for the coming electric power-plant build. However, despite several decades of development in project management tools and strategies, the majority of projects initiated by organizations of all kinds are completed late or over budget, are not completed at all, or are completed but fail to fill the original business requirements.

As Public Utilities Fortnightly reported in its May 2007 issue, cost estimates for individual non-nuclear projects have doubled or more between initial estimate and construction approval.

But projects are not missing targets only in the electric-power industry. In the Construction Management Association of America’s 2005 survey, project owners reported that construction projects were taking between 27 percent and 57 percent longer than planned in design, documentation, programming construction, and commissioning.

Employing Effective Risk Management

By their nature, projects are “one of a kind.” Lessons from the ’70s and experience since then clearly show that the root causes of late or over-budget project deliveries or catastrophes are in how an organization performs project management.

To address the entire range of factors that can affect a project, this is best thought of in terms of project-risk management: identifying, addressing, and mitigating risks related to the management, execution, and control of projects.

The organizational shortcomings in project-risk management that we see across much of our daily practice leads us to estimate that as many as 40 percent to 50 percent of major construction projects will exceed their original schedules and incur significant cost overruns. The resulting risk events (financial, reputational, legal, or regulatory) will challenge even the most seasoned management teams and boards as they address the project-related issues.

The other side of that coin, however, is that a methodology based on sound project risk-management practices will focus on the processes, procedures, and controls used to manage large capital projects, and as a result create true value for their companies. Following this approach, an organization will have the means to identify process weaknesses and to strengthen them proactively, with the goal of preventing the occurrence of schedule slippage, cost overruns, and defective quality. For companies that are innovators of best project execution practices, the results are significant and lasting:

• Assurance that contractual obligations are appropriately fulfilled, including any joint-venture arrangement;
• Assurance of the adequacy and effectiveness of performance measures to properly control (track and monitor) costs, quality, schedule (timeliness of completion), and safety standards;
• Assurance that construction processes are conducted in adherence to established policies and procedures; and
• Assurance that adequate anti-fraud measures are in place to reduce the risk of fraud, waste or abuse.

Key Risk Areas

The organization or owner must analyze the risks posed by a specific project at two levels—project implementation and project infrastructure design. Process-classification tools of all sorts exist, many based on a generic American Productivity and Quality Center model. Protiviti’s customized data-bases for the energy and utility industries include more than 1,000 processes and sub-processes, associated risks and the necessary controls that occur during project execution.

A rundown of key project implementation risks from the owner’s perspective should include elements listed above.

Also critical to the success of a project is its infrastructure design—the deployment of the essential components that need to be in place and linked together to pave the way for continuously improving project process performance. Without alignment, comprehensive and value-driven project risk management capabilities are difficult to attain.

If a component is deficient, the process will not achieve the strategic objective, people may be unable to perform, reports will not provide information for effective management, analysis will be inadequate, and sufficient information will not be available for reporting and analysis.

Energy & Utilities Industry Application

Protiviti has distilled a list of the 10 most commonly addressed tactical focus areas, along with con-crete steps that can be taken in each area (see Table 1, “Top 10 Big-Build Mistakes”).

These proven methodologies can underpin an organization’s focus on excellence in project-execution performance, while driving best-in-class objectives and continuous process improvement. Information and analysis on the effectiveness of construction management performance is used as a basis for cost-benefit analyses and for proposed changes to the organization’s project-management processes and construction management execution. With a thorough understanding of the key processes driving projects, the associated risks and controls, and reliable measurement and monitoring of progress, management will be in a position to avoid the fates of a WPPSS, Chunnel, STNP, or the Big Dig and to make a notable contribution to meeting the power-demand, environmental, and investment needs of the coming decades.