The U.S. government has focused on energy and electricity to be part of the solution for some of the toughest challenges facing the United States including climate change, energy security, infrastructure investments and the need to invigorate the American job market. The American Recovery and Reinvestment Act is helping to fund these efforts and includes more than $16 billion for the DOE Office of Energy Efficiency and Renewable Energy’s (EERE) programs and initiatives. The Obama administration also has called for reducing CO2 emissions to 1990 levels by 2020, with a further 80-percent reduction in these emissions by 2050.
Other federal and state legislative initiatives call for an increased focus on energy efficiency and peak demand management. The American Clean Energy and Security Act (ACES), introduced in the house in June 2009, attempted to establish a standard that requires utilities to achieve an increasing percentage of their power supply through a combination of energy-efficiency savings and renewable energy initiatives (e.g., 6 percent in 2012, 9.5 percent in 2014, 13 percent in 2016, 16.5 percent in 2018, and 20 percent in 2021 through 2039).
Efficiency measures can be applied throughout the electricity value chain from generation through power delivery, as well as during the time electricity is being used. The industry has a good understanding of efficiency options, costs and impacts in power plants through heat-rate improvements and at customer sites resulting from the efficiency programs they conduct. However there is no clear understanding of the cost, service, power-quality or reliability impact when applying efficiency measures to transmission and distribution (T&D).
A kilowatt-hour saved in generation, power delivery or through end-use efficiency measures is a kilowatt-hour less generated, and helps reduce fuel costs and carbon emissions, and, if measured and verified, it might help to meet expected efficiency mandates. The electric industry is the largest single user of electric power, consuming approximately 12 to 15 percent of the electricity produced when operating auxiliary loads in generation plants and substations, or through losses occurring in T&D.
The cost of saving a kilowatt-hour of energy through transmission or distribution efficiency measures might be lower than saving the same kilowatt-hour through end-use efficiency measures. There is a need for measuring, managing, comparing and understanding all efficiency opportunities across the end-to-end electricity value chain, including T&D efficiency.
The percentages across both transmission and distribution equate to approximately 300 million MWh based on a U.S. annual generation total of 4,157 million MWh. That is roughly equivalent to the electricity needed to power approximately 29 to 35 million homes (Electric Power Industry 2007: Year in Review, EIA report released Jan. 21, 2009). Reducing the losses by 20 percent would result in a reduction equivalent to the electricity needed to power 6 million to 7 million homes.
Ultimately, electric energy is lost in the form of heat. The current-carrying components get hot as they deliver energy to the consumer and the thermal energy is dissipated to the surroundings as losses. To offset the losses, utilities have to generate more power, which usually results in burning additional fuel and increasing the carbon footprint. Since these losses are inherent in the process, they can’t be reduced to zero, but might be reduced significantly with present technology (see Figure 1).
Additionally, efficiency is more than simply minimizing losses. An efficient system should reduce losses, increase utilization of existing T&D assets and enable smart integration of renewable and storage technologies. While we might not be able to achieve optimal efficiency of the T&D system overnight, we must understand the technology and operational options available today to increase overall efficiency. Whenever attempting to change or upgrade the system, efficiency must be assessed together with all other expected benefits of the upgrade such as reliability, capacity and growth.
A clear understanding of the magnitude of T&D losses is the first step in improving system efficiency. This can be achieved by putting in place a system for accurate energy efficiency accounting.
The industry lacks standard methods and a protocol to identify and quantify T&D losses or evaluate the numerous methods for reducing those losses. Having a consistent and uniform protocol would help utilities identify and compare cost-effective loss-reduction options. It also would allow utilities to document energy savings resulting from power delivery efficiency improvements so they can be assessed easily and credited toward possible energy-saving quotas.
As a highly efficient T&D system is built out, sensors, communications, data management, visualization, and control are key enablers to achieving and improving efficiency. These technologies will help provide data that can be used to identify where the losses are, mitigate the losses and increase the overall efficiency of the electrical system. Establishing a baseline of the present system efficiency is a prerequisite for efficiency improvements to be applied to possible renewable and efficiency standards.
The implementation of technology-based solutions for reducing losses and improving overall system efficiency requires utilities to study and assess not only the technologies, but their T&D systems. Utilities need a comprehensive evaluation methodology and strategic planning framework to accomplish this.
In 2008, the Electric Power Research Institute (EPRI) took a step towards demonstrating T&D efficiency and launched a distribution efficiency initiative, Green Circuits, to analyze and apply the efficiency measures in numerous distribution circuits with the goal to assess real-life costs and benefits, as well as to identify technical feasibility and obstacles. Preliminary results show that distribution losses can be reduced by 5 to 10 percent over the next 10 to 15 years by using highly efficient components when performing system upgrades. Overall efficiency also can be improved by changing the operating standards and controlling the voltages within a more optimum band for lowering overall losses and end use of energy. Other options include reconductoring, phase balancing or capacitor placement, to name a few.
The Green Circuits distribution project has identified two major areas of focus for potential improvements, including more efficient transformers and voltage optimization.
Studies have shown that one of the significant contributors of distribution losses are the no-load transformer core losses. To address this situation, the U.S. Department of Energy (DOE) issued a ruling on minimum efficiency for distribution transformers, which became effective Jan. 1, 2010. Before finalizing the ruling, DOE conducted an extensive analysis of tradeoffs between energy savings versus transformer costs, with all available (but practical) material options, including amorphous metal.
DOE estimates the standards will save approximately 2.74 quads [quadrillion British thermal units (Btu)] of energy over 29 years (2010 through 2038). This is equivalent to all energy consumed by 27 million American households in a single year.
Amorphous metal transformers (AMT) were developed in the United States under an EPRI program in early 1980 with General Electric (GE). Amorphous metals are a new class of material with a random atomic structure unlike regular metals, which are crystalline. Transformers built with these materials have about one-third of the core losses when compared to regular silicon-iron based core transformers, resulting in highly energy-efficient units. AMTs are slightly more expensive but have significantly lower operating costs than conventional units do, resulting in lower life-cycle (LC) or total ownership costs (TOC). During the next 15 years, more than 500,000 units were installed in the United States with a very satisfactory field experience. In the late 1990s, the demand for this product disappeared as restructuring (i.e., deregulation) set in, resulting in all manufacturers abandoning the production of these types of units in the United States. However, the product has been very popular in India, China, Japan (in descending order of installations) and other countries.
As a result of the DOE ruling, there’s a renewed interest in the United States for AMTs. They’re a solution for U.S. utilities working to improve distribution system efficiencies, reduce their carbon footprints, and meet or exceed other environmental goals. This technology might be marginally cost effective at today’s fuel price without monetizing carbon. However, future scenarios that might include higher fuel costs and a carbon price would increase the effectiveness of this technology.
By 2038, DOE expects the energy savings from the standards to eliminate six 400-MW power plants (2,400 MW) and 238 million tons of carbon dioxide (CO2). Using a 3-percent discount rate, the cost of the standards is $460 million a year in increased equipment and installation costs, while annualized benefits are $904 million a year in reduced operating costs.
However, had AMT been the standard, energy savings would have been 7.37 quads, a CO2 reduction of 674 million tons, and generation elimination of 7,200 MW—triple the benefit compared to the current ruling.
Optimizing voltage also will yield substantial energy savings. Losses and end-use consumption are reduced when the voltage along the feeder is managed to be within the lower end of the standard supply service voltage band. This practice is called voltage optimization, or conservation voltage regulation (CVR). For many years, utilities have used voltage regulation to reduce demand during periods of peak consumption. While not a new concept, the practice of voltage regulation is now seeing renewed interest from utilities for both peak reduction and energy savings during non-peak periods.
The results of the Distribution Efficiency Initiative Study validated by EPRI’s Green Circuits distribution project indicate 1- to 3-percent energy savings, a 2- to 5-percent demand reduction, and a 5- to 10-percent VAR reduction can be achieved through voltage optimization. Assuming an adoption rate in the range of 25 to 50 percent of residential distribution substations, voltage optimization could achieve an approximate annual savings range of 4 million to 28 million MWh by 2030. Voltage optimization can be fine-tuned further with the deployment of smart-grid systems using the data from advanced metering infrastructure (AMI) systems to measure and determine the minimum service voltages, eliminating assumptions and potentially producing an additional 1- to 2-percent energy savings.
During 2009, a series of domestic and international workshops were held to discuss the state of transmission efficiency and options for improvements. The workshops brought together stakeholders in the industry including about 320 participants from transmission owners and independent operators, vertically integrated utilities, the vendor community, trade organizations such as the Edison Electric Institute (EEI), and regulatory bodies such as the Federal Energy Regulatory Commission (FERC) and North American Electric Reliability Corporation (NERC), various state public utility commissions, and members of academia, research organizations and the media.
Transmission presently is operated to first meet reliability standards and second to meet market needs. As such, loss reduction is typically one of the functions for optimization. However, as a result of the series of workshops and in conjunction with an executive council, EPRI is developing a framework to include efficiency in the operational strategies of transmission owners, independent system operators and fully integrated utilities. This framework is based on the following principles:
• Efficiency is more than simply reducing losses: An efficient system is low in losses, but it also increases utilization of existing transmission assets and enables smarter integration of renewable and storage technologies.
• Efficiency initiatives require that reliability remains a primary focus: There are technologies and practices available that increase the efficiency of the transmission system while maintaining or enhancing reliability.
• Efficient transmission will be built on the shoulders of new and upgraded systems: More transmission is essential for enabling renewable resource integration, improving reliability and achieving optimum efficiency. Sensors, communications, and using data to achieve greater control are key enablers for achieving and improving efficiency.
• Efficiency must be included in future business cases: Proposed transmission improvement projects for capacity and voltage stability improvements—as well as transmission improvements to connect to such clean and innovative energy technologies as renewable resources and storage—must include efficiency considerations as part of a comprehensive energy delivery resource plan.
• A regulatory framework with incentives can lead to efficiency improvements: To incentivize transmission efficiency, revisions to the regulatory framework might be required.
Using this framework, EPRI will facilitate information and experience sharing for evaluating methods and strategies for improving efficiency and reducing transmission losses. It’s the foundation for an emerging demonstration initiative focused on transmission efficiency and provides for a consistent methodology for quantifying criteria such as measurement and verification (M&V), cost-benefit analysis, total losses reduced, CO2 reduced etc.
The aim is to create the same level of understanding as already has been done for potential efficiency improvements in generation and end use. These demonstration projects can help verify and establish M&V protocols and metrics for quantifying the improvements, and serve as the basis for efficiency accounting.
In August 2009, an executive committee of industry stakeholders recommended to move forward with the development of an industry-wide transmission efficiency demonstration that will identify technologies leading to greater efficiency in the bulk power system, an increase in system utilization and a reduction in line and equipment losses. The initiative’s goal is to understand efficiency improvements and apply that knowledge when operating the grid and when upgrading existing transmission lines or building new lines.
These workshops identified three focus areas for improving efficiencies, including:
1. Reduce System Losses: Efficiency and loss reduction must be viewed from an overall system impact perspective. There are numerous approaches to reducing system losses. These include increasing nominal voltage (e.g., new lines or voltage upgrades), dispatch considerations to relieve flows from overloaded or higher loss lines to less congested and lower-loss lines, coordinated voltage control across the system to reduce VAR flow, and other means of power flow control.
2. Reduce Line and Equipment Losses: Electricity providers are studying numerous methods to reduce losses from lines and equipment components as a way to improve overall transmission efficiency. The key contributors to transmission losses are the lines and the substation equipment. The transformer is the principal loss contributor within the substation. Electricity providers are investigating low-loss lines and configurations, low-loss transformers and auxiliary equipment. Superconductivity also might be applicable in some cases.
3. Increase Line and System Utilization: Efficient transmission seeks to optimize utilization of assets and resources including right-of-way, materials, labor, time and dollars. As the industry retires older, less-efficient assets and builds new higher-voltage, more-efficient systems, increasing the utilization will allow greater throughput on existing corridors by adding storage and control technologies that will enable integration of higher levels of renewable resources.
“This is a timely initiative in that it comes as the government and industry work together to improve the efficiency of the transmission system,” said FERC Chairman Jon Wellinghoff. “Clearly, implementing new technologies on the bulk power system would benefit both the industry and consumers, while at the same time, reducing the sector’s carbon footprint.”
Technology isn’t the only limiting factor in improving the efficiency of our electric system.
Utilities need to be incented to implement efficiency measures whether they are for end-use or delivery. As society has recognized the importance of end-use efficiency, it might find the additional power delivery efficiency gains just as valuable, which might lead to similar incentive mechanisms.
Efficiency can either offset or defer new generation and delivery assets investments. There are regulatory instruments in place to recoup generation or transmission investments. The same holds true for many end-use efficiency programs. However, these instruments might be missing for T&D efficiency investments.
These industry demonstrations are intended to increase awareness, identify technology options, develop comprehensive measurement and verification protocols and validate the potential savings, which might lead to valuation of efficiency improvements in power delivery.
The bottom line is that the industry needs to look at all opportunities to use electricity more efficiently. There’s significant opportunity to improve energy efficiency of the electricity sector including the transmission and distribution system. And in order to unlock the potential, we need enabling technologies, industry demonstrations and supporting regulatory framework to achieve the full potential for power delivery efficiency that can help to realize a lower-carbon future.