The smart grid might not be a done deal, but the U.S. electric utility industry seems to have reached a conclusion about it. In short, we have seen the future of the electric system, and it works—intelligently.
In just a few years, the idea of a smart grid has advanced from a pie-in-the-sky idea to an industry bandwagon. Across the country, utilities large and small are touting investments in transmission and distribution (T&D) automation, and calling them part of a smart-grid strategy.
Of course, this investment trend isn’t really new. For a couple of decades, utilities have gradually built more automation into their T&D systems. But the trend has evolved into something more. The smart grid has become a vision for the industry’s future—a vision that both executives and lawmakers have recognized as a politically neutral, forward-looking technology trend. As a result, it’s gaining momentum not just in terms of investment plans, but also in public-policy processes.
Most notably, the Energy Policy Act of 2005 called for the development of smart-grid standards, and directed state regulators and utilities to consider, among other things, time-based rate schedules. Then, Title XIII of the Energy Independence and Security Act, signed into law last December, moved the ball further by authorizing up to $100 million per year for smart grid initiatives, and establishing matching federal funds for up to 20 percent of qualifying smart-grid investments.
Spurred in part by the U.S. Congress, there’s movement at the state level as well. Utility commissions across the country have been examining smart-grid-related filings, from basic cap-ex plans to innovative rate treatments for advanced metering programs. For example, in May 2008 state legislators in Ohio endorsed rate-base treatment for smart-grid investments, and supported electric utility revenue decoupling in the name of energy efficiency. In several states, decoupling allows utilities to recover not only the cost of implementing energy efficiency and peak demand reduction measures, but also the revenues they forfeit once the measures go into effect (see “Commission Watch table, Revenue Decoupling in the States”).
For many smart grid proponents, these developments are great news, because they provide solid policy support for many of the central benefits of system automation. But for all this progress, the smart-grid vision still remains fuzzy for many executives and regulators. Every investor owned utility—not to mention every cooperative or public power company—faces a different operational and strategic situation, so the smart grid means something different for each utility.
Also, no application to date has provided a complete, working example of the smart grid in operation, and the technical, business and regulatory implications of the smart grid are unfolding too quickly for a single project to adequately demonstrate. Thus executives and regulators remain uncertain when they’re asked to invest in a smart-grid strategy. In short, the industry likes to think the future grid will work intelligently, but we’re not really sure what that means.
In this special report, Public Utilities Fortnightly highlights five recent initiatives, each of which clearly demonstrates one or more aspects of this quickly developing thing called the smart grid.
Two studies conducted in 2007 by the Department of Energy’s Pacific Northwest National Laboratory (PNNL) demonstrated that smart-home appliances will go a long way in helping utilities improve grid operations.
The first, known as the Olympic Peninsula project, garnered most of the early headlines because it was one of the first in the country to demonstrate that individual homeowners will readjust their energy-use patterns when given real-time price signals via in-home information technology.
Some 112 homeowners on the state of Washington’s Olympic Peninsula received new, two-way meters and thermostats, along with water heaters and dryers outfitted with special software that allow the homeowner to customize the devices to the desired level of comfort or economy.
Once the settings were established, the devices automatically responded to electricity price signals that were updated every five minutes. A commercial building, municipal water pumping system and a small amount of distributed generation capacity also responded to the same pricing signals.
During peak periods when electricity was most expensive, the software automatically adjusted thermostats or appliances to pre-set response limits established by each homeowner. Combining the demand response with distributed generation reduced peak-distribution loads by 50 percent. Further, participants who responded to the real-time prices reduced their peak-power use by 15 percent.
The Olympic Peninsula project yielded remarkable results, but operations professionals might be more intrigued by the second pilot—which received less public attention.
Rather than focusing on price signals, the Grid Friendly Appliance (GFA) Project demonstrated that everyday household appliances fitted with electronic controllers capable of sensing under-frequency stress on the power grid could have an equally profound effect on network operations.
PNNL developed and embedded a small electronic controller on 150 new Whirlpool clothes dryers and 50 existing residential water heaters in 150 Washington and Oregon homes to detect and respond to low-frequency stress on the grid.
The controller’s frequency threshold was set at 59.95 Hertz—high enough to recognize frequent, shallow frequency excursions of a 60-Hz AC voltage signal available at any residential outlet. The controller automatically shut off the dryer or water heater’s heating element within one-quarter of a second after a sudden drop in frequency.
The hypothesis was that an army of controllers collectively could contribute to frequency protection on the grid—typically a substation function where under-frequency relays are pre-set to shed feeder loads if low frequency thresholds become crossed.
“We turned an everyday household appliance into an ally,” says Rob Pratt, PNNL program manager for both projects. “Utilities typically maintain a 5-percent spinning reserve, and white goods like dryers and water heaters account for 20-percent of electrical demand at any point in time. If every appliance in a service territory had a controller built in, you would have what amounts to a 20-percent safety cushion. That, in theory, could help reduce a utility’s spinning-reserve requirement.”
A total of 358 under-frequency events lasting from several seconds to 10 minutes were observed. What was typically a momentary interruption went unnoticed by the consumer, but the study found that controllers have the technical capacity to act as a shock absorber for the grid and prevent or reduce the impact of power outages.
Addition Through Subtraction
The GFA results are interesting from a number of perspectives. For example, the controller works autonomously, so all that’s needed is a wall outlet. Because its under-frequency threshold is higher than the threshold at a substation, the controller theoretically could reduce the chance of a feeder blackout by anticipating and responding to an under-frequency situation first, before it reaches and triggers the substation relays.
The protection results in little or no inconvenience for the appliance owner, while a substation relay action creates outages for many customers on the downstream feeder, and such an arrangement certainly would be preferable to having designated commercial and industrial loads curtailed at the utility’s expense.
“In a sense, you’re buying electricity from a demand-response network, not a peaking reserve unit,” Pratt says. “This could reduce the need for load following power to regulate the frequency of the grid, which fluctuates daily. Why burn a lot of fuel to follow that load during what is usually a 10 to 60 second event?”
Finally, such an arrangement could make it easier for utilities to rely on renewable energy sources like wind and solar farms. As an example, Pratt points to a February drop in frequency on Texas’s transmission grid related to a shortfall in output from the state’s wind projects.
“They couldn’t bring up their spinning reserve fast enough,” Pratt says. “They had a sledgehammer approach in that they had to interrupt targeted customers. Whether it’s a price dispatch approach, or a circuit board built into a home appliance, we’re proposing to do the same thing, but with finesse.”
A major utility in the Midwest has been experimenting with smart-grid technologies since the mid-1990s, and today it uses those technologies to get the most out of its old-tech infrastructure. (Editor’s note: As this issue was going to press, the utility’s lawyers intervened and requested we not name the company in this story because of pending litigation and ratemaking issues. As the company’s spokesman explained, they’re facing disputes over such basic things as the definition of “smart grid,” and whether the company’s grid already is smart enough to do what’s necessary to serve its customers.) The company uses Cellnet+Hunt UtiliNet communications devices to help monitor and operate the 34-kV sub-transmission system that feeds its distribution substations.
A web of roughly 3,500 Cellnet wireless devices are programmed to carry out a variety of functions and transmit real-time operations data back to the utility’s SCADA system. Each device sends its signals via a mesh network of Cellnet transmitters strategically located throughout the utility’s service territory.
The utility has 900 automated switches on the 34-kV system, and each has the ability to reconfigure itself to restore service during an outage, and report its actions back to its SCADA system. If, for example, there’s an outage due to a fallen limb, the switches can talk to each other, lock out the damaged section of wire, keep the lights on in most areas, and send a report back to the utility’s dispatch center. That minimizes the outage until a crew arrives to make the repair.
“The UtiliNet radios are more than a radio. They are a computer that can transmit and receive,” says a consulting engineer with the utility. “We can write programs that run on the radio’s memory and turn switches and other utility equipment into intelligent devices. We’re leveraging technology to create real-time grid monitoring.”
The technology also is helping the utility meet the needs of its growing service territory. The company has developed a modular sub-transmission substation outfitted with the Cellnet transmitter.
The green pad-mounted cookie-cutter design is easier to site and install because it’s similar in style to the smaller pad-mounted transformers found in newer residential neighborhoods. The Cellnet technology is embedded in the substation and transmits real-time operating data back to the SCADA system.
“Our Midwestern cornfields are turning into housing developments and there’s a corresponding need for new substations to service those loads,” the engineer says. “The problem is, today’s property owners are not very receptive to substations. They tell us right up front, no fences, no barbed wire, and no utility poles.”
More importantly, the company is looking to leverage the communications network further, extending it to a variety of sub-transmission and distribution system components upstream and downstream of the substation.
“We’ve got this transmitter that’s the size of a stick of gum. We can write code and embed it in a component and turn it into an intelligent device,” he says. “So we’re meeting with vendors and examining all the devices on the network to determine how we can increase their ability to send and receive operating data and work in concert with other components.”
Advanced grid monitoring now also is taking center stage in Pennsylvania at PECO Energy. Pennsylvania’s legislature is considering a bill that would provide incentives to consumers and businesses to institute energy-saving measures such as replacement lighting and air conditioning upgrades. The bill also would require utilities like PECO to provide consumers and businesses with smart meters and time-of-use pricing plans.
Against that backdrop, PECO has instituted two pilot programs to help it determine the best way to use advanced metering data to improve overall distribution-system operations.
The first, which began in 2007, uses meter data gathered from Philadelphia’s Old City neighborhood to determine whether upstream devices like transformers, fuses and cables are adequately sized to handle the increasing loads. Using the Cellnet+Hunt AMR system, data from 4,500 meters is collected and monitored on an hourly basis to establish the loads being placed on individual system devices throughout the day.
The second pilot is being conducted in Jenkintown, Pa., a small suburb just north of Philadelphia. In that pilot, the UtiliNet devices are collecting data every half-hour from 15,000 existing one-way meters and two dozen new two-way meters, four reclosers and a unit substation.
The idea is to use the metering data to establish a clear picture of how one part of the utility’s distribution network is working and leverage the data to enhance overall load management.
“We believe AMI technology can support communications to our distribution automation devices like transformers, switches and re-closers,” explains Glenn Pritchard, principal engineer. “Further, we think the data will help us optimize our distribution network. We may, for example, be able to uncover an overloaded circuit and then reconfigure the circuits in that vicinity to reduce power losses and better handle peak loads.”
The long-term goal, he says, is to move away from the utility’s old approach of using historical data to model system behavior and optimize the network with real-time metering data delivered every half hour.
“The Old City pilot demonstrates what we can do with AMR data. The Jenkintown pilot takes load management one step further by bringing us data from other grid devices too,” Pritchard says. “It’s what I call the convergence of AMI with distribution automation, which is arguably one of the fundamentals of the smart grid. Right now we’re validating smart grid principles by making sure the devices communicate as expected and support our data acquisition requirements.”
While PECO’s experiences demonstrate ways to leverage remote access to metering data in a high-density area, the same benefits can be even more valuable in a more widely dispersed service territory–say, less than 100,000 customers spread over nearly 11,000 square miles.
That’s the case at Lake Country Power, which spent nearly five years deploying a full two-way advanced metering infrastructure—including 63,000 new residential and commercial meters—in northern Minnesota.
With an average of only six customers per mile, along roughly 8,000 miles of power lines, the AMI system makes it easier to conduct a range of services—including collect meter data; verify outages and power restoration in real time; check time-of-use readings for an important off-peak electrical heating program; and monitor distribution system performance back to each of 39 substations.
Lake Country, which buys its power from generation and transmission cooperative Great River Energy, offers discounted overnight rates to some 19,000 customers for off-peak electrical heating. These systems use thermal-mass storage systems that charge overnight and release heat throughout the day.
Most of the units are operated by a separate radio and meter combination that activates the heating unit at the appointed time. When a radio relay malfunctions—as roughly two percent do each year—it fails in the on position.
In the past, co-op technicians frequently went from house to house to find and repair faulty receivers, a huge task in a large territory with very few customers per mile. Now the co-op remotely can check the meter’s historical usage data and send a technician directly to homes with unusual load patterns.
“Finding a faulty receiver really was like looking for a needle in a haystack,” says Mike Birkeland, Lake Country’s director of member service. “But it’s important to ensure those devices are working properly. We expect to save $500,000 per year because now we can query the meters to check the load by time of day. That allows us to identify problems and improve control over our load.”
The meters also provide a new way to monitor distribution- system performance from the home all the way back to the substation. By combining real-time meter data with the data provided by its SCADA system, the co-op can more accurately measure voltage and current levels at key junctures along a feeder path.
“For example, say the voltage is low at one home. The cause may be a new residential development nearby with a lot of electric heating,” says Line Foreman Jack Metso. “First we remotely check the meters at nearby residences to ensure it’s not an isolated incident. If it’s not, we might install a new voltage regulator to boost the voltage level in that particular area. We can make the change and quickly measure its effectiveness through the meters.”
Though the system already is providing significant cost savings, Birkeland says he anticipates further benefits.
“Right now we’re still in the exploratory stage,” he says. “There are still technical hurdles, such as bandwidth. It takes about seven seconds per meter to obtain readings, which is what we expected. But we’d like to see fewer technical barriers when we try to pull the much larger historical data files for each meter. We’ve taken a big step forward, but we want to squeeze out every possible benefit.”
Most technology gurus will tell you the devices needed to develop a true smart grid—an intelligent, auto-balancing, self-monitoring grid that integrates a variety of energy sources with minimal human intervention—already are available.
Xcel Energy intends to prove it.
In March, the utility announced Boulder, Colo. will serve as the nation’s first fully integrated “smart-grid city.” As such, Boulder residents and businesses will soon get an up close and personal look at technologies designed to prove the often discussed—but as yet unproven—environmental, financial and operational benefits of an automated smart grid.
If the Boulder project proves successful, Xcel expects to use it as a springboard to similar but much larger smart-grid deployments throughout its eight-state service territory.
“Our hypothesis is that we can change consumer behavior,” says Mike Carlson, chief information officer and vice president of business systems. “If the system is in an overloaded condition, can an automated-response system adjust consumption and avoid a blackout? Can we create real-time interaction between consumption and generation, cut Boulder’s spinning-reserve requirement, and reduce its future environmental, reliability and infrastructure costs? That’s what we intend to find out.”
The project includes a consortium of partners: Global consulting firm Accenture will oversee diagnostic software, intelligent distribution assets and outage-management systems; Current Group is supplying advanced sensing technology, two-way high-speed communications, and round-the-clock monitoring and enterprise analysis software; Schweitzer Engineering Laboratories is providing “smart substation” expertise; and Ventyx is providing work management and price and load forecast solutions.
The first phase, which Xcel expects will be completed in August, includes the deployment of 15,000 smart meters and placing data-gathering devices in two substations and five feeders. Behind-the-meter technologies, including thermostats and plug-based devices that interact with network signaling, are being installed as well.
“There’s not a single solution for all the homes,” Carlson says. “We’re testing different vendor solutions with different capabilities across the city.”
Phase II will be completed in April 2009, with meters installed at another 10,000 homes and businesses, and communications devices installed on another three substations and 17 feeders. At that point, testing, evaluation, and fine-tuning will commence for a yet-to-be-determined period of time.
In June, the consortium will assemble its substation and distribution automation network, which includes connecting to each customer and integrating the first set of meters. By September, a web portal interface will be established to let customers view their consumption. That will be followed by providing automated device control and management capabilities.
“There are obviously a lot of parallel paths,” Carlson says. “The first job is to install and test the new meters. Then we have the time-of-use rate proceedings, which begin in August. At the same time we have to examine the grid algorithms, document the key data points and identify potential gaps in the distribution system to determine the best way to establish a real-time data collection system. By August we should have the basics in place.”
Xcel estimates the project will cost roughly $100 million, with much of that borne by a combination of government grants, partners, and vendors looking to test and prove their software and hardware products.
“We’ve run our time data architecture requirements by companies like Oracle, SAP, Teradat, OSIsoft and they’re all interested in doing a trial,” Carlson says. “Everybody says they have technology that can be applied to this project. How much really exists and how much of it still needs to be developed? Right now we think 60 percent of the data architecture technology is already there, while the other 40 percent will probably need tweaking. Then we’ll determine what is or isn’t scalable.”
Xcel’s smart-grid-city project is intended to address a number of important smart-grid theories and questions. For example, the company believes monitoring a distribution system in real time to optimize power factor performance and system balancing could reduce distribution losses by as much as 30 percent. Therefore, it will retrofit existing substations with remote, near real-time data monitoring equipment to optimize performance.
“Say the grid is overstressed and we can see that a substation transformer is overheated to its breaking point. And because of that, we know we have a 15 percent power factor loss at that point in the system,” Carlson explains. “Our hypothesis is that if you have a fully-integrated grid, you can make distribution alterations to relieve that stress point and bring the loss down to something like six to ten percent. We believe we can do that system-wide.”
The goal is to create an intuitive electrical network, one that can incorporate and automate existing load-shedding processes to balance the system. Some of the necessary capabilities already are in place, such as remotely opening or closing certain switches, or interrupting service to large customers who have agreed to such measures in advance. But in this case, the processes would be automated and include other options, such as cutting back on residential central air conditioning via programmable thermostats.
In the future, the smart-grid city could include additional demand-response measures that go beyond operational benefits to address environmental or other societal values. For example, pool pumps or other non-essential residential appliances could be identified in advance and shut down temporarily to reduce peak loads.
“The point is, you have to prove all these concepts before you can present the idea of a truly intuitive smart grid system to regulators,” Carlson says. “As an industry we haven’t really demonstrated the benefit of combining all these technologies. Until we do, there will be skepticism. That’s the real value of this project.”