Reinventing the Grid

Deck: 

How to find a future that works.

Fortnightly Magazine - March 2014

The phrases "utility of the future" and "utility 2.0" aren't new,1 but they've entered the lexicon in a big way in recent months. In various public and private forums, utility leaders across the country have been trying to visualize the industry's future - and to clarify the utility's role in that mix.

Different people see different things when they gaze into the metaphorical crystal ball. Some see catastrophe, the so-called "death spiral." Others see a renaissance, a new age of innovation in energy services.

Most everyone sees big changes ahead.

In the death-spiral vision, renewable energy and distributed energy resources (DER) will continue to get cheaper - just as traditional central utility services become more expensive. As customers become their own generators, they drop their former share of the utility's fixed costs onto the backs of other ratepayers, driving more users to seek alternatives to utility power. At some point, the utility ceases to be a viable business - and the grid falls apart. Only the wealthy will be able to afford first-class electricity service, and the rest will be relegated to price spikes and rolling brownouts.

Some foresee less drama, and more subtle changes. In their view, things like renewables and DERs might just be a fad, perpetually unable to compete without subsidies and mandates. This scenario sees the pendulum swinging back toward least-cost planning as the industry's primary guiding principle.

But even this view includes some important changes coming as new technologies mature - most notably grid modernization, distributed intelligence, fast communications, and real-time analytics. Automation offers new ways of operating the system to deliver reliability and safety at the least cost. And even traditionalists foresee continued advances in technologies to deliver and manage things like demand response (DR) and microgrids.

Others see a future in which the utility realigns itself to serve a new set of mandates. Instead of building out a system to electrify the nation - a task that was mostly accomplished last century - the industry will prioritize making that system as flexible and adaptive as technology will allow. Reliability, safety, and reasonable cost will remain pillars of the regulatory compact, but the utility 2.0 mission will be to integrate all energy resources - including DERs, variable renewables, and more traditional assets - in a way that ensures optimal deployment, nondiscriminatory treatment, and maximum value for customers.

Where utility 1.0 was required to provide undifferentiated service for all ratepayers within a given class, utility 2.0 will provide a platform for innovation and customer choice.

A common theme unifies these three visions - namely, a more decentralized grid architecture, with more resources dedicated to serving localized customer needs.

"If you look into a real crystal ball, what you'll see is a view of exactly what's in front of you - except it's upside-down," says John Jimison, executive director of the Energy Future Coalition, a Washington, D.C., nonprofit affiliated with the UN Foundation. "That's a great analogy for the electric industry's future. The world will be very different in the year 2050, and the industry already has launched irrevocably onto the process of change."

To better understand these trends, Fortnightly interviewed 11 industry leaders to get their views on the future. They include:

• Doug Houseman, v.p. of technical innovation, EnerNex

• Mani Vadari, president, Modern Grid Solutions

• Rana Mukerji, s.v.p., market structures, New York ISO

• Bill Grant, deputy commissioner, Minnesota Department of Commerce

• Charles Bayless, former CEO, UniSource and Tucson Electric Power

• James Gallagher, executive director, New York State Smart Grid Consortium

• Ralph Masiello, s.v.p, energy advisory, DNV GL

• Mark Feasel, v.p., sales and marketing, Schneider Electric

• Michael Edmonds, v.p., U.S. business unit, S&C Electric

• John Jimison, executive director, Energy Future Coalition

• Ken Geisler, v.p. of strategy, Siemens Smart Grid

Their comments suggest that a radically different future lies ahead - and utilities see it taking shape today. The biggest question now is how to prepare for it in a way that minimizes the risk and maximizes the rewards that new technologies will bring.

Killing the Central Model

FORTNIGHTLY What does the utility system of 2050 look like? What are the implications for utilities and their role?

Mani Vadari, Modern Grid Solutions: In the next five to 10 years, both storage and distributed generation (DG) will be widely available technologies, and they'll be competitively priced. That will dramatically change the future outlook for utilities.

Today the guy putting PV on his roof is the energy geek who wants to try new things. When prices come down for things like solar shingles and windows capable of generating electricity, solar generation will be more prevalent across the neighborhood, and that will be the first real threat to the centralized utility system that we have today. As prices come down, we'll see an explosion in microgrids and minigrids.

The more DER deployment we see, the more we'll need someone to make the system work in an integrated fashion. The utility will be the best equipped to manage, operate, and maintain microgrids and DERs. It will allow them to justify a smaller amount of investment in T&D and generation assets.

John Jimison, Energy Future Coalition: Before long manufacturers will stop building dumb appliances. Every dishwasher or refrigerator will have a chip that can be programmed to minimize cost and optimize efficiency, and customers will want to use that. Not only that, it will allow making the appliance available for grid control, and customers might expect a reward for that.

Doug Houseman, EnerNex: When I was involved in IEEE's Grid Vision 2050 Roadmap project, we had to change the way we think about the grid. Instead of thinking in terms of electricity generation, transmission, distribution, and consumption, we started thinking in terms of making, moving, and using electricity. Make, move, use: In the future, I might make electricity on the roof, move it down the wall, and use it in the kitchen. If I can't use it all, then I'll move it to my neighborhood, then up into the grid, down through a different transformer, and to a consumer somewhere down the road.

In that kind of system, electricity would move both ways in concentric and linked circles throughout the system. Thinking about it that way, you begin to see how complex the grid will be.

Charles Bayless: The industry will disaggregate. Innovators today have no respect for the concept of a service territory, natural monopoly, or stranded assets. They just want to make a buck. Over time thousands of people will be competing to provide services. The natural monopoly model is dead.

But somebody has to be responsible for keeping the system up. Now it's done by edict; regulation requires power companies to do it. However, no customer is buying grid stability. Some customers buy their own backup power, but there's no market for grid stability. In 2050 when you have 2,000 separate providers on the system, to whom does the edict go? We can't just have disjointed groups saying "it's not our problem."

I also worry about the future as we get cold waves like we did this year. We came very close to having rolling brownouts just when customers need power the most. Gas-fired power units aren't viable capacity, and it's not their fault; because of the gas market structure, no rational generator can lock in firm gas. Pipeline capacity already is insufficient, and as push comes to shove the compression will be too low to keep gas-fired plants running.

You can't build the system for a one-day-in-10-years reliability standard if the fuel supply system isn't built to meet that standard. In order to be counted as firm capacity, gas-fired generation will need to have two weeks of fuel supply in storage, or some other way to guarantee availability.

Ensuring grid stability and reliability costs money. We can introduce a grid stability charge to pay for it, and someone has to be accountable. By 2050 the ISOs could be doing it.

Rana Mukerji, New York ISO: If we reached the perfect end state in 2050, computing and communications technology would provide the visibility and control to dispatch any resource, whether it's supply or demand, based on economic signals. That would be the ultimate: a refrigerator communicating with signals in the same plane as a power plant. Every piece of demand and supply would be elastic, based on price vs. quantity signals.

New York Public Service Commission Chair Audrey Zibelman has categorized what New York is doing as the second restructuring of the utility industry. Just as we had restructuring in the wholesale market, she envisions restructuring at the distribution level. There would be an entity, the [independent] distribution system operator (DSO), which would coordinate the distribution players.

The DSO would have a purely regulated, goal-driven paradigm. It would offer a nondiscriminatory, open access structure for both supply and demand resources to compete in the distribution space. The DSO's other role would be to aggregate distribution resources and feed them into the ISO market.

That might happen in several forms; the market could allow load to bid or offer at a certain price and quantity. It could be a supply or DR offer. The DSO would manage the load shape, hour by hour and minute by minute, and provide a framework for different players to compete, aggregate, and feed resources into the wholesale market.

James Gallagher, New York State Smart Grid Coalition: The idea of a DSO is a very viable and real direction that we need to explore. It will be important to coordinate control between distribution utilities, DSOs, and ISOs to make sure we aren't duplicating responsibilities. In many cases, dispatch for DR, for example, could be handled by any of the entities. The important part is thinking through the responsibilities of the DSO vs. the ISO.

As we reinvent the distribution utility, we'll be moving toward active network management on a local level, much like the larger transmission system, but taking the concept down to distribution. That will call for much greater sophistication by utilities and more sophistication within microgrids. The utility will be the best suited to have a macro view of the system and ensure that independent microgrids and DERs are coordinated.

Flexi-Grid Emerges

FORTNIGHTLY How confident should we be that technology is capable of implementing the utility 2.0 vision? What new products or design approaches will be needed to ensure we maintain the level of service quality and reliability customers expect?

Mark Feasel, Schneider Electric: Looking back to the progress we've made on very complex electric systems over the past 20 years, I see very few technological barriers to achieving [a sophisticated and fully integrated utility 2.0 system].

Empowered and situationally aware stakeholders will be prevalent - and critical - in the utility of the future. In the past, only the most energy-intensive consumers had any situational awareness. With miniaturization of components, wireless communications, and standardization, we're reaching the point now where even relatively unsophisticated energy stakeholders can understand what variables are correlated to their energy consumption. They can better quantify the balanced outcome they're looking for, and increasingly they're augmenting their own facilities or procuring services from independent providers. Technology absolutely is empowering them to do that.

I see big data and analytics as one of the most important aspects of technology development over the next decade or so. Big data analysis capabilities today are emerging that provide insight for understanding flexibility needs in a pretty concise and verifiable way.

We already are designing microgrid controllers and automatic dispatch systems to coordinate supply and demand seamlessly in the grid.

It's certainly not simple, but I think the physical layer is the least of our worries. The really difficult questions involve the business, regulatory, and financial layers. We need to make progress in quantifying the benefits of distributed, autonomous solutions, including their scalability and how they can be monetized. That will allow more people to change their minds about investing in systems that allow [utility 2.0] to happen.

Ken Geisler, Siemens: A lot of the technology is there now. Good products are available today. But the other part of technology is network design. It was a radial distribution grid, and now we're getting uncontrolled sources popping up and creating problems for the central design. There are two ways to solve those problems. One is with quick fixes, some form of local upgrade, or controllable storage or DG. But those are Band-Aids to accommodate uncontrolled DG. The other approach is to redo the system design. That might include dropping in some controllable storage or DG, but it will happen as part of an overall design solution.

DERs are coming on faster than utilities thought they would. Some utilities are doing a very good job, working with things like automated controls and adaptive relay schemes to account for changes on feeders and different levels of load and generation, and working with customers on things like battery storage and microgrids. We did a project with Hawaiian Electric Co. in East Oahu to improve reliability and outage recovery by automating a 46-kV section of the Oahu network. The engineers at first were leery about it, but once it was in, they saw it works flawlessly, and HECO decided to install automation throughout the rest of the 12-kV network.

Utility engineers have worked with radial designs for years. It will take time for them to become comfortable in a space that isn't as well known.

Michael Edmonds, S&C Electric: As power systems engineers and control engineers, we were trained to design systems and leave them alone, maybe checking protection equipment every five or 10 years. That whole paradigm will change. If something happens on the grid, devices will automatically isolate the fault. We'll have multiple ways of serving power in different areas. Technologies are changing how we design the power distribution grid and how it operates.

Control is moving from the centralized control room into the grid. The central control room won't disappear; it will continue to serve an overall command structure, as opposed to a control structure. It's like in the U.S. Army; if a soldier had to ask the Pentagon every time he needed to fire a gun, the army would never get anything done. It's the same from a grid standpoint.

Big data is important but I don't buy into the data warehousing model, with everything going back to central decision making. Data will be used in algorithms for how to improve things. The system will operate in a way that's almost like crowdsourcing. Devices on the grid will talk to each other. Consumers will be participants in the grid.

For that you need a system with low latency and fast response. Storage, voltage control, and grid stability will be a big play. The distribution grid of the future will be built around controlling energy flows and managing transactions. The need for huge generating plants will become more specialized and focused. As storage gets cheaper, peaking power plants will become challenged.

Houseman: Under the current regulatory regime, renewables are must-run assets. But nowhere in North America do renewables make enough power to support seasonal peak load, even if we scale everything to net-zero energy.

Anyone who thinks DR will solve the problem doesn't realize that the biggest challenges aren't during the day, they're seasonal. It's not just about shifting your washing from noon to 2 a.m.; it's about shifting from July to October. Policymakers haven't done enough thinking about the mismatch on the physical side. Do we build three times as many wind turbines and shut down two thirds of them for 10 months? Or build massive pumped storage facilities. Or produce ammonia and use it in fuel cells?

We need a silver bullet in storage, and we don't have it. Storage hasn't gotten much better in the last 40 years. It's somewhat cheaper but hasn't improved much in terms of energy density. Nanotechnology researchers talk about orders of magnitude of improvement, but when they try to scale it up, it doesn't work.

Ralph Masiello, DNV GL: Storage will change the electricity industry tremendously. Will it happen in three years? No. Deployment will be slower than some people think today. But on the other hand, people tend to underestimate the potential in the long run. Storage will change the game in 20 or 30 years. There are so many drivers to make storage technology cheaper, more reliable, and more effective, like electric vehicles and things like Google Glass and smart watches. Companies are investing billions of dollars in storage research because they're impatient with the pace of development.

Also we need to realize that reliability levels actually have gone down. Reliability to the end user has been degrading in many geographic areas. SAIDI and SAIFI stats typically exclude events like Superstorm Sandy. We've had four major hurricanes in the last three years, taking out millions of customers each time. Just last week 600,000 customers were out in Philadelphia after a major snowstorm.

From the customer's point of view, reliability is getting worse, and at the same time electricity has become more essential to our lives. Not having Internet connectivity for several days will drive people nuts. Electricity is more vital than ever before, so reliability should be improving - but it isn't.

Breaking Eggs

FORTNIGHTLY Utilities understandably oppose changes that threaten the traditional business model. Assuming utility 2.0 is a realistic vision, what will convince utilities to support it?

Geisler: Some utilities have embraced it and are looking at how they can address it outside the regulated model. Others are pushing the drive toward new business models, working hard with regulators to make adjustments that account for the effects [of decentralization], plan for the costs, and consider different avenues. It's great to see that happening.

Masiello: We need to develop an operating system that will allow the transition to happen. The OS consists of the rules for how things work on the transmission and distribution system. It's tariff structures and control systems. We're not moving at a faster pace today because we don't have the OS. It changes the role of the utility, and how they invest.

Bill Grant, Minnesota Department of Commerce: We need to start thinking about the way that utilities are compensated for the investments they're making in transmission and distribution. The models in place basically say you make a capital investment and sell electricity as a commodity, and we'll compensate you on the basis of how well you market that commodity. That's in obvious conflict with the notion that we're disaggregating sources of generation, who's investing in them, and how utilities are compensated for operating the grid.

It's a problem that needs to be fixed at the same time that we're easing the path to greater DG penetration. And it's one of the reasons that we've been looking for alternatives to the traditional net metering regime that's been in place for some 40 years. It's not a problem if 0.1 percent of customers are net metering, but it's a significant problem if 5 or 10 percent of customers are net metering.

Similar issues first originated with energy efficiency policies. Utilities were required to invest in incentives that would help un-sell their product. We went through numerous approaches and settled on a system of financial incentives and in some cases decoupling. But those are like Band-Aids that don't solve the bigger problem.

Minnesota now has innovated with the value-of-solar approach. It involved an effort to figure out what's the value of solar energy to the utility, including a range of factors. For utilities that adopt the value-of-solar tariff as an alternative to net metering, the customer would continue to pay for grid services at the same time they're being compensated for the electricity they're providing.

John Jimison, Energy Future Coalition: There's unquestionably a need for a change in the utility business model, away from volumetric rates. The traditional adversarial regulatory context has to change if we're going to keep the utility industry healthy and provide critical functions in an era of declining load and increasing options.

There's a cost to be paid for the services utilities will provide in the future.

We will need markets for new and as-yet-unpriced performance metrics, such as flexibility and quick response. Fast-acting plants should get paid more because there's a value for what they provide when they ramp quickly. They should be compensated for that capability.

One factor that hasn't received a lot of attention is that utilities provide a one-size-fits-all service. Aside from differences in classes, it's a flat service offering, which arises from the regulatory notion that utilities are a natural monopoly that must provide nondiscriminatory service and pricing. But like telephone customers or TV customers, electric customers will start differentiating the service they want. Some will want a flat monthly bill, others will want to keep the service they've had, and some will want to integrate a PV array, an electric vehicle, smart appliances, and the ability to provide DR through a third-party aggregator. Those are different sets of service that warrant different rates and arrangements between utility and customer.

There will be fussing and feuding over how you provide differentiated service for power quality and reliability. But it will happen because we'll move beyond volumetric rates, and when you do that the only criteria that make sense are performance-based rates, with incentives or disincentives for performance. Regulators will be confronted with decisions about how to allow customers who have the ability to pay for better reliability, smarter grid integration, DR, and other things. We will get into questions of social equity. It will be complicated and there will be push-back. But as a society we may be better off allowing utilities at least the option of offering gold-plated service for customers who need it and can pay for it, rather than to push those customers toward options other than the utility.

The question will be whether we can protect the basic rights of all customers in the process, or whether they'll suffer as a function of allowing utilities to provide better service to customers who can pay for it.

Houseman: If we keep running in the direction of microgrids the same way we're doing renewables, then we'll get to a future of haves and have-nots; 20 percent of the customers will have their own generation and ultimately will disconnect from the grid, and 80 percent won't be able to disconnect. When the 20 percent disconnect, the reliability will be horrible. With variable renewables, we won't have enough power.

Just wait until there's a blackout in the inner city in August, and it's 100 degrees with 100 percent humidity. Talk about social inequity; the direction we seem to be driving toward is for greater social inequity than we've had since before the Civil War.

Bayless: It's easy to get utilities to do something if you have the right regulations. Utilities won't budge until regulators say they can recover the costs.

The problem is that utility regulatory policy moves in a series of lurches from one direction to another. A state commission might favor renewable energy, but it might be replaced by one that says "No way." Inconsistent policies over time are less than optimal, to put it mildly.

Until supportive policies are in place, I think a lot of utilities will try to slow down the transition. Look at the early days of the independent power industry and all the roadblocks that utilities and states put up. There was a law in Indiana that said only a utility could own a power line that crossed a railroad track. An IPP couldn't serve a customer next door because of a railroad track.

However, customers eventually get what they want. If utilities won't provide the systems that customers need, then customers will bypass them and go directly to providers who will.

Keep in mind, this is one of the few remaining end-to-end integrated industries. We'll continue to disaggregate. In the end utilities will be pure discos. They might not even sell electricity. They'll run distribution, and that's a great business to be in.

Vadari: I don't believe there are 100-percent winners and losers, but there will be those who like the status quo and want to maintain it, vs. everyone else.

The day is coming when DG and storage will stand alone without subsidies. It might be five, 10, or 15 years, but on the 30-year horizon I can almost guarantee that both storage and DG will be competitive. That means the regulated utility rate base is going to shrink, and utility companies will get smaller unless they start changing now and begin offering these services.

If a customer wants to build a microgrid, then why shouldn't the utility company do it? It should. But if a utility develops a business to manage DERs and microgrids, it would need to become an unregulated service. They'd be competing with local Mom and Pop electricians and other firms that will want to offer those services. I don't think utilities can do it in the regulated rate base.

Polishing the Crystal Ball

FORTNIGHTLY Utility 2.0 planning is happening in various forums. What approach will ensure these planning efforts produce constructive results?

Vadari: Utilities and policymakers should be spending more time on the planning horizon - five to 30 years. Assume the horizon will bring new cost-effective technologies, and look at alternative future scenarios. Do a weighted probability of each scenario, calculate a likely 30-year end state, and plan for it.

As an engineer, that's an approach I can defend. If something isn't scientifically or mathematically defensible, then you're just making stuff up.

Grant: The planning issues are extremely important.

We've been doing integrated resource planning (IRP) in Minnesota for a long time, focused almost exclusively on generation and transmission decisions. We were almost completely ignoring distribution-level planning and the role it plays in the outcome of an advanced utility system. We need to move toward a planning environment where we're not looking in isolation at the distribution grid, but at how it interacts with transmission and generation - both central plants and distribution nodes. That's a significant component if we're going to get the best value from distribution resources generally.

Distribution-level planning is a new frontier on the utility horizon.

Houseman: We have to get past the idea that anyone who wants to can put in DG wherever they want. It's like taking a bucket of marbles and throwing them onto the map and then requiring the utility to connect the dots. We're being very inefficient about planning and therefore we're wasting more power than we should be. And if we continue to lean on the grid in this way, it will become underfunded, shabby, and less reliable. It will take longer to fix problems.

I'm a fan of DG and renewables, but we've gone about this in the worst possible way, from a regulatory standpoint. It's literally the Wild West; do what you want to do, claim your stake, you don't have to pay for it.

Because of asset life cycles, if we don't start today to make a generational transition in the grid, it will be 2050 before we get it right.

Feasel: We see some amazing thought leadership right now, and some great ideas. It's happening in bits and pieces - and consumers are pushing it.

If you view DERs as a problem, then you're not going to find an efficient solution. The challenge is for regulators to unshackle utilities so they can serve customers the way they need to be served.

The people that best understand specific requirements with respect to the electric system are the customers - stakeholders with facilities. They understand the cost of an outage or a power quality anomaly. They understand how they value or don't value renewables, and the value of electric service for the third shift vs. the first shift. In an efficient grid, they and their assets will become fully empowered, collaborating with the grid to achieve the outcomes they need.

Gallagher: We're at a very early stage where utilities are open to considering what needs to be done, but at the same time they're looking for a dialogue about the kinds of change that might be appropriate in the way utilities are compensated, the way utility incentives are structured, to make sure there are symmetrical incentives.

This dialogue is beginning. Parties are debating, proceedings have been initiated, and over the next couple of years we will gain a much clearer vision of where we're going with distribution utilities, at least in the near term.

Utilities won't embrace models that haven't been fully tested. This means much of the innovation must come from outside of utilities and the halls of regulation. Some of the most creative ideas are coming from communities, vendors, and other stakeholders. It's critical that we have open processes, with open dialogue that includes outside stakeholders, so that where competitive markets can provide services, they have an opportunity to do that.

Endote:

1. Fortnightly might be attributed with coining the phrase "utility 2.0" when we used it on the cover of our August 2008 issue. See "Utility 2.0: Web technologies are transforming the utility-customer relationship," August 2008.