Amory Lovins on negawatts, renewables, and neoclassical markets.
The name “Amory Lovins” means different things to different people in the energy industry.
To some executives, the name means “tree hugger.” For about three decades, the co-founder of the Rocky Mountain Institute (RMI) has acted as a champion for green energy development and a critic of Big Oil—and big utilities. Since the energy crisis of 1973, he’s advocated what he calls the “soft” energy path—less reliance on Big Energy with its centralized power plants, landscape-spanning transmission lines, and hard-won fossil fuels, and more reliance on smaller, locally sourced renewables, conservation, and efficiencies driven by competitive market forces. And in the 1970s and ’80s, he argued against nuclear energy, largely on the basis that its development would spur weapons proliferation.
To many outside the industry, however, the name Lovins means “visionary.” In 2009, for example, Time magazine named him one of the world’s most influential people, saying that he “had the solution to the energy problem in 1976,” and that his ideas about conservation and renewable energy “have become accepted wisdom.”
Granted, those words in Time were written by Carl Pope, executive director of the Sierra Club. But as a general matter, many of the facts seem to bear him out. And as it happens, Public Utilities Fortnightly’s archive includes a record of those facts.
In March 1985—28 years ago this month—this magazine published Lovins’s article titled “Saving Gigabucks with Negawatts.”1 That article represents the earliest known publication of the term “negawatt”—a word that’s entered the industry lexicon.2 More importantly, it described a future in which both supply and demand-side resources would be bid into competitive electricity markets; buildings would produce as much energy as they consume—or more; and advancing technologies and changing economics would weaken and ultimately break the longstanding correlation between economic growth and energy consumption.
This month we’re re-publishing Lovins’s 1985 article at Fortnightly.com. The article makes remarkable reading today, in part because it was so prescient; much of what Lovins predicted has indeed come to pass. However, what’s even more remarkable is that the article actually still applies today, almost three decades later. With some editing—changing some details—it still presents a solid analysis of industry trends that are playing out right now.
Lovins spoke with Fortnightly in February about those trends. That conversation, in edited form, follows here. And in an accompanying article, Lovins offers his retrospective view on “Saving Gigabucks with Negawatts,” in the context of RMI’s current focus.
Fortnightly: In your 1985 article, you’d proposed a “neoclassical competitive marketplace for energy services,” versus what then was an “imperfect fuel bazaar, satisfying no condition of the ideal free market.” Since then the utility industry has developed competitive electricity markets in fits and starts, and only in some locations, with varying degrees of success. It would seem the imperfect bazaar still prevails. Why are we still here?
Lovins: Where we are is messy but improving. Competitive markets cover about 60 percent of the nation’s electricity demand—almost everything but the Southeast and much of the intermountain West. Nearly all the rest is covered by RTOs or equivalents. In the Northeast, PJM, and MISO—30-odd states—negawatts can now be bid into what formerly were only supply-side auctions. And in 15 states, with six more pending, utilities are no longer rewarded for selling you more electricity.
We’ve made progress, and the results in states that have adopted decoupling and shared savings speak for themselves. Those utilities have better value and lower risk, and they can provide better service, often at lower cost, because they’re rewarded—not penalized—for doing the cheapest things first.
Arguably just the building codes recently entering force in half the states suffice to flatten national electricity demand growth to about zero. And indeed electricity use seems to be drifting down, not up, even during an economic recovery, because electric intensity (kWh used per dollar of real GDP) is dropping steadily—not evenly across the country, but in total. This pressure should gradually cause the electric industry, like the gas industry earlier, to push harder for decoupling. And decoupling, combined with shared savings, should then produce such dramatic financial, cultural, and behavioral change in the industry that modern, least-cost, efficiency-centric efforts would become the new center of gravity.
John C. Fox, my predecessor as chairman of RMI, led PG&E’s demand-side efforts when they were the biggest and best in the world. In 1992, PG&E invested $170-odd million to help customers save electricity cheaper than it could be produced even by existing plants. This yielded nearly $400 million in present-value benefits, which the CPUC allocated 89 percent to customers as lower bills and 11 percent to the utility. The initial investment was amortized to all customers over many years, just as it would’ve been for a new power station, but since efficiency was cheaper, it was bought first. The 11 percent reward went straight to the utility’s bottom line. It totaled more than $40 million that year—PG&E’s second-biggest source of profit after the Diablo Canyon deal.
Fox reported that if you add $40 million to the bottom line at no risk or cost to the company, your CEO will call you every week to ask if there’s anything you need, and all the smartest people in the company will want to come and work in your division to advance their careers. Aligning customer interests with company interests profoundly affects culture and behavior.
That story has a sharper point today. The industry faces big catch-up investments on old infrastructure, along with reliability and security issues, environmental cleanup, and other investments that are more necessary than productive. But upward rate pressure plus stagnant or falling sales could shrink the domain of financial stability. Some financial analysts [draw] a disturbing analogy to how quickly the business model of newspapers went upside-down, and to the large sensitivities of many electric utilities’ EBITDA and net earnings to the decline in revenue observed in the Great Recession a few years back. From this perspective, buying small, fast, granular resources—rather than big, slow, lumpy ones—could be an important tool for financial risk management.
Fortnightly: By “small, fast, granular resources,” are you talking about distributed generation and demand response?
Lovins: Yes, and energy efficiency, though modern renewables like windpower and photovoltaics (PV) get more press. In 2011 alone, non-hydro renewables got $225 billion of private investment worldwide and added 84 GW of capacity. Renewables including big hydro have added half of the world’s new capacity starting in 2008, the majority lately in developing countries, and now make a fifth of the world’s electricity.
But hidden in those numbers is a game changer like we haven’t seen since Edison’s day. Recently I visited a Chinese factory that was producing two and a half gigawatts of solar cells every year. The means of making electricity has already shifted to a scalable, mass-produced, manufactured product with a steep learning curve. Creating this industry’s basic asset is now much more like the way we make microchips, cell phones, and computers than it is like the specialized, massive, cathedral-like edifices with decade-long lead-times that have dominated our investments and balance sheets, yet now lack a business case.
Thus core technologies have been transformed in type, scale, speed, and accessibility to many market actors. Close behind those technologies is a swarm of new business models, revenue models, and regulatory models. The electricity sector—the most capital-intensive, complex, and vital part of the economy—is facing more numerous, diverse, and profound disruptions than any other sector, as 21st-century speed collides with 20th- and even 19th-century rules, institutions, and cultures. That’s why my colleagues at RMI have launched the Electricity Innovation Lab (e-Lab)—a multi-year, multi-stakeholder effort in rapid mutual learning to figure out together the contours and many of the details of the next electricity industry.
Fortnightly: History shows that change happens fastest in this industry when it serves utilities’ interests, rather than threatens them. How can utilities be convinced they should support this change, rather than fight it?
Lovins: First they need to understand its speed and scale. For example, a utility might consider PV its least-plausible competitor among modern renewables, because it’s the costliest major renewable technology. Yet in April 2012, more than 4 GW of PV generation cleared the California auction at a busbar cost of $89/MWh in levelized 2012 dollars—cheaper than power from a new gas-fired combined-cycle plant. To be sure, that’s with the benefit of a 30-percent solar tax credit, which expires in 2016. But by that time you’ll get the same result without the tax credits—as Germany proved last year by cutting its average cost for installed PV systems to half the U.S. average. Or you can get competitive, unsubsidized PV today by counting some of the distributed benefits that astute market actors are starting to exploit, but that most market structures fail to recognize. For more on this, see RMI’s 2002 book, Small Is Profitable.3
The second observation is that in about 20 states, companies like SunEdison, Sun Run, Sungevity, SolarCity, and their rivals will happily come to your house, install solar power on your roof with no money down, and beat your utility bill. There are at least a half-dozen ways an incumbent can respond to such insurgents. It can ignore them; fight them; try to tax or block them; finance them; buy them; incorporate their products as its own branded offering; become an open-source integrator for all qualified offerings; or several other possibilities. But among all responses, playing ostrich isn’t a good one.
This sort of basic challenge to the traditional utility model does focus the mind wonderfully, and calls for an unprecedented level of thoughtfulness and creativity in figuring out both competitive and coopetition models—as well as the new regulatory models that come with them.
But the challenge is more fundamental than it might appear, on at least two levels. First, the old revenue model is broken. California and the U.S. Army, among others, have set goals for net-zero-energy homebuilding, so that new homes will produce at least as much electricity as they use during the course of each year. Well, if you’re charging customers for kilowatt-hours, but their net use of kWh over the year is zero, then they pay you zero net revenue, even though both you and the home are swapping valuable services for which you should both be properly compensated. (See “The Law of Unintended Consequences,” this issue, page 44).
That’s the kind of trouble you get into if, as Walt Patterson reminds us, you treat electricity as a commodity when it’s really an infrastructure.
Second, utilities as we know them are the folks we’ve always hired to keep the lights on and the motors humming. But customers have expanding choices. Utilities that annoy their customers, by price or behavior, risk the sort of bypass that drove many phone customers to abandon the landline and use only cell phones, sometimes with different service providers. The difference is that the cell phone business still relies on a lot of the old copper and fiber assets, whereas electricity customers might bypass the grid entirely. The entrepreneurs who put that competitive solar power on your roof with no money down can provide a portfolio of other equally unregulated products, like efficiency, demand response, storage, and so on, that could ultimately add up to a virtual utility providing the same services that utilities now provide—quite possibly with lower cost and greater reliability and resilience.
Fortnightly: I’ve been hearing a lot of people use that word “resilience” lately, especially in the wake of Superstorm Sandy.
Lovins: That’s right. “Resilience” is another hot trend moving us toward a more efficient, diverse, distributed, and renewable power system. Thirty-one years ago, RMI published a book titled Brittle Power: Energy Strategy for National Security,4 with a foreword by former CIA Director Jim Woolsey and Admiral Tom Moorer, who was chairman of the joint chiefs under President Nixon. That book remains the definitive unclassified work on domestic energy critical infrastructure. It found that power systems were under attack daily around the world and that a handful of people could turn off three-fourths of the oil and gas supply to the Eastern states without leaving Louisiana—then keep it down for a year. Electricity systems were even more vulnerable—and that was before the Internet was deployed in an extraordinarily insecure way to control power systems, creating new and even scarier types and levels of vulnerability, whose exploitation could black out many parts of the country and prevent recovery.
Having served on the Defense Science Board Task Force that surfaced this issue in 2006 through 2008, I’m a little surprised every morning that the lights are still on, because much of the industry still hasn’t taken precautions to secure the electric grid. Yet this vulnerability to grave and potentially economy-shattering disruption is unnecessary and correctable. The most fundamental way to correct it is to shift the architecture of the grid. We can take advantage of small, fast, granular resources on both the supply and demand sides by reorganizing the grid into netted, islandable microgrids. These normally exchange power, but can stand alone at need, disconnecting fractally and reconnecting seamlessly so that critical loads are served by local resources, until wider interconnections are restored. And the more distributed the generation, the greater the reliability, since 98 to 99 percent of U.S. power failures originate in the grid.
Fortnightly: We’ve seen a lot of interest in microgrids lately. But so far I’m only aware of some fairly small pilot projects. Is this technology pie in the sky, or is it real today?
Lovins: Denmark has been piloting such a cellular grid over significant areas. Professor Abe at Tokyo University and a Japanese industry consortium have even developed a “digital grid,” connected by smart, asynchronous inverters that act as routers for electricity—delivered in packets labeled by their price, origin, and impact. Cuba—which we can learn from even if we wouldn’t want to live there—combined netted, islandable microgrids with efficiency and distributed generation to go from 224 serious blackout days in 2005 to zero in 2007. Then in 2008, those microgrids sustained vital services while two hurricanes in two weeks shredded the eastern grid. Some remaining dependence on geriatric, Soviet-era oil plants blacked out 5 million people in greater Havana last September. But in general the change has had a stunning benefit for reliability.
Resilient design can make cascading, large-scale, long-term grid failures impossible by design—rather than inevitable by design, as they are now. The Pentagon has adopted this resilient approach for its own power supply at its bases to ensure its own mission continuity, because they need their stuff to work. Of course so do the rest of us, the people they’re defending. As we rebuild broken infrastructure, Hurricane Sandy should inform us that designing for resilience is no longer a luxury. The security it brings, not just to our country but to each customer, is valuable and highly marketable.
Fortnightly: How is it marketable?
Lovins: I’ve talked to homebuilders about the notion of putting in every new home certain specially colored sockets—say, orange—that don’t go off in a power outage, and to which you could connect things like your refrigerator, freezer, computer, and communication systems. The number of orange sockets could later rise as, for example, PV gets cheaper and the array on your roof expands. Homebuilders agree that even initially this form of household security and insurance would be a very marketable feature. Many of the big merchant homebuilders already offer integrated rooftop solar power systems as an option, and sometimes even as standard equipment. Had that already been widely installed in the Sandy-damaged area, it could’ve saved a great deal of misery and disruption.
The inherent vulnerability of today’s electric grid to natural disaster and solar storms, and its physical and cyber vulnerability to attack, are moving grid security and resilience rapidly up many customers’ agenda. Microgrids offer a vital opportunity for the electricity industry to anticipate customers’ needs while ensuring its own technical, political, and financial resilience.
Fortnightly: The state of Connecticut is actively promoting microgrid development in the wake of recent extended outages. Do you see a need for regulatory or legislative mandates to drive the industry toward building microgrids?
Lovins: My colleagues and I proposed microgrids in Connecticut before Sandy. Especially after Sandy, state regulators and FERC are paying careful attention. Cybersecurity and resilience were a major theme at NARUC’s February national meeting under Phil Jones’s leadership. Policies might spread that, for example, make islandability the default design for distributed generators, rather than, as now, often being prohibited by local practice. Utilities can change that local practice themselves, because often it’s not the result of regulation, but of outmoded assumptions.
Recently I toured a major public building that was just completing installation of a large rooftop PV array. This building was also the principal local tornado shelter. I asked the building operator whether the PV array could work without the grid—powering the building, at least in the daytime, right through a disaster. The operator said, “We have modern inverters with IEEE-1547-compliant islandability features, but we can’t activate them because our utility forbids the practice.” It turned out the utility wasn’t conversant with the industry-consensus standard, and still clung to the old belief that to protect its linemen, inverters must be solely grid-excited, so if the grid goes down they go too.
IEEE 1547 standards provide automatic isolation and protection for linemen. It’s entirely within the purview of utilities to change their rules; I don’t think it would even require commission approval, but I’m sure commissions would recognize the merit and public-safety case if they were asked.
This is a time for utility leadership to get ahead of the curve on this increasing issue of public safety, customer continuity, and national security. We needn’t wait for Congress or any regulatory body to tell us this approach can make sense and make money.
Fortnightly: Generally state regulators don’t tell utilities what to do. Rather they respond to utilities’ filings. So it seems likely the impetus will have to come from customers and utilities themselves, rather than regulators. What factors will drive that demand? Is it simply a matter of customers’ losing patience over outages? Or is it more complex than that?
Lovins: Leadership can come from many directions. SDG&E found the microgrid at the University of California San Diego campus to be extremely useful when a wildfire took down a power line, and the campus was able to switch quickly from being a large net importer of power to a net exporter. The design can come from a customer or a utility, just as a better way of doing business—like the fine work by Susan Story and her team at Southern Company Services’ in creating great customer value, including radically better outage response, through smart meters and IT integration. In fact, such grid intelligence is a logical step toward microgrids.
There’s an even bigger strategic opportunity here. John C. Fox, whom I mentioned earlier, led the Delta Project at PG&E and two later experiments at Ontario Hydro, illustrating what I’d call the “inside-out utility.” Those utilities traditionally were dominated by their generating side. They’d extrapolate demand, then build generators and wires to meet it. But these experiments instead focused first on the part of the company that was investing the most—distribution. They picked a distribution area where they planned to expand, upgrade, or modernize substations, feeders, or other costly assets, and they examined the end-use structure in that neighborhood. How much of the peak load was coming from commercial lighting, from residential water heating, and so on? With that information, demand-side investments could be aimed like a rifle, not a shotgun, specifically to reduce those loads in that neighborhood, and thus defer or avoid building costly distribution infrastructure. If that didn’t suffice, then the utility could refurbish distribution or even consider distributed generation. The surprise was that by starting at the distribution planning area and working back toward generation, they found that customer needs could be reliably met without expanding T&D—with about an order of magnitude lower capital investment.
In today’s tough investment environment, that’s an extremely important lesson to recall, test, and spread around. And of course it’s fully consistent with the focus on resilient distribution architecture, with a least-cost strategy emphasizing the best buys first, notably end-use efficiency and demand response.
Fortnightly: What efforts at RMI do you see as most promising for bringing about the neoclassical competitive marketplace you described in your 1985 article?
Lovins: In electricity, our flagship effort to understand and help create that future is the e-Lab, whose three dozen members have launched some important research initiatives.
You can’t understand the electricity system in isolation from the sectors it serves. Nearly three-fourths of U.S. electricity powers buildings. The rest runs industry. In Reinventing Fire,5 we showed how U.S. buildings could triple or quadruple their energy productivity with a 33-percent internal rate of return (IRR), while industry could double its energy productivity with a 21-percent IRR. These things can be achieved by 2050 if efficiency’s average national rate of adoption ramps up over 20 years to the levels already achieved by 2009 in the Pacific Northwest—which seems ambitious but plausible. Altogether, we showed how to run a 2.6-fold bigger economy in 2050 with no oil, coal, or nuclear energy, and with one-third less natural gas, 82 to 86 percent less carbon emissions, and a $5 trillion lower cost in net present value, counting all externalities at zero. We showed this will require no new inventions nor acts of Congress, but rather will be led by business for profit—$5 trillion being ample inducement. And we found that an 80-percent renewable, highly reliable and resilient, half-distributed electricity system could cost essentially the same as business-as-usual.
Dwight D. Eisenhower said, “If a problem can’t be solved, enlarge it.” You expand the problem’s boundaries until they embrace everything the solution requires. You integrate the sectors, including buildings, industry, and transportation, and you innovate not just through technology and public policy, but also in design and strategy.
You also use integrative design—optimizing a whole building, factory, or vehicle as a system, not components in isolation, to achieve multiple rather than single benefits. This approach makes large energy savings often cheaper than small savings, turning diminishing returns into expanding returns. That’s the biggest game changer on the demand side. But also, integrating across sectors makes it easier to solve the electricity and automobile problems together than [it would be to solve them] separately. As breakthrough auto designs—some entering mass production this year in Germany—make electric automobiles affordable with great competitive advantage, their controllable off-peak loads and distributed storage can help integrate variable wind and solar power sources into the grid.
There’s increasing evidence, too, that with a properly integrated portfolio of renewables—diversified by type and location, properly forecasted, and artfully combined with flexible supply- and demand-side resources on the grid—the storage and backup required for stable and reliable power supply could well be less than business-as-usual scenarios need to manage the intermittence of large thermal stations.
Looking at it symmetrically, if we worry about firming, balancing reserves, and integration for variable renewables, then we should also calculate the same costs for traditional generating assets. Reserve margin, spinning reserve, and redundant transmission capacity aren’t free either.
We’re moving into an era where all ways to make or save energy will get to compete fairly, at honest prices, regardless of their type, technology, size, location, and ownership.
1. “Saving Gigabucks with Negawatts,” Amory B. Lovins, Public Utilities Fortnightly, March 21, 1985.
2. The Fortnightly article wasn’t the first public use of the term; that happened a few months earlier, when Lovins delivered a presentation at the NARUC Annual Meeting on the same topic.
3. E. Kyle Datta, et al., Small is Profitable, Rocky Mountain Institute, 2002.
4. Lovins, Amory B. and Hunter L., Brittle Power: Energy Strategy for National Security, Brick House Publishing, 1982.
5. Amory Lovins, Reinventing Fire: Bold Business Solutions for the New Energy Era, Chelsea Green Publishing, 2011.