Big Data, Big Change

Deck: 

Analytics chart a path for industry transformation.

Fortnightly Magazine - August 2013

As anyone who’s been in this industry for more than a few years knows, technology trends come and go. But sometimes they seem to return after a few years with a different name, a different buzzword.

One case in point is “ERP” – enterprise resource planning software. Back in the 1990s, companies like SAP and Peoplesoft developed ERP systems for utilities that would bring disparate data together into a common warehouse. They provided customized dashboards to help utility professionals in various departments do their jobs better. These systems helped them to analyze historical trends and to detect anomalies and problems, and produce reports to support major investment decisions. The analysis could improve ongoing processes too. 

At around the same time, many utilities also built or bought sophisticated customer information systems (CIS) and customer relationship management (CRM) systems – especially in deregulated markets, where they had to deal with new competitive pressures. These systems helped utilities mine their customer data to support marketing and customer service efforts.

ERP, CIS, and CRM couldn’t do everything, however. For example, they couldn’t erase barriers within utility organizations. IT departments – the natural teams to implement and integrate the new systems – were functionally and culturally separate from the operational groups that needed the analysis. Bringing data together from multiple silos raised security and regulatory questions – as well as technical challenges, turf anxiety, and plain-old intra-company friction. 

By Y2K, the industry started focusing on other things, such as better security and reliability. Then, in the wake of Enron and the dot-com meltdown, “back to basics” became the industry’s mantra. Magic boxes took a back seat to corporate restructuring, spinoffs, and meat-and-potatoes execution of regulated utility service. 

Fast forward 10 years, and the magic box has returned, but this time it’s bearing a new name: “big data analysis.” And it’s brought a phalanx of smart grid technologies and information streams that make ’90s-era ERP look like a puppet show.

The practical issues that stymied ERP and CRM haven’t gone away. But a whole generation of experience with advanced data systems has, in effect, opened doors in silo walls. From meter data management (MDM) to outage management systems (OMS), utility operations have adopted and implemented 21st-century information technology. And IT departments now understand the demands of operations technology (OT) more clearly than they did before. 

“Without integrating operational data with traditional IT data, I don’t think the industry would be any further along than it was five or 10 years ago,” says Steve Ehrlich, a vice president with Space Time Insight. “Analytics is the ability to look at operational data in a new light, and that’s happening right now.”

To learn how analytics capabilities have evolved, and how they’re being deployed and used today, Fortnightly spoke with experts at several companies that are leading the charge. They include:

  • Paul Bower and Ray Kasten, ABB Ventyx
  • Scott Stallard, Black & Veatch
  • Rick Geiger, Cisco
  • Michael Valocchi, IBM
  • Matt Owens, Itron
  • Steve Ehrlich, Space Time Insight

Their comments suggest that recent investments are paving the way toward a future in which big data analytics is an integral part of utility operations across the entire value chain, from power plant management to customer engagement. 

And this time the magic box seems to be here for the long haul.

FORTNIGHTLY What do you see as the primary benefits of analytics today, versus those of data analysis systems of the past? 

Michael Valocchi, IBM: It’s been interesting to see how the terminology around analytics and big data has evolved along with the applications. Previously the industry was focused on the data that we could get versus the business value we want. As we implemented automated meters and smart grid technology, we had all this data and said, “OK what could we do with it?” We’ve reached the point now where we have flipped the question. The benefit now is in integrating systems, asking what problems we want to solve, and then getting the data. For example, outage notification could involve operational analytics, customer analytics, and marketing. Big data analytics allows us to integrate it and then drive value from there.

Paul Bower, ABB Ventyx: Big data analytics is a new thing on the scene for utilities. But two analytics issues have been with us for a while. Those involve dealing with the broad range of data in the systems space, and also the veracity and quality of that data. What we haven’t seen before is the huge volumes that people term “big data.” We’re seeing more granularity in metering information, and trying to convert semi-structured and unstructured data into structured data for analytics. 

Ray Kasten, ABB Ventyx: We’re taking data from half-a-dozen different sources. Where historically that was isolated data – like offline oil condition measurements, vibration sensors, work management information, and cost information – we’re bringing that all together and giving customers actionable intelligence. Where analytics in the past was about reactionary maintenance, now it’s about time-based or condition-based maintenance. 

It used to be that only about 25 or 50 people at a company used an outage management system. They were mostly people in a dispatch center, and a handful of engineers who execute work orders. Now the data is being used by more than 3,000 users at some utilities. We’re presenting big data with granularity based on the user’s needs. It’s intelligence for the COO, or for the supervisor in the field. 

Matt Owens, Itron: For 15 years we’ve been using advanced statistical models to forecast energy demand, bringing in weather data and other factors to help fine tune those models. California ISO uses it for day-ahead forecasting to manage the transmission network. Even before smart metering we had solutions around revenue protection that were based on analytics. They looked at data from AMR systems – billing and move-in and move-out data, and tamper flags. We had an hourly load profile for every customer, and it was synthesized for transformer loading analysis, so we could look at asset health. Xcel Energy in the early 2000s used our tools for their proactive transformer replacement program.

Now with smart metering, there are countless opportunities. I see analytics as a way to provide a wide range of benefits to utilities and in many cases help them transform the business. As the grid becomes more two-way – these days the term is “transactive energy” – analytics will play a larger role. With smart metering data, there are many more applications, and things you can do with analytics solutions that you couldn’t do with monthly billing data before.

FORTNIGHTLY What examples of analytics in practice at utilities today demonstrate the technology’s benefits and capabilities?

Owens, Itron: Avista Utilities is using our solution for grid analytics – transformer loading, power quality, conservation voltage reduction (CVR), and revenue protection. Those solutions are aimed at optimizing the grid and improving reliability. Avista has smart meters deployed in Pullman, Wash., and they’re using the data for different benefits. One is power quality. By collecting voltage data you can continuously monitor what’s happening at the edge of the grid from a power delivery standpoint. Under-voltage conditions or faults affecting certain customers can trigger an investigation to see if you’ve got a problem on a line. Avista is using our solution for that.

They’re also using it for volt-VAR optimization and CVR. Part of Avista’s ARRA project included the purchase of capacity bank controllers and other CVR equipment. Analytics won’t solve the whole problem, but it will help solve a big part of it. Avista picked six feeders to install CVR equipment, with the objective of reducing the total energy on the feeder while maintaining service levels to customers. They establish baseline voltage levels with our system, and then monitor the equipment and the feeder, and make adjustments, fine tuning voltage levels. The goal is 1 to 1.5-percent reduction on energy to that feeder. When you expand that to the whole network it can mean big savings. 

Steve Ehrlich, Space Time Insight: At Hydro One in Toronto, their challenge was to make a rate-case justification for replacing a lot of old distribution assets. How could they know which ones were in greater need of replacement than others? Our system helps them view about 4.5 million T&D assets on a map. They can drill down to street level, and see all sensors and performance indicators associated with an asset. Or they can view assets on a survivability curve, and rank them on a score driven by an analytical formula. It’s a very visual approach, but it also allows them to total up the assets that need replacement and then make a rate case based on avoiding outages.

The system also helps Hydro One to consolidate field service activity. They can see which jobs are related to other jobs in the same area, so crews can fix multiple problems in the same truck roll.

Bower, ABB Ventyx: ComEd is using analytics for outage management. That’s taking a lot of data, moving it into a real-time environment, and pushing it out to millions of customers in addition to internal staff. By communicating proactively with consumers, versus withholding information, they’ve improved customer satisfaction levels. [ComEd CEO] Chris Crane commented that it’s improving relations with customers.

Scott Stallard, Black & Veatch: Our platform is called Asset 360. Kentucky Utilities uses it for operational intelligence, to find anomalies in power plant systems. It focuses on early detection of issues and turning those discoveries into actions and work management. They’ve discovered problems that have resulted in the system costs being paid back four to five times over, consistently every year. 

Whether it’s a power plant, substation, transformer, or other device in the field, you apply the same logic: bring back data streams, understand the performance issues, and take actions based on that information. And you can also look forward and apply big math – predictive math – to anticipate what will happen at different assets. 

Valocchi, IBM: Centerpoint is an interesting example, because of the competitive nature of its markets. They’re at the integration point with analytics. They were one of the leaders in smart metering and smart grid, they’ve continued to apply analytics in their operations, and they’ve embarked on applying big data analytics to customer service.

It’s about efficiency. Five years ago, people were talking about getting customers out of service centers and into IVR [interactive voice response]. Big data takes that further, allowing customers to interact with the utility at specific points in time, in different channels, with consistent information across those channels. That’s driving costs out of processes.

Rick Geiger, Cisco: If you look around the U.S., all the large utilities that have done major rollouts of smart metering infrastructure are at one place or another on the [continuum] of data analytics. At FPL, they’ve got performance and diagnostic centers that perform analytics for prediction. They’re also using PMUs [phasor measurement units] at substations, and doing analysis of line protection and control systems. ComEd in Chicago has done a lot with feeder automation, automated switches, and self-healing architecture. So has Con Edison in New York. And Southern California Edison has a very advanced system that they call the “central remedial access system.” They’ve integrated their smart meters with OMS.

FORTNIGHTLY We continue hearing about organizational silos and other challenges that stymie technology deployments at utilities. How are these challenges affecting analytics efforts, and how are they being overcome?

Ray Kasten, ABB Ventyx: Over the years one of the problems has been that four different departments at a utility might have four different answers for the same problem. We’ve been able to tie different data sources together to provide a single source of the truth. But another challenge has been dirty and incomplete data – and operational data that’s unstructured. 

Bower, ABB Ventyx: We’ve developed a consistent data model to deal with the challenges of bringing key data together – whether it’s in PDF files, unstructured sources, or very structured sources – and conforming them to that standard information model. Once deployed, it’s an open data structure that’s easy for the utility to use.

Ehrlich, Space Time Insight: Silos are clearly one of the bigger problems. The reason that individual owners of systems are so wary [of integration] is that if you take data and replicate it in another system for analytical purposes, they question whether the data is accurate. Is it copied out? Is it current? We don’t want to create another silo. So instead we access the data from real live systems and present the exact same data to users that they’d get when they look at their regular interface. That makes people more comfortable.

Geiger, Cisco: Another challenge to resolve is how all of this works together with NERC CIP requirements. Where data comes from systems within the purview of NERC CIP, utilities have to be very careful to implement both cyber and physical security, and avoid creating interconnections that cause vulnerabilities. 

Owens, Itron: Funding allocation within an organization is a big challenge. You want one platform, and that system has many different applications – power quality, outage management, financial analysis – all providing benefits to different departments. So who pays? And how do you get approval? 

Talent is another issue. “Data scientist” is a hot job right now, and I don’t think most utilities have data scientists on staff. We provide consulting services, building new algorithms and fine tuning the system to get the most out of it. But it requires a collaborative effort between the provider and the utility to build and use the system. 

Another issue is data governance. What are the sources of data records? Who has access to those systems? What are the rules for using it? Who’s responsible for updating it? Also utilities are grappling with support for cloud solutions. Many innovative analytics solutions are based on cloud approaches. The provider can refine and improve the system faster, and it’s easier for a utility to scale the application to its needs. The utility needs to have a plan in place for data governance before it can move ahead with any big analytics initiative. 

Stallard, Black & Veatch: The traditional way to avoid getting into trouble is to build it incrementally, one tool at a time. But the endgame is an integrated architecture, a different way of viewing all the pieces. 

FORTNIGHTLY What lessons have been learned about how to apply analytics at utilities?

Owens, Itron: The industry has learned that if you start with a big vision and try to implement it all in one phase, then nothing gets done. That was the classic problem with data warehousing. More than 50 percent of those IT projects failed. Starting small in one area is a tactic that helps you get some success quickly, and that’s what utilities really need. 

Also utilities need to start thinking of analytics as a core part of their operations. Big data analytics has the potential to transform the way they do business. Think about distribution planning as one example. For 50 years it’s been done by engineers using rules of thumb about how they size equipment. They have planning tools, but they’re looking at models. Now you have the opportunity to look at actual data from your network to do distribution system design and planning. You can get major benefits just from doing that. 

Lastly, it’s critical for customer engagement and satisfaction to understand that analytics is a continuous process. You don’t just implement one solution and say, “Now we’re good for 10 years.”

Ehrlich, Space Time Insight: When companies say that analytics isn’t working for them, it’s often because the systems built over the last five or 10 years aren’t designed to deal with real-time data feeds or data from outside a database. But if you take a situational intelligence kind of approach, it changes things dramatically because you’re combining historical data with real-time data. 

Valocchi, IBM: The key lesson is that building big data capabilities in a silo doesn’t work. People are learning that using analytics as the integration point is a much better answer. The second lesson is that you don’t start by asking what data is available. You start with the opposite question. What business problem am I trying to solve, and what data analytics do I need to solve it? You build your strategy around that, from the bottom up.

FORTNIGHTLY What do you see as the long-term vision for analytics? And where are we on the path toward realizing that vision?

Owens, Itron: Not all utilities are seeing it yet, but transactive energy is coming. Distributed energy resources are coming, it’s just a question of when. We absolutely see analytics playing a role in grid optimization and reliability [in a system with a lot of distributed energy resources]. 

At SDG&E, one problem is that they’ll have a big deployment of PV in certain areas, and clouds will come in and they’ll see a sudden drop in total power output on a particular feeder. That’s causing stress on equipment at the substation level, and within the feeder. Utilities are saying that if they had a more nearly real-time forecast of the output or load of distributed energy resources, it would help them to better optimize and plan. 

Erlich, Space Time Insight: The infrastructure that utilities are working with today is really complex, and it will get more complex in the future. If thousands of people are plugging their cars in at night, does the utility have distribution capacity to deal with that? Is it necessary to change the pricing model? Many different questions come up when you bring distributed generation and electric vehicles into the fold. It makes the operating environment that much more complex. The utilities that will be most successful at answering those questions are the ones who are able to harness data from different systems and figure out the relationships between them. 

The vision of a transactional smart grid is the right vision. But it requires that as you implement technologies on the grid, you have to implement analytics at the same time. Otherwise you end up with smart grid data and no idea how to process it. Analytics of the future will be integrated and rolled out as the same time as smart grid technology, so the data will become immediately accessible.

Kasten, ABB Ventyx: The future vision involves data from tens of millions of data sources. Utilities already can triangulate outages based on incoming phone calls. The next step will use things like Instagram, with photos of power poles down, correlated with utility data. All types of information will play a role in a planning model versus a reactive model.

Stallard, Black & Veatch: Unstructured data will be used only in very narrow windows for a while. If you have structured data that was designed for one purpose and you’re using it for another, you have to be careful the data is applicable. When you bring in unstructured data, like text and voice messages, tweets, and photos, you have to be even more careful. Is a tweet related to something that’s really happening, or is it a false positive? Analytics will have to root out false data from real data.

Geiger, Cisco: I think the path has become clear, but the vision is maybe too big to grasp.

The industry is going through as profound of a transformation as an industry can have. It’s going from a legacy electric power supply chain that was unidirectional – generation to delivery – to one that now is just beginning to see the incursion of generation at the edge. As that grows and can no longer be managed by the existing models – whether they’re business models, control models, or engineering models – then the system will be completely transformed. Generation at the edge will require a new control model, with the proliferation of active control in a system that currently has a higher percentage of electromechanical devices than active intelligent ones. 

You can’t have this look like the Internet – unmanaged, with all kind of various things hanging out there that aren’t carefully monitored and controlled. The system needs a power engineering model that’s equipped to deal with 1 million network nodes, not to mention the network that connects them. Utilities have nothing close to that yet. The closest thing is in the IT department, and that means OT and IT had better get together and make this work. 

The issue isn’t limited to just distributed generation, but the whole set of distributed energy resources. If the day comes that we all have home energy systems that watch an electricity price, there might be the possibility of an occurrence where the market reaches a certain price, and suddenly hundreds of megawatts of load drop off the grid. Completely outside any power engineering control system, it will be driven by financial transactions. 

Remember the flash crash of a few years ago, where the Dow Jones dropped by 1,000 points in a few minutes, and nobody could figure out why? We need to learn the lessons of the stock market. Here we are, engineering a price-responsive, transactive grid. If we’re going to have things in the market that are price-responsive, we need to understand how that will interact with electric power control systems. 

What did Wall Street put in place to prevent flash crashes? Not coincidentally, they’re called “circuit breakers.” It’s all of a piece.