Public Utilities Reports

PUR Guide 2012 Fully Updated Version

Available NOW!
PUR Guide

This comprehensive self-study certification course is designed to teach the novice or pro everything they need to understand and succeed in every phase of the public utilities business.

Order Now

IT Roundtable: The Digitized Grid

Data gathering and controllability offer the quickest path to reliability.
Fortnightly Magazine - January 2005

speaking, was way behind the concepts that people were proposing about how a smart system should operate. They were looking from a high-level point of view about what would be the right thing to do, conceptually, to develop the system.

They introduced the state-based management concept, which categorized the system in either a normal, emergency, or restorative state. Each state defined what the operator had to do; maybe 80 percent of the time it was in a normal state, and 20 percent or less it was having problems. This idea was very powerful and still is valid today, but the IT was way behind it. Most of the idea couldn't be implemented, and gradually we saw things happening for the next 30 years.

Today we find ourselves in an inverse situation. The concepts I described are more or less unchanged, but IT has grown enormously fast and we now have computers capable of handling many things amazingly fast. Data-gathering from substations, generating stations, and the grid in general is becoming more possible on a massive scale. A digital relay has buffers internally that can store data about the last five minutes of operation, and it can sample conditions on a millisecond rate. Extremely high-resolution data is now available for any quantity you want. So more things can be done, but they aren't being done. This is the motivator of the work we are doing right now-to use everything IT has to offer.

Fortnightly: How will the IntelliGrid project accomplish that?

Sobajic: With all this data becoming available, we have to re-think how we gather it and use it to assist functions. EPRI took the first step and conducted a study, and IntelliGrid is the result of that.

The IntelliGrid architecture is the foundation. It has to pick up all the data available, and this data might evolve into something that is not just electrical quantities, but other things like video or sound. Thirty years from now we don't want to say, "If we'd only thought of this… ." It's a far-out system design that carries you as far as the imagination can go.

That part is behind us, though it will be refined as we go forward.

Next we will focus on system models. These are simulated environments where we can model different operations and controls. The big challenge is to make these models more real-time compliant, and to calibrate them to look more like the real system. For example, one model is a load model. It could be one factory or an entire region. To simulate the load of the city of San Francisco, for example, the program uses an amount of impedance. But one single number cannot replace a load of that complexity. It has dynamic components and we need to reflect that with greater fidelity.

The third area of IntelliGrid is wide-area operation, monitoring, and control.

Historically, the power system was operated on the basis of control areas. These were 138 individual geographic territories, where certain power lines were coming in and out. The operator sits at the middle,