Whether it’s an aging workforce, the impact of competitive markets, or an outdated transmission system, today’s energy and utility organizations are facing a whole new set of challenges. What many people in the industry don’t realize is that the utility sector is not the first to face these kinds of issues.
The U.S. military is dealing with, or has dealt with, a strikingly similar set of problems in recent years. During the 1990s, the military realized that the information technology it had developed during the 1960s, ’70s, and ’80s would not be sufficient to prepare for the distributed battles they would need to fight in the future. It was using monolithic legacy systems with dated technology, closed operating systems, and proprietary codes, making it difficult to share critical information among different branches of the military.
Although the defense sector still is working through some of these issues, the good news for utility organizations is that military advances offer many lessons. In some cases, direct applications for defense technologies can help solve key utility challenges. Some of those technologies already are being used in several forward-thinking utility organizations, and some are being adapted for potential future use in solving nagging utility-industry problems.
The transmission and distribution systems that deliver energy today are, in many cases, more than 50 years old. These systems and the business practices used to support them were designed to handle lower loads and fewer customers with less stringent economic, reliability, and power-quality constraints. Because of the differences in intended and actual operating conditions, utilities are operating much closer to critical design limits and have less margin for error than ever before.
In addition to this issue of outdated technology, increased complexity poses a challenge. Utilities operate much more intricate distributed systems that must be monitored, analyzed, reconfigured, and repaired reliably, cost effectively, and safely. This requires massive amounts of real-time operating data; consistent, accurate and current financial data; and accessible historical data. It also requires very large and often elaborate applications. The systems required to collect, analyze, store, retrieve, and share this data and information are complex, costly, and extremely important. Operators must be able to extract useful information from megabytes of data from real-time two-way communications with distributed assets, resources, and customers.
Additionally, efficiency, cost cutting, and revenue enhancement continue to be top operations priorities as utilities look for ways to meet investors’ expectations. In this changing environment, the development of adaptive organizations with flexible business processes, systems, and technologies is becoming a necessity. Systems and processes must be enhanced to respond to evolving business conditions and regulatory positions without costly down-time or large investment.
Today, closed systems with protocols and code written by vendors in proprietary languages hamper the ability of organizations to collect, share, and analyze data and create useful information. Beyond that, there are countless challenges from both an operational perspective and a technology perspective that must be overcome for utility organizations to reach the level of flexibility necessary to run efficiently and effectively in an increasingly more dynamic operating environment.
However, most industry experts believe these deficiencies can be overcome through technology. In fact, a recent report submitted to Congress by the Energy Department and Federal Energy Regulatory Commission (FERC) contends that “technology currently exists that could be used to establish a real-time transmission monitoring system to improve the reliability of the nation’s bulk power system; and emerging technologies hold the promise of greatly enhancing transmission-system integrity and operator situational awareness, thereby reducing the possibility of regional and inter-regional blackouts.”
A recent white paper on advanced transmission technologies prepared by experts from the Pacific Northwest National Laboratory and the University of Illinois states “there is a massive backlog of prototype technology that can, given the means and incentives, be adapted to power system applications.” That’s where lessons and technologies from the defense industry apply. Because we can draw upon proven technologies and common experiences, we already have the tools and knowledge to architect, develop, and deploy technology to support complex utility business and operating processes.
The glaring lack of tools available to better manage the transmission grid continues to dog the industry. FERC has pointed to the need to develop new tools that help control-room operators identify, respond to, and track transmission system errors and events. Unfortunately, that is easier said than done. Whether within a regional transmission organization, an independent system operator, or at different utilities, operators are using disparate legacy systems to deal with massive amounts of information in multiple formats.
The military has dealt with similar challenges when attempting to share information across multiple systems to track the resources of each different military branch. For years, each military branch had its own technology systems that operated independently of the other services. If the Army was attempting to destroy a target in a given area, it would have no ability to access or even see the resources of other services in close proximity to that target in real-time.
For instance, an Air Force fighter jet might be within a mile of its target, but because the Army couldn’t access Air Force systems and see that the jet was available, it might re-route an Army plane from 30 miles away to execute the mission, wasting valuable time and resources. Army soldiers on the ground could provide valuable information to Air Force pilots (and vice versa) if an electronic link between the two was available. The Army refers to this as shared situational analysis (SSA).
The military has made tremendous strides in breaking down those information walls and creating technology vehicles to promote SSA among the different military branches. Secretary of Defense Donald Rumsfeld has said, “Possibly the most transforming thing in our force, will not be a weapons system, but a set of interconnections.”
The military is using a combination of interoperability and intelligent agent technologies to create those interconnections. Those same technologies are being applied to develop new tools to help energy and utility control-room operators track and respond to transmission outages, generation emergencies, and other significant events.
PJM Interconnection, the organization responsible for overseeing the nation’s largest regional electricity market, is using an advanced logging system to increase the productivity of operators and back-office personnel by providing faster access to information and eliminating the need to manually input large amounts of data.
This “smart logging,” or SmartLog, application uses proven military technologies including intelligent agents to scan operational databases, recognize critical events and abnormal conditions to alert operators to potential problems. The SmartLog system automatically updates operator logs with critical data and events; the operator then needs only to add his actions and the result to the log record, thus reducing the amount of time required by the operator to record the event. Similar to its military counterparts, the SmartLog application ties into the existing set of operations tools used by PJM’s power-system operators.
The SmartLog application was designed specifically to improve the processes for operators in the control room, supervisors who manage the control room, and analysts who later need to assess and communicate the actions taken by operators. It is a prime example of how military technologies are being applied successfully to solve issues facing utilities.
Mission rehearsal and training is another key area where the military has made great strides in efficiency and effectiveness by applying new technologies. The challenge revolves around the dissemination and communication of large volumes of information—in this case, real data stored in command-and-control (C2) systems. To make mission rehearsal as realistic and relevant as possible, real C2 data is used to create training scenarios.
Using this real data creates real challenges. Effective mission rehearsal begins with the definition of the scenario, followed by the data harvesting that will initialize both the C2 and modeling and simulation (M&S) systems. The information must then be tailored for a specific training scenario that will be executed and evaluated in an after-action review.
Information-sharing is the first challenge. There are numerous C2 and M&S systems, each with their own data scheme. By defining common information protocols, the defense sector has improved the responsiveness of the systems that host mission rehearsals while allowing more of the real world C2 systems to participate within their operational settings. These are the types of technologies referred to by FERC and the Energy Department when they ask for tools to increase situational awareness and improve operator-training capabilities. Technologies exist that can capture and share real-time data among multiple organizations and agencies to increase joint situational awareness and store that data for use in more realistic and broader regional training exercises.
An information hierarchy that creates realistic battle training scenarios remains to be developed. To help solve this complicated issue, the military has developed and employed a set of complex rules engines that extract and organize C2 data for use in training scenarios based on a set of predisposed criteria. Those rules engines, which have helped reduce the time it takes to set up training exercises and analyze training results, are being applied at one ISO to help simulate the impact of tariff changes and bad data on the billing settlement process.
The New York ISO is using a proprietary billing simulator driven by a complex rules engine. The simulator provides a tool to analyze market-rule changes and problems affecting a customer or segment of customers, and helps to expedite the billing-settlement process. Additionally, the simulator enables the ISO to understand the market impact of customers having financial difficulties.
In mid-2003, the ISO faced several types of billing issues: changes in the financial status of a market participant; data accuracy; rules and rule changes; and software. In each of these cases, the ISO needed to respond rapidly to determine the potential financial impacts. Those market conditions were the catalyst for the billing-simulator project.
The billing-simulator architecture includes 146 settlement use cases, or rules, and it runs up to five times faster than the ISO’s current billing system while maintaining 100 percent accuracy. The ISO is in the process of modifying its production billing systems to use the simulator rules engine and create the foundation for the organization’s next-generation billing and settlement system.
Most of these energy technology efforts have been aimed at conservation. Energy consumption is at an all-time high, and concerns over dependence on oil and coal, and availability of resources in general, make it clear that consumption habits must be altered to have a lasting impact on the energy market.
This situation, along with a focus on cost savings, is pushing utilities toward automated meter infrastructure (AMI), but these systems pull information from millions of different sources that must be funneled into a single data repository.
The military has a similar challenge when it comes to mission operations involving joint forces (Army, Navy, Air Force, Marines) or coalition forces (mission partners such as Britain in Iraq). Each has its own systems, so interoperability would allow commanding officers to react much more quickly to enemy operations, and carry out missions more efficiently at a lower level.
Time Critical Targeting (TCT) is a military operation that involves quick deployment of firepower for a time- sensitive or just-discovered target. Determination of the best solution to eliminate a target has been hampered historically by a lack of system interoperability. Cross Service Weapon Target Pairing (XSWTP) was developed to address this ongoing problem.
This service uses an infrastructure building block known as a Multi- Channel Service Oriented Architecture (MCSOA) to enable systems to share information via the Web. From a practical standpoint, this service allows a military commander to survey quickly the attack solutions, choose one, and deploy it. XSWTP enabled interoperation among three systems that were not designed to communicate with each other. MCSOA supports the quick Web enablement of legacy systems and enables dynamic collaboration. These two attributes add up to a shared situational awareness (SSA).
MCSOA can help solve similar issues related to automated metering. It’s part of a solution being implemented in a large-scale automated meter reading project in Ontario that also includes a service-oriented architecture (SOA) to connect a meter data repository responsible for the collection, verification, and validation of raw data with the utility’s billing and operating system. The SOA ensures accurate, timely, and secure transmission of data, essentially taking meter data from millions of homes in Ontario and returning it to the utility in the form of billing data.
Ontario has made its smart-meter initiative a top priority, committing more than $1 billion to it. Through automated two-way communication, consumers will be able to review their household energy consumption, allowing them to decrease consumption during more expensive operating hours and ultimately more effectively conserve electricity. The key to that two-way communication is the ability not only to share data, but for the utility to be able to extract meaning and value from it. This SOA capability, borne from an existing military application, provides the vehicle for sharing that data in a meaningful way from multiple systems and sources.
The ability to share, analyze, and use large amounts of data is a common theme among these utility and military applications. It seems like a simple concept, but the military has spent billions of dollars to make that concept a reality. Although the defense industry has not cracked the code on this issue, it certainly has made great strides.
That’s good news for the utility sector, which does not have to reinvent the wheel. It also means that utilities can leverage the investments made by the Defense sector. Technologies already exist to solve some of the industry’s most pressing problems. It’s up to utilities to take advantage of that knowledge.