New tools for enhancing utility preparedness and response.
Michael Beck is the founder and managing director of MJ Beck Consulting, LLC. Seth Guikema is an assistant professor in the Department of Geography and Environmental Engineering at Johns Hopkins University, with joint appointments in civil engineering and Earth and planetary sciences. Ken Buckstaff is the managing director of First Quartile Consulting, where he leads the utility benchmarking and consulting practice. Steven Quiring is an associate professor in the Department of Geography at Texas A&M University, focusing on hydroclimatology, drought monitoring and prediction, and modeling the impact of hurricanes on infrastructure systems.
Extreme weather events aren’t new. Utilities, first responders, and government agencies have extensive experience in dealing with weather-related disasters. Usually, the response is effective and, in many cases, might be characterized as heroic. But for a variety of reasons, including increasing population density (i.e., affected customers), security concerns, ballooning damage assessments (e.g., Hurricane Katrina’s impact was estimated at $146 billion1), extended outages, and an economy and society ever more dependent on its energy infrastructure, utilities find themselves under heightened scrutiny with regard to their activities related to storm preparation and response. For example, along with post-Sandy cleanup, executives and managers at many utilities have been engaged with post-Sandy investigations, hearings, regulatory proceedings, and media response. While some of this scrutiny, unfortunately, is a search for the guilty, not all of it is so opportunistic. There are lessons to be learned, practices to be changed, modified or adopted, and new tools to be employed.
Highly sophisticated storm outage modeling could be one of those tools.
Storm outage modeling – which might include estimating the number of customers out, protective devices activated, damages incurred, or restoration times – can be accomplished in a variety of ways, from the simple to the complex. The simple approach is based largely or solely on past experience and works well in many cases, but not all. And recent history has provided more than a few examples where there was no past experience to draw upon.
Research hasn’t shown conclusively whether storm frequency and intensity are increasing as a trend. But storm effects are leading many entities – utilities and agencies alike – to consider more sophisticated approaches to outage modeling. These approaches are based on significant advances in climatological research, enhanced GIS capabilities, specialized software, and low-cost computing power.
Sophisticated modeling doesn’t prevent storm damage – not in the short term anyway – but it can be an integral part of storm preparation and response. And with the preparation and response bar being continually raised, enhanced modeling approaches deserve a careful look.
Storm Outages Rising
Despite this year’s late-starting hurricane season, hurricane-related activity and intensity in the last few decades suggests a trend when compared to long-term historical data. The evidence, however, isn’t clear. Much of the dispute centers on the time period selected for study, anomalies in the data, and whether hurricane activity, or other storm types, refers to the total number of events or just particular categories of events (e.g., hurricane categories 1 through 5). But regardless of whether there are more storms today than in the past, storms and climate-related events in recent years have been exceedingly damaging. In fact, as shown in Figure 1, 2012 produced 11 events in the U.S., each of which caused at least $1 billion in effects.
While the debate continues as to whether there’s an increasing trend in the annual number of storms, a review of long-term data indicates that there has been a demonstrable clustering of high-dollar (i.e., greater than $1 billion) storms in the last decade or so. As shown in Figure 2, since 2000, there have been 22 Atlantic hurricanes that caused damages estimated at over $1 billion, compared with 23 Atlantic hurricanes over the 35 years from 1965 to 2000. In other words, storms are becoming more damaging in an economic sense, likely because of the increasing density of assets in many areas, and the increased value of the damaged assets.
The same trend is evident in electric utility reliability statistics. A survey of North American utilities shows that over a 10-year span (see Figure 3) the percentage difference between the frequency of outages (i.e., SAIFI data) when major events are included and when they’re excluded in measurements has been slightly reduced. That is to say, utilities seem to be getting better at reducing the frequency of interruptions caused by major events. On the other hand, the difference in total outage time (i.e., SAIDI) when major events are included has been slightly increasing. In other words, the contribution to total outage time by major events has been growing, despite the efforts by utilities to respond more effectively. In addition, it’s been reported that severe weather accounts for 58 percent of outages observed since 2002 and 87 percent of outages affecting 50,000 or more customers.2 Not surprising information, maybe, but sobering.
Given the significant effects of storm events, particularly on the outage time experienced by customers, it’s evident that utilities need to find ways to enhance their ability to respond quickly and effectively.
Modeling Storm Outages
Utilities have responded to the need to prepare for increasingly damaging and potentially more frequent storms in various ways. Some, particularly in those areas that have experienced repeated impacts of hurricanes such as the Gulf Coast and southeast Atlantic Coast, have developed in-house outage and damage forecasting models. These models vary from simple correlation-based or spreadsheet models to more advanced predictive statistical models. These models can be run several days ahead to estimate how many outages, how much damage, or what duration can be expected from a forecast weather event. The models for hurricanes (as opposed to, say, thunderstorms or snow and ice storms) are most advanced, and a range of predictive models have been developed and implemented by a university research collaboration led by two of the authors (Guikema and Quiring), as well as at several major utilities.
An example of what a storm outage model looks like, how it works, and the output it produces, is demonstrated in the work of Guikema and Quiring and their research groups. The models were developed over about eight years in collaboration with a major utility that has experienced repeated hurricane impacts. The models use outage and damage data from past storms at a relatively high spatial resolution to train and validate a predictive model that can then be applied for future events. This is combined with wind forecasts, information about the power system, soil moisture levels prior to the storm, land use and land cover, topographic information, and utility tree trimming records. Early generations of these models (Han et al. 2009a) utilized relatively simple generalized linear regression models, but more recent models have used more flexible non-linear statistical methods and ensemble data mining techniques (Guikema and Quiring, 2012; Nateghi et al. 2013) to achieve more accurate results.
A critical distinction with tremendous practical effect in such models is the difference between model fit (to past data) and model predictive accuracy. Nearly any data set can be fit arbitrarily well with a sufficiently complex statistical model, but such a model wouldn’t yield accurate predictions; it would be over-fit to the data. Model validation, in the sense of testing how well the model predicts data that aren’t used in training the model, is critical for selecting a model with good predictive accuracy. The current standard for assessing model predictive accuracy is to use out-of-sample holdout testing, and this is the approach used in the authors’ work. A portion of the data is withheld from the data set; the model is trained on the remaining data set; the trained model is used to predict the outages for the withheld portion of the data; and the error is measured. This process is repeated for different holdout samples to test the conditions in which the model offers good predictive accuracy. The approach can withhold entire storms, portions of a service area, or randomly selected combinations of grid cell-storm combinations. The authors’ analysis uses all three approaches.
Once a model has been fully validated, it can then be used operationally. This generally involves running the model every six hours to update the hurricane track and intensity forecast starting approximately five days before landfall of the storm. The output from each model run consists of a map of predicted outage intensity and an estimate of the total number of outages, either for the full service area or for sub-areas of interest. These estimates then can be used in the utility decision-making process in the days before the storm, particularly in helping inform requests for external crews and materials.
As an example of the output of a prediction model, Figure 4 shows both the actual outages at the grid cell level (i.e., in this case the “grid cell” is 12,000 by 8,000 feet) and the predicted number of outages with the predictions based on the best estimate of the actual hurricane track. These predictions thus eliminate the uncertainty in the track and intensity forecast, highlighting how well the model can do when given an accurate track and intensity forecast. The area shown is approximately one quarter of a sizeable state and contains approximately 2,000 grid cells.
Figure 5 illustrates an outage prediction made in advance of landfall of Hurricane Sandy. In this case, the predictions are from a simplified version of an outage forecasting model, a version that’s based only on publicly available data and therefore can be applied along the entire coastline. The map shows an area from just north of New York City south through New Jersey. The forecasts were made on Oct. 28, 2012, while Sandy was still off the coast of the Carolinas, and the estimates for this model were developed at the level of individual census tracts. Although Sandy ended up taking a somewhat more northerly track than was forecast when these estimates were made, the total number of people without power forecast at this point in time (approximately 10 million) was reasonably well aligned with the realized outages.
Ongoing research seeks to improve pre-storm outage forecasting models. Recent efforts have focused on better incorporating uncertainty in the weather forecasts into the models, to both improve predictive accuracy for predictions made four to five days before a storm, and to better capture and represent the uncertainty in the forecasts. Also, while this example focused on modeling the impacts of hurricanes, a similar approach could be used to develop models for other weather events such as thunderstorms, wind storms, and snow and ice storms. The major difficulty in modeling other types of weather events is in accurately forecasting the weather event with enough lead time to make an outage forecast useful to utilities. Progress is being made on outage forecasting for thunderstorms, and some preliminary work has been done on outage forecasting for ice and snow events.
Putting Models to Work
Storm modeling capabilities offer benefits for many types of users and stakeholders. First among these are utilities. There are potentially significant benefits to be derived for some utilities from enhanced pre-storm planning, during-event execution, and after-event assessment and reporting. In addition, long-term planning can be facilitated through scenario testing – developing and examining “what-if” scenarios. A storm model was used to address a recent request by the governor in a southern state who asked, “What would be the impact of a Katrina-sized storm rolling through the two largest cities in our state?” Another use of scenario planning would be to test different response approaches to potential storms. Using the same predicted storm damage, a model could be used to project response times under different scenarios of mutual aid levels, specific allocation strategies for resources, and levels of automation within the electric system. That might support investment decisions for the electric system as well as enhanced plans for resource planning and allocation in future storms.
Model development can also help utilities formalize the knowledge developed over the years by their storm response teams. That, in turn, would enable them to be better prepared when long-time storm management personnel retire or leave, as well as either confirming or rejecting long-held beliefs about likely event effects. The result is to institutionalize the knowledge of storm events, thus leaving the utility less dependent on the expertise of a very small number of individuals.
From the customer’s perspective, the two most important benefits are the potential for faster restoration times, and improvement in the utility’s ability to provide estimates of those restoration times. Storm modeling can enhance the utility’s ability to forecast outages, affected equipment, and damage to the electric system. The knowledge gained from the better predictions enables actions that, in turn, produce the desired benefits.
Mutual aid organizations might also benefit from a shared storm model. A region-based storm model could be utilized to share information on expected damage and customer outages, and collectively recognize where the greatest resource needs are likely to be during a major event, thereby optimizing the dispatch of resources across the entire storm area, with the resulting benefit to the greatest number of customers in a region. In the same way, emergency management agencies (EMAs) might also benefit from a storm model. Electric utilities clearly are focused on how a major event affects the electric system, while EMAs have a broader set of concerns, including multiple utility types (e.g., electric, gas, water, telecommunications, etc.), as well as the effects on transportation, food and water supplies, and traffic management. A model to address multiple elements of infrastructure could help the EMAs be better prepared and more responsive in a major event, including helping to understand the interactions of the effects on different infrastructure elements.
Finally, in the aftermath of a major storm, a storm model could be used in analysis of a utility’s performance, to help improve future performance, and to support what have become customary investigations by regulators into the performance of utilities. Just as a regulator could use a model as a standard for comparison, the utility can use the model to demonstrate its successes in the response effort. It also can highlight unique circumstances that previously haven’t been encountered by the utility (e.g., the flooding that occurred along the Northeast coast during Sandy) and demonstrate its effect on a utility’s ability to restore service.
Utilities and other stakeholders face increasing pressure to advance and enhance their storm preparation and response capabilities. These capabilities extend across a wide range of activities and include functions such as logistics, communications, infrastructure hardening, organization, and planning. Sophisticated storm modeling can enhance many of these areas and, as research advances, there are clear benefits to adopting new tools that will augment decision making during critical climate events.
1. National Oceanic and Atmospheric Administration (NOAA), 2012 dollars, inflation at CPI; http://www.ncdc.noaa.gov/billions/events