An electric generation turbine spins wildly out of control and ceases production in a smoking mess, all because of a computer hacker. Fact or fiction?
A video was leaked to the press in late September 2007 showing exactly that scene. The video was produced for the U.S. Department of Homeland Security (DHS) by the Idaho National Laboratory and shows what happened when a generator was remotely taken over by computer hackers.
But does the video disclose a serious threat to the U.S. electric grid, or it something more banal?
The simulated attack, named the Aurora Generator Test, took place in March 2007 by researchers investigating supervisory control and data acquisition (SCADA) system vulnerabilities at utility companies. The experiment involved hackers invading the plant’s control system to change the operating cycle of the generator.
DHS officials then quietly worked with the industry to fix the undisclosed vulnerability. Some security experts argue the test proves what can go wrong on a larger scale if simultaneous attacks are launched on power plants, shutting down larger portions of the grid. Other experts suggest that the test, while appearing dramatic on video, doesn’t really mean much.
“Sometimes specific tests like this focus attention on one narrow little thing, because it’s easy [to fix], instead of focusing attention and funds on the general problem,” says Erich Gunther, CTO and principal consultant at EnerNex Corp. “It isn’t as sexy as watching something blow up, but I’d prefer that we pay attention to having an overall security policy with layers of defense.”
Utilities use two primary types of control systems — distributed control systems typically within a single generating plant, and SCADA systems for large, geographically disbursed operations. SCADA systems are perceived as having vulnerabilities that make them susceptible to cyber attacks, especially because of increased connectivity of control systems to other systems and the Internet.
As the Aurora test demonstrated, SCADA systems are vulnerable because they were developed in a time when security was not as important as it is today. “SCADA was not designed to deal with the modern-day security issues we come across,” says Cheryl Traverse, president and CEO of Xceedium, a control-systems security vendor. “It’s a very insecure protocol.”
One reason for the potential vulnerability is that most SCADA devices weren’t designed for easy upgrades if software vulnerabilities are found after the equipment was crated and shipped. “That’s going to have to change in the industry, especially as we find ways to make them more accessible to more people within an organization,” Gunther says.
In the meantime, improving the security of installed SCADA systems can require significant hands-on attention.
“SCADA systems classically are deployed over very long lifetimes,” says Darren Highfill, utility communications security architect at EnerNex. “You have to be careful how you deploy patches to make sure you don’t inadvertently break one thing when you are trying to fix another.”
Part of the problem is that many utilities’ SCADA equipment uses proprietary operating systems and proprietary protocols for gaining access and communicating among various parts of the infrastructure. Other systems use standard protocols and authentication routines that are inherently weak. “In a lot of cases it would be clear-text protocol like Telnet, which is a very insecure protocol a lot of these older machines speak,” Traverse says. “And maybe everyone can sign onto a piece of SCADA equipment with one user password, which is ‘super.’” Additionally, SCADA equipment generally allows serial access, which means a user can plug a device into it with serial connectivity and access all the ports that are connected. “There is no way to compartmentalize the user that is accessing it and there is no way of tracking that user,” Traverse says.
Despite these challenges, however, SCADA weaknesses can and must be addressed to prevent hacker attacks on critical utility infrastructure, like the one shown by the Aurora exercise. Fortunately, the security community has developed ways to prevent hackers from getting into utility systems. “We never want to say we give you an absolute fool-proof solution,” Traverse warns. “But we give you double encryption — superior protocols, the ability to overcome weak authentication problems, the ability to separate and compartmentalize your users, and the ability to implement a security model and be alerted if there is problem.”
A major problem, however, for any infrastructure protection involves what Traverse calls “the rising tide,” meaning the bad guys, the hackers who want to wreak havoc on utility systems and who always come up with new ways to invade systems. The Aurora test exposed one particular vulnerability, which companies are working to eliminate. “The question is, will we ever be ahead of the game?” Traverse says. “We constantly are trying to protect ourselves from the rising tide.”
So what did the industry learn from the Aurora hacker exercise? Maybe not much.
Utility companies certainly applied the patches prescribed by the DHS, but they were motivated to improve system security long before the Aurora exercise happened (see “CIP Goes Live”).
Furthermore, insiders suggest this particular vulnerability wasn’t among the most serious facing the industry. According to Gunther, in order for a hacker attack like the one tested in Aurora to succeed, the hacker would’ve had to know about the specific vulnerability or discover it by chance. Then, the hacker needed to gain physical access to the control system or access to the communications path. Finally, the test assumed none of the operator’s safety protocols worked and warnings went unnoticed.
“[Aurora] certainly was a viable scenario, but if you start really drilling down, the level of risk in general is pretty small,” Gunther says. And to the degree the test focused attention on one particular vulnerability, it could prove counterproductive by diverting attention from the broader need for layers of defense in utility systems.
“A test like Aurora can be fascinating to watch and see how it breaks down at the individual level,” Highfill says. “But at the end of the day you have to make sure you don’t get caught up in chasing the individual particulars of this scenario, and allow that to distort your perspective on your overall security posture.”
Gunther adds, “It’s a systems engineering problem that must be solved in the larger context of the application of the overall system. You’ve got to work at it, and a lot of people don’t want to hear that.”
The good news about Aurora, however, is it raised awareness of security issues in general. It demonstrated clearly the importance of reviewing security systems, assessing the risks, understanding them and reacting appropriately. Such efforts represent an ongoing challenge for the industry.
“The minute you say nobody can figure out how to [hack a system], they do,” Traverse says. “Everybody needs to plan for the worst and protect themselves as best they can. We have to get ahead of the curve.”