Author Topic: How well do you know your cybersecurity hazards?  (Read 1483 times)

0 Members and 1 Guest are viewing this topic.

Offline Riney

  • Support Administrator
  • *****
  • Posts: 3000
How well do you know your cybersecurity hazards?
« on: May 11, 2015, 15:01:52 PM »
How well do you know your cybersecurity hazards?

Intelligent Utility Magazine   
Summer 2015
by Kathleen Wolf Davis
Matt Gibson is betting the answer to that question is "not very well," though certainly not for lack of trying.
Gibson, a long-time nuclear energy insider, currently helms an Electric Power Research Institute (EPRI) project that considers the use of hazards analysis to identify cybersecurity vulnerabilities.
Traditionally, the field of cybersecurity (wherever it has been applied) dealt solely in the vulnerability analysis realm. But Gibson and his team (which EPRI, a few utility partners and the Sandia National Laboratory are a part of) just don't think that does enough.
In comes hazards analysis, an idea that's been used for years in the health and safety industry. 
"In a nutshell, hazards analysis is the process of identifying the things that could happen, analyzing why and then figuring out risk," Gibson said. He added that we all do hazards analysis ourselves in our daily lives. We ask ourselves what's the worst that could happen and then try to figure out the likelihood of that happening, whether we're making a financial decision, a relationship choice or simply crossing the street.
And even within the power industry, the concept of hazards analysis-if not the separate field-has been an integral part of the engineering process since the engineering process was invented.
"I've been in the nuclear industry for over thirty years designing, building and maintaining computerized control and monitoring systems," Gibson said. "Hazards analysis is a natural part of creating those systems. We apply the techniques as part of our normal engineering process to create systems that are reliable."
In other words, a change in design or an addition of a feature may have an engineer running the basics of hazards analysis in her head, even if she labels them "what if" scenarios and not strictly "hazards analysis."
But, the formal approach of hazards analysis takes that natural tendency and builds in two critical items-expert knowledge of the systems to be examined and a formal approach to allow that expert to cover all the bases and not miss a potential problem (which can happen if you're just running "what ifs" in your head).
Gibson noted that, as systems become more and more complex and are hooked together in more layers and more ways, formal methods and approaches will be increasingly necessary to make sure every nook and cranny has been examined, especially in a critical area such as cybersecurity. 
"Hazards analysis hasn't been done a lot in cyber," Gibson said. "Cyber uses vulnerability. But when you talk an integrated hazards analysis, that's something that goes beyond vulnerability analysis. It encompasses equipment and could give you a better idea of the actual consequences of a cyber attack-especially when you're expanding and trying to make things holistic."
Within cybersecurity, Gibson and his team apply hazards analysis in slightly different ways than the norm. (The research itself centers around finding subtools of the standard methods to use in utility-specific ways.) For example, normal hazards analysis relies on probabilities and physics to determine mechanical or electrical failures. The rules of physics don't change. 
Security, though, has a basic element that doesn't rely on physics at all and does change-namely the will and whim of the hacker. So, coming up with an exact failure mode then becomes difficult because the rules change with each new malicious element. EPRI's research replaces the physics side of things (the risk) in the equation with new limitations, such as whether this particular hack may be simple or super difficult to achieve.
Theoretically (and also psychologically), making something more difficult to achieve lowers the likelihood of it being accomplished. 
"It's not a perfect fit," Gibson admitted. "But it allows us to use the existing hazards tools."
But, since formal hazards analysis is new to the utility field and the field of cybersecurity, where do we all start?
Gibson advised that first you need to understand your system and your capabilities. You begin by modeling your power plant. (Gibson's team is starting the analysis with nuclear plants but hopes that, if they can find the right formal process, it can be adapted to other generation and then to T&D in the future.)
"We understand how plants work. So, we can understand how a hazard might impact it-sometimes in minor ways and sometimes in serious ways. Hazards analysis is vulnerability analysis on steroids. It makes people truly understand the seriousness of a particular vulnerability by placing it in context," Gibson said.
You start with understanding how everything works and then add in how prior events impacted that system, which may bring up system weaknesses you're not aware of.
From there, Gibson's hoping that his team's work on adapting hazards analysis will come into play. They are working with the idea that the industry may be doing vulnerability analysis from the wrong angle-way too far removed from how the plant's systems actually work, which creates issues in the areas of security and financing. For example, what if your vulnerability analysis identifies a certain component that is highly vulnerable and you spend a lot of money securing it? That's the right thing to do, right? Well, yes, within vulnerability analysis. It's vulnerable. You make it less so.
But, what if that component is actually of very low value in the overall system? What if it gets hit or hacked and, well, nothing much came of it? Now you've spent lots of cash to protect something that really didn't need that protection. On the other hand, you may have a component that's less vulnerable but much, much more important to the system-it could cause a cascading problem or a lengthy outage-that you're spending less on because your vulnerability analysis told you it was less vulnerable than the first component. That's the difference of hazard analysis-adding in the concept of a system-based hierarchy to your vulnerability assessment.
"It's a problem across many industries today," Gibson added. "Folks are either overdoing it or underdoing it. But the common denominator isn't people wanting to be insecure. No one's saying, `Oh, we don't care about our plant security.' Absolutely not. We just don't, collectively, have an understanding of how to evaluate a critical infrastructure facility very well."
That's what his team is trying to accomplish, blending plant modeling, known system vulnerabilities and the value of protection. And while Gibson concedes that there will always be a PR/perception issue when even something low-level is hacked-think of the recent extremist hacks of the U.S. military command's Twitter accounts all over the national news-what he and his team are concerned with are things less trivial: focusing on operations, reliability and safety, not PR.
His research is in the final stages of phase one, where they are evaluating different hazards analysis methods and determining which ones lend themselves to cyber better than others. There will also be a phase two starting soon where they try to blend the methods into a formal set that can be used in the industry, and then they'll move on to phase three, which involves pilot projects. 
While Gibson's team had hoped they'd find one or two hazard analysis methods that were ready to go with cyber without a lot of tinkering, they haven't found that to be the case, unfortunately. Still, Gibson is excited about this move into phase two (the blending phase) because he can see-at least around the edges as things congeal-how helpful this could be to the industry, perhaps much more helpful than the plug-and-play approach they were hoping for originally. But first, he acknowledges there will be a lot of work looking at complexities. Some methods work better on certain levels of the system and process than others, which Gibson and his team call "levels of decomp," for the concept of decomposition (or the process of breaking things down).
"To be complete, cyber needs to look at a broad spectrum of levels. An attacker can come in at any level," he added. He also talked about how the hacker in pop culture is portrayed as a super brilliant super villain, which really isn't the case.
But, he does admit that, despite rarely being the brilliant fellow we see on the screen, the average hacker is doing something you're not-namely spending time in your systems studying things.
In the end, a good hacker isn't acing an exam without studying. They cram. They pull all-nighters. They spend hours, weeks, days, months in your system, which is likely hours, weeks, days and months more than you do. In Gibson's opinion, that needs to change (which goes right back to his first-step advice of knowing your system). 
"Hackers aren't smarter than you," Gibson said. "They're just more motivated. You need to be equally motivated. Get more hot rodders in your organization and fewer Sunday drivers. Bring in people who can take the equipment and the processes and the software apart just like the hacker can. To defeat that hacker, you've got to be that hacker-and then know the system just a little bit better.

http://www.intelligentutility.com/magazine/article/410221/how-well-do-you-know-your-cybersecurity-hazards
"Life shrinks or expands in proportion to one's courage" Anais Nin .. and yet we must arm ourselves with fear