“Cybersecurity is not given the level of importance within utilities as it should be, because it slows things down and gets in the way of progress and innovation. In addition, it’s hard and it’s expensive. If you have something that is difficult, costs a lot of money and slows down progress, innovation and business, this is a real barrier to business operation.”
This article first appeared in Metering & Smart Energy International issue 3-2018. Read the full digital magazine here or subscribe here to receive a print copy.
This is the beginning of my conversation with Patrick Miller, a cybersecurity expert and keynote speaker at the recent SAP International Utilities conference in Lisbon. Miller, who describes the security sector as the least sexy in the utility business, is talking about why utilities and businesses generally don’t prioritise security.He believes that security is not seen as a business imperative – yet – because we are changing from a traditional analogue physical world to a digital world; and we haven’t yet quite come to terms with our own constructions of business obstacles.
Miller continues: “We have a lot of responsibility from a security discipline to change that perspective. We have been the business unit of ‘no’ for so long, and we are so busy trying to secure things, that we are not trying to make it secure in a way that doesn’t impact the user or slow the business down. This requires a unique and different way of looking at security and not a lot of people are ready or willing to think about it that way. They just want to say no, that you have to slow down and that you have to go through numerous security checks before a project can be brought to the table – and the way we are currently doing it means security is getting in the way of business progress.”
We are sitting in the exhibition area of the conference shortly after Miller’s keynote address. In the keynote, Miller brought some thoughts to the table about security; including that security is not seen as an enabler to transformation and, importantly, that compliant does not mean secure and robust.
A lot of the challenge, Miller believes,can be addressed by properly framing the business implications of the lack of security – particularly in the boardroom –instead of moving straight to doom, gloom and possible digital Armageddon. Saying that a project needs to be delayed by a few days due to risk and exposure if the project goes live as it is, conveys the message that changes need to be made, but in a different, easier to accept, way.
Too often, Miller tells me, security systems seem to need to be engineered from scratch, adding to the time to operation and increasing. This, he says, is simply not true.
By segmenting operational siloes, similar functions are grouped together, minimising the impact of a breach across the entire organisation. But silos can be configured in very similar ways, making replication easy and cutting down on the need to ‘reinvent the wheel’ each time a system is architected.
I wondered if this doesn’t open a company up to increased risk of being easily hacked due to repeatability. Miller says not. He says homogeneity is always a concern – in the physical world as well as in the digital world. Yet, heterogeneity is only good when it is balanced. By having just enough differential between plant will result in a model that looks the same, but the actual implementation of the end point technologies will be different, albeit based on a common blueprint. The difference between innovations in each iteration of a plant will ensure sufficient variances, but the basic concept will remain the same.
“It’s similar to saying we are going to build cars with headlights, steering wheels and blinkers or indicators, but each one you get into will be sufficiently different that you will need to check where the radio on button is, or where the high-beam switch for your lights is. This will happen naturally, so even by designing a standardised system, it will always have sufficient technological differentiation.
This system will reduce your overheads and time to operation and gives you the business advantage of volume purchases for software licences. Even if you do buy everything from one vendor, from one year to the next, the workstations, components and interfaces will evolve. However, in the architectural diagram, work stations go ‘here’, servers go ‘here’ and how they communicate is a similar model.”
Supply chain vetting is another subject worth touching on. Miller spoke in his keynote about the need to get rid of the‘snowflakes’ as he calls once-off, bespoke programmes which are hard to maintain,reliant on one out-dated operating system,are unsupported and rely on that one operator in the most remote plant who knows how it works. By standardising software and suppliers, he believes you have access to more effective software engineering, but also to constantly
improving supply chain vetting. In point of fact, Miller points out, many manufacturers are making their own cables or components in an effort to secure the supply chain,adding to the degree of assurance you get from the product you buy. Depending on the vendor, you can therefore get a fair amount of supply chain security.
The standards that apply to this include IEC 66243. There are some from NIST and some international standards as well,which are designed to give you the tools to ask the right questions of your vendors.
You may get answers you don’t like, but at least you know the lie of the land,what the risks are and how you can insert mitigating controls, instead of wondering where all the problems are. Over time this will improve; and each time you ask your vendor the security questions, they will be better prepared and soon enough provide you products that state what the controls and security considerations are. This will likely end up becoming a differentiator for ICS system providers.
Where does regulation come into play?
Regulation takes time, says Miller. It should take time. Fast regulation is generally bad regulation. Yet the challenge with regulation and technology is that technology moves very fast – faster than any legislative body can move.
“There are some that are very good at it compared to others and there are some who write legislation to certain types of technology, yet when the legislation is done and voted in, the technology has already changed.
“Yet, trying to write legislation that is technology future proof is exceedingly difficult and ends up being very vague, is hard to implement, hard to interpret and leaves lots of room for variation in how you read it.
“The challenge is how do you write it to be specific enough so everyone understands what it means and are implementing it in similar ways; and where it is future proof enough for you to have a chance to write the new legislation before all the technology has changed under your feet?”Legislators write laws based on common practice, yet security regulation doesn’t conform to a norm, because security includes malice and malice has no place in time or resource will be directed against your company in the event of an attack.
The best regulation is regulation that clearly establishes: “This is the low bar. This is the minimum requirement that you need to meet. Our expectation is that you are going to do more than this, but this is the minimum.”
“Yet this is clearly not communication. So what we should see as the floor we see as the ceiling; and it creates an artificial norm where the legislation is as far as people go and they stop.”
Often businesses will take the approach that if something is not necessary in order to be compliant they won’t do it, despite the risk they may be exposing themselves to.
The NERC CIP regulation is compulsory and comes with penalties and sanctions of up to $1.3 million per day per violation –maximum fine – although no one has been penalised to that extent thus far. There was recently a $2.72 million fine issued for information leakage related to the NERC CIP standard, Miller says.
“The challenge is we have multiple agencies in charge of regulation. The electric sector generally, however, is probably the most experienced when it comes to regulations and even it is struggling when it comes to issuing penalties and sanctions as there are multiple implications when it comes to issuing fines and the potential impact on a business.”
If you think you are not a target…
Utilities are definitely targets for cyber criminals, Miller says. “There haven’t been a lot of published attacks on utilities,”he continues, with emphasis on the word‘published’. In the industrial security space, no one talks about any of the security incidents. There is virtually zero actuarial data upon which to base risk decisions, with the result that utilities don’t know where to focus their time, money or resources because there isn’t enough data available. “We are just guessing where we need to focus,” Miller says, continuing that the lack of published incidents creates a false sense of security as the tendency is to think if no one is reporting it, it must not be happening. “If you were hit, would you talk about it?” he asks.
Reputational risk aside, Millers says the unintended consequence of regulation is that if an incident has occurred, even if it was completely outside of the regulatory space, from a perception perspective you are assumed guilty until you can prove your innocence. The amount of effort that goes into doing that is enormous. Hence, no one talks. There have been situations where test systems have been breached and then incorrectly reported by the media, creating all sorts of challenges. While it is true that a system may have been breached, because it is a test system, it is not subject to the same level of security as operational systems. The impact on the operations of the business is therefore zero. However, the consequences for the business can be serious. Even if nothing was stolen and only a test system was affected.
In terms of the system impact – zero impact. In terms of the PR impact – HUGE. What about within utility associations and organisations? Are discussions taking place among utilities? Miller has argued that legislation should take ‘data breach’ and ‘data compromise’ into account, allowing a critical infrastructure organisation, if compromised, to report the breach and be given ‘safe harbour’ from any potential fines or penalties, because it is providing a ton of actuarial data. “We now know what the threat actor did, how they operated, what they exploited, what they went after, and we can now share that information with others as a use case. In a lot of cases what you see is that no one talks because of the obvious issues. If they have an incentive to talk, they probably will.
Being hacked, ironically, allows companies an opportunity to show how they successfully managed a breach. Miller is of the opinion that it is not a case of ‘if’ but ‘when’ a breach will occur, and the response to the breach is what will set successful companies apart from others. “If you can show how quickly you contained it, and how quickly and successfully it was handled, this will set you apart and history has shown, will likely positively impact on your stock price too,” explains. Intentionally covering it up is going to negatively impact business but if you have a team that is well trained to manage breaches and contain the damage, it should be a situation that is manageable. Despite all the protections you have in place, a hacker is still likely to get through, but you can then show
that the breach or attempted breach has allowed you to shore up your defences, preparing you for the next time an attempt is made.
Just not enough people
There is a severe cybersecurity workforce issue right now. Miller believes it is because utilities are not seen as being an exciting space compared to technology companies such as Google or Apple, especially if you are a millennial who is very talented. “Utilities are known for long term job security, but they are not known for high salaries and the new talent pool has an expectation of being in the same company for between two and five years before they more on. They have a transient mindset when it comes to employment.
What we are offering them is nowhere near that which they are interested in.”
What does the landscape look like?
The reason we haven’t seen more public or visible issues is because taking down power grid, or blowing up an oil refinery is considered a nation state level response. NATO has stated that infrastructure based cyber attacks can and may warrant a physical response to that attack. That is a severe situation. But what if you can’t determine who the attacker is? False flags are used often whereby an attacker will make it look like another country/person is responsible. Yet, “once you get past all the posturing, hackers have a fingerprint and it is difficult to exactly replicate the way someone writes a programme or an exploit and links code together so there are ways we can get to a level of assurance and
attribution. We could be about 90% sure that a specific nation or state may be responsible. The residual is handled with the intelligence community, so if you can
correlate a high degree of certainty from a digital fingerprint, along with intelligence
and corroboration from other allies, you can get to a high degree of assurance. It’s
not quick and it takes time, but it’s not impossible.”
Yet, surely if more companies were encouraged to share breach attempts, it would be possible to build up a comprehensive database of coding styles and fingerprints? Miller agrees. Code databases make it possible to compare samples via artificial intelligence and identify patterns that humans may not even be able to pick out.
“Sadly, there’s not much sharing happening, and we need to get over that.” Miller believes the most effective ways to share information are through multiple rings of trusted individuals, each sharing information with their individual circles. It’s a more social, less institutional, formalised form of sharing, but it is highly effective.“These rings of trust create a network of sharing,” he says. It is important to utilise expertise from other professionals who work within the same field but in order to do that, utilities should encourage and
allow security professionals to engage and network with other professionals. “Form relationships with other professionals in your field,” Miller recommends. “Spend time networking because at the end of the day, it will take a human to stop another human. There is no machine that can stop a determined human adversary.”