Musings on the Management of Machinery

By Ravi Nayyar

A Techno-Legal Update
14 min readJul 12, 2024

In early June, I attended the fabulous Lesley Carhart’s talk on all things OT security, hosted by Dragos in Sydney. As a fan of Lesley and as an aspiring OT security nerd, I loved it. Great explainers, points and stories from the field (including the ‘peaker plant which turned itself on’).

Always great to listen to folks who have actually been in the trenches. Or on oil platforms, with Lesley telling us about mandatory drown-prevention training which OT security folk have to do before they are allowed to take an helicopter ride to a rig.

As someone whose knowledge of OT security is largely from literature, it is a pleasure to learn from folks who have been there, done that.

Well, folks, this piece is about just that. Lesley’s talk and conversations on OT security over drinks after her preso. This is about how the sausage is well and truly made.

And gee whiz, for someone as green as I, were the insights really something.

1. The Basics Matter Most

First things first, operators continue to be found wanting in terms of basic security controls. Of course, no industry veteran will be surprised, but yeah, I mean basic. This was reflected in how Lesley’s talk was about said basic controls.

I found this quite surprising at the time, given that the audience were experienced OT operations/OT security folk. Lesley is a veteran (both of the US Air Force and this discipline) and the talk was run by her employer, one of the specialist OT security vendors.

Hence, I went into the small theatre expecting the content to be quite advanced in terms of emerging fancy threat vectors, the nitty gritty of ICS protocols and gear, even war-gaming of future threats. I thought that last one would be especially relevant as the landscape hots up and the barriers to entry for OT hacking lower with, eg, modular hacking kits. In fact, I thought we would hear Lesley’s thoughts on the implications of stuff like ICEFALL, what the Russian state’s OT hackers have been doing in Ukraine, as well as the finer details of defending against Volt Typhoon-y threat actors.

Or even talk about secure-by-design (‘SbD’) being rolled out (if at all) by OT OEMs, certainly given what I’ve heard at conferences from folks in the sector about product security (especially vulnerability response) from OT vendors.

Instead, Lesley brought it all down to earth.

Her excellent talk was more an explainer of, as above, OT security basics. Stuff like MFA, training personnel and prioritising vendors who are responsive and have bug bounties when making procurement decisions. Run-of-the-mill matters.

Indeed, Lesley pointed out that most of her work in OT incident response stems from operators just not doing the actual basics right. Whether it is securing edge devices or safely managing IT-OT connectivity (eg just having DMZs).

Little wonder that the presentation was focused on the simple things being done wrongly, time and time again, by operators. Not fancy hacking techniques.

Focusing on defenders who fail to exercise their agency to defeat the lion’s share of threats by getting their houses in order, rather than the ‘winged ninja cyber monkeys’ (to quote Dr Ian Levy).

This makes one wonder why labels like ‘cyber-attack’, ‘sophisticated’ and needless jargon are used in the mainstream press and from non-cyber executives on panels at cyber summits/in interviews.

If one looks at IT, we know the story: shadow IT, poor vulnerability management, phishing, stolen creds and a lack of MFA, misconfigurations, poor network segmentation, etc. And the state of OT security appears to be little different, at least in terms of that laxity in security culture.

Indeed, is it ‘sophisticated’ if a baddy simply uses creds for a company VPN — stolen from an employee’s personal device as they use the same browser on their managed work device — bereft of MFA in order to log into an OT facility and subsequently move laterally to fiddle with the HMI?

Or if the baddy is a malicious insider who gets sacked by an OT operator but poor ICAM means that the baddy’s accounts aren’t terminated as soon as they are removed from their position, such that they can run riot in their ex-employer’s OT environments out of spite?

It would be great if the diplomats, (international) bureaucrats, corporate officers and civil society members (like think tankers talking about norms for state conduct in cyberspace and regional capacity-building) addressed this sort of issue (‘defenders stuffing up’) a bit more relative to their speaking on the threat from exploit shops (sorry, ‘cyber mercenaries’) or state actors doing normal signals intelligence.

And yes, I am aware of the UNGA approving the norm of responsible state conduct in cyberspace that ‘[s]tates should take reasonable steps to ensure the integrity of the supply chain so that end users can have confidence in the security of ICT products’ in 2015. And the G20 countries calling for the promotion of SbD as part of baking security and trust into the digital economy in 2023, which is > FVEY(+) doing joint advisories/guidance on SbD or procurement. And thus the need to hold vendors to account for marketing nonsensical stuff. (And to define software supply chain risk management as a pillar of (inter)national security — you can read my take on that here, here and here.)

But that does not mean we create a smokescreen of ten-foot tall threat actors and focus our indignation entirely on vendors in order to let reckless/negligent OT operators get away with poor controls.

Indeed, this is why my thesis concerns the regulation of the security and deployment of critical software as part of the regulation of CNI cyber resilience. It’s not just about the vendors dropping the ball with their SDLCs. It’s about the defenders neither deploying the product safely nor mitigating risks sufficiently from that product.

As Ukraine has well and truly shown us, defenders get a vote.

Reinforcing another UNGA-approved cyber norm which begins like so: ‘States should take appropriate measures to protect their critical infrastructure from ICT threats …’

‘Nuff said.

2. A Major Skills Gap?

Another key takeaway from the evening for me, especially from conversations over drinks after Lesley’s talk, was the serious skills shortage in the OT security discipline. What shocked me in my naivety wasn’t talk about the shortfall in folk who are technically trained in cyber resilience, but talk about the dearth of folk who understand how computers actually work, even after doing the average cyber resilience degree/vocational training program.

Of course, I’m a regulatory guy, so I don’t track the skilling aspect of cyber policy or what a good cyber education program requires. Yes, I understand the incredible diversity of the cyber resilience and computing disciplines such that folks will have to specialise within either of the two to gain a deep, technical understanding of a particular technology or paradigm.

And yes, I am not positioned at all to comment on the details of cyber resilience/computing curricula. So, I am only relaying to you what I heard from veterans I spoke with that evening. If you want to push back on this section of the piece, please do. I want to get the other side of the argument, especially because I am very green in cyber and OT security.

I assume a basic degree in how to protect computers from baddies in cyberspace would teach candidates how computers work. Certainly, one at my university seems to do so. But one veteran said the top universities in Australia drop the ball in their instruction, which puzzled me.

If we want to fill the oft-mentioned ‘skills gap’ in cyber, we need to ensure that our programs for cyber resilience education are actually fit-for-purpose. To speak generally, training providers like universities must ensure that the education they provide to our future incident responders, vulnerability researchers, pentesters, red-teamers, CISOs and cyber policymakers is actually preparing them for what they will be working on: computers.

After all, we are talking about something as vital as cyber resilience, which keeps modern societies and economies alive, not an 18th Century English contract law case of little relevance today (no offence, contract law people).

Broadening out the talent pipeline issue, one thing which also made my ears prick up was how OT security firms can’t find enough folks, even those trained in cyber resilience, who are worth recruiting for OT incident response roles.

Why, you ask?

Well, per the industry veterans I spoke to, it’s because most of today’s school leavers/university grads can’t think critically or laterally. Which isn’t ideal, given the complexity of OT environments, the increased amount of software and connectivity with IT assets making them hum along, and the quite bizarre ways OT assets can be operated ‘by themselves’.

For instance, to borrow from one of Lesley’s stories (my memory’s spotty here, so bear with me), how does a trainee incident responder think their way through the problem of ‘phantom inputs’ to deduce that it’s actually moths flapping about or drops of condensation hitting an HMI touchscreen because the setup is in a shed with a door which wasn’t shut properly and thus exposed to the elements?

If people wanting to work in OT incident response (or cyber more broadly) can’t problem-solve and/or think creatively when faced with a seemingly intractable problem (like how an OT asset is switching itself on without verifiable human input), OT security vendors face a dilemma.

In this particular case, the jobs are there, rather there aren’t enough folks worth recruiting for them.

And the clincher was, someone said to me that they are happy to show a cyber grad the ropes when it comes to OT incident response but if the grad is unable to go off the beaten track when it comes to ideas, the exercise is one of futility.

Not merely a cyber skills shortage, but a structural problem with the degree to which our primary, secondary and/or tertiary education systems actually teach kids how to think. If they aren’t taught how to push boundaries or be critical of their own approaches to problems, having too few capable cyber resilience people is the least of our problems.

Relative to the wider issue of a workforce not quite up to thinking creatively and rationally.

Not an OT or cyber resilience workforce. But a workforce generally.

You can’t generate a cyber/OT security talent pipeline from that, at least to the magnitude you need even in the shorter-term as: IT and OT threat landscapes worsen; and attack surfaces multiply with software dependencies and increasing(ly unsafe) IT-OT connectivity.

The solution is a more complex, whole-of-government one than the Commonwealth Minister for Cyber Security sharing the odd photo with a computing or cyber resilience student to highlight how the government is working to fill the skills gap.

This is in addition to an existing issue where, as one attendee told me, OT operators don’t have enough people to spare for them to take time off to get trained in OT/general cyber security.

That one really is a dilemma: how do you upskill a vital employee in how to better look after a vital bespoke and/or legacy asset when you don’t have a decent stand-in to look after that asset in the meantime?

3. State Capacity?

The third key takeaway from the evening for mine was state capacity, especially in Australia. My conversations were obviously focused on Australian cyber defence > offence, given the crowd and the overarching theme of the evening.

Now, of course, as a general matter, state capacity in cyber can never be perfect. Guns v butter and all that stuff. Especially in Western countries where there is a whole host of non-cyber stuff (like social services and healthcare) which requires taxpayer funding equally, if not more urgently, relative to cyber. Especially during present cost-of-living crises.

That all said, when it comes to assuring national cyber resilience, agencies aren’t exactly starved of resources.

If we look at Australia, one need only look at the roughly $10 billion being invested in ASD under Project REDSPICE — note the dot points on CNI cyber resilience and improving ‘nationwide cyber-incident response’.

I note also the Commonwealth’s commitment under the 2023–2030 Australian Cyber Security Strategy to make us a ‘world leader in cyber security’ by 2030. For example, through investments in capabilities and mechanisms to drive more robust incident response, such as a last resort ‘all-hazards consequence management power’ for the state, as well as uplifting CNI cyber resilience more generally. These aren’t small fry. (I expressed my reservations about such a power months before our strategy came out, but that’s not the topic today.)

Indeed, the point is, there is a lot of spending happening and more on the cards, as well as political will. This is on top of how the reputation of ASD (of which ACSC is a part) precedes it: FVEY and right up there.

Robustly well-resourced when it comes to defending the oodles of OT around the country.

Or so I thought.

I was quite surprised to learn from someone in the electricity sector that ACSC just don’t have enough OT security specialists.

Which throws into jeopardy their ability to actually execute intervention requests issued by Dep-Sec Home Affairs to DG ASD under Security of Critical Infrastructure Act 2018 (Cth) s 35AX (‘SOCI Act’).

But wait, what are intervention requests?

I have analysed the intervention request regime here and here, but the TL;DR is:

  • these are break-glass powers that allow ASD to be sent in to, essentially, operate a CNI asset which has been/is imminently going to be bricked that national security is (likely to be) threatened; and
  • among other triggers being met, the Minister’s satisfied that no other regulatory system will ‘fix it’, the operators are no good/unwilling to resolve the incident, and the triumvirate (PM, Defence Minister and Minister for Home Affairs) are on board.

Point is, these powers are at the extreme end of the spectrum of ASD’s advisory function under the Intelligence Services Act 2001 (Cth) s 7(1)(ca).

Makes sense, though, given the context of national security being directly threatened. ASD is an intelligence agency, after all.

Oh, and let me point out that intervention requests cannot be authorised by the Minister unless they are satisfied that compliance is technically feasible: SOCI Act s 35AB(10)(f). When the powers were being debated in the Parliamentary Joint Committee on Intelligence and Security, ASD pointed out that they would, as a matter of course, consult with the operator of the affected CNI asset when figuring out what would be technically feasible (there are consultation requirements under section 35AD(2)).

So, at least when I wrote that earlier piece on these powers, I assumed that ASD has plenty of in-house OT security expertise. How else would they be able to consult appropriately and swiftly with the operator during a crisis?

Except the electricity sector person I spoke to that evening said the dice were not exactly present on this front.

Now, I’ve had the pleasure of meeting a few of the OT specialists at ACSC, including at CI-UP. They’re lovely folks and absolute guns.

But are there enough of them? I don’t know.

So, as with the earlier section of this piece on the quality of education, I want you readers to tell me if you agree with my interlocutor’s position.

Is the OT expertise/resourcing situation that bad at ACSC?

And yes, to speak generally, we know the journeys of many folks in cyber: getting world-class training in cyber resilience and operations while working in the military/Sigint agencies (if not in the military), then checking out (after required service periods) for better pay and conditions in the private sector. I’m not at all dissing folks who follow that route, rather wondering out loud if FVEY agencies like ASD do enough to retain their talent, especially their OT talent.

A vital question, given that a true OT specialist is priceless in a national security sense.

The skills are built over years of tinkering/responding to incidents in a whole variety of OT environments or even a niche of OT environments, given that you can have people who spend most/all of their careers becoming specialists in, say, pipeline security.

It is such specialists who are pillars of the (regulatory) capacity and literacy of the state. For example, they help cyber resilience agencies identify, with respect to their CNI cyber resilience missions, what:

  • threat types and responses need to be planned for;
  • legal and policy reforms, and OT security metrics, are necessary;
  • resources (including training and funding) are required for whole-of-government efforts to assure OT cyber resilience; and
  • institutional structures and mechanisms are necessary for agencies to be credible and trusted partners for industry, including for CTI sharing, co-development of law and policy, and incident response.

I make these points while thinking of the shortcomings identified in the (past) approaches of the Fed, TSA, CISA and other US Sector Risk Management Agencies to discharging their (regulatory) missions.

Oh, and our CNI is largely OT. Our agencies need OT-literate folk to be able to protect it and enable smooth collaboration with operators.

I’ve heard that ACSC needs to spend a few meetings convincing OT operators that they’re not a regulator when trying to literally help them through CI-UP. This suggests that there is a trust deficit between the state and industry. Which is such a shame, given how amazing the CI-UP people are.

And, to reiterate an earlier point, such OT specialists can make much more money in industry, be it for an OT operator or even a specialist firm like a Dragos, Fortinet or Nozomi. Our agencies, backed by our governments, need to do what they must to retain that talent.

Indeed, the CISA Director acknowledged the pull factors at play for cyber folk generally (using herself as an example) before trying to rebut them by saying that ‘nobody joins the federal government to make any money … [rather] they come for a mission’. Make of that what you will.

(Tangentially, we know the reports/leaks on issues with CISA’s resourcing/structuring/approach. My heart goes out to the agency: no real teeth, yet condemned to jawbone, at least until CIRCIA, 6 USC §§ 681–681g, is operationalised through the agency’s rules. CISA was consulting on them earlier this year).

All Up, a Fun Evening

It was a really cool evening (not just because it was at the start of winter). Lesley gave a great talk, told some great stories and provided some great lessons. I have been following Lesley for around a year or so on X/Mastodon and been a fan of hers since, especially as an aspiring OT security nerd. Therefore, it was so cool that I could finally meet and talk shop with her. Cheers to Dragos for putting on this event.

In terms of my three key takeaways from the evening, well, those made it even more memorable. I’m always looking to learn from folks who actually do this stuff for a living since my knowledge is almost entirely book-knowledge. I relished the chats ’n reality checks over drinks. Be it with regard to the dearth of implementation of basic controls by OT operators, whether our cyber workforce is being given the right skills or whether our agencies are appropriately resourced to perform their OT security mission.

Of course, as I said especially with regard to takeaways 2 and 3, I want you to provide your responses. Are those takes from the other attendees accurate? Something missing? What’s your take on the solution?

P.S. Someone I spoke to made an interesting definitional point. They said that the IT-OT convergence had already happened, namely in the 1990s with the introduction of SCADA solutions. That ‘convergence’ was still confined to OT segments of operators’ networks, though.

Hence, they distinguished that ’90s convergence from the contemporary IT-OT convergence by highlighting how the latter involves a fusion of the OT operator’s business processes with their OT processes. This is because of, eg:

  • IT running processes that govern or enable the operation and maintenance of OT (eg workforce planning and rostering, operator callout systems, vulnerability management and asset inventories);
  • the wider ‘cloudification’ of OT; and
  • the (subsequent) phenomenon of OT and IT data flying around each other’s segments of operators’ networks more and more.

What do you make of their assessment of the IT-OT convergence?

--

--

A Techno-Legal Update
A Techno-Legal Update

Written by A Techno-Legal Update

Vignettes from the intersection of law and technology, and a word or two about sport. Composed by Ravi Nayyar.

No responses yet