The Cessation of Straya’s Cyber Somnolence? Yeah, Nah

By Ravi Nayyar

A Techno-Legal Update
26 min readDec 4, 2023

So, like a lot of folks, I have thoughts about the 2023–2030 Australian Cyber Security Strategy (‘2023 Strategy’).

The thing hyped up as the remedy for our erstwhile ‘cyber slumber’, to quote our fearless Minister for Home Affairs and Cyber Security, Clare O’Neil MP. (Though, did Rhonda from Utopia come up with that term?)

Well, TL;DR = I didn’t expect much and my response is, ‘Barring a few things, sigh’.

I’ll provide brief thoughts on the 2023 Strategy as a whole, dive into a few of the shields and will leave my specific thoughts on bits and pieces in an appendix. (And all the memes were created by moi.)

Let’s dive in.

General

There’s not really much new in the 2023 Strategy versus its predecessor, Australia’s Cyber Security Strategy 2020 (‘2020 Strategy’). For a strategy which has been in the works since August 2022.

I won’t do an action-by-action comparison of the two strategies, but if you do, you’ll understand my point, especially if you have been following the Australian cyber resilience scene far longer than I have.

Makes one wonder what all the millions of dollars spent on consultants to do administrative-sounding work to support the development of the new strategy was for. Either way, happy days for the firms when Home Affairs could have hired Masters/PhD students as research assistants/interns to do the same for much less. Nonetheless, our fearless Minister has been clear in avoiding questions about the merits of these disbursements of taxpayer dollars:

Yeah, I mean, I actually — the Department decides about contracting and that sort of thing. It’s actually not something that I have power and control over. All I can say is that this is a problem that is worth… [throwing] Literally tens of billions of dollars [at solving]. I’m not kidding… I think you will find whatever the cost of this strategy is going to be is going to pay for itself hundreds of times over.

Yes, Minister.

Alrighty, to the Shields.

The Most Noble Order of the Shields

Now, the ‘six cyber shields’ (um, that s-word suggests defence, not a more holistic goal of resilience, but anyhoo). Figure 1 summarises them.

Figure 1: The six Shields from the 2023 Strategy.

I find the ordering baffling. Why put CNI as far back as 4?

Why not an order like the following:

  1. Safe technology
  2. Protected critical infrastructure
  3. World-class threat sharing and blocking
  4. Strong businesses and citizens
  5. Sovereign capabilities
  6. Resilient region and global leadership

Safe technology (never mind the cop-out from the government, as I argue below) is vital to everything in our society and economy functioning — like CNI — let alone the 2023 Strategy being implemented.

And then, you can’t do any of the stuff in the other four Shields without your CNI running (with the aid of safe technology), given the definition of CNI. See the criteria under the Security of Critical Infrastructure Act 2018 (Cth) ss 51(1)(c)-(d) (‘SOCI Act’). For example, you can’t share CTI per Shield 3 if your data centres are starved of electricity.

But ah well, that’s ‘semantics’.

Now, my thoughts on Shield 1.

Shield 1

A mixed bag.

The Good Bits

Firstly, the proposed ‘Cyber Incident Review Board’ (‘CIRB’), modelled on the likes of the American Cyber Safety Review Board (‘CSRB’) and the Australian Transport Safety Bureau (Action 5.2), is a great thing.

A specialised, most likely multi-stakeholder (if it’s modelled on the CSRB), body tasked with investigating major breaches and identifying how we can collectively improve how we tackle cyber risk.

Secondly, the proposal to co-design a legislated limited use obligation which will closely regulate the intra-governmental sharing of data reported by industry to the Australian Signals Directorate (‘ASD’) and the National Cyber Security Coordinator, with the Commonwealth planning to get the ball rolling ‘by developing an interim approach for ASD’ (Action 6.2).

The Commonwealth can use the following bit of clause 21 of the Memorandum of Understanding between the National Cyber Security Centre and the Information Commissioner (‘NCSC-ICO MoU’) as a template:

For the avoidance of doubt, the NCSC will not share information from an organisation it is engaged with due to a cyber incident with the Commissioner unless it has the consent of the organisation to do so.

In the foreshadowed Australian limited use obligation, the Commonwealth can also make reference to section 40 of the Intelligence Services Act 2001 (Cth), which strictly limits what information can be shared by ASD staff, contractors and their employees.

I reckon Action 6.2 is among the most consequential elements of the 2023 Strategy, given the criticality of transparency and information-sharing by the private sector — especially CNI owners and operators — with government (during a crisis). The quicker industry can share more, richer CTI and telemetry with legislated limits on what that data can be used for by government, the greater the trust will be between the public and private sectors.

Which will enable ASD to move much more quickly in a national security crisis stemming from a serious breach of cyber resilience in, say, CNI. Not least because the situational awareness of the victim organisation will be more rapidly super-charged (yes, I said that) by the state’s own ’Marvel superpower’ (no, I didn’t say that) — their situational awareness and technical expertise through ASD, a Sigint agency — in order to swiftly resolve the breach and crisis.

Action 6.2 makes clear that this limited use obligation ‘would not impact regulatory or law enforcement actions, or provide an immunity from legal liability’. Sure, in principle, this prevents moral hazards that would be caused by organisations otherwise being confident that they can avoid scrutiny for unreasonable control failures and thus breaches of their regulatory obligations. It thus promotes the rule of law.

But in practice, if the Commonwealth is looking to incentivise transparency by organisations, there are questions about what this limited use obligation will achieve in terms of shifting industry mindsets in favour of engagement with ASD and/or the National Cyber Security Coordinator in the absence of further carrots or reductions in the size of relevant sticks.

In this regard, the Commonwealth should consider instituting a mechanism for organisations constructively engaging with it to be looked upon favourably by their regulators, especially in the context of enforcement action. Such a mechanism could echo clause 37 of the NCSC-ICO MoU:

The Commissioner will on an increasing basis, continue to recognise and incentivise appropriate engagement with the NCSC on cyber security matters in its approach to regulation. Specifically, the Commissioner will publicise (on its website, in guidance, and in relevant press releases) that it looks favourably on victims of nationally significant cyber incidents who report to and engage with the NCSC and will consider whether it can be more specific on how such engagement might factor into its calculation of regulatory fines.

The Suboptimal Bits

Firstly, on the proposed CIRB, I am unsure what is meant by how it ‘will not make findings of fault’. How do you conduct a ‘lessons-learned review’ without figuring out whose conduct contributed to what happened? (Of course, I understand the need to keep industry onside in all this.)

The value of such a CIRB is also limited by its ability to access information from industry and government in order to generate rich recommendations. Will it have information-gathering powers akin to those of a royal commission?

This is relevant especially if the pwned organisation claims legal professional privilege (‘LPP’) over relevant data because it fears getting sued — as Optus unsuccessfully did with the Deloitte post-breach report — thereby delaying the work of the CIRB with litigation over LPP. Perhaps the Commonwealth can legislate powers for the CIRB similar to that under the Royal Commissions Act 1902 (Cth) s 6AA in this regard?

When it comes to dealing with a Medibankesque scenario where the organisation subject of a CIRB review simply refuses to publish the DFIR review, the Commonwealth can look at the coercive powers of a royal commission to summon witnesses and take evidence under the Royal Commissions Act 1902 (Cth) s 2 as a model.

Devil’s advocate: Why do we need a CIRB when you can expand the purview of the Inspector-General for Intelligence and Security (‘IGIS’) to review how ASD (regulated by IGIS) and all stakeholders acted in resolving a breach? IGIS has pretty hefty powers.

Secondly, the 2023 Strategy is a damp squib, a complete let-down when it comes to cyber governance.

Our country appears reticent when it comes to taking even basic action against poor cyber governance as poor corporate governance; as opposed to a breach of sectoral law a la ASIC v RI Advice.

Or reticent in terms of imposing personal liability on corporate officers, as defined by the Corporations Act 2001 (Cth) s 9 (‘Corps Act’), for breaches of their statutory duties to their companies arising out of said poor cyber governance.

Why else haven’t any of the people, who were Medibank’s or Optus’s officers just before and during the relevant breaches, been even the subject of Corps Act s 1317E proceedings that merely involve a court, on application from the Australian Securities and Investments Commission (‘ASIC’), declaring that they have breached their officers’ duties under sections 180–1 of the Corps Act (those are civil penalty provisions under section 1317E(3))?

Seriously, Medibank (a CNI operator) was done through its supply chain (valid Medibank creds stolen from a third-party service provider) and a misconfigured firewall letting the attacker straight in. The consensus is that Optus was done in 2022 through an unauthenticated API. The Minister rightly shot down the then-CEO’s narrative about it being a ‘sophisticated’ attack.

Fast-forward just over a year and the massive outage of Optus’s consumer IP network being thanks to Optus failing to configure its Provider Edge Routers appropriately, such that around 90 ended up ‘automatically self-isolating in order to protect themselves from an overload of IP routing information’ — as per the default settings — received after a software update (which Optus knew about) at a Singtel internet exchange (run by the parent company, whom Optus unsuccessfully tried to blame).

So an outage which had such massive cross-sectoral impacts, including by preventing 000 access for many customers, was thanks to Optus not configuring its own gear properly.

No threat actors.

Just its own stuff-up.

Oh, and it gets worse — one of our country’s most critical of CNI operators had not exercised a complete cold restart of its network. (And its then-CEO felt it was perfectly professional to leak her resignation to the Australian Financial Review the week before she officially resigned.)

But wait, there’s more, possums!

It’s not like the DP World Australia hack was because of a ‘military-grade cyber weapon’.

Rather, it was most likely because they didn’t patch their Citrix NetScaler instance prior to their compromise with, let’s face it, LockBit. The Minister certainly seemed to imply CitrixBleed was the initial access vector by publicly railing against the port facility operator for failing to patch their Citrix.

The CNI operator handling 40% of our maritime trade failing to do basic patch management (which would include, one would assume in this case, failing to put in place mitigating controls for the intervening period prior to their ultimately patching their Citrix).

When we are already an island continent fundamentally dependent on sea lines of communication to continue, well, existing. And exacerbating the systemic problems brought on by lacking a merchant navy.

To sum up, we have a few instances of terrible cyber governance with terrible negative externalities for our national security in just the last year or so. Among the 143 incidents that ASD responded to in FY 2022–23 at organisations self-identifying as CNI people, an increase from the 95 incidents reported in 2021–22.

And yet, here’s what Shield 1 has to offer on cyber governance.

Industry feedback has flagged that more could be done to help businesses understand what good cyber security looks like.

What an indictment. It’s 2023, for crying out loud.

As a first step, the Government will publish an overview of corporate obligations for critical infrastructure owners and operators.

ASIC already did in 2015 with Report No 429. See Appendix 2 thereof. No mention, of course, of ASIC’s role in enforcing officers’ duties in the cyber governance context.

To add insult, ASIC isn’t even the lead agency for Action 5.1, per the Action Plan for the 2023 Strategy (‘2023 Action Plan’), rather Home Affairs and Treasury are. I didn’t realise they were the corporate regulators.

Never mind that this is not a law reform initiative (which would require Treasury and Home Affairs to be among the lead agencies), rather one recycling existing literature on cyber governance.

Including that put out by the actual corporate regulator.

Next, the Government will consider how best to collaborate with industry to design best-practice principles to guide good cyber governance.

Excuse me!?

No wonder there is no mention of ASIC’s role as an enforcer: the Commonwealth seems to have forgotten the underlying philosophy of Recommendation 6.2 from the Financial Services Royal Commission, which called for ASIC to, ‘as its starting point, [take] the question of whether a court should determine the consequences of a contravention’.

Yes, that recommendation was about enforcing financial services law, not corporations law, but you get my point — ASIC needs to actually and aggressively enforce the Corps Act officers duties, rather than the government sitting down to sing kumbaya with industry.

That is what will truly ‘guide good cyber governance’.

‘Why not litigate?’, to quote Commissioner Hayne.

Oh, and note that we already have ‘best-practice principles’. They’re called rationes decidendi from courts interpreting the officers duties under the Corps Act.

To help industry navigate mandatory cyber incident reporting obligations, the Government has developed a single reporting portal on cyber.gov.au that brings key reporting links together in one place.

So it’s basically something undergrad law students can put together and pitch in a day’s work at a hackathon? A collection of hyperlinks based on whom you tell the webpage you’re regulated by?

Sigh.

All up, folks, it is obvious that said breaches of cyber resilience in our CNI are merely continuing a trend of below-par cyber governance in Oz generally.

Look at the findings of ASIC’s cyber pulse survey 2023 (Report 776), based on responses from 697 organisations, including companies of a range of shapes and sizes. Across the board, the weighted average cyber maturity score was 1.66 out of 4.

When we break things down by category of the NIST framework, things are far from rosy — see Figure 2 (hint: look at the ‘Govern’ function).

Figure 2: Weighted average cyber maturity score by function for small vs medium and large organisations, per ASIC’s cyber pulse survey 2023.

The overall score for governance and risk management is 1.62 out of 4, with ASIC saying organisations can do better in:

  • ‘defining cyber security roles and responsibilities’;
  • ‘complying with regulatory obligations’; and
  • ‘identifying and prioritising vulnerabilities to information assets’.

Furthermore:

  • ‘44% [of respondents] do not manage third-party or supply chain risk’;
  • ‘58% have limited or no capability to protect confidential information adequately’;
  • ‘33% do not have a cyber incident response plan’; and
  • ‘20% have not adopted a cyber security standard’.

ASIC’s survey is in addition to the cyber security stocktake being done by the Australian Prudential Regulation Authority (‘APRA’) in 2023 to assess compliance by over 300 banks, insurers and superannuation trustees with APRA’s prudential standard for information security, CPS 234.

The findings, released in July 2023, from the assessment of 24% of APRA’s regulated population thus far are as damning as those from ASIC’s survey.

Never mind that the respondents would include CNI operators, given the categories of CNI assets under the SOCI Act s 9.

Oh, and let’s go back even further to a survey by the Australian Institute of Company Directors and the Australian Information Security Association in May 2022 of 856 directors on how their organisations did the cyber dance. The findings from which are also damning.

Of the directors surveyed:

  • 22% have an ‘inadequate’ understanding of their companies’ regulatory obligations with respect to cyber;
  • 25% of those sitting on the Boards of listed companies did not consider that cyber security is a ‘high priority issue for [their Boards]’;
  • less than half of listed company directors said their Boards ‘[m]ake cyber a specific focus of a board committee’;
  • 31% said their Boards are never briefed on the cyber resilience of ‘key third-party suppliers’, while merely 37% received briefings ‘Occasionally/Ad-hoc’ on ‘execution of [a] cyber strategy or framework’;
  • half said their organisations did not apply cyber resilience frameworks like NIST/ISO 27001/Essential Eight; and
  • merely 14% of those sitting on listed company Boards and 20% of those sitting on unlisted company Boards said their organisations recruited directors with expertise in cyber.

And yet, our cyber security strategy, our Rosetta Stone for cyber policy, has not shed the voluntary self-regulation/warm, fuzzy/baby steps, etc approach when it comes to cyber governance. This is not a new problem. This is not a ‘winged cyber ninjas’ problem. This is a ‘dearth of intent’ problem.

A dearth of intent to enforce existing laws.

Not jawbone, as the ASIC Chair has done over the last two years, about officers needing to provide good cyber governance and comply with their Corps Act duties.

But enforce.

If the sword of Damocles does a wee bit more than just hanging over officers who drop the ball, well, that would really set the cat among the pigeons.

Like the SEC is doing in the case of SolarWinds and its CISO (here’s my take on that action.).

Especially as the threat environment gets even worse generally, not just CNI.

Otherwise, we make Sisyphus look like he’s on holiday.

Actually, don’t we already do that?

Now, Shield 2.

Shield 2

Another mixed bag.

The Disappointing Bits

This is the one I was especially looking forward to as an aspiring software security nerd, but knew not to expect much from, given the Minister’s publicly disclaiming the creation of a European Cyber Resilience Act-style vendor liability regime. And good thing I hadn’t gotten me hopes up.

Yes, there are commitments to: co-design a mandatory IoT cyber security standard with industry (Action 8.1); implement year-old Quad commitments to harmonise procurement regulations with respect to software security (Action 8.2); and create a national security-focused SCRM framework to help industry do their C-SCRM on top of figuring out how to boot insecure gear from the Australian market (Action 8.3).

That said, will the mandatory IoT standard merely codify the Voluntary Code of Practice: Securing the Internet of Things for Consumers from 2020, flagged by the 2020 Strategy? If yes, how’s that a radical rethink of policy taking us out of the ‘cyber slumber’ under the Morrison Government? That said, the bar is subterranean for IoT/software security regulation in Australia.

Also, why go for a mandatory IoT security standard but then go for a voluntary labelling regime for IoT? If both a security standard and labelling regime would be in synch with international standards, then I’m not quite sure how the former would require compliance with criteria irrelevant to the latter.

If you’re going to require IoT vendors to do x, y and z in terms of product hygiene, which they have agreed on via the co-design process and thus would see as making their products more attractive, then how is it a compliance headache for them to be required to merely disclose that they have done x, y and z on their products and/or packaging in a manner which they have agreed to via the co-design process?

Why not make both mandatory? And enforceable co-regulatory frameworks analogous to industry codes under Competition and Consumer Act 2010 (Cth) pt IVB?

The problem is not that small enough to justify voluntary self-regulation (because that is what is foreshadowed here) ‘as a first step’, by the way.

On implementing commitments as a Quad country (from the May 2022 and May 2023 Quad Leaders’ Summits), excellent. I’ve written on the Quad’s initiative (here, here and here) and think it’s a great idea.

But I am incredibly disappointed the government is not going down the liability route in re software vendors not investing in the hygiene of their software development life cycles (‘SDLCs’). As I’ve argued in said pieces on the Quad’s initiative, existing incentive structures, if any depending on the jurisdiction, are utterly inadequate in motivating vendors to invest in and continue to assure the resilience of their SDLCs (just like how they are in Australia for companies generally to assure their cyber resilience).

You cannot possibly rely on economic incentives (like government contracts) alone to get vendors (or indeed any regulated entity) to lift their game. You need to go for smart regulation a la Gunningham, using a range of approaches, including black-letter law. If economic incentives (the wallets of educated buyers) were enough, then the invisible hand alone would have done it.

Except it didn’t.

It’s long past time the government realised that this is a software and national security problem, and thus a public goods problem.

Alas, the market has not prioritised SDLC hygiene, but it appears the Albanese government has reposed a lot of faith in the market and ignored the signal. By rejecting the idea of an explicit liability framework to recalibrate the invisible hand to favour vendors with robust SDLCs.

Meanwhile, the Council of Europe and European Parliament have reached provisional agreement on the EU’s proposed Cyber Resilience Act — what would be the world’s first bespoke liability regime for the security of hardware and software products — a major milestone in the EU supranational legislative process. #whycantwehavenicethings

Meanwhile, we’re committed to coming up with a ‘voluntary code of practice for app stores and app developers’ in Action 8.2 of the 2023 Strategy. Look at the sheer amount of malware in the Android ecosystem, finding its way into popular apps, for instance. How can we forget the Pinduoduo imbroglio of 2023? A case study into negative externalities of decision making by developers and app store gatekeepers.

And yes, both Apple and Google take tons of action to prevent fraud and flush out malicious apps, but then again, it is purely motivated by corporate governance on their part — failure to do their basic gatekeeper roles as app store arbiters will not exactly do their reputation among users any good.

They are not specifically required by legislation, at least in Australia, to take that action and held accountable per standards for the security of apps that they allow to be downloaded from their storefronts.

Of course, if the likes of Apple and Google rail against being regulated as gatekeepers, well, the Commonwealth can merely tell them, ‘We’ll enshrine your existing policies in law as a start’.

The Great Bit

I love Action 8.3 because I believe C-SCRM, especially software SCRM, is national security risk management (which is why regulating the hygiene of SDLCs is vital).

As the Indian External Affairs Minister put it just a year ago:

Today, technology is very much at the heart of geopolitics…

We live in an era where, in many ways, the Westphalian model of international relations is over for us in this era of technological interpenetration… I think in the digital side, we are going to hear, more and more, the concept of trusted geographies, so the moment you start talking trusted geographies, the geopolitical connotation of that is very, very clear.

C-SCRM matters, folks. Geopolitics, and thus cyber risk and foreign interference risk, pervade the lot of it.

Anything the state can do to share its situational awareness on hardware and software supply chains with industry to enable the latter to make much more informed decisions (and give CISOs/procurement officers some actionable intelligence to take to their Boards/seniors, respectively) is vital.

This will especially enable industry’s compliance with their obligations to manage supply chain risk, such as: CNI operators covered by the risk management program regime obligations under SOCI Act pt 2A, specifically required to take care of supply chain hazards under the Security of Critical Infrastructure (Critical infrastructure risk management program) Rules (LIN 23/006) 2023 (Cth) r 10 (‘SOCI RMP Rules’); or prudentially-regulated institutions under CPS 234 having to manage risks from third party service providers (note CPS 230 coming online in 2025, requiring them to do so regarding material service providers).

A framework of the sort contemplated by Action 8.3 could also contain criteria that regulated entities or organisations can plug into their procurement contracts and vendor screening procedures. Beauty.

If we are to go down the black-letter route to implement Action 8.3, we can use, as templates, the:

  • Supply Chain Rule (see 15 CFR § 7), a national security-focused screening regime under which the Secretary of Commerce can ban ‘any acquisition, importation, transfer, installation, dealing in, or use of any information and communications technology or service, including but not limited to connected software applications’ on the basis, essentially, of foreign interference risk;
  • Secure and Trusted Communications Networks Act of 2019 (codified at 47 USC §§ 1601–9), under which the FCC designated Huawei and ZTE as national security threats and thus producers of covered equipment per 47 USC § 1601. Which led to the Federal Communications Commission: banning the use of Universal Service Fund (‘USF’) money to purchase said companies’ gear, per 15 USC § 1602 and 47 CFR § 54.9(a); requiring the removal of their gear by US telcos receiving Universal Service Fund money, per 47 CFR § 54.11(a); and making said removal, where the telco has at most ten million customers, eligible for federal funding under the Secure and Trusted Communications Networks Reimbursement Program, per 47 USC § 1603 and 47 CFR §§ 1.5000, 54.9(d); and
  • Secure Equipment Act of 2021 (codified as a statutory note to 47 USC § 1601), under which the FCC updated its equipment authorisation program (contained in 47 CFR §§ 2.901–2.1093) in November 2022 to ban the authorisation, import and marketing of covered equipment (including comms gear produced by Huawei, ZTE, Hytera, Hikvision and Dahua) in the US.

Or maybe the Commonwealth won’t go for the creation of a generalist framework — be it under statute or legislative/regulatory instrument (eg Ministerial rule-making to amend the SOCI RMP Rules under SOCI Act s 30AH(1)(c)) — modelled on the above, which allows the state to coercively require the exclusion of designated vendors?

Maybe it will be left to sectoral regulators to implement? (Though none are mentioned in the 2023 Action Plan list of accountable agencies for Action 8.3.)

Or maybe the SCRM framework, as contemplated by Action 8.3, won’t be legally enforceable?

More ‘guidance’ incoming from Home Affairs — via the Cyber and Infrastructure Security Centre/Australian Cyber Security Centre (‘ACSC’)/Australian Security Intelligence Organisation — on which vendors to avoid? Guidance which CNI responsible entities can ‘consider’ when complying with their SCRM obligations under the SOCI RMP Rules r 10 as well as sectoral obligations, if any?

After all, Action 8.3 contemplates ‘further options to limit the availability of non-secure products in the domestic market’, which is broader than black-letter law bans on vendors or service providers that have been designated by the Minister for Home Affairs/Secretary for the Department of Home Affairs/delegate of the latter.

Wouldn’t want to antagonise industry through more pesky black-letter regulation, now, would we?

On the whole with Shield 2, I can’t help but see it as a missed opportunity. We could have really driven real change at home in terms of incentivising the production and support of truly secure technology, had we gone for liability frameworks for software vendors, mandatory IoT labelling and mandatory security standards for app stores and developers. The things enabling all the other Shields.

Gee, what a surprise that we can’t have nice things.

I guess we can hang on to how we led the world in excluding Huawei from our NBN and 5G build, but not really progress much beyond that.

Now, Shield 4.

Shield 4

As Shakespeare wrote, ‘What’s in a name?’

Well, in policymaking and law reform (as we were taught in statutory interpretation in law school), a lot. Intent, for one.

Why does Shield 4 say ‘protected’, but not ‘resilient’, CNI?

‘Protected’ suggests that the CNI assets’ perimeter defences, in the cyber context, are good enough, not that the assets and their operators are ‘resilient’ in the face of successful targeting by the baddies.

Words matter.

Especially when, right before the diagram that is Figure 1 (listing the Shields) appears in the 2023 Strategy, is the following sentence:

Throughout the period covered by the 2023–2030 Australian Cyber Security Strategy (the Strategy), the Australian Government will work with industry to reinforce these shields and build our national cyber resilience.

R-word, Baby!

By the way, on the proposal in Action 14.3 for a ‘last resort all-hazards consequence management power’, note that the all-encompassing directions-issuing power of the Minister already exists under SOCI Act pt 3.

And if this is talk about extending the Commonwealth’s part 3A powers to operate after the relevant cyber security incident has been resolved, I am firmly against that — see my analysis from earlier this year.

Onto the final Shield.

Shield 6

I am especially disappointed by how Shield 6, ‘Resilient region and global leadership’ (five pages worth with 7 Actions), is nowhere as comprehensive as our 2021 International Cyber and Critical Technology Engagement Strategy (‘2021 CCTE Strategy’; the operative bits are close to 80 pages long with 37 Actions).

The 2023 Strategy’s Action 19 is titled ‘Support a cyber resilient region as the partner of choice’, especially for our Pacific family. And we appointed Tim Watts MP — someone more than literate in cyber policy — as Assistant Minister for Foreign Affairs.

So why wasn’t Assistant Minister Watts tapped to draft a new cyber and critical technology diplomacy strategy? After all, if we were wanting to tear back the curtains, open the windows and let in some fresh air by replacing the 2020 Strategy, why not do the same with the 2021 CCTE Strategy?

In the interests of bringing our diplomacy and domestic policy on cyber under one strategy (the 2023 Strategy, as argued by our Ambassador for Cyber Affairs and Critical Technology), why seemingly water down great Actions (eg one on confidence-building measures) from the 2021 CCTE Strategy to a few sentences or words?

Also, what happened to the ‘critical technology’ component of Shield 6, especially when the 2023 Strategy refers to Ambassador Brendan Dowling as ‘Australia’s Ambassador for Cyber Affairs and Critical Technology’ (emphasis added)?

Or is all of that critical technology stuff subsumed by the ‘emerging technology’ Actions 10.1 (‘Support safe and responsible AI’) and 10.2 (‘Prepare for a post-quantum world’)? Never mind all the other technologies, such as ‘protective cyber security technologies’, that the current government itself has designated as critical technologies in the national interest.

Will we see a specialised critical technology diplomacy strategy; in another payday for the consultants? Is the 2021 CCTE Strategy still alive? The hyperlink for ‘Cyber affairs’ on the ‘Global themes’ section of the Department of Foreign Affairs and Trade’s (‘DFAT’s’) website is for the 2021 CCTE Strategy, so there is that.

By the way, is there a reason neither the 2023 Strategy nor the 2023 Action Plan announce extra funding for DFAT’s Cyber Affairs & Critical Technology Branch (‘CYB’)? The CYB is a small branch as it is (see Figure 3). With DFAT responsible for leading the implementation of five of the seven sub-Actions under Shield 6 (see Figure 4).

Figure 3: Excerpt from the DFAT organisational chart (effective from 6 November 2023).
Figure 4: Excerpts from the 2023 Action Plan.
Figure 5: Excerpt from the Budget Measures document from the 2023–24 Commonwealth Budget.

Note that the 2023–24 federal budget does not even mention our cyber and critical technology diplomacy, let alone provide additional funding (see, eg, Figure 5).

So, is the CYB already funded to, for instance:

  • revamp our Cyber and Critical Technology Cooperation Program and develop a new strategy for gender equality, disability and social inclusion (under Action 19 of the 2023 Action Plan, though see Actions 12–14 under the 2021 CCTE Strategy);
  • ‘develop a framework to identify when and how to deploy our limited resources [for incident response] across the region’ (under Action 19 of the 2023 Action Plan);
  • ‘support the establishment of a permanent UN Programme of Action to advance peace and security in cyberspace’ (under Action 20 of the 2023 Action Plan), with us all knowing how expensive UN diplomacy can be; and
  • review our attribution framework for cyber to ‘ensure it continues to be fit for purpose’ (under Action 20 of the 2023 Action Plan)?

It already has, presumably, led from the Australian side the delivery of 64 projects across Southeast Asia and the Pacific, with a total value of $45.75 million, via the pre-existing Cyber and Critical Tech Cooperation Program (2016–2025).

Like ASIC, the CYB needs to be funded appropriately to pull off what the 2023 Strategy requires it to do and/or lead.

By the way, just on the regional cyber crisis response team to be established in DFAT (per Action 19.1), why not leave the crisis response to be led by the ACSC, perhaps with DFAT and ACSC having a liaison officer parked with each other so ACSC can swing into action, supported by DFAT? Not sure about the logic (bar turf gains for DFAT) of Action 19.1.

Summing Up

As I said up top, TL;DR = I didn’t expect much and my response is, ‘Barring a few things, sigh’.

There isn’t much new in here, but there are a few things to write home about for the government, such as the: CIRB; limited use obligation for ASD and the National Cyber Security Coordinator; SCRM framework; and mandatory standard for IoT security.

But I am especially disappointed about the: ordering of the Shields; lack of any intent to enforce corporate law to uplift cyber governance; absence of a proposal for a bespoke regulatory framework for software security; and Shield 6 being nowhere near as comprehensive as our 2021 CCTE Strategy.

Anyhoo, see the Appendix for my specific takes on bits here and there, eg, on CTI sharing, CNI policy and cyber diplomacy.

And tell me what you think of the 2023 Strategy and my take on it.

Appendix

From the Minister’s Foreword

We are rallying our international network of cyber guns to help break the business model of ransomware and cybercrime.

Excuse me? And I really want to hear someone with a Yorkshire accent say that.

From the Strategic Context

Australia is not starting from scratch.

But the Minister has repeatedly said the Morrison government left us (about) ‘five years behind’ on cyber law and policy.

One of Australia’s core strengths is our robust legislative system… From regulation of critical infrastructure under the Security of Critical Infrastructure Act 2018 (the SOCI Act)…

Which the Minister described as ‘bloody useless’ just last year.

Shield 1

Disrupt and deter cyber threat actors from attacking Australia

Yes, we will deter cyber threat actors. Whatever that means.

As a next step, the Government will build a ransomware playbook. This playbook will provide clear guidance to businesses and citizens on how to prepare for, deal with, and bounce back from ransom demands.

2. Build playbooks for incident response

What are the ACSC’s Cyber Incident Response Plan and Business Continuity in a Box resources, then?

Shield 3

The Government will also support industry-industry threat sharing by investing in a Threat Sharing Acceleration Fund to support the development of sector-specific ISAC in Australia.

So the government is preferring a sectoral ISACs approach? Siloes for CTI?

Why not empower a centralised ISAC for all CNI folk like CI-ISAC Australia? (Full disclosure: I played a small role in the genesis of CI-ISAC Australia and one of its co-founders is a mentor of mine.)

Also, was the ‘Threat Sharing Acceleration Fund’ coined by Jim and Rhonda from Utopia?

To promote uptake of threat sharing, the Government will encourage and incentivise industry to participate in threat sharing — with a particular focus on organisations most capable of collecting and sharing threat information at scale, such as critical infrastructure.

What does this mean?

And again, why go for sectoral ISACs and not empowering CI-ISAC Australia to bring everyone in CNI under one roof?

Drawing on the work of the Steering Group, the Government will also seek to encourage and incentivise threat blocking across the economy by those most capable of doing so — including telecommunication providers and ISPs.

Love Clean(er) Pipes, but see para [62] (page 31) of the 2020 Strategy.

Shield 4

The Government will work with industry to move the security regulation of the telecommunications sector from the Telecommunications Sector Security Reforms (TSSR) in the Telecommunications Act 1997 to the SOCI Act.

If it’s relatively painless for telcos and the obligations are (amended to be) almost identical across both regimes, fair enough.

The Government will explore options to incorporate cyber security regulation into “all hazards” requirements for the aviation and maritime sectors.

Well, the government could cancel their sectoral regs and the Minister designate them to fall under the SOCI Act pt 2A?

To protect our Systems of National Significance, the Government will expedite implementation of the Systems of National Significance framework.

What was the government doing re SoNSes before now?!

These are the super-duper subset of Australian CNI assets, per the criteria for a Ministerial declaration under SOCI Act s 52B.

The Minister has, to date, declared 168 CNI assets to be SoNSes.

Note the $19.5 million for 2023–24 under the last Commonwealth Budget ‘to continue work to improve the security of critical infrastructure assets and assist owners and operators to respond to significant cyber-attacks’.

The Government will also scale up ASD’s Critical Infrastructure Uplift program.

This is wonderful. I’ve met the folks running CI-UP and they’re doing excellent work.

If only all CNI operators here stopped being apprehensive about ASD/ACSC being regulators (they’re not) and thus stopped hindering the agencies’ ability to help them under CI-UP, per conversations I’ve had.

As part of this framework, the Government will consult with industry on developing enhanced review and remedy powers, including the power to direct entities to uplift risk management plans if they are seriously deficient.

Adding a bit more teeth to the thrust of SOCI Act ss 30AE-30AF, to mix metaphors?

Shield 6

Australia will continue to drive development and implementation of high-quality digital trade rules…

Awkward — the Yanks pulled support for such rules at the WTO, cancelled digital trade negotiations under the IPEF and are loath to doing new bilateral FTAs.

--

--

A Techno-Legal Update

Vignettes from the intersection of law and technology, and a word or two about sport. Composed by Ravi Nayyar.