Friday 12 May 2017 was a Black Friday in the truest sense of the word, not a day of panic in trying to grab a bargain in discounted sales but a day that witnessed a global ransomware attack now known as WannaCry. The attack was random and whilst one of the major victims was our NHS, it was certainly not targeted. The cyber-attack affected some 100 countries and in excess of 200,000 computers. The exact numbers and full extent will never be known. Perhaps more surprisingly the cost to the NHS will also not be known as, despite investigation by the Department of Health and a report from the National Audit Office, we are informed that the cost is not calculable; much of the data as to the full impact of the attack is seemingly lost or unavailable. If true, there are shoddy systems in place at the NHS.
There were certainly shoddy systems in terms of IT and cyber security. For a start, the infection by the WannaCry ransomware was entirely avoidable. Every single NHS organisation that was infected by WannaCry had unpatched or unsupported Windows operating systems that enabled virus infection. Significantly, in March 2017 Microsoft had issued updates that NHS trusts using Windows 7 could have adopted to protect themselves. Further, on 17 March 2017, NHS Digital had issued a CareCERT asking NHS trusts to apply the Microsoft update. If the Department of Health’s figures are to be relied upon, more than 90% of the devices in the NHS are operating on Windows 7 so that 90% of those devices would have been protected if they had been patched in line with the NHS Digital request. Trusts running older Windows XP operating systems on devices had been expressly notified that they were to migrate away from their use, yet when the attack came on 12 May 2017, approximately 5% of the NHS was still reliant on an outdated Windows XP operating system. Windows XP can however be patched and following the attack, Microsoft issued an XP update that would have prevented the ransomware infection.
This non-targeted ransomware attack was spread via the internet and caught the NHS which was exposed due to its unpatched windows systems. Even this exposure would not have been fatal had effective firewalls been in place to repel the threat but there was no such line of defence because firewalls had not been maintained so that even this basic shield was missing. Prior to this Black Friday, the NHS had no joined up cyber security and a culture of woeful non-compliance; as at 12 May 2017 only 88 out of 236 trusts had been subject to a cyber security inspection by NHS Digital. Of the 88 inspected, not a single trust passed. The inspections were voluntary and CareCERTs requesting updates and other basic cyber security measures were treated as being voluntary and largely ignored. The NHS trusts were silos and the Department of Health had no knowledge as to which had complied with the requests. The Department of Health was itself unprepared; it was warned a year before the attack that it was at risk yet did not provide any written report in response until two months after the attack in July.
So what happened after the initial breach?
Sadly the NHS had no proper breach response plan or, if it did, it did not have one worth having. History tells us that one of the key features of a cyber-attack is the communication blackout that follows. It was Maersk’s lack of preparedness for this that caused such bewilderment and the same was true of the NHS. The very first hurdle, the loss of key communication systems, was not properly prepared for and staff were left scrabbling for personal mobiles in order to try and send WhatsApp messages, subject to the contact being within their personal contact list. Roles, responsibilities and reporting lines were not properly defined with the result that emergency calls were made to various local, national agencies and emergency services in the uncoordinated disorganised panic that followed the attack.
It is arguably better not to have any breach response plan than one that is merely a box ticking exercise that leads, as here, to complacency and increased confusion when the attack hits. Incident response plans should be tested in a realistic way - there needs to be a drill where systems are not available for use, staff become familiar with whom and how they make contact and a step by step means of limiting damage and restoring and recovering systems.
In conclusion, they had a woefully inadequate breach response plan which arguably wasn’t a plan at all but rather an unpractised and ineffective hypothetical policy that none of the key personnel were sufficiently familiar with. The recovery was aided by a cyber security researcher who activated a kill switch; his action prevented WannaCry locking out further systems and devices. That was by luck or intuition rather than design as it was not in pursuit of any implemented national cyber security policy. NHS England’s IT department did not even have on call emergency facilities in place so that there was a reliance on IT staff attending work voluntarily to assist in fire fighting. The National Cyber Security Centre and National Crime Agency also pitched in, assisting the NHS and other affected organisations - it is unclear just how much worse the lines of communication and impact might have been but for that external assistance.
The disjointed structure of the NHS gives little cause for hope. The Department of Health has overall responsibility for cyber security but this is delegated down to a myriad of trusts, GP practices and social care providers. History tells us that these organisations do not all march in step and have previously failed to heed warnings or requests. The NHS has now declared ‘the need to improve the protection from future cyber-attacks’ but how will it actually implement such a statement of intent when it comprises silos that are seemingly ungovernable? It sets out a number of key measures namely:
This all sounds rather trite. The NAO report found a cyber-breach response plan had already been developed on 12 May when the attack hit. It was not the absence of a plan but rather the inability to put any plan into practice that was at the heart of the failure. That can only be taught through cyber drills that replicate the loss of communication and key system support.
There needs to be a scheme of regulation and a compliance regime with teeth to ensure that there are routine checks and sanctions for those who fail to adhere to CareCERTS. In terms of practical steps, the Department of Health should be setting a minimum number of drill targets, rather like fire drills backed by mandatory inspections by NHS Digital or external inspectors. If an organisation fails an inspection, there should be immediate action to remedy and a follow up test. No doubt the NHS and its constituent parts will take cyber-attacks more seriously going forward but deeds not words are required. On this evidence, it is hard to be convinced that this will happen.