The recent ransomware attacks have focused lots of minds on cyber security, however many of the solutions being proposed are little more than sticking plasters to the larger underlying issue – namely systems are not secure by default. The ‘trend’ in software has been to launch it, then fix it. This is a very attractive proposition for businesses, as it lets them discover the ideas that work and don’t work, and then iteratively improve them. Most of the gadgets we use in our lives today would not exist without this mentality. However, the dark side of this approach is that almost all software is not secure. The evidence shows that pretty much every system deployed has security flaws. The only question is who finds the flaws first – bad people or good people.
This situation is not going to be viable in the long-term. Technology is becoming a larger and larger part of our lives and you cannot have an ecosystem of software collapsing every few months or years because someone has found a weakness. In 10-15 years all our transport, logistics, energy, and entertainment will be dependent on systems – such an attack could literally kill millions and send us back to the dark ages – the UK emergency committee (amusingly named Cobra) has a saying “we’re 9 meals from anarchy“. A few years ago in the UK we had a ‘petrol’ strike where the tanker drivers refused to deliver fuel, this very nearly broke the entire food supply chain in a few short days.
In IT this is solvable – the IT industry needs to stand up and take some responsibility for the mess we’ve created. Our obsession with innovation and speed is now coming with too high a price. If something doesn’t change, it will kill people. The sticking plaster approach to security becomes more complex and less effective over time, and this will all end in tears if action is not taken. It may take government regulations to force it – hopefully not, as they are blunt tools – but action must be taken. Therefore I’d like to propose a manifesto (having just come out of election season in the UK) we take onboard as an industry…
We will not ship software that does not have:
- Updatability – All software must be able to be securely updated. We know we make mistakes, we know we have to be able to fix them.
- Integrity – All software should have integrity built in. If it’s been messed with it should not just blinding charge ahead doing what the attacker wants.
- Security outcomes, not security features – a tick list of security features is not security. Focus on the outcome of a secure system, not if you’ve got all the ‘usual’ check boxes. This usually means you’ve done a threat analysis and have got a plan for at least the known threats.
- Logging/Telemetry – If you’re being attacked, tell someone – if no-one knows, no-one can respond.
- A diverse ecosystem – if we all use the same tools, libraries, and suppliers, one attacker can take us all out. We need diversity.
Software written in compliance with this manifesto would still be vulnerable to attacks, that is never going to change, however the outcome of such an attack is different. Let’s analyse the Wannacry ransomware attack and what could have been different….
Wannacry started by infecting a small numbers of PC’s and then spreading.
- The first PC it infected should have noticed something changed (Integrity)
- The PC should have reported it – possibly to their users, local admins or Microsoft (Logging/Telemetry).
- That would have allowed Microsoft to notice, warn people and to issue patches (Updatability)
- Users would have been able to take action (patch or turn computer off)
- The computer itself could take action – preventing it’s integrity being further compromised
- If these organizations had a diverse range of technology (say some Macs or Linux) they’d have found themselves not totally stuck when their windows PC’s got infected. This is not to say Mac’s/Linux are perfect for security, but the diversity itself is a benefit.
- Many of these computers would have had anti-virus software installed, but it didn’t work (outcomes not features). Anti-virus software, although useful in some cases, is far from a panacea and gives a false sense of security. If the people running these computers had focused on the outcome, as opposed to just ticking boxes, they may have solved this another way (e.g disconnect them from the internet)
The end result in a compliant ecosystem is that such an attack would be a minor annoyance, and not a disaster. It’s simply building in some resilience.
If we take another (unfortunately common) example – a website with a security blunder in it that allows attackers to gain access to a server. Often these attacks result in data loss, or worse, yet with these axioms there are many places they can be stopped.
- The process of web application suffers a code injection, and runs the attackers code – it should notice (integrity) and report it (logging/telemetry).
- Then the first server gets infected and the attackers starts adding tools to gain remote access – it should notice (integrity) and report it (logging/telemetry).
- The system admin should notice and should take action and update the app, or disable the server to prevent spreading. If the application is built with diversity, the same attack should not easily execute on other servers, thereby preventing rapid spread.
- The hacked server should ensure the credentials it utilized to access other servers (databases etc) are locked to it’s application code (integrity) – preventing the attacker from pivoting onwards.
- The network security should not just rely on the server being trusted, but should insist on applications authenticating themselves (integrity).
- The system admins should have reviewed the security model and made sure hackers can’t pivot (outcomes not features) and that all the expected attack trees were protected.
We are not going to be able to remove security as a problem from IT systems, they are now sufficiently complex we can’t fix it. However we can design a diverse ecosystem of integral software that logs issues, allows updates and is designed to handle the threats it’s likely to encounter. There is a case that some standards are needed. Poorly designed software is analogous to overuse of anti-biotics – everyone who does it has little motivation to fix it. But as a society we are all hurt by this behavior, and over time it’s going to get really serious for us all.