“Our investment-strategy was simple. People hate to think about bad things happening so they always underestimate their likelihood – Charlie Geller, The Big Short.
The above quote was one of many haunting lines from the now-classic movie The Big Short, which covered the period leading up to one of the most serious financial collapses in recent memory. Looking back now, a decade on from the financial crisis, the signs of a credit bubble seem obvious, yet very few (and now very rich) people saw it coming.
For the majority – including some of those most exposed – the risks seemed remote; mortgage credit was plentiful and housing buyers were bullish. When debt obligations turned to defaults, however, the consequences were devastating, taking down some of the world’s most illustrious financial institutions, obliterating many pension and investment funds, and requiring prompt government intervention to prevent further repercussions.
In the world of cyber security, risk tends to be viewed subjectively. A global insurance company faces the risk of having a database of customer payment details stolen, a health provider faces the risk of a ransomware attack to retrieve customer records, or an online retailer faces the risk of a DDoS attack that takes down its store front. These risks are typically well understood, and solutions exist to take preventative or restorative action. But there’s a greater macro risk that exists, one that the cyber security industry is blindsided to, yet may have catastrophic consequences if it comes to be realised.
Understanding technical debt
To explain this macro risk, I must first cover the concept of technical debt. At first the concept may seem peculiar – we understand debt in monetary terms, where one party has an obligation to pay another at a future date. In software development however, the term ‘technical debt’ connotes the implied cost of reworking code at a later date. When a dev team ships software too soon, or takes shortcuts towards their desired outcome, this debt is the time that will be required to update and fix bugs in future.
While this isn’t problematic in itself, it can lead to significant risks if the debt is not repaid and the issues become ‘too costly’ (i.e. too resource intensive) to fix later in the process. From a user perspective, this process of debt repayments may be seen in software updates. V.1.0 of the software contains not only the functionality of the code, but an implicit ‘debt’ that any bugs or other imperfections will be ironed out in v.1.1. If the bug isn’t addressed until v.1.7 or even v.3.6.1 it may be so interwoven in the programme that it’s too arduous for any developer to resolve. We can think of this as the ‘debt’ accruing ‘interest’.
Our own impending debt crisis
Many years ago, renowned VC Marc Andreessen made the prescient observation that ‘software is eating the world’. In the time since, this phenomenon is becoming more obvious as ‘softwarisation’ pervades every industry. For example, a single iPhone app requires around 50,000 lines of code, while Google’s entire codebase amounts to around two billion lines for all its services. Software is so ubiquitous across our professional and personal lives that defective code can create potential security flaws too broad and complex to be fixed after the fact.
Not only are these flaws hard to fix, they are also subject to what’s known as ‘contagion risk’. This concept was popularised in the financial crisis and embodies the idea that when one market ‘crashes’ it has significant knock on effects for interconnected markets, creating an exponential systematic risk. This concept applies also to security debt, where code is shared across multiple businesses and networks and so a serious security flaw can rapidly reach epidemic proportions.
It’s this unaddressed technical debt that we should be wary of. Coding flaws can quickly become opportunities for exploit, and as we add more connected devices and endpoints to the equation the attack vectors broaden and the risks compound. At the extremes, it could take just a few lines of bad code to turn entire network environments toxic. Mirai, Wannacry and NotPetya are just a few recent examples of how attacks on invisible flaws can escalate to have devastating impact.
Anticipating black swans
In his seminal 2007 book The Black Swan, author Nassim Taleb introduced the concept of ‘black swans’: unanticipated events that have massive consequences -the risks for which are only fully appreciated in hindsight. His argument is that such events (like a large-scale cyber attack) cannot be adequately predicted, but steps can be taken to ensure the system (in this case the cyber security landscape) is sufficiently robust to withstand its impact.
To explain this concept further, Taleb uses the metaphor of a turkey at Thanksgiving; what is a ‘black swan’ surprise for the turkey (i.e. being the highlight of the dinner menu) was not a surprise for the butcher. He implores us to be like ‘the butcher, not the turkey’ and avoid being lulled into a false sense of security so as to strengthen our ability to act when faced with unexpected risks.
This same concept applies to us in information security today. We know that in the ever-expanding mass of software that exists, the probabilities of there being flaws that may be exploited at scale are real. We therefore have a choice: become more robust to the risks or face dealing with their consequences. Being on the more desirable side of this dilemma requires us firstly to understand the nature of the macro risk, but more deeply to recognise the range of stakeholders needed to address and minimise it. Just as the financial crisis had implications far beyond the mortgage sector, a similar crisis in security goes beyond a single software firm or an IoT device manufacturer.
Building robustness into the system requires the support and collective action of government authorities, other businesses and even other departments within our own businesses. In order to bring this broader range of stakeholders together we need a communal language and way of understanding risk that can help galvanise the response – in essence, lose the jargon and bring everyone into the conversation.
Letting the air out of the bubble
In this blog we’ve discussed the risks of technical debt and why the interest accumulated on software could become a multi-billion dollar ‘bubble’ that, if burst, the IT industry alone will be unable to contain.
In order to tame this bubble, it’s incumbent on industry leaders to learn from the experience of the financial services industry and tackle this issue with a proactive approach, rather than relying on authorities to ‘bail out’ the industry after a major security crisis. Achieving this requires a commitment to clear, simple language that can bring more stakeholders around the table for a multilateral, coordinated and robust response.