But do we actually understand what cybersecurity is? Let alone how it manifests into its many, often malicious, guises? And most importantly, what should we do about it? In this article we explore not only defining cybersecurity, but elements of the threat landscape, like specific emerging technologies, that will continue to change our experience of it.
Therefore, tackling the basics is the first step, unpacking new and complex components is the second, then how to start to have a holistic view of cybersecurity is the final; all of which is best encapsulated in the security operation centre (SOC).
The SOC is a large organisation tool, hailed as the Holy Grail for protecting against the increasingly complex threat landscape. Yet, there are still right and wrong ways to approach the SOC, so this piece ends with practical guidance for starting things on the correct path.
Defence, security and critical national infrastructure sectors are high-stakes environments, and consequentially risk-averse. But arguably their protection against the ‘invisible’ enemy fails to reflect this.
Death by buzzword
Cybersecurity isn’t new. It’s been around since the first computer in the 1970s, when the first computer virus was discovered. Except, the word ‘cyber’, in all its forms, was scarcely used at the time. Instead, ‘information security’ – abbreviated to ‘infosec’ – prevailed.
Infosec was tangible, self-explanatory and easy to grasp: have information? It needs security.
Meanwhile, cybersecurity, despite still being related to information security, eclipsed much of the conversation from the 90s. This appeared to be a very conscious, vendor-led approach to rebranding within the market.
Soon, cyber crept its way up agendas. It rapidly became a prefix for amorphous and disconcerting words such as ‘attack’, ‘crime’ and ‘criminal’. An endless stream of new viruses, software vulnerabilities and high-profile hacks, plus a cybersecurity industry that invests sizeably in enterprise and consumer-facing advertising, all contributed to this seismic shift in focus. The global cybersecurity market is expected to be worth $352.25 billion, with an annual growth rate of 14.5%, by 2026 (Mordor Intelligence, 2020).
Despite the fact that infosec and cybersecurity are inextricably linked, it is fair to say that ‘cyber’ has had a better PR run. But, as is often the case with buzzword bingo, sound understanding and applicability did not follow, paving the way for the degree of inertia and noise around the topic amongst nation states through to the individual level.
According to an analysis conducted by CyberSec, more than 50% of companies examined were reluctant to invest (or further invest) in cybersecurity; however, after the business case jargon was changed and a risk-based approach to cybersecurity was adopted, 100% of the same companies opted to make an investment.
The cybersecurity jargon (there is even a glossary of terms here) is clearly confusing companies. Although every word has unique relevance, it can be extremely overwhelming to organisational leadership teams and decision makers. A typical response to this kind of haziness around a situation is to translate into layman’s terms, but the criticality may be lost and eventual outcome may be the same: lack of clarity and awareness.
It’s much more simple than we think
According to the UK’s National Cyber Security Centre (NCSC), cybersecurity’s ‘core function is to protect the devices we all use…and the services we access – both online and at work – from theft or damage. It’s also about preventing unauthorised access to vast amounts of personal information we store on these devices, and online’.
The world’s first reported hack dates back to 1878, two years after the telephone had been invented by Alexander Graham Bell. The Bell Telephone Company was forced to fire a group off teenage boys for repeatedly and intentionally misdirected and disconnecting customer calls. From then on, the company chose to only employ female operators.
The fundamental manifestations of the cybersecurity threat have largely remained unchanged ever since, and can be pooled into three categories: theft of IP, theft of money, or just causing harm and damage.
The perpetrators of such tactics can vary, from disgruntled employees, or activists, through to malicious actors trying to wreak havoc and steal from their targets. But ultimately, malicious actors today press on with the same motivations as those from hundreds of years ago, just meandering through different techniques. Even just taking this three-pronged perspective can already help dismantle some of the overwhelming ambiguity.
The biggest change: the world around us
With the consistency of the attack strategies, arguably the most fundamental shifts stem from the world and cultures around us.
Further buzzword mania comes into action when talking about the so-called ‘Digital Age’; ‘5G’, ‘the cloud’, ‘AI’ are all contributing to further noise and are sometimes even misnomers and misunderstood in the context in which they are discussed.
It is indeed true that the pace of change today is immensely fast; when technology development used to present itself in ‘waterfall’ stages – a period of downtime between each stage – now it’s exponential. There is no opportunity for any organisation to get ‘on top’ of the risk, before the landscape has evolved further and technology has iterated once again. Yet, a common misconception is that rapid digitisation has forged its way in, replacing behind many physical elements of our infrastructure, and consequentially diminishing physical risks too.
But for many defence, security and critical infrastructure sectors, the physical infrastructures and risks remain. For instance, aside from disruptor banks like Monzo & Starling, most banks still have branches and physical assets of high value to safeguard. There is still plenty of space for the physical ownership of assets and goods, and to secure this too.
Moreover, if the focus of protecting an organisation becomes solely digital, opportunistic attackers use this to their advantage and go back to the basics, damaging and stealing from physical infrastructures. Therefore, there needs to be an ‘all hands on deck’ approach to cybersecurity – viewing any gaps as vulnerabilities.
There are two emerging technologies today that confront the cybersecurity question in particularly challenging ways, contributing to the dramatic change in landscape that will only gather momentum in the future.
5G
“The future of wireless technology holds the promise of total connectivity. But it will also be especially susceptible to cyberattacks and surveillance.”
This is the opening of The New Yorker’s review of the “terrifying potential” of 5G.
The article cites estimates that “5G will pump $12 trillion into the global economy by 2035, and add 22 million new jobs in the United States alone,” while ushering in “a fourth industrial revolution.”
The exact difference between 5G and our current wireless is sheer speed and latency, supposedly offering up to a hundred times faster speeds, which will reduce, and potentially eliminate, the latency (delay) altogether. This affords the opportunities for a new iteration of the Internet of Things (IoT), in which remote control surgery could occur, and autonomous vehicles will drive safely and become the norm. 5G is essentially the conversion to mostly all-software network, meaning a complete recalibration of the ecosystem and devices from the 21st century that are relied upon. Software comes hand-in-hand with cyber vulnerabilities, so such retooling will perhaps be the toughest part.
For example, the Brookings Institution, a non-profit public policy organisation, has identified five ways in which 5G networks are more vulnerable to cyberattacks than their predecessors in a report titled: “Why 5G Requires New Approaches to Cybersecurity.”
It is not unfounded to imagine lives at stake, either directly during remote surgeries, or indirectly, as we begin to increasingly rely upon these connected devices that are vulnerable.
Industry watchdogs have confirmed that 5G has the potential to worsen existing threats whilst concurrently introducing new ones. The improved speed and latency, compounded by expansive threat vectors, provides fertile ground for attacks, alongside this world of opportunity.
During a roundtable hosted by the non-profit organisation, one of the experts in 5G cybersecurity laid bare his ominous opinions: “It is an exposure that is exacerbated by a cyber cold war simmering below the surface of consumer consciousness.”
Software-defined networks
Software-defined networking (SDN) is an architecture that aims to improve the control of network and flexibility, allowing users to design, build and manage networks. It separates the control and forwarding planes.
According to Cisco: “Implemented in software, these controllers maintain a coherent view of the network domain. To applications and policy engines, SDN looks like a single logical switch.”
But to Mike Capuano, Chief Marketing Officer for Pluribus, it is still much more than that: “Typically, network routers and switches only know about their neighbouring network gear. But with a properly configured SDN environment, that central entity can control everything, from easily changing policies to simplifying configuration and automation across the enterprise.”
A variety of networking trends have played into the central idea of SDN. Distributing computing power to remote sites, edge and cloud computing, and IoT – each of these are enabled by and more cost efficient via a properly configured SDN environment.
The global SDN market size is expected to grow from $13.7 billion in 2020 to $32.7 billion by 2025 at a Compound Annual Growth Rate (CAGR) of 19.0% during the forecast period (PR Newswire, 2020).
The emergence of the software-defined network has certainly overcome the previous hurdle of needing to manage multiple networks at once, ensuring security, credibility and flexibility. A network could have one public-facing, low security network that needn’t be sensitive. Another segment could have much more access requirements with software-based firewall and encryption policies on it, which allows the transfer of sensitive data.
However, due to the separation of the two planes, SDN is vulnerable to more attack vectors than traditional networks. Research has separated the potential for attacks through SDN into three camps: application layer attacks, control layer attacks and infrastructure layer attacks. Examples range from ‘man-in-the-middle attacks’: the switches and controllers are not directly connected for the transfer of information, so the “man-in-the-middle” monitors can intercept important information without being detected and can result in eavesdropping or black hole attacks (infrastructure layer) through to ‘denial of service attacks’: shutting down the network, making it inaccessible to its intended users - these attacks can occur at channel, controllers or between the controller and the switches (control layer).
The COVID effect
Compounding the challenges brought about by some of the emerging technologies, the COVID-19 pandemic is yet another factor.
A great positive of the pandemic has been the recalibration and reflection period. Organisations have realised that they may not need as many physical facilities as once was deemed necessary.
However, with the move towards working from home comes obvious vulnerabilities and the herculean task of securing the home-working environment. For a start, there was a dramatic surge in phishing attacks, hacking on video calls and spam communications.
The Internet Crime Complaint Centre at America’s Federal Bureau of Investigation (FBI) reports that by June, daily digital crime had risen by 75% since the start of stay-at-home restrictions, and that the number of complaints received in 2020 had all but surpassed the total for 2019. This is undoubtedly due to the activities that we have now moved across to the online sphere due to distancing and stay-at-home rules (Economist, 2020).
By June 30th, the US Federal Trade Commission had received almost 140,000 reports since the start of the year, already nearly as many as in all the whole of 2019. And it had had more than 570,000 reports of identity theft—also almost as many as 2019.
So, bearing all of this in mind, where can we even begin to have that holistic view of the cybersecurity threats around us?
A security operation centre, SOC, provides the key to unlocking this perspective. Nonetheless, there have been many occasions in which its purpose is misidentified, and ultimately value is negated.
Security operation centres: getting them right
A SOC is a fundamental pathway towards protecting what matters most, and adapting to the pace of the landscape today. This provides a dedicated resource for monitoring, analysing and protecting an organisation from cyber-attacks, whether from malicious or negligent behaviour or from internal or external sources. You can read more about their purpose and value here.
Up until recently, there had been a mismatch – a recognition that SOCs are a great asset, but a lack of knowledge and skill to correctly implement and develop one. Previously, SOCs have been born out of the IT functions of a business. But this often culminated in the SOC delivering administrative tasks, building an IT ticket log, due to the areas of expertise of its very creators. Alternatively, a security team might set one up to approve tasks that could be automated, assuming the ‘approver’ role, when actually ending up becoming the scapegoat for when tasks fail.
Now, our workforce and training tools, are matching the needs for tackling cybersecurity. For the first time ever, the specialised skillset required for SOCs is being recognised; there are now a handful of certification schemes, and SOC consultancies are growing in number, framing the sector with a professionalism to it that has never been acknowledged hitherto.
Recommendations: start small, working up
A viable route for getting started with a SOC is to adopt a risk-based cyber security approach. This means that when it comes to making cyber security-related decisions, risk is front of mind, instantly flipping the approach to proactive rather than reactive.
Beginning at the small, very core risks is important for gauging who has access to what, and fundamentally what’s most crucial to protect. This could be simply understanding where the ‘vault’ in the bank would be, to shield the most valued resources. Then thinking about creating a security benchmark – not just about having the vault – but making sure that it is locked every time. A SOC is built upon this detective and preventative slant. Again, this approach becomes about measurable exposure to cyber-attacks rather than hunting for a silver bullet. The goal is not 100% security, it is meaningful risk reduction, allowing for pragmatic decisions on resource allocation.
The best method might be to outsource – either a permanent hire or contractor – with a fresh pair of eyes for a purely independent perspective in order to map out the threat model. Sometimes the hardest part is allocating budget for this, but with new data protection provisions worldwide, there is solid reasoning behind such investment.
These regulations allow for issuing a monetary penalty on data breaches, defined by the Information Commissioner’s Office (ICO) as a “breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data. This means that a breach is more than just losing personal data”. The GDPR penalty in the EU, for instance, includes a higher maximum amount of £17.5 million or 4% of the total annual worldwide turnover in the preceding financial year, whichever is higher. The stakes are obviously huge.
With the sheer pace of technology advancements, the new working-at-home environments, SOCs are becoming more vital than ever. The greatly expanded and ever-growing threat vectors, indeed create endless possibilities for incredible but also horrible situations, faster than you can say ‘5G’.