The Human Nature of Cyber Security

A couple of months ago, I was invited to Seattle to discuss the human elements of cyber security for Microsoft Modern Workplace. We talked about topics like communication, fear, social engineering, how to engage with senior executives and the idea of people as the ‘weakest link’. It inspired me to pull together some of my thoughts and work regarding how I define the human nature of cyber security.

“If you engage in changing your culture, if you engage in empowering your staff… then people go from being the weakest link to the biggest part of defence” Dr Jessica Barker speaking to Microsoft

Bruce Schneier popularised the concept in 1999: cyber security is about people, process and technology. Yet almost two decades later, the industry still focuses so much more on technology than the other dimensions of our discipline. My consultancy work has always centred on the human nature of cyber security and I’ve been talking publicly (at conferences and in the media) about this subject in the four years I’ve been running my business. In the last year or so, it’s been gratifying and encouraging to see industry and the government really start to actively and vocally engage with the sociological and psychological dimensions of cyber security.

For a long time, when the industry has considered the human nature of cyber security, it has been within the context of a narrative that ‘humans are the weakest link’. My argument has long been that if that is the case, then that is our failing as an industry. Cyber security relies on engaging, educating and supporting people, and enabling them to create, share and store information safely. If people are failing to conceive of the threats online, if they are unable to appreciate the value of the information they are handling, and if they are struggling to enact the advice we provide to keep their information more secure, then we are not doing our job. At the recent NCSC flagship conference in Liverpool, Emma W expressed it perfectly: 

For security to work for people, we need to communicate with people clearly, which means speaking in a way people understand. This sounds straight-forward, but we use terms in this industry which most people don’t get, including terms that most of us would probably not think of as technical, like two-factor authentication. There is even a disconnect between what we, in the industry, call our profession and what the general public call it. We need communication skills to get people engaged with the subject, to empower them to behave more securely online and to get senior support of our work. We know from psychology that the more clearly and concisely we can communicate a message, the more likely people are to actually engage with it (known as the fluency heuristic).

I often hear that the solution to the ‘people problem’ is awareness-raising training. The logic behind this makes sense, but lacks nuance. Awareness is not an end in itself: you want to raise awareness to change behaviours and contribute to a positive cyber security culture. My research, such as this on passwords from a few years ago, suggests that awareness of cyber security is actually quite high, but we’ve been less successful in changing behaviours.

Why have we had less success in changing behaviours? One reason, of course, is that we put too much responsibility for security onto people. This, in turn, leads to security fatigue, as reported by NIST last year.

An inevitable part of cyber security awareness-raising involves talking about threats, which essentially embodies a ‘fear appeal’ (a message that attempts to arouse fear in order to influence behavior). Research on the sociology and psychology of fear teaches us that we have to be very careful using fear appeals. If we talk about scary things in the wrong way, this can backfire and lead to denial (“hackers wouldn’t want my data so I don’t need to worry about security”) or avoidance (“the internet is the wild west so I try not to use it”). I’ve talked in much more detail about the implications of the sociology and psychology of fear on cyber security elsewhere, such as here.

Why are these nuances so often overlooked when it comes to the design and delivery of cyber security awareness-raising training? Partly, it is because the people responsible for the training are usually experts in technology and security, but not in people, as this research from SANS Securing The Human shows (pdf link). Exacerbating this, how many organisations train their IT and information security teams in issues relating to sociology, psychology and communications? When it comes to awareness-raising, all of our focus is on training people about cyber security; we don’t train cyber security professionals about people. I spoke about this issue at Cybercon recently and the NCSC picked up on this at their flagship event, quoting a comment I made about the need to train not just ‘users’ in technology, but also technologists in ‘users’:

'People: The Strongest Link' Emma W, Cyber UK, March 2017
‘People: The Strongest Link’ Emma W, Cyber UK, March 2017

Lacking training, technical professionals are unaware of the psychological power of so-called hot states. Cyber criminals, on the other hand, are well aware of the psychological irresistibility of temptation, curiosity, ego and greed, which is why so many social engineering attacks capitalise on the power of these triggers.

Without an understanding of why people do what they do, is it any surprise that when people click links in phishing emails, the technical team will label them ‘stupid’? To the IT or infosec team, when people who have been trained to be wary of suspicious-looking emails continue to click links in suspicious looking emails, they are being illogical and stupid. The problem with this (other than it not being true and not being very nice, of course) is that ‘the user is stupid’ narrative only creates more disconnect between cyber security and the rest of the business. When we expect little of people, we get little back (the Golem Effect) and when we expect a lot, we get a lot (the Pygmalion Effect).

Another problem with the ‘people are stupid’ narrative is that it undermines people within our industry, too. There is a tendency, in our culture, to tear people down when they make a mistake or get something wrong. Not only does this contribute to a culture of imposter syndrome, but it arguably also makes our organisations more insecure, too. Human vulnerabilities can lead to technical ones, as I discuss here. If we continue to lazily push the ‘people are stupid’ narrative, we continue to be a big part of the problem, not the solution.

By Dr Jessica Barker

NB: there are many, many other elements to the human nature of cyber security, of course. For example, I haven’t even begun to tackle the motivations of malicious cyber actors or issues around management and organisational learning here. I’ve also barely scratched the surface of the culture(s) of our industry or ethics and cyber security in this blog post. Nor have I commented on diversity, what that means and why it matters. I’ll save my thoughts on those topics, and more, for another day.