Perception is Truth: Trust and Reputation in Cyber Security

Perceptions of trust

A piece of research published recently by Ponemon Institute has found that consumer trust in certain industries may be misplaced:

  • 68% of consumers say they trust healthcare providers to preserve their privacy and to protect personal information. In contrast, only 26% of consumers trust credit card companies
  • Yet, healthcare organisations account for 34% of all data breaches while banking, credit and financial organisations account for only 4.8%. Banking, credit and financial industries also spend two-to-three times more on cyber security than healthcare organisations

It is worth noting that the research was conducted before WannaCry hit the NHS: it would be interesting to see whether the perception of trustworthiness in healthcare providers has been impacted by that high profile incident.

Why do some sectors have a much stronger image of trustworthiness when it comes to cyber security, even when this contradicts reality? Is this consumer wishful thinking, media reporting bias or something else? Should organisations that are not rated highly for trustworthiness be more vocal about the efforts they undertake to be more secure?

Is brand protection part of your job?

The survey The Impact of Data Breaches on Reputation & Share Value: A Study of Marketers, IT Practitioners and Consumers in the United Kingdom, sponsored by Centrify, also found that IT practitioners do not believe that brand protection is their responsibility:

  • 71% of IT respondents in the research do not believe that protecting their company’s brand is their responsibility
  • However, 43% of these respondents do believe a material cybersecurity incident or data breach would diminish the brand value of their company

The research surveyed three groups (IT practitioners, Chief Marketing Officers (CMOs) and consumers) to ascertain their perspectives on data breaches.

Perhaps unsurprisingly, CMOs allocate more money in their budgets to brand protection than IT:

  • 42% of CMOs surveyed say a portion of their marketing and communications budget is allocated to brand preservation and 60% of these respondents say their department collaborates with other functions in maintaining company brand
  • Only 18% of IT practitioners say they allocate a portion of the IT security budget to brand preservation and only 18% collaborate with other functions on brand protection

Should IT practitioners be more proactive in pursuing brand preservation? Or, is it the responsibility of organisations to encourage their IT departments to be more engaged in reputational protection?

Impact blindspots

The loss of stock price seems to be a blind spot of CMOs and IT practitioners. Reputation loss due to a data breach is one of the biggest concerns to both IT practitioners and CMOs, and yet:

  • Only 23% of CMOs and 3% of IT practitioners say they would be concerned about a decline in their companies’ stock price
  • In organisations that had a data breach, only 5% of CMOs and 6% of IT professionals say a negative consequence of the breach was a decline in their companies’ stock price

IT practitioners and CMOs share the same concern about the loss of reputation as the biggest impact after a breach, but after that, the concerns are specific to their function.

For CMOs, the top three concerns about a data breach are:

  • Loss of reputation (67% of respondents)
  • Decline in revenues (53% of respondents)
  • Loss of customers (46% of respondents)

For IT, the biggest concerns are:

  • The loss of their jobs (63% of respondents)
  • Loss of reputation (43% of respondents)
  • Time to recover decreases productivity (41% of respondents)

What are the implications on cyber security for an organisation when the marketing department is so externally focused and the IT department is internally focused?

By Dr Jessica Barker



Read More

The Human Nature of Cyber Security

A couple of months ago, I was invited to Seattle to discuss the human elements of cyber security for Microsoft Modern Workplace. We talked about topics like communication, fear, social engineering, how to engage with senior executives and the idea of people as the ‘weakest link’. It inspired me to pull together some of my thoughts and work regarding how I define the human nature of cyber security.


“If you engage in changing your culture, if you engage in empowering your staff… then people go from being the weakest link to the biggest part of defence” Dr Jessica Barker speaking to Microsoft


Bruce Schneier popularised the concept in 1999: cyber security is about people, process and technology. Yet almost two decades later, the industry still focuses so much more on technology than the other dimensions of our discipline. My consultancy work has always centred on the human nature of cyber security and I’ve been talking publicly (at conferences and in the media) about this subject in the four years I’ve been running my business. In the last year or so, it’s been gratifying and encouraging to see industry and the government really start to actively and vocally engage with the sociological and psychological dimensions of cyber security.

For a long time, when the industry has considered the human nature of cyber security, it has been within the context of a narrative that ‘humans are the weakest link’. My argument has long been that if that is the case, then that is our failing as an industry. Cyber security relies on engaging, educating and supporting people, and enabling them to create, share and store information safely. If people are failing to conceive of the threats online, if they are unable to appreciate the value of the information they are handling, and if they are struggling to enact the advice we provide to keep their information more secure, then we are not doing our job. At the recent NCSC flagship conference in Liverpool, Emma W expressed it perfectly: 



For security to work for people, we need to communicate with people clearly, which means speaking in a way people understand. This sounds straight-forward, but we use terms in this industry which most people don’t get, including terms that most of us would probably not think of as technical, like two-factor authentication. There is even a disconnect between what we, in the industry, call our profession and what the general public call it. We need communication skills to get people engaged with the subject, to empower them to behave more securely online and to get senior support of our work. We know from psychology that the more clearly and concisely we can communicate a message, the more likely people are to actually engage with it (known as the fluency heuristic).

I often hear that the solution to the ‘people problem’ is awareness-raising training. The logic behind this makes sense, but lacks nuance. Awareness is not an end in itself: you want to raise awareness to change behaviours and contribute to a positive cyber security culture. My research, such as this on passwords from a few years ago, suggests that awareness of cyber security is actually quite high, but we’ve been less successful in changing behaviours.

Why have we had less success in changing behaviours? One reason, of course, is that we put too much responsibility for security onto people. This, in turn, leads to security fatigue, as reported by NIST last year.



An inevitable part of cyber security awareness-raising involves talking about threats, which essentially embodies a ‘fear appeal’ (a message that attempts to arouse fear in order to influence behavior). Research on the sociology and psychology of fear teaches us that we have to be very careful using fear appeals. If we talk about scary things in the wrong way, this can backfire and lead to denial (“hackers wouldn’t want my data so I don’t need to worry about security”) or avoidance (“the internet is the wild west so I try not to use it”). I’ve talked in much more detail about the implications of the sociology and psychology of fear on cyber security elsewhere, such as here.

Why are these nuances so often overlooked when it comes to the design and delivery of cyber security awareness-raising training? Partly, it is because the people responsible for the training are usually experts in technology and security, but not in people, as this research from SANS Securing The Human shows (pdf link). Exacerbating this, how many organisations train their IT and information security teams in issues relating to sociology, psychology and communications? When it comes to awareness-raising, all of our focus is on training people about cyber security; we don’t train cyber security professionals about people. I spoke about this issue at Cybercon recently and the NCSC picked up on this at their flagship event, quoting a comment I made about the need to train not just ‘users’ in technology, but also technologists in ‘users’:


'People: The Strongest Link' Emma W, Cyber UK, March 2017
‘People: The Strongest Link’ Emma W, Cyber UK, March 2017


Lacking training, technical professionals are unaware of the psychological power of so-called hot states. Cyber criminals, on the other hand, are well aware of the psychological irresistibility of temptation, curiosity, ego and greed, which is why so many social engineering attacks capitalise on the power of these triggers.

Without an understanding of why people do what they do, is it any surprise that when people click links in phishing emails, the technical team will label them ‘stupid’? To the IT or infosec team, when people who have been trained to be wary of suspicious-looking emails continue to click links in suspicious looking emails, they are being illogical and stupid. The problem with this (other than it not being true and not being very nice, of course) is that ‘the user is stupid’ narrative only creates more disconnect between cyber security and the rest of the business. When we expect little of people, we get little back (the Golem Effect) and when we expect a lot, we get a lot (the Pygmalion Effect).

Another problem with the ‘people are stupid’ narrative is that it undermines people within our industry, too. There is a tendency, in our culture, to tear people down when they make a mistake or get something wrong. Not only does this contribute to a culture of imposter syndrome, but it arguably also makes our organisations more insecure, too. Human vulnerabilities can lead to technical ones, as I discuss here. If we continue to lazily push the ‘people are stupid’ narrative, we continue to be a big part of the problem, not the solution.

By Dr Jessica Barker


NB: there are many, many other elements to the human nature of cyber security, of course. For example, I haven’t even begun to tackle the motivations of malicious cyber actors or issues around management and organisational learning here. I’ve also barely scratched the surface of the culture(s) of our industry or ethics and cyber security in this blog post. Nor have I commented on diversity, what that means and why it matters. I’ll save my thoughts on those topics, and more, for another day.


Read More

The Encryption Debate

Discussing the encryption debate on C4 News
Discussing Encryption on C4 News

I spoke on Channel 4 News earlier this week about the debate surrounding end-to-end encryption. The debate, which is often framed in terms of privacy vs security, emerged last weekend when Amber Rudd (the UK’s Home Secretary) argued that it was “completely unacceptable” that the government could not read messages protected by end-to-end encryption. Her comments were in response to reports that Khalid Masood was active on WhatsApp just before he carried out the attack on Westminster Bridge on 22 March 2017. Rudd was, therefore, talking in this case about WhatsApp, although her comments obviously have connotations for other messaging services, and the use of encryption in general.

WhatsApp explain what end-to-end encryption means when using their app:

“WhatsApp’s end-to-end encryption ensures only you and the person you’re communicating with can read what is sent, and nobody in between, not even WhatsApp. Your messages are secured with a lock, and only the recipient and you have the special key needed to unlock and read your message. For added protection, every message you send has a unique lock and key. All of this happens automatically: no need to turn on settings or set up special secret chats to secure your messages” WhatsApp FAQs

This is not the first time that anti-encryption messages from the government have hit the headlines. In 2015, for example, then-PM David Cameron called for encryption to be banned. As I mentioned earlier, the argument is often presented as being about privacy vs security. Those in favour of banning encryption argue that it would protect national security, by preventing malicious actors (such as terrorists) from being able to communicate in secret. Those who make this argument claim that this overrides any individual right to privacy. The government have a challenging job to do and surely this is never more challenging than in the wake of a terrorist attack. We also cannot expect ministers to be subject matter experts on all issues which are presented before them, let alone have a deep understanding of technical complexities.

The issue, from my point of view, is that this is not about security vs privacy at all. To ban or undermine encryption has security implications, given that encryption underpins online banking, financial transactions and more sensitive communications. The UK government has pledged to make this country the most secure place in the world to do business online, which appears at odds with their messages on encryption. The true debate we need to have, then, is about security vs security.

By Dr Jessica Barker


Read More

Ransomware on Radio 2 with Jeremy Vine

I recently spoke to Jeremy Vine and his listeners on Radio 2 about ransomware.

Action Fraud reported last year that 4,000 people in the UK have been a victim of ransomware, with over £4.5 million paid out to cyber criminals. As these are the reported figures, it is unfortunately guaranteed that the number of people impacted, and the sum paid out to criminals, will be significantly higher.

The first known ransomware was reported in 1989 and called the ‘AIDS Trojan’. It was spread via floppy disks and did not have much of an impact. Times have changed since 1989 and ransomware as a means of extortion has grown exponentially in recent years due to a combination of factors:

  • Society’s growing use of computers and the internet
  • Developments in the strength of encryption
  • The evolution of bitcoin and the associated opportunity for greater anonymity

Last year we saw reports of strains whereby victims are promised the decrypt key if they infect two of their contacts (called Popcorn Time) and others in which criminals promise to unlock data when the victim reads some articles on cybersecurity (known as Koolova). Ransomware-as-a-service, in which criminals essentially franchise their ransomware tools on the dark web, appears to be very profitable for criminals, with Cerber ransomware reportedly infecting 150,000 devices and extracting $195,000 in ransom payments in July 2016 alone.

Listen to my chat with Jeremy Vine and his listeners for more information on ransomware and what to do if you’re hit. *Spoiler*: I recommend offline back-ups a lot and plug The No More Ransom Project, an initiative between law enforcement and IT Security companies to fight against ransomware.


By Dr Jessica Barker

Read More

Digital Guardian article – GDPR: Getting the Board on Board

For my second Digital Guardian blogpost, I continued looking at GDPR. As is the case with many cybersecurity projects, getting senior-level support for GDPR compliance efforts requires effective communication. As research from (ISC)2 has highlighted, one of the biggest challenges with GDPR projects is securing senior-level support (and the budget that goes with it). Read what I have to say in Digital Guardian for some tips on how to get the board on board.

By Dr Jessica Barker

Read More

Digital Guardian article: what does GDPR mean for you?

In the first of a series of blog posts I am writing for Digital Guardian, I have tackled the General Data Protection Regulation (GDPR) and what it means for companies worldwide. To find out what GDPR is and my top ten points on why it matters, read the blog post here.

It’s enforceable from 25 May 2018, which sounds like a long time away, but as time moves quickly and organisations tend to move slowly, you should start preparing for GDPR now. One of the key problems, however, seems to be getting the leadership of organisations to fully engage with GDPR and recognise that preparing for it is a strategic, as well as IT-related, activity. With this in mind, in my next article for Digital Guardian I will be exploring what to do – and how to do it – to get the business level of an organisation engaged and on board with a project like GDPR implementation.


GDPR is Coming
GDPR is Coming

By Dr Jessica Barker


Read More

Infosecurity Magazine Profile

I'm very happy to be featured in this month's Infosecurity Magazine
I’m very happy to be featured in this month’s Infosecurity Magazine

Michael Hill interviewed me for Infosecurity Magazine about my background, some of the big consultancy projects I carried out last year, the media work I do and much more. You can read or download the magazine here.  As always, it’s an excellent read, with articles on the cyber security implications of Trump’s presidency, an analysis of the future of encryption and a thought-piece on whether and when hacking back is ever legitimate.

Infosecurity Q1, 2017
Infosecurity Q1, 2017

By Dr Jessica Barker

Read More

Predictions for Cyber Crime in 2017: what small businesses need to know

Cyber crime can be pretty indiscriminate, with businesses of all sizes falling victim to attacks. For smaller businesses it can be particularly challenging to receive good cyber security information and advice. With this in mind, I contributed to an article that explores:

  • top cyber crime predictions for 2017
  • what small businesses can do to better-protect themselves
  • the future of cyber crime – what’s on the horizon?

Read what myself and other cyber security professionals have to say about ransomware, the Internet of Things, spear-phishing, Artificial Intelligence, and more.

By Dr Jessica Barker

Read More

Coping with Passwords on the Radio

Most people in the UK returned to work this week after the festive break and I joined Radio 2’s Drivetime show, presented by Simon Mayo, to talk about one of the pitfalls: forgetting your passwords, having not logged in for a couple of weeks.

Take a listen to my interview with Simon Mayo below for my thoughts and tips on what makes a more secure password (and why) and how to cope with many complicated passwords at once (if you don’t want to use a password manager).


My parting advice in the interview is the importance of two-factor authentication, for advice and support in doing this, check out this website.

By Dr Jessica Barker

Read More