The Human Nature of Cyber Security
The human brain is not a computer
The following is an excerpt from Dr Jessica Barker’s speech at the recent IRISSCON security conference held in the Ballsbridge Hotel in Dublin last Thursday 23rd November. vCloud.ie were delighted to attend again this year as IRISSCON brings together some of the latest thinking and developments within the cyber security world. Dr Jessica Barker is a sociologist and cyber-expert who recently founded RedactedFirm and also runs cyber.uk.
The human brain often makes decisions based on incomplete information. We do not remember or process information in the way a computer does. Often we use what is called heuristics in order to make decisions. This is the process whereby our brain uses shortcuts to come to a conclusion or to make a decision. This allows us to move forward quickly as we are not constantly bogged down by lots of information. Heuristics are usually based on human bias. The problem with this is that we then make imperfect decisions.
In terms of cyber security we can see this when people make poor decisions because they are not acting with all of the information. An example of this would be people not adopting best practices after cyber security awareness training, but instead falling back on old ways of thinking or on old bad habits.
A very prominent heuristic is call Social Proof. This is the process whereby, if you do not know how to make a decision, you look to other people. An example of this is Tripadvisor and Airbnb. You look at the reviews to see if there is social proof that the place you are choosing suits your purpose.
Another example of this is re-using hotel towels. Studies have shown that if you put up a sign saying “most people in this hotel re-use their towels” as opposed to “please save the environment by re-using your towels“, more people will re-use their towels. So, if you want people to take an action tell them that most people are doing it and make it as close to them as you can.
When it comes to cyber security, we in the industry do the opposite. We shout very loudly about how very bad everybody behaves in terms of internet security. We see lots of headlines all the time about how poor password management is, how most people click on malicious links, how the user is the problem. What this does psychologically in terms of social proof is that it says to everybody on a subconscious level that it is fine if you have a bad password because everybody else does as well.
We think that we are scaring people into having better behaviours but what we are really doing is reassuring them that it is OK if they are insecure online because everybody else is too. We need to think about how we can re-frame this. For example, if we run phishing exercises within our company, instead of saying that 30% clicked on the link and that is terrible we should focus on the 70% who did NOT click on the link and that is fantastic. The message here would be – “be more like your colleagues” – Use Social Proof to your advantage.
The Optimism bias
Another heuristic we see as a challenge in cyber security is what’s known as the Optimism bias. We as humans tend to be overwhelmingly optimistic about our future. We generally overestimate the likelihood of good things happening in our lives and underestimate the likelihood of bad things happening. Relating this to cyber security people will most likely think “I am never going to get hacked“, “why would hackers want my data“? This is a problem when it comes to cyber security because, when most of us are faced with this kind of optimism from our executives, users, colleagues and friends then the cyber security industry try and beat this optimism out of people with facts and figures and this does NOT work. It would be far more constructive if we in the industry focused less on trying to scare people and instead focus more on how we can harness that optimism bias and how we can give people the tools and information they need to be more secure.
The stereotype Threat
The very existence of a stereotype puts pressure on individuals who are the subject of that stereotype to mean that they do not perform as well. An example of this is the stereotype around women and maths – that women are not as good at maths as men.
The stereotype threat has a big impact in cyber security. The stereotype that “users are stupid“, that “people are the weakest link“. The more we perpetuate and talk about this kind of stereotype the more we actually make the problem worse. We need to talk about users in a more positive way and empower them and move away from this narrative that people are the problem.
Focus on self-efficacy
Self-efficacy is where people feel they have control over something and they have high assurance in their capabilities. In cyber security this would be where people have the control and capability in carrying out more secure behaviours. e.g having good passwords, carrying out two factor authentication, not clicking on links.
One of the key things when trying to protect your users is having a “Report a Phish” button. This increases someone’s empowerment. At the moment we are telling people that if they find a suspicious email to ignore or delete it. This is not helpful advice because people don’t like to ignore something. By giving them a “Report A Phish” button people can take a positive action.You are also reminding them about cyber security every day with every email that they read. This is called behavioural Priming. The idea that we can plant subtle reminders which will have a positive influence on our behaviour without the person even being aware that they have been influenced.
- Empower people with self-efficacy
- Focus on the positives not on the negatives
- Confront stereotypes
- Prime people on security
- Don’t spread fear, spread hope.
If you have any questions about your own cyber security or you would like to test how secure your IT Systems are then contact our team right now and we would be more than happy to talk.