Ignoring the fact that we are all human in information security

information security

Sometime in the next 7 days you will be at risk because of a “data breach”. This is likely caused by human factors, or compliance fatigue. People returning from their Christmas break will have forgotten their passwords, will be keen to get stuck into their work. They may overlook the practices and processes they followed, and take short cuts. Combined with the fatigue they may feel about returning to a job they don’t like, or trauma at home such as relationship problems or family break-ups which are so prevalent post-holiday season in the UK, this can cause data compliance issues.

Why is this relevant? Well, we are humans, we are not machines. We are not conditioned, or able, to follow rigorous procedures against a backdrop of productivity requirements and decision making. In most organisations Information Security is often designed and developed by those who do not use the policies, procedures and processes used to keep data safe; at least, not on a daily basis. Technological solutions are often designed by IT security experts with little interaction with those who use the tools on a daily basis. Data protection policy is often written by Compliance professionals, and some lawyers who demand 100% allegiance to their policy. There is very little regard to the factors that affect us as humans leading to compliance fatigue.

Compliance Fatigue

Compliance fatigue is where a constant stream of instructions or complex rules have been developed because it is the only way to enforce humans to act like machines. My recent trip to Japan proved that the more rules there were, the more shouting through mega-phones; the pointing of wands by people helping you cross the road; signs telling you not to do something – the more humans could ignore them and pick only those they felt aided to their productive day. Taking shortcuts where possible and ignoring the rest.

In Data Protection we are in danger of using the term “culture change” to try and enforce an extra burden on employees. We expect another good dose of training and awareness to solve the problems we have as humans. The culture change being a new way of adding layers of complexity to day jobs of employees.

We need to address cultural change in a different way. Developing employee centric engagement at the start not the end of a privacy programme can help.

Designing Security

In the 1975 paper, The Protection of Information in Computer Systems, Jerome Saltzer and Michael Schroeder established ten principles for designing security[1].

Two of those principles are rooted in the knowledge of behavioural sciences:

Psychology
  • the security mechanism must be ‘psychologically acceptable’ to the humans who have to apply it;
Human Factors and Economics
  • each individual user, and the organisation as a whole, should have to deal with as few distinct security mechanisms as possible.

100 years before Schroeder & Saltzer, the founding father of cryptography Auguste Kerckhoffs formulated six principles for operating a secure communication system. These focused on human factors. For example: “it must be easy to use, and must neither require stress of mind nor the knowledge of a long series of rules”.

Over the past 20 years, there has been a growing body of research into the underlying causes of compliance fatigue and security failures. The insight that has emerged is that security measures are not adopted because humans are treated as components, whose behaviour can be:

  • specified through security policies
  • controlled through security mechanisms and sanctions

But the fault does not lie primarily with the users, as suggested by the oft-used phrase that humans are the ‘weakest link’. Often, the fault is in ignoring the requirements that Kerckhoffs, and Schroeder & Saltzer, so clearly identified. Security needs to be usable, acceptable and effective.

The expectation that all that is needed is a set of standards, policies and training sessions to deliver the CORE programme outputs is unfounded. Our compliance programmes need to ensure that our design “fits the task to the human, not fits the human to the task” .

When employees do not behave as specified by security policies, most compliance practitioners think that the users are at fault: that they ‘just don’t understand the risks’ or ‘are just too lazy’. But research [2] has shown that non-compliance, which we now refer to as ‘rule-bending’, is caused by people facing a stark choice between doing what is right by security and reducing their productivity. Most choose productivity over security, because that is what the organisation also does.

Behaviour is Goal Driven

Behaviour is essentially goal driven. We perform tasks to achieve goals. At work: ‘I want to get this offer to our employee today’; in their personal life: ‘I want to get the best utility deal for us’. To achieve these goals, people complete a series of tasks. We must be aware that compliance fatigue can easily set in with conflicting challenges in employee’s everyday lives.

Therefore, a compliance programme will fail in delivering its objectives without considering how it can address at design phase “human factors”.

Armed with this knowledge and building in not only privacy, but human factors by design, a range of employee supportive solutions can be developed.

These may include:
  • Automated security. For instance, using implicit authentication to recognise authorised users, instead of requiring them to enter passwords many times over.
  • If explicit human action is necessary in a security task, minimise the workload and the disruption to the primary task.
  • Designing processes that trigger security mechanisms such as authentication only when necessary.
  • Design systems that are secure by default, to avoid pushing the load of security configurations and management on to the employees.

The benefit being that we actually make the humans job easier which is always an easier selling message. When employees encounter security policies that are impossible to follow or are clearly not effective, it provides a justification for doubting all compliance policies.

That is why compliance hygiene is essential. When policies are not being followed, compliance professionals must investigate, in a non-confrontational manner, why; and if it is because they are impossible or too onerous to follow. Redesigning the solution is the next step.

Shadow Security

Employees do not show blatant disregard for security, but try to manage the risk the best way they know how. This is what they call shadow security; designing their own solution. These ‘amateur’ security solutions may not be entirely effective from a security perspective. Since they are ‘workable’, asking how could we make that secure? is a good starting point for finding an effective solution that fits in with how people work. The programme recognises employee input into compliance design.

This represents a major change of behaviours, attitude and culture from any organisation, over and above the normal remedies to ensuring “compliance sticks”. Practitioners often respond with security awareness, education and training measures when people do not follow security policies. In practical terms, if people keep being told that the risk is really serious and they must follow policy, but cannot do so in practice, they develop resentment. This generates negative attitudes towards security and the organisation (which is counter-productive).

Three Step Approach

In practice, the three terms: awareness, education and training, are often used interchangeably. These are different elements that build on each other:

Security Awareness

The purpose of security awareness is to catch people’s attention and convince them security is worth the engagement. Given that many organisations face compliance and security fatigue, we need to capture people’s attention, and get them to realise:

  1. data security is relevant to them, that is, the risks are real and could affect them
  2. there are steps they can take to reduce the risk and that they are capable of taking those steps.

Crafting effective awareness messages is not an easy task for compliance professionals. Working with the communications specialists in an organisation can, therefore, help. They know how to craft messages, nudges and scenarios that catch people’s attention, and also how to reach different audiences. They can help integrate them into the overall set of communications to avoid message fatigue.

Security Education

Once people are willing to learn more about data security, we can provide information about risks and what they can do to protect themselves against them. Most people currently have very incomplete and often incorrect mental models on cyber risks. Transforming them into more accurate ones provides a basis on which to build cyber security skills.

However, it is hard to ascertain whether the education leads to more accurate mental models, or just the ones that security professionals expect people to possess.

Security Training

Training helps people to acquire skills. For example, how to use a particular security mechanism correctly, how to recognise and respond to a social engineering attack. In addition to showing people how to do something, we need to support the acquisition of skills. By letting them practise the skills, they can ‘experiment’ with security decision-making and react on their perceptions and biases. Supporting parts of skill acquisition online is possible. However, it is much more likely to be successful when taking place in the context of a social community.

Human Activity is 90% Automatic

A common misunderstanding is that if people complete the three steps and know what to do, their behaviour will change. But knowing what to do and how to do it is not enough. Human activity is 90% automatic, driven by routines or habits stored in the long-term workspace. New security behaviour requires embedding, but its place is occupied by an existing behaviour (similar to an old password).

The adage that ‘old habits die hard’ accurately describes this automatic behaviour. Until we manage to push the old behaviour out, all our awareness, education and training efforts may not yield the changes in behaviour we are seeking. The new behaviour needs to become automatic. This is a challenging undertaking.

Productive activity needs to carry on while we change security behaviour. Therefore, we can only target 1–2 behaviours at a time. We can only embark on changing the next 1–2 only once these have become genuinely embedded. Nor should one conflate security awareness and education with security culture.

Does teaching compliance work?

As my mate Ali Hepher, Head Coach at Exeter Chiefs rugby said:

“the expectation that professional rugby players can be 100% mentally focussed over a 30 match programme fails to recognise that they are human beings and it is impossible to that spot on with your mental focus every time…we are humans after all”

So, it is not employee behaviour we need to change: but those of us within Information Security who think “compliance can be taught”.

[1] J. H. Saltzer and M. D. Schroeder, “The protection of information in computer systems,” Proceedings of the IEEE, vol. 63, no. 9, pp. 1278–1308, 1975. [Online]. Available:

[2] C. Herley, “More is not the answer,” IEEE Security & Privacy, vol. 12, no. 1, pp. 14–19, 2014 & M. A. Sasse, S. Brostoff, and D. Weirich, “Transforming the ‘weakest link’—a human/computer interaction approach to usable and effective security,” BT Technology Journal, vol. 19, no. 3, pp. 122–131, 2001.

Share this short video covering key areas of how to combat data compliance fatigue.

Find out more about our data compliance services and how we can help embed change in your organisation.

related posts

Mel

Looking back at 2024 for DPAS

As we reflect on another remarkable year, I want to take a moment to personally thank you for choosing DPAS. Your loyalty and trust drives us to continually deliver the highest-quality training and services for our clients.

Read More »

Get a Free Consultation