Examining Organizational Cybersecurity User Experience.

·

5 min read

Introduction

In “Things that make us Smart: Defending Human Attributes in the Age of the Machine” (Norman, 1993), renowned usability expert Don Norman examines the interactions between humans and technology. He argues that technology needs to focus on developing models that fit the minds of humans and not try to focus on designs that force us to conform to the machine’s ideals. This book was written nearly thirty years ago, but its issues still apply today. This paper will examine this issue within the scope of modern business, identifying common issues faced by users. It will then explore how security professionals can affect the secure user experience.

Identify The Problem

The pandemic did not just change the working conditions for users and how they interact with an organization, it also changed how malicious groups interact with end users. Within a short time, malicious actors changed tactics to take advantage of the pandemic before organizations could adapt to these new changes (Crossland & Ertan, 2021). This change in malicious tactics began to highlight areas from previous studies around remote work that many organizations may not have properly prepared for.

Until the pandemic, it was acceptable to blame the end user because it “falls into the category of acceptable accident causes” (Hollnagel, 1983). The quick shift towards remote work and the primary example of issues can be found in papers such as “Reigning in the Remote Employee: Applying Social Learning Theory to Explain Information Security Policy Compliance Attitudes” where it was stated “Remote working has been shown to impact an employee’s intended behaviour concerning cyber security policy, with remote working colleagues having an altered perception or security and privacy policy awareness, as well as lower intentions and ability to comply with information security” (Johnston et al., 2010).

Many of these issues were previously known to cybersecurity professionals. However, the cybersecurity industry was implementing controls based on flawed data. This risk and the resulting malicious focus were not,” detected via standard survey questions used by information security surveys (which ask about security behaviour alone) - but only when put in the context of other organizational goals. Taken together with the results from the qualitative studies this highlights that most of the studies in this review - which focus on security behaviour alone - may produce results that will not apply in real-world circumstances because their “tunnel vision” on security ignores the other factors driving security behaviour (Karlsson et al., 2017)”.

Solutions

As users began to move remotely, cybersecurity policies adapted in a way that was not congruent with the common idea of “trust but verify” used in the industry. Users already considered navigating cybersecurity information “as ‘intimidating’ and ‘overwhelming’ leading to practices of avoidance” (Slupska et al., 2021)”. Cybersecurity policies, and thus the methods to enforce them did not always remember that “goals, values and norms drive behaviour of non-security experts. These studies show that security is ultimately a social construct and as such needs to be negotiated between the different stakeholders in the ecosystem. The benefits of a collaborative stance are also suggested by economics: the time and effort individuals and organizations can expend is a limited resource - and economics tells us that we ignore such constraints at peril to our security goals” (Heath et al., 2018).

The starting place for ensuring that cybersecurity goals can be achieved is for “Security leadership colleagues should ensure employees at all levels understand the purpose of cyber security controls and the justification for using them, no matter whether they are in their homes or in an office environment, leveraging executive leadership support where this is required” (Crossland & Ertan, 2021). Instead of treating users as the weakest link, they instead need to “demonstrate that creating a sense of procedural fairness regarding rules and regulations is the key to effective information security management. In sum, it is important that, far from presuming that they are the ‘weakest link,’ our end users be dealt with fairly and with trust (Saltzer & Schroeder, 1975).”

Conclusion

As the methods of work change and user interaction with technology adapt to new practices, software, and ideas cybersecurity professionals should refer to the 1975 foundational paper The Protection of Information in Computer Systems (Saltzer & Schroeder, 1975) in which one of the key principles stated that human factors and economics (that each user, and the organization as a whole, should have to deal with as few distinct security mechanisms as possible). If the various layers of socio-technical interactions do not begin to focus on this core tenant, we will continue to face a situation where we are forcing compliance with the machine, instead of building a machine that works best for our users.

References

Crossland, G., & Ertan, A. (2021). Remote Working and (In)Security.

Heath, C. P. R., Hall, P. A., & Coles-Kemp, L. (2018). Holding on to dissensus: Participatory interactions in security design. Strategic Design Research Journal, 11(2). https://doi.org/10.4013/sdrj.2018.112.03

Hollnagel, E. (1983). Human Error NATO Conference on Human Error, Bellagio, Italy. https://erikhollnagel.com/onewebmedia/URTEXT%20on%20HE.pdf

Johnston, A., Wech, B., Jack, E., & Beavers, M. (2010). Reigning in the Remote Employee: Applying Social Learning Theory to Explain Information Security Policy Compliance Attitudes (Vol. 3).

Karlsson, F. J., Karlsson, M., & Åström, J. (2017). Measuring employees' compliance - the importance of value pluralism. Inf. Comput. Secur., 25, 279-299.

Norman, D. (1993). Things that make us smart: defending human attributes in the age of the machine. . Diversion Books.

Saltzer, J., & Schroeder, M. D. (1975). The Protection of Information in Computer Systems. Proc. IEEE, 63, 1278–1308. http://web.mit.edu/Saltzer/www/publications/protection/index.html

Slupska, J., Dawson Duckworth, S. D., Ma, L., & Neff, G. (2021). Participatory Threat Modelling Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems,