One of the talking points at Canon’s recent Parliamentary Reception was the relationship between digital security mechanisms, the state and individuals. How can the public sector manage this relationship and still deliver the security that’s required?
Lizzie Coles Kemp, Deputy Director of the Research Institute in Science of Cyber Security: I think that’s something that everyone grapples with. Ian Levy from the NCSC has talked about the importance of social groupings and communities to circulate a lot of this information, important information about protection. I think he’s right – digitalisation does transform how those relationships work.
The good news is that that we’re seeing from the research that we do that people are taking very seriously this notion of engagement. How do you engage with people in conversations about something that often seems quite dry, quite abstract? In the Research Institute in Science and Cyber Security, one of the things that has emerged from the membership and from the interest groups that we work with is the need to talk about the different ways in which we can engage. That’s not just the method and the medium. It’s not just about whether we use posters, or we use text messages, although that’s important, but also the important skills of listening.
It’s certainly clear with the groups that I work with and have worked with for the last 10 or so years, that they have very clear views on what it is that makes them feel safe and secure. You’re talking about the individual, so it’s what makes the individual feel safe and secure. The sorts of worries and concerns that they have and sometimes those are different to perhaps where part of the state might be coming from, or where industry might be coming from.
Part of that engagement is also finding out where security interests overlap, where they diverge, where they conflict, and how we’re going to manage that, rather than saying simply, “Do this or do that”. Yes, those instructions are important but there’s also another layer to it.
Do you think this is the classic case of the government finding solutions and fitting the problem to it, rather than actually finding out what the real problems are and then working on the solution?
That’s a really interesting question. I think there’s technological design and there’s security around the technological design, but there’s also a whole question about the social types of security. It’s where people and their attitudes towards security weave into the way that they have interactions and relationships with not just the state but also with institutions, banks, shops, but also with each other. We’ve got different layers going on here and I think, coming back to your question, there’s always going to be occasions where technology is designed directly from needs now. Only sometimes is it future-looking. That means there’s always going to be back and forth when it comes to security.
However, what we are very clearly beginning to see is this movement towards understanding what security needs actually are. I don’t think that we’re over the line yet regarding how to do it, but it is an emerging trend within security practice in government.
That brings us nicely onto the next point which is about the confidence people have in the state with regards to security. Can government do anything to improve this and build greater trust? Or, is it simply a case of as people’s understanding of security and the development of solutions, whether it’s tech or social-based, will automatically solve the problem?
I think that this is, again, a really important question. I think one of the things we do is smuggle a lot of fundamental structural change about markets, interactions, propositions into technical solutions. When we roll-out with a digital service, quite often there are fundamental changes to the way that service works. For example, the conditions that you have for entry into that service or the constraints that happen.
The security industry needs to think more about how security fits into design before taking it to the public. That requires a change in the way we do security design today. Currently, we think protection first, but I think that we are increasingly going to have to think much more about what’s the benefit, and how does the proposition benefit all those different parties? Where do people lose out? With these answers we can understand the risks and protection, how they sit within that overall benefit conversation and build people-centered services. This will have the benefit of allowing the public sector to move beyond making the service smooth or easy to use but also engaging people much more in the design of those services.
Being able to access a well-designed service that has security built in will boost an individual’s sense of safety and security. Knowing that I can use a service in a protected way that enables me to manage my financial security, manage my home security, manage my job security so that I’m able to go out and achieve the things that I want and need to achieve, is empowering.
How can we achieve that?
I think it’s very much about skills. Currently we’re doing great work in building up coding skills and we’re talking very much about technical security protection, Equally though, we need to do more to develop ‘softer’ analogue skills. The irony is that as we move more towards a digital future, we also need more face-to-face and conversational skills and talking skills to bring those digital services, imaginings into reality and weave them into everyday life.
You’ve outlined a lot of ways that we need to change, why cyber security is about much more than technology. Who should be driving it?
The default security answer is always everybody. However, what we’re clearly seeing from our research is that there needs to be a better clarity around roles, roles and responsibilities. That doesn’t mean just conceptualising the user, whoever the mythical user is, as somebody who doesn’t really understand, who doesn’t really engage because that’s not the wider, complicated picture.
I think that there needs to be a clearer understanding of what government does in this role and what government provides, but I think we all have to watch ourselves when we say, “Oh, the market will decide or it’s market driven”. I think we need to be much more nuanced and much more concrete about what we mean by the market and what those rules of the market are and where ethics and social justice and fairness come into the market.
We’re all living on one digital network. Therefore, thinking about our interactions with others and understanding that if we disadvantage people, that doesn’t just affect their interests, it also affects ours.
Bringing that into a security point of view, we’ve not really looked at the implications of markets locking out people. What are the implications of that in a digital environment? The impacts need to be understood because the technical controls are just the technical controls. They will always operate in a social, economic and political context so we need to build security in a way that appreciates this.