Tried, Tested and Proven

For all the talk of foreign states hacking various organisations, criminal gangs seeking to monetise information and hacktivist groups, the truth is that most security incidents are caused by organisations’ own staff and normally they are trying to do something good for their employer. It is not malicious, it is just misguided and often is in line with company practice, culture or policy. Generally these incidents are small or near misses, but there are enough big ones to cause concern.

The events at 56 Dean Street clinic inspired the posting of this blog. 56 Dean Street is a sexual health clinic specialising in the treatment of HIV and the crux of their incident was that a marketing email was sent with 780 email addresses as a CC rather than the intended BCC. Subsequent disclosure of this incident thrust an otherwise innocuous company, and a relatively small disclosure, into the public consciousness. However, the 56 Dean Street clinic is just the latest in a long line of damaging, accidental information disclosures and it is in this wider context that we’re discussing this issue.

I’m not writing to name and shame but rather trying to answer the question of how we can stop such incidents from happening elsewhere?

In most cases I would be reluctant to blame the individual. Yes, they clicked send, wrongly addressed an envelope, lost a USB stick, incorrectly transferred funds or whatever, but the reality is that they were just doing their job using the tools made available to them and, ultimately, they made a simple, easily foreseen error. User awareness training is important, but that will not address every eventuality. There comes a point at which ways of working must be improved, such that users do not need to make a conscious decision to do things properly; they simply do their work work, using the tools provided, and everything around them delivers a correct, secure outcome. That being said, before something new can be implemented, someone, somewhere needs to realise that there is an issue.

I’m also reluctant to directly blame the information security team in any organisation for such an event. They cannot be expected know everything that happens in an organisation and they can only respond to what they are aware of. Big brother, they are not. The people that do have a better understanding of what happens around them, day-in, day out, are managers and supervisors. We make managers responsible for many things: health and safety, equality, discipline, prioritisation, training, etc…, all of which lies outside of their primary area of expertise. Security is effectively another area of corporate responsibility that needs to be assigned to managers. That awareness of danger is required which provides a warning: “Stop! That doesn’t look safe”. Someone standing on a chair to change a lightbulb isn’t much different to someone handling information poorly. Once you’re looking, it is pretty obvious. While it will probably be fine, the consequences of making the wrong call can be severe. The individual performing the action may not realise this, but surely we can expect their line manager to step in? This is where some indirect blame is often attributed to the infosec team, because who else will press this agenda and lead the education programme?

It is also a question of company culture. For almost any task in an organisation, such as sending global emails or transferring information, there is the “proper way” (which, amongst other advantages, is usually safer) and the “cheap bodge” approach. The latter is almost always devoid of control and prone to going wrong. Sending a global email with 780 emails as a CC/BCC is a really lousy way of communicating, especially when there are a wide range of options for sending these types of communications properly as individualised emails (and this is just one pertinent example, there are many others). Whoever is championing the need to send marketing emails should be an expert in their field and recognise that corners are being cut, decreasing value and introducing risk. They may not be able to resolve the issue, but they should be drawing attention to it and driving for a better way. This really comes back to the point made earlier about giving people the right tools to work with and getting good information security outcomes by default. I really cringe when I hear people complaining about users doing stupid things like using a personal DropBox to transfer corporate information; the truth is that a user has the need to transfer a large quantity of information and hasn’t been provided with a better way of doing it. Whose fault is that?

I think it is human nature to blame the individual that made the mistake; sending a marketing email incorrectly, losing an unencrypted laptop, failing to lock the cupboard, falling off a table when changing a lightbulb. However, until we take a step back, look at the journey that led to that incident and accept that the responsibility really lies elsewhere, we will never put a stop to these silly (but not insignificant), all-too predictable incidents. The marketing email was sent by someone trying to promote the company, the person that lost the unencrypted laptop was just trying to take some work home, the person that failed to lock the cupboard was distracted by a ringing phone, the person that fell off the table changing a lightbulb had asked for it to be changed 4 times already. Where does the fault truly lie? It has to be with the management tier and upwards and that is where the changes need to happen if these types of incidents are to be prevented going forward.

This article was written by Simon Saunders, Head of Consultancy at Portcullis.
If your interested in learning how to better prepare yourself and your business for an incident, or want to know how you can train your employees on cyber security, you can speak to one of our security consultancy team members on 0208 868 0098 or go to the
contact us page.