William Gibson’s book Virtual Light includes a bar which goes by the name of “Cognitive Dissidents”. I noticed this last night when I was reading to bed, and it seemed apposite, because I wanted to write about cognitive bias, and the fact that I’d noticed it so strikingly was, I realised, an example of exactly that: in this case, The Frequency Illusion, or The Baader-Meinhof Effect. Cognitive biases are everywhere, and there are far, far more of them than you might expect.
The problem is that we think of ourselves as rational beings, and it’s quite clear from decades – in some cases, centuries – of research that we’re anything but. We’re very likely to tell ourselves that we’re rational, and it’s such a common fallacy that The Illusion of Validity (another cognitive bias) will help us believe it. Cognitive biases are, according to Wikipedia, “systematic patterns of deviation from norm or rationality in judgment” or put maybe more simply, “our brains managing to think things which seem sensible, but aren’t.”[1]
The Wikipedia entry above gives lots of examples of cognitive bias – lots and lots of examples – and I’m far from being an expert in the field. The more I think about risk and how we consider risk, however, the more I’m convinced that we – security professionals and those with whom we work – need to have a better understanding of our own cognitive biases and those of the people around us. We like to believe that we make decisions and recommendations rationally, but it’s clear from the study of cognitive bias that:
- we generally don’t; and
- that even if we do, we shouldn’t expect those to whom we present them to consider them entirely rationally.
I should be clear, before we continue, that there are opportunities for abuse here. There are techniques beloved of advertisers and the media to manipulate our thinking to their ends which we could use to our advantage and to try to manipulate others. One example is the The Framing Effect. If you want your management not to fund a new anti-virus product because you have other ideas for the earmarked funding, you might say:
- “Our current product is 80% effective!”
Whereas if you do want them to fund it, you might say:
- “Our current product is 20% ineffective!”
People react in different ways, depending on how the same information is presented, and the way the two statements above are framed aims to manipulate your listeners to the outcome you want. So, don’t do this, and more important, look for vendors[2] who are doing this, and call them out on it. Here, then, are a three of the more obvious cognitive biases that you may come across:
- Irrational escalation or Sunk cost fallacy – this is the tendency for people to keep throwing money or resources at a project, vendor or product when it’s clear that it’s no longer worth it, with the rationale that to stop spending money (or resources) now would waste what has already been spent – when it’s actually already gone. This often comes over as misplaced pride, or people just not wanting to let go of a pet project because they’ve become attached to it, but it’s really dangerous for security, because if something clearly isn’t effective, we should be throwing it out, not sending good money after bad.
- Normalcy bias – this is the refusal to address a risk because it’s never happened before, and is an interesting one in security, for the simple reason that so many products and vendors are trying to make us do exactly that. What needs to happen here is that a good risk analysis needs to be performed, and then measures put in place to deal with those which are actually high priority, not those which may not happen, or which don’t seem likely at first glance.
- Observer-expectancy effect – this is when people who are looking for particular results find it, because they have (consciously or unconsciously) misused the data. This is common in situations such as those where there is a belief that a particular attack or threat is likely, and the data available (log files, for instance) are used in a way which confirms this expectation, rather than analysed and presented in ways which are more neutral.
I intend to address more specific cognitive biases in future articles, tying them even more closely to security concerns – if you have any particular examples or war stories, I’d love to hear them.
1 – my words
2 – or, I suppose, underhand colleagues…