Every day we are bombarded with data. So much that we simply cannot process it all, and even the fraction we can manage is inevitably at the mercy of our subjective bias. So when making decisions we all need a Red Team to keep us honest.
In 2016, a report by IBM Marketing Cloud revealed that 90% of data in the world had been created in the previous two years, with 2.5 quintillion bytes of data being produced every day. By some estimates the typical human now consumes 100 GB of data a day. That’s the equivalent of 20,000 songs or 200 hours of standard-definition video hitting your five senses every day. The problem is that while we have 21st century data, we still have prehistorical brains.
The challenge for leaders, managers and organisations is how to fight through the clutter and glean value from all this data in order to make better decisions. The technical answer is to develop an appropriate data infrastructure or framework, where information is shared across the business and doesn’t hide in silos. But the difficult part is establishing a culture of data to ensure that decisions are evidence based and driven by the data, rather than finger in the wind, gut feel and top-down decision making.
In broad terms the flow chart of turning data into something of value is this: data – information – knowledge – wisdom. Data is received by all our senses, but let’s confine ourselves to the text documents, emails, phone calls, Skype calls, images, videos and audio clips that we typically consume each day at work. From this we must first glean information, facts, theories, ideas and statistics, which we collect for reference and analysis in order to create knowledge, awareness or familiarity with a subject or situation, which if judiciously applied becomes wisdom, the ability to make sound judgements based on the knowledge you possess.
But no stage of this process is impartial or infallible. Even raw data is the result of assumptions and biases held by the gatherers of that data. But it gets even worse as we process the data. Fundamentally the problem is this: despite acres more data we are still using minds that are genetically very little different from those of our prehistorical forebears. Our ability to take in and sift through all this information remains as it was 70,000 years ago at the onset of the cognitive revolution.
Neuroscience now provides evidence for what we already suspected: we make decisions in a gut way, from somewhere deep within us and oddly before we even realise we have made a decision. We then use our cognitive abilities to rationalise our decisions and muster arguments so that we and others believe that they are based on factual data and logical thinking processes. Our sense of objectivity is an illusion. In truth, we interpret or ignore information in a way that confirms our subjective preconceptions.
Psychologists Daniel Kahneman, Paul Slovic, and Amos Tversky introduced the concept of psychological bias in the early 1970s, publishing their findings in the book ‘Judgement Under Uncertainty’. They describe psychological bias as the tendency to make decisions or take action in an illogical way. For example, subconsciously making selective use of data or, alternatively, feeling pressured to make a particular decision based upon the preconceptions of powerful colleagues. It is the opposite of objective, measured judgement and can lead to missed opportunities and poor decision making.
There are a variety of psychological biases, but basing decisions on our preconceptions is known as Confirmation Bias, which as described above, happens when we look for information that supports our existing beliefs and reject information that goes against what we believe. Good current examples include people’s attitude toward President Trump and Brexit. People in both camps, for or against, are largely fixed in their views and seek evidence to confirm those beliefs within a social echo chamber of like-minded people.
Even in the best of circumstances it’s hard to spot psychological bias in ourselves, because it often comes from subconscious thinking. We all like to think that we are objective and it is other people who suffer from the delusions of subjectivity. But that just isn’t true. Avoiding confirmation bias is therefore a matter of seeking ways to challenge what you think. One of the most powerful methods is to make major decisions with the support of other people. The problem here though is the social echo chamber: we prefer to make groups and teams in our own image, people who are like us.
That’s where the Red Team comes in. A Red Team is a group of people who challenge your point of view, because they are different from you and will offer dissenting views. You can also seek out information that challenges your opinions, or assign someone on your team to play “devil’s advocate” for major decisions. But the key is to identify people and sources you respect, but may not like – particularly when they tell you something you don’t want to hear.
In short, we have a better chance of making balanced decisions, which accommodate more of the relevant data and potential impacts, if we work with a diverse group of colleagues – a Red Team. People whose perspectives are diverse, whether this be through differences in gender, ethnicity, life experiences, expertise, thinking styles, hierarchical level in the organisation, or in a range of other ways. They might be irritate you, but pearls aren’t made without some grit in the oyster.