As someone who works in marketing, my job is to persuade. To do that I need to understand the perspective of everyone involved in the transaction – usually my client and their customers. I’m always asking myself questions about what success looks like to those people. I want to know what they need and want, or how can I give someone whose mind I’m trying to change the room to do so without losing face. I want to know how I can get them to feel satisfied with their sale, purchase, or decision.
I also need to understand why and how people think the way they do and make choices.
Looking at the opposing views about everything from COVID vaccinations to gender inequality to religious beliefs, and how to protect the environment, it often feels like people are singing from completely different song sheets.
Opinions aren’t facts. They aren’t inherently right or wrong. Our opinions, including mine, are based on our values, the information we have available to us, and the unique way our brains have processed that information. Because of the way our brains work, it’s hard for us to distinguish between what we believe to be true (opinion) and what’s actually true (proven fact).
Cognitive science is the study of thought, learning, and mental organization. It draws on aspects of psychology, linguistics, philosophy, and computer modelling.
From the extensive research conducted by cognitive scientists, we know how human beings process the enormous amount of information we’re exposed to in our environment. If we tried to deal with all of it in every moment we’d be overwhelmed, so our brains have evolved to process what’s relevant and tune out to what’s not.
The human brain processes what’s relevant using mental shortcuts called heuristics. Heuristics use what we already know from learning and experience to help us reach quick decisions. Most of the time, these processes are pretty accurate. When there are errors in our thinking, it can lead to cognitive biases, which we’ll talk about later.
Daniel Kahneman is an economist and Nobel Prize winner who is known for his work on the psychology of judgement and decision-making. In his words, “…heuristics are the ‘shortcuts’ that humans use to reduce task complexity in judgment and choice, and biases are the resulting gaps between normative behaviour and the heuristically determined behaviour.”
There are three types of heuristics.
· Representative heuristics help us make judgements about probability. Say we meet someone in a hospital corridor who is neatly groomed, walking purposefully, and wearing a white lab coat while holding a clipboard. Based on our knowledge and
experience, it’s highly likely we’ll assume that person is a doctor in that hospital, even though we don’t have any hard evidence.
· Anchoring and adjustment heuristics relate to the human habit of putting extra value on the first piece of information we receive on something we don’t know much about. That first piece of information becomes an anchor point, and we adjust it based on additional information as it arrives. It works like this, if someone asked you to estimate the value of something that you knew nothing about, you’d probably have trouble coming up with an initial figure. But if that person gave you a figure they had in mind, with some examples of similar items he’d seen for sale recently and some information on their relative condition, you could make adjustments to that anchor price and arrive at some fairly accurate estimates.
· Availability heuristics relates to our tendency to use the information that most quickly and easily comes to mind when we make decisions about the future. For example, if you’re trying to decide which politician to vote for and you recall a news article or social media post about one of them being involved in something a bit dodgy – even if you can’t remember the detail or whether that’s what headline actually meant – you may base your voting decision on that and be satisfied that you’ve given it proper thought.
You can see that even though might people believe they’re thinking carefully about their decisions, they can end up making decisions that are highly inaccurate.
As Daniel Kahneman says, heuristics or mental shortcuts can result in cognitive biases. Let’s look a few examples to see how two people can have completely different perspectives to ours based on their observations of the same events.
You might recognise some of these:
· Cognitive dissonance describes when we avoid having conflicting beliefs and attitudes because it makes us feel uncomfortable. The clash is usually dealt with by rejecting, debunking, or avoiding new information.
· Confirmation bias describes our underlying tendency to notice, focus on, and give greater credence to evidence that fits with our existing beliefs.
· The framing effect describes when our decisions are influenced by the way information is presented. Equivalent information can be more or less attractive depending on what features are highlighted.[1]
Once we have an opinion about something, it becomes part of our identity. We become someone who, for example, believes in free speech or believe that eating meat is wrong, and we identify with other people who believe the same thing.
Bruce Hood is an experimental psychologist and philosopher who specialises in developmental cognitive neuroscience. In his book The Self Illusion, he has a lot to say about groups and how they work. He says it so well that I’m going to quote him directly.
“Whether we like it or not, we’re all members of different groups. Some group membership is relatively fixed and independent of what we want – age, sex, race height and nationality, for example – though sometimes we try to change even these: we lie about our age, cross-dress, have surgery, wear elevator shoes and become a nationalised citizen. Other groups we aspire to join throughout our lifetime – the in-crowd, the jet-set, the highfliers, the intelligentsia or the seriously wealthy. Some of us are assigned to groups we’d rather not join – the poor, the uneducated, the criminal classes, or the drug addicts. People do not normally choose to be any of these but we are all members of groups whether we like it or not. Furthermore, it is in our human nature to categorise each other into groups. Even those who don’t want to be categorised are a group unto themselves – they are the dropouts and outsiders.”
“We categorise others because it makes it much easier to deal with strangers when we know where they’re coming from. We don’t have to do as much mental work trying to figure out how to respond and react much more quickly when we categorise. This is a general principle of our brains – we tend to summarise previous experiences to be prepared for future encounters. It’s likely to be an evolutionary adaptation to optimise processing loads and streamline responses. When we identify someone as belonging to a particular group, this triggers all the stereotypes we possess for that group, which in turn influences how we behave towards that person. The problem is, that stereotypes can be very wide of the mark when it comes down to the characteristics of the individual.”
When we judge others, or when we criticise other people for what we view as judgemental attitudes and behaviour, we need to pause. We all have stereotypes in our minds. When we demand that people not judge others according to stereotypes, we’re almost asking the impossible. Instead, we can them to do something that is far more powerful and effective in influencing their beliefs and opinions.
We can ask them to consider the other person’s words or behaviour from a different perspective. We can ask them to consider what might be motivating the person they’re judging to behave the way they are. Or ask them to practice empathy and consider what they would do if they were in that person’s exact situation and had walked through life in their shoes.
Empathy, compassion and altruism are the tools of the third age of man (see day 2 – The Future), in a circular system with the power to cross the boundaries and scale the walls that we’ve put up since we stopped living as hunter-gatherers, to make ourselves feel safe.
The next time you feel powerless to change or understand other people’s behaviour, remember that you have something far more powerful at your disposal, the ability to change your perspective.
Sources
· The Self Illusion: Who do you think you are? Bruce Hood
· https://thedecisionlab.com/biases-index/