How our brains trick us into conflict
How is it possible that when presented with the same information, we come to hold such widely different views of reality?
The views we form about the world undergo a series of modifications before they morph into unassailable and resolute beliefs. But these changes are influenced by more than just the ability to comprehend and sort information, or our own inborn prejudices. The exercise of free choice can elicit an unfortunate need to take sides.
Dr. Kris De Meyer is a computational neuroscientist, currently working at the Department of Neuroimaging at King’s Institute for Psychiatry, Psychology and Neuroscience. His documentary, Right Between Your ears – a film about how we become convinced we’re right, even when we’re wrong, is a must see. He also teaches neuroscience and psychology-based public engagement workshops to students in the environmental sciences. I’m a fan his work because I’m interested in human conflict and the behaviour of groups as indicators of the human condition. It happens that Dr. De Meyer Ph.D. has a brilliant metaphor for the processes of formation of beliefs and deeply held convictions.
If two people hold similar views, they could be perceived as standing at the top of a pyramid, where their morals, ethics, opinions, preferences and beliefs, will be more or less the same. The introduction of new information however, may cause them to see things in a slightly different light, and accordingly, they will take their first step down the side of the pyramid. In this way, their morals, ethics, opinions, preferences and beliefs will move slightly apart. More information and debate will result in more steps down the pyramid and the distance between them will increase.
As the pair continue their descent, so their views become even more divergent, their convictions stronger and more entrenched, and they drift even further apart. Moreover, as they seek (and find) more reasons to support their beliefs, they will grow to dislike each other.
Changing opinions create conflict and a need to self-justify. This need is a deeply embedded part of human nature – everybody experiences it when the need arises – and it inevitably leads to further behavioural measures, such as defending decisions to others, including friends or family, and yet more self-justification.
Perfect examples can be found by looking at the emotional angst, disbelief, frustration and anger resulting from Brexit and the election of Donald Trump. Part of the reason is that political rhetoric doesn’t persuade equally or fairly – it splits and polarises opinion and breaks up alliances and friendships.
The more we argue our corner with others, the more convinced we become that we are right and they are wrong. The further down the pyramid we go, the more prone we are to confirmation bias (a very dangerous condition!) and the more open we become to outrageous, scandalous and false propositions about our opponents. Our dislike for anyone holding views in opposition to our own increases our willingness to accept derogatory stories as the truth. The more certain we are of our own position, the more willing we are to denigrate, insult and destroy the reputations of those on the other side of the pyramid. And you can bet your life they will be doing exactly the same to us!
In 1957, psychologist Leon Festinger introduced the idea of Cognitive Dissonance. According to Festinger, we hold many opinions and beliefs about the world and ourselves, but when these opinions clash, there is a discrepancy that culminates in a state of tension. Cognitive dissonance perfectly describes the inconsistencies we perceive in other people’s views and behaviour, but rarely in our own.
In situations where attitudes, beliefs or behaviours produce feelings of tension or discomfort, our natural reaction is to try to keep all our attitudes and beliefs in harmony. We evolved to seek consistency in our attitudes and beliefs so any situation where cognitions become inconsistent creates a powerful desire to maintain or restore consistency. This can sometimes give rise to irrational and extreme behaviour.
Festinger’s proposition was that the inconsistencies we detect in our own beliefs create emotional discomfort that acts as a force to reduce the inconsistency by modifying beliefs or adding new ones. Cognitive dissonance becomes the catalyst of opinion change.
In our modern information driven technologically advanced society, we have greater freedom of choice than ever before. But freedom of choice can create dissonance if it involves difficult trade-offs. Choice can result in a growing commitment to the chosen option that in turn may lead to belief change.
Almost 60 years of research and hundreds of experiments have, as one would expect, shown that dissonance operates most strongly when events impact our core beliefs – especially the beliefs we have about ourselves as good, competent and intelligent people. Take for example the effect the attack on the World Trade Centre had on our erstwhile comfortable and familiar view of the world. Suddenly we were presented with a choice of which side we should be on. The opinions and beliefs we adopted still remain in place years later. The opinions formed in the West were not necessarily the same as the opinions held by people in other parts of the world. These conflicting opinions have resulted in feelings of mistrust and in some areas, in outright hatred.
How do our beliefs become so firmly entrenched? What psychological processes are involved in this kind of decision-making?
We all consider ourselves rational, competent and intelligent people who would never wish harm on any of our fellow human beings – except of course those we consider a threat – and it’s easy to view anyone whose beliefs are diametrically opposed to ours as a threat. Sometimes it’s hard to understand why other people can’t see what to us is patently obvious! Are they blind? Surely no one can be that misguided or ignorant of world affairs, or so terminally stupid they can’t see the bigger picture that is plain enough to anyone blessed with a thinking brain. After all, we are not only more enlightened, we are in possession of a broader and more informed version of the world! Aren’t we?
It’s no coincidence that people on opposite sides of a polarised debate judge each other in similar terms – our evolutionary social brains predispose us to this way of thinking. This behaviour has it’s roots in evolving groups’ need to protect themselves from other, possibly predatory, groups. A cursory look at the trouble spots of the world proves the point – we are constantly reminded of the futility and stupidity of pointless human conflict on the nightly TV news – Sunnis killing Shias, Shias killing Sunnis, Brexiteers bickering with Remainers, Democrats squabbling with Republicans… We are treated to a seemingly never ending procession of those in tears, bemoaning the result of a democratic vote and demanding a second referendum because the outcome of the first was not to their liking, or seemed unfair, or fixed, or that Russian hackers were to blame. If only these fools could see things as clearly as we do!!!
Babies can evaluate the behaviour of others by the time they’re six months old. Even at that early age, they can tell the difference between the safety and comfort of their mother’s arms and the unpredictability of strangers. The ability to discriminate is also part of the human survival strategy and obvious in the school playground, where alliances are formed and dissolved on a regular basis. As we grow up, we learn the powerful automatic cognitive processes that will protect us from being cheated – a good thing in its own way of course – but this type of social awareness can also inadvertently mislead us.
Social media can often make matters worse because words on a screen cannot possibly help us evaluate the perspective and intentions of others – especially in the absence of face-to-face interaction, observation of body language and interpretation of facial expression. Emails, tweets and text messages can seem peremptory, even rude without the face-to-face interaction that we’ve been used to for the last hundred thousand years. It’s all too easy to believe that those on the other side of the debate really are an abusive bunch of knuckle-dragging Neanderthals!
What is lacking is a better understanding of how our views change so easily and rapidly. Dr. De Meyer’s pyramid analogy provides a useful model of how people’s opinions change from weak to moderate to strong convictions – about politicians, footballers, or X Factor finalists.
Sure, having strong convictions can help us achieve fine and selfless acts, but we must also learn to control antipathy and mistrust and try harder to understand where others are coming from – you never know, it might turn out to be the top of the same pyramid we started out from!