A World Divided –
Is digital to blame?
by Nigel Barlow
We seem to be more divided than ever.
Brexiteer or adamant remainer, for Trump or anti, #metoo or believe the guys are having a hard time of it, immigration as necessary or to be prevented, straight or gay, pro life or for freedom to choose?
Our world has polarised into myriad pairs of opposites. Binary yes/no, fake/real and me/you thinking is prevalent everywhere.
Although we can’t say for sure that the tsunami of digitalisation is causing this dangerous splintering of viewpoints, there’s certainly a strong correlation: at the very least, digital communication amplifies and accelerates difference and opposition. As we carry on obliviously liking and disliking all we see, we may be reducing our minds to their most basic elements, an endless chain of O’s and 1’s. Binary brains, with all the divisiveness, separation and fixed mindsets that inevitably follow.
Now I am not suggesting a return to analogue. The digital cat is long out of the bag, and will only continue its advance, at least while we crave convenience, cost and immediacy above all other considerations.
What is needed is a dose of straightforward, analogue thinking for insight on how to balance too much of the polarised, binary option. It’s tearing us apart.
Think of how democratic Parliaments are supposed to work: they claim to encourage ‘debate’ (one of Tony Blair’s favourite words). This normally takes the form of one party putting forward a thesis. Perhaps a grand term for saying something like, “Doesn’t the right honourable member believe he and his government have been unprepared and incompetent when they have . . .” (add your own slip or disaster). But however crude, it is a thesis, albeit usually on the way to being rebutted.
Not surprisingly the antithesis that results is either a stone wall or counter-attack:
“No, I don’t agree. Perhaps you should consider your own appalling record on. . .”
It’s 1+1 = 0.
The Hellenic philosophers knew better. For them, the process of thesis countered by antithesis should ideally result in a third stage: synthesis. A coming together or combination of energies and ideas that is a new and useful resolution of a problem.
Westminster, the mother of Parliaments, is simply not set up for synthesising ideas. Given the shallow gene pool the UK’s leaders emerge from, it’s hardly surprising that it’s all a facsimile of an Oxford Union debate: for/against, ayes and noes. Minds are locked into their ‘not listening’ mode, while the game of thesis-antithesis is played vociferously, like a political form of Quiddich. Or a referendum, which has divided Europe.
When this bickering behaviour is televised, even though we might ‘tut, tut’ at the childish, point-scoring debates, we ingest much of this style of thinking. Grey is for wimps, so take your positions everyone!
It’s either/or, never both/and, or a stepping stone to a new position.
But democratic wrangles have been going on for centuries - what’s going digital got to do with this?
What Digital Does To Our Thinking
Digital is a child of homo sapiens. But, like most children, it tends to grow beyond what we expected of it. It doesn’t cause polarity in thinking, but it does strongly influence it in three main ways. . .
Acceptance. We accept binary, either/or thinking more readily because in subtle ways this is the kind of thinking the new technology makes habit-forming. We could call this Facebook Thinking – everything is to be Liked or Disliked at the tap of a finger. We are so overwhelmed with the data flow that to stop and think in our everyday life of a third position where we might elide between opposites, or accept some grey, is just too much work.
The continual filling in of mandatory fields and boxes further drives our awareness into pre-determined categories. We’ve developed the habit of accepting these binary choices dozens of times a day, making it difficult to get out and solve real problems in more subtle and complex ways. This is a conceptual box we have to think our way out of, but the conditioning of the internet is all-pervasive and insidious.
Amplification. Does the internet really facilitate a global conversation? ‘Con’ means with, but the majority of online communications are more about talking at or to. The fact that millions can view a tweet instantaneously means that two-dimensional and simplistic thinking is amplified through the collective consciousness.
Our response is emotional and visceral, which is why Twitter has been described as the amygdala or fear centre of the internet. It’s an apt analogy: think of all the rage, anger and hatred that is amplified through the neutral channels of the web world. It’s divisive thinking, spread virally.
Acceleration. All is haste – news, views and information of all kinds hits our brains all day, and sometimes all night, long. Have you tried watching old movies, say from 25 to 30 years ago? Most find they are agonisingly slow, even when that’s the mood; take Death In Venice as an example. The same friend who pointed this out to me fast-forwards through a tennis player’s serve so he can get right to the moment the ball is struck. Instant, contactless transactions keep us deciding and spending in a blur of time. No wonder that one of our most precious resources, human attention, is at such a premium. Our minds may crave stillness, but our attention waivers or is staccato, stabbing into this and that option the digital universe throws at us.
Our attention has been chopped up into smaller and smaller slices by the competing and often simultaneous cries for attention from new devices and sources of information. One study calculated that to check your mobile 2,000 times a day isn’t out of the ordinary, while many Millennials sleep with their digital tools, tablets and phones.
Our shrinking attention span is reflected in song introductions being half the length they were a short generation ago, while movies hit a high drama scene sooner to stop you mentally grazing elsewhere. Publishers talk about readers ‘snacking on content,’ and we are absorbed in the web’s juicy offerings while supposedly having a meal, dinner conversation or work meeting. We’re all mildly ADHD, with our awareness split into shorter and shorter spans.
Hold on. So far this sounds like a standard rant against digital distraction. But there is a more nuanced, not necessarily either/or, good/bad position here. It’s possible to both laud the great benefits to society, science and commerce that come from a digital connection, while simultaneously being concerned about its impact on our mental ability to process complex situations.
As F Scott Fitzgerald remarked, ‘The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time and still retain the ability to function.’
Either/or thinkers are baffled by this proposition. I have a friend who is a respected professor of biology and very pro science and technology, but anti GMO. He has been vilified by his scientific colleagues who believe he is letting the side down. “You’re either with us or against us” is the prevalent thinking here, reminiscent of George Bush’s pronouncements after 9/11. No wonder the world is dividing into opposing tribes when at the root level of cognition we seem to prefer certainties, rigid and unambiguous positions. In a messy world, the algorithm that decides your credit worthiness, whether you should go to jail or stay free, need this treatment or that, is a simplistic tool that parallels the explosion in simplistic thinking.
Solutions To A Bias For Binary
So what can be done?
To achieve the desired synthesis, the coming together of the best of both sides of an argument, we could train the whole population in synthesis-seeking.
Chances of success? Highly unlikely, as this is still working on the surface level of the intellect, similar to a student debate. You could learn the game without ever shifting from your own binary ‘I’m OK – you’re not OK’ stance. People’s viewpoints are embedded in their emotions and neurophysiology. Even in their senses, they ‘smell’ a cover up, they like to ‘touch’ what they consider to be real, and so on.
We need to go deeper. The levels of the mind that exist beyond thesis/antithesis, either/or are the more creative and connected kinds of thought. There needn’t be anything mystical about this. Stephen Jobs described creativity as ‘just connecting things’, as he did successfully with integrating beauty and machines, design and technology. The best of both.
A fundamental shift of society’s mindset away from destructive and oppositional thinking to a more creative and inclusive outlook is what’s needed. After all, authoritarian leaders, isms and rigid philosophies create their power base by dividing the world into its binary elements, its noughts and ones, us and them. They are creating the mindset that fears otherness. Today, the concept of ‘other’ is alive in all forms of fundamentalism, whether religious, philosophical or political. When an American President describes a group of immigrants with alleged gang ties by saying, ‘These aren’t people - these are animals,’ it can only propagate division, fear and hatred.
What’s also needed is new ways of thinking more fluidly, accessing levels of thought that are connected and inclusive of apparent differences, deeper than the surface level of opposition and the cold certainties of digital bytes. Neuroscience is rapidly opening up our intimate understanding of how our brain’s functioning creates our ‘perceptual world’, while simultaneously verifying the benefits of older approaches to expanding the mind, such as the practice of meditation.
However we accomplish tapping into more integrated levels of thinking, where true synthesis is possible, it could be necessary for our survival. ‘Wars begins in the minds of men’ are wise words from the United Nations Charter, but far too little has been done to work on the level of the mind and arrest the causes of unnecessary conflict. At the very least, we have to pursue a deepened knowledge of how to perceive otherness – unfamiliar ideas and faces – without rejection, premature judgement or stereotyping.
A practical way of visualising how we can be more integrated, non-binary thinkers is for us to become ‘Foxhogs.’ Let me explain.
The philosopher Isaiah Berlin believed he could divide thinkers into two camps: foxes and hedgehogs. The fox knows many things, including different creative strategies for avoiding the hunter. In contrast, the hedgehog knows only one thing, and in most situations - other than proximity to the motor car - this ability to roll into a protective ball is effective.
Writer Ian Leslie in his book Curious remarks how today we need to be a combination of the two styles: possessing in-depth, hedgehog-like knowledge of one particular field, together with a fox-like ability to put this information into a wider and useful context.
It works. Darwin’s fox-like familiarity with fields as diverse as demographics, botany and geology were necessary for him to make insightful connections and see the bigger picture of evolution.
In a time of narrowed specialisms, it can be hard to see out of the box of one’s own field to perceive underlying unity. The parts predominate over the whole. Consciously becoming a foxhog will help . . .
What’s also called for is more fluid thinking, which is a better way to describe a flexible view than saying it’s grey. Grey sounds both weak and dull, definitely unappealing compared with the dopamine rush we get on hearing information that confirms our prejudices. This can dig us deeper into a fixed mental trench.
To think in a more fluid way needn’t be thought of as ‘wet’; the most innovative scientific minds of the last 150 years are mostly those who saw the underlying pattern, not just the surface phenomena. They are ‘lumpers’ rather than ‘splitters.’
We hear all the time that life is becoming more uncertain, but we can in fact be increasingly certain that we will live longer and healthier lives, have less chance of a road crash or robbery than in the past, and are less likely to be caught up in war and violence, except on Netflix.
Certainty gets a very bad press from commentators and experts on creative and dialectic thinking alike. Doubt is the new cool kid on the block; certainty is for the naïve and mindless masses. However, when we introduce the concept of levels of thought, it’s possible to think in ways that allows the co-existence of two notions: firstly, that doubt is a sensible position to adopt in the absence of evidence that something is true, while at the same time knowing that there are matters we can usefully be certain about.
These include our interdependence on a small and increasingly crowded planet, our higher desire to think beyond the narrow box of our own preoccupations, our shared humanity, and our essential nature as social beings to put unity ahead of division.
Utopia? Much harder to think about than the usual future commentator’s trope, which is a lazy assumption of an impending dystopia. It takes profound, synthesising thinking to bring about a more positive future, but we do have the mental and survival instincts to make it happen if we choose it. The world may be bad, but at the same time the data shows it’s getting much better. Tasks can be happily outsourced to digital tools, but the kind of thinking needed to preserve and expand what’s best in our global village has to be taken more seriously. We are truly in a ‘wisdom race’ to develop our own minds – a kind of Moore’s Law for the brain - at a pace that parallels the insinuation of digital technology into all areas of our lives. It’s a both/and rather than either/or approach that’s clearly needed.
Above all, we need to move beyond ‘referendum thinking’. It’s a numbers game that’s a facsimile of real democracy, and only amplifies division and ranting across the debate room floor. No-one is happy with the result, not even the supposed winners.
Oxford, November 2018