If you don’t have time to read all of this – here’s a wonderful cartoon
The Paradumb Paradox
Green politics and economics are based on an understanding of ecology – the branch of biology that deals with the relations and interactions between organisms and their environment (aka territory), including the other organisms within it.
Ecology is, like climate change, a relatively new science and one little understood, or even considered, by many climate change sceptics. And ecology is intimately connected with the study of evolution – a science also rejected by some climate change contrarians.
One area of study which looks back the very birth of humanity, but has not been greatly pursued to date, is enquiry into the emergence of consciousness and its perhaps inevitable companion; the ‘world-view’ – or as I prefer to call it (because ‘world-view’ doesn’t sound sufficiently overwhelming); the ‘personal paradigm’ – from the Greek paradeigma: to show side by side. (Actually, I really prefer to call it The Paradumb – you’ll soon see why).
Diagram from Kaizan
“Sometimes people hold a core belief that is very strong. When they are
presented with evidence that works against that belief, the new
evidence cannot be accepted. It would create a feeling that is
extremely uncomfortable, called cognitive dissonance. And because it
is so important to protect the core belief, they will rationalise,
ignore and even deny anything that doesn’t fit in with the core belief.”
The paradigm resides at the very heart of our consciousness. Debra Yearwood sums it up nicely: “Paradigms help us to interpret, define and engage in the world around us. Without our paradigms we would constantly be struggling to determine and define what we see, what we hear, and what we should do about it. Our paradigms help us to move through our lives seamlessly.”
But I’m more interested in how it behaves, and why, than in the physiological workings, because properly understanding the Paradumb, and finding out how to convert, subvert, bypass, evolve or otherwise diminish what are now becoming negative effects are rapidly becoming critical – because it turns out that while your paradigm does help to carry you (and/or you group) through a time of crisis it can also impedes rationality – to a point where it puts you in real danger.
We could debate whether the paradigm is 1) a purely personal phenomenon which has group functionality and is therefore of benefit to your gene pool, 2) a group function; a manifestation of the herd mentality which forces individuals think and act in consort for the good of all, or 3) a manifestation of colonial hive thinking or collective consciousness.
I suspect the last, but in any event, our own framework of beliefs is certainly intertwined with what our family, close friends, colleagues and/or congregation believe. Some of us are individualists, but very few are comfortable with complete isolation. Most of us want or need the comfort of shared values and explanations – aka a group paradigm. The problems come when we need to hold two or more of these at the same time; personal beliefs at odds with the teaching of a church, say, or political convictions at odds with our family lifestyle – resulting in cognitive dissonance.
This is, we now know, the chief reason it’s so hard to change your personal paradigm: Not only may your subconscious be hard-wired to conform, you are likely to be aware that the adoption of beliefs different to those of people you care about and rely upon may cause you to lose your place in your group hierarchy, or even risk ostracisation. (I know I struggled with this when I first started to study climate change properly at MA level, returning from a week of back-to-back lectures in Machynlleth to, then, bemused and dubious family and friends).
So even the most maverick thinker is bound to compare their ideas with people he or she admires and respects. And wherever there is agreement, that idea gains weight as a belief. And the stronger the correlation, the greater the risk of confirmation bias.
Today, it is becoming apparent that the echo-chamber effect of social media (plus possible reinforcement by Google autocomplete results), is now helping to unleash a new globalised form of group paradigm, one that authoritarian religions (for all their faults) and authoritative media (for all its faults) used to keep in check: The Post Truth Society.
(This BBC programme is well worth 45 minutes of your time – it’s utterly fascinating).
At one end of the spectrum the problem can be partially explained by the Dunning Kruger effect: Some people simply do not have sufficient intelligence to appreciate that they are not as clever as everyone else – therefore anyone using evidence or arguments they can’t fathom must be lying for some nefarious reason.
This may well explain the whackier belief systems such as Flat-Earth-ism and (since Darwin) Creationism: The science-supported theory (being ‘still’ a theory does not mean it’s not true, by the way) is too complex to grasp, so the believer clings to a simpler more powerful narrative / fairy story which ‘works’ for them. Staggeringly, today these and similar beliefs are gaining, not losing, traction – especially among less-well educated people who’ve been taught that their group’s religious narratives are more truthful than science, and never had reason to move away from that position. (And is here’s a great blog about flat earth and creationism).
But we can’t just blame lack of IQ or poor education, because at the other end of the spectrum we have plenty of very clever, well-educated people suffering from the same disability. In fact education, especially self education (something the internet both facilitates and deregulates), can often exacerbate the problem. (Perfect examples of this can be found in the deeply troubling Global Warming Policy Foundation – including one former Chancellor of the Exchequer, no less, whose understanding of climate science is more muddled than that of most ten-year-olds).
One of the reasons that clever people wind up believing untruths is the weaponisation of language – something Trump, for example (there are many others) has become surprisingly adept at – though whether by accident remains to be seen:
‘Language works by activating brain structures called “frame-circuits” used to understand experience. They get stronger when we hear the activating language. Enough repetition can make them permanent, changing how we view the world.’ … [Trump is] a super salesman with an instinctive ability to manipulate thought by 1) framing first 2) repeating often, and 3) leading others to repeat his words by getting people to attack him within his own frame.’ (Guardian).
Because the paradumb phenomenon is not static. It can be influenced as above, but mostly it works symbiotically with the self-justification process that we all use to manage our cognitive dissonance.
This is a journey which can turn mild-mannered people into murderers, upright officers into torturers, and socially-committed politicians into tyrants. It is explained brilliantly in what is probably the most enlightening book I have ever read: Tavris and Aronson’s “Mistakes Were Made (But Not By Me)” – which explains wisely that no-one EVER thinks they are the bad guy.
This is from the Guardian review:
“when the authors need to sum up the mechanics of self-justification in one pithy line, it is a British politician, Lord Molson (1903-1991), who provides the soundbite: “I will look at any additional evidence to confirm the opinion to which I have already come.
MRI scans confirm that when we are confronted with dissonant information, the reasoning areas of our brains all but shut down. And it’s not only politicians who indulge in self-justification. For which of us, on buying the more expensive appliance, has not then spent weeks kidding ourselves the cheaper model would have been unreliable or downright dangerous?
But how do we square two dissonant cognitions when one of them is the belief that we are decent people and the other is the knowledge that we have inflicted pain on an innocent victim?
Ask any kid who wallops a younger brother. “I’m decent, but I hit him,” the argument runs, “therefore he must have deserved it.” It’s the most vicious of circles. Aggression begets self-justification, which begets more aggression, and thus do the authors lead us, one small step at a time, down the road to Abu Ghraib and to all those deeds throughout the ages whose doers were never the monsters we’d prefer them to be but just decent people like us.”
And at each small step along the road towards the Big Mistake, our paradumb clicks round one more notch, to provide a new unshakable foundation upon which to transit to the next, ever more erroneous, position. (Though of course the journey from error to truth can be just as tortuous).
So today we have huge numbers of intelligent, well-educated people who cannot begin to accept climate science (here is a great talk from George Marshall, who wrote Don’t Even Think About It, on this very topic), or are blinded to other basic fields of knowledge, such as medicine (have you seen what some say about vaccines over there? sheesh), and instead cling to ever madder and madder and madder conspiracy theories.
In fact – we might even classify those religions which fly in the face of accepted science as conspiracy theories. All are manifestations of the same human trait.
At the most extreme end, we encounter political and/or religious fanaticism, which may or may not be closely linked to mental illness, which manifests in unhealthy religious practices, the election of dictators like Trump (and they usually are elected – or have huge popular support – at first), at least some of the support for Brexit, some bug-eye belief such as man never going to the moon or us all being poisoned by chemtrails, and – most worrying of all – this refusal to accept that the Earth is warming dangerously.
Here is a neat article offering 6 reasons why people believe in conspiracy theories.
Actually some would separate out the religious from the political paradigm, and there is evidence suggesting that the political typically Trumps the religious (such as the Pope’s failure to convince American catholics about climate change) but in the end it makes little difference.
We are all slaves to our paradigm(s) even when this belief is based on flawed information. There is in fact evidence now emerging that conservatives are more susceptible to believing provable untruths than liberals, and also, it seems, more addicted to the comfort of the paradumb panacea, but none of us is immune.
Incidentally, if you want to insure yourself against the tyrany of the paradumb, you should read two books back to back: Factfulness by Hans Rosling and family, which is an absolutely life-changing explanation of exactly how world-views manage to override truth – essential reading for every person on the planet in my opinion, and The Tiger That Isn’t: Seeing Through A World of Numbers by Dilnot and Blastland (who present More of Less on BBC Radio 4).
If not challenged – something that needs evidence, confidence, clear thinking, lucidity – and those books, perhaps, to achieve (you could even say the scientific method emerged in the Age of Enlightenment specifically as a means of defeating the paradigm phenomenon) – the paradumb can and does frequently override both fact and logic to the point of putting an individual or group at risk.
This cannot be an evolutionary advantage.
So why did the paradigm emerge, and how did it become such a central force in human behaviour?
We don’t know for sure when early humans became self-aware (and was this, intriguingly, caused by a virus?), but it’s obvious that our first attempts to understand a complex universe would have been rudimentary. So much would have been inexplicable that our developing brains would have struggled constantly to cope with new and often terrifying information – even to the point of mental illness.
Anne Applebaum expresses it very well in this article about Donald Trump and his use of conspiracy theories to win power:
“In its essence, a conspiracy theory is the modern equivalent of a myth: It’s a story that people tell to explain otherwise inexplicable events. The appeal of conspiracy theories is deep, because the human need for meaning is so profound. Why does the sun rise in the morning? Because that’s when the God Apollo rides his chariot across the sky. Why did Antonin Scalia die right now, in the middle of a presidential campaign? Because someone is pulling strings behind the scenes, trying to manipulate events.
“The human brain is designed to reject the random, the haphazard, the arbitrary. It’s just too much for many people to accept that accidents happen, planes crash, ships founder on the rocks, and elderly people die in their sleep: A dramatic event must fit into a larger narrative. For others, conspiracy theories are a useful way to explain personal failure: It wasn’t me, it was the freemasons/capitalists/Jews/immigrants/murderers of Scalia who deprived me of success.”
So, back at the dawning of human consciousness, believable explanations for the passage of sun and moon, weather, sickness, death, predators, pests and other threats would be not only desirable, but necessary.
And it seems fair to assume that those best able to tame explicable happenings and coax them into a coherent, comforting story – in other words to use their imaginations to fill in the gaps in the narrative and so provide an appearance of reason – would flourish better than those who did not – thus passing their ability on to their young, and so, eventually, by natural selection, hard-wiring the paradigm into our brains.
There is even some evidence to justify going one whole step further – to suggest that the paradigm originally evolved not just as a way to provide reason – but as a system which would guarantee that the smartest person always won the argument, and therefore would survive to pass on their genes.
Mercier and Sperber suggest in “The Enigma of Reason” that narrative actually evolved to ‘prevent us from getting screwed by the other members of our group. Living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing, and with making sure that they weren’t the ones risking their lives on the hunt while others loafed around in the cave. There was little advantage in reasoning clearly, while much was to be gained from winning arguments.‘
This might explain why facts don’t change our minds.
Actually, not only do facts frequently fail to change minds, contrary evidence can even reinforce the paradumb – via something called the Backfire Effect:
“The Misconception: When your beliefs are challenged with facts, you alter your opinions and incorporate the new information into your thinking.
The Truth: When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.”
And this has been going on ever since humans first started to think. So today, a function which once helped to ensure our survival is doing the very opposite – pressing our feet down on the accelerator, when we desperately need to be stamping on the brake.
So maybe we can say that the phenomenon we call the paradigm, and the phenomenon we call narrative, or story-telling, are no more than internal and external manifestations of the same thing: Both rely on imagination to corral perceptions into a memorable and seemingly logical ‘truth.’
It’s been suggested that the chief reason Homo Sapiens succeeded was not superior eyesight, opposable thumbs, an omnivorous diet or even a larger brain (Neanderthals were as well endowed in that department, and much stronger). It was the development of language.
And some go further, to suggest that language developed specifically to facilitate the telling of stories. But how would narrative offer an evolutionary advantage?
Well, unlike other animals, who rely on showing their young what to eat, how to kill, what to avoid, how to heal, we – uniquely in the animal kingdom as far as we know – are able to share stories which impart vital survival information in a neat, memorable package. It is this memorability – propelled by the skill with which the story is told – which allows the information to be accessed by the listener long after the story-teller has departed. In fact detailed information can be passed from generation to generation, and in a nuanced form which greatly enhances whatever genetic memory may exist in humans, and so provide a major advantage to the species. (The genetic memory passes instructions directly from parent to offspring – as in swallows, who never fly with their parents, but still manage to navigate all the way to Africa without ever touching the ground).
Stories also allow hunters and gatherers to tell others where the food and dangers are – somewhat like the bees’ waggle dance – only much much more effectively, because the information can be passed on to others, adapted, amended, and retained almost indefinitely.
It is worth mentioning that the way humans seem to have developed the story as survival strategy may have delivered one specific consequence which is now causing major problems – the binary choice.
Robert McKee, in his famous book and seminars (which I have attended) entitled ‘Story’, suggests that there is an essential narrative – to which all other stories defer – which has at its climax a simple, single choice that the protagonist must make to achieve either redemption or retribution. Whether this is a cause or consequence of human society’s obsession with binary states (married / single, guilty / not guilty, male / female, friend / enemy, good guys / bad guys, migrant / refugee, Left / Right, Leave / Remain, etc) when plainly all of these are in reality nuanced, complex situations, I don’t know. But I do know that most people prefer to boil down their personal decision-making to a binary choice if they possibly can (it is usually essential, in fact) – and the easier politicians can make this for them, then the more support they seem to get.
So our personal paradigm is, effectively, our over-arching story – the one against which all other stories and experience must be measured, before being given credence as ‘truth’ – and our paradigm adapted accordingly.
And story-telling is the very means by which we form and police a group paradigm – as is necessary for the survival and advancement of our troupe.
So the paradigm has to be, and has thus become, robust. In fact it is downright obstinate. Unless and until the evidence is overwhelming, the paradumb – which is formed quickly upon the arrival of a conundrum, will Trump (sorry) any new thinking, to the point of being anti-intellectual.
And this is justified as an expression of personal rights (as in ‘free speech’):
And it’s not just Americans who do this, sadly.
In general, paradumbs are stubborn, though it does seem that not everyone is as wedded to theirs as everyone else. It’s actually possible that the flexibility of the paradigm may vary according to circumstance. The perception of an advancing threat may encourage or discourage change – but I’m not myself aware of any research on this.
I’ve actually changed my own a few times as a result of major evidence. I even had a short period as a born-again evangelical Pentecostal Christian (a paradumb I toppled into from the claustrophobia of an English public school, and it was much harder to move on from than it was to to adopt). And I had to have another major rethink when I began to study climate change properly – just as these guys did.
But many seem never to alter, or even to question, their major belief system – though whether this is because of their DNA, their early upbringing, or their learned experience as an adult remains unclear.
Kings and Swingers
Desmond Morris suggests in The Naked Ape that we are a strange hybrid between a) the hairy, placid, vegetarian, largely matriarchal, herd ape that swung from Late Cretaceous jungles, and b) the far-ranging, ruthless, small-family, largely patriarchal and naked omnivore which we decided to become when we emerged upright from the trees, and began to compete with the big carnivorous predators of the open plains.
Morris suggests we have never fully resolved this transition, which may explain various strange anomalies in human behaviour – not least the selfish (or even sociopathic) / altruistic conundrum, which presents differently in colonial and predator animals.
Altruism – as much of a survival strategy as selfishness – is another strange phenomenon not yet fully understood by science.
There is strong disagreement over the extent to which we are simply born ruthless, as suggested by Richard Dawkins, who promotes Kin Selection Theory in The Selfish Gene, and to what extent altruism can trump selfishness, as suggested by EO Wilson and others with Group Selection theory.
Kin Selection Theory posits that usually we support ourselves and our own progeny at the expense of all others, but are more likely to sacrifice ourselves for relatives than non-relatives. A good example would be helping your tribe by not having offspring yourself (and so reducing pressure on resources), and thus contributing to the success of your genome though those with the closest genetic code to your own.
Group Selection Theory demurs.
By pure coincidence, Wilson was a guest on the BBC’s The Life Scientific on the very day I was writing this section of this blog. He confirmed Jim Al-Khalil’s summary of Group Selection as this:
“Altruism cannot be explained away as genetic greed. Selfishness beats altruism within a group, but altruistic groups will always beat selfish groups.”
If Dawkins in right, we are in big trouble (see the Tragedy of the Commons below). But if Wilson is, then there are grounds for hope – but only if the whole of mankind can be persuaded to behave as a single group, in a context of a universally recognised, finite territory aka global ecosystem.
I would suggest that whatever the rights and wrongs of the above theories, there does seem to be a barrier, or even a binary switch, between selfish and altruistic behaviour – with those at either end or side largely incapable of understanding the motivation of those at the other.
Only the other day I heard a lecturer refer to ‘the human condition’ – as if this was single and universal. My experience is that it’s not at all universal – in fact, huge and largely unchanging (unchangeable?) differences exist in how selfishly people behave (though obviously people do behave differently at different times and in different contexts).
It’s been said that outright conservatives believe that all humans are intrinsically bad, and so yearn for a tight ship to protect themsleves, (I worry that these demand too much misery in return for their comfort), whereas firm liberals think everyone is intrinsically good, and are only turned bad by environmental or social factors (and I worry that these may be dangerously naive).
And never the twain shall meet. For example, I have close, life-long friends, whose intellect and experience I respect, who simply cannot begin to accept that anyone is ever motivated by anything other than selfishness. Even sainted altruists, they think, can only be behaving like that for ultimately selfish reasons; because it makes them feel good, or because they want to go to Heaven.
Likewise I know many people to whom selfishness is pure anathema, and who are equally convinced that co-operative altruism is the natural state of humanity, and even mild self-justification is de facto criminality. (George Monbiot finds, with some good references, that “We’re Not as Selfish As We Think We Are” here).
So how did this disparity originate?
Could this be because my friends are disposed towards different applications of altruism – within a group, or between groups, perhaps? – according to how they interpret their ‘group’?
If so, then socialists and progressives might envisage a very large group (how large is very variable from person to person – but anything from, say, ‘Trade Union’ to ‘All of Mankind’)
Conservatives and libertarians (as opposed to liberals), on the other hand, might think in terms of immediate family only. And of course sociopaths (and is that condition genetic or environmental, I wonder?) think only of themselves, and often not even, it would seem, of their own progeny.
I recently heard about the Izraeli psychiatrist Aaron Antonovsky (who coined the term Salutogenesis), interviewing concentration camp survivors after WWII. He found that even an experience as disturbing as that failed to change most people’s personal paradigm. The empty-cuppers (conservatives?) thought the holocaust merely proved that humans were indeed monstrous, while the full-cuppers (liberals?) remained convinced that it was merely a freak blip in an essentially healthy society caused by a few rogue individuals.
There is some intriguing research on the role of threat values in the development of political opinion. But how deep does the phenomenon go?
“In the 1900s, as part of my research in the cognitive and brain sciences, I undertook to answer a question in my field: How do the various policy positions of conservatives and progressives hang together? Take conservatism: What does being against abortion have to do with being for owning guns? What does owning guns have to do with denying the reality of global warming? How does being anti-government fit with wanting a stronger military? How can you be pro-life and for the death penalty? Progressives have the opposite views. How do their views hang together?
The answer came from a realization that we tend to understand the nation metaphorically in family terms: We have founding fathers. We send our sons and daughters to war. We have homeland security. The conservative and progressive worldviews dividing our country can most readily be understood in terms of moral worldviews that are encapsulated in two very different common forms of family life: The Nurturant Parent family (progressive) and the Strict Father family (conservative).
In the strict father family, father knows best. He knows right from wrong and has the ultimate authority to make sure his children and his spouse do what he says, which is taken to be what is right. Many conservative spouses accept this worldview, uphold the father’s authority, and are strict in those realms of family life that they are in charge of. When his children disobey, it is his moral duty to punish them painfully enough so that, to avoid punishment, they will obey him (do what is right) and not just do what feels good. Through physical discipline they are supposed to become disciplined, internally strong, and able to prosper in the external world. What if they don’t prosper? That means they are not disciplined, and therefore cannot be moral, and so deserve their poverty. This reasoning shows up in conservative politics in which the poor are seen as lazy and undeserving, and the rich as deserving their wealth. Responsibility is thus taken to be personal responsibility not social responsibility. What you become is only up to you; society has nothing to do with it. You are responsible for yourself, not for others — who are responsible for themselves.”
Is it too simplistic, I wonder, to render Lakoff’s Nurturant Parent Family and Strict Father Family as simply Maternal and Paternal?
But whether we call it kin vs group, nurturant vs strict, matriarchal vs patriarchal, altruist vs selfish, progressive vs conservative, or left vs right, we seem to be looking at the same basic scale, with behaviours driven by a fixed perception of evolutionary success guided by either one system or the other.
So what would cause people to interpret their ‘altruistic unit’ differently, and how fixed is that interpretation? Is it genetic or learned? To what extent is it locked inside our personal paradigm? And what evidence or experience do we need to overcome the Backfire Effect and trigger a ‘paradigm shift’ – a change in our belief system, and therefore a shift our behaviour – individually or in groups – along that scale?
There are a few bold assumptions.
In terms of the ratio of donations to income, rich people often appear to be more self-justifying (‘because I worked for it,’ or ‘because I’m worth it’ as in Kin Selection theory – as explained in the Mistakes book), while poorer people often seem to be more generous (perhaps because the pooling of resources is a better strategy when you don’t have a lot – as would be required by Group Selection Theory).
There is another intriguing idea; Attractive people are, research suggests, more likely to be right wing because, regardless of wealth or privilege (which as we’ve seen can also draw people to the right), they’re typically treated better and so have more trouble empathising with the less fortunate. If true it would help to explain why some otherwise charming friends are worryingly unable to think rationally about social problems. Plus, of course, evolution theory suggests that attractive people are more likely to breed successfully – so once again, political preference may be DNA-rooted.
But to what extent is the split between left and right based on intelligence? Certainly, the great majority of academics are left-leaning. But is this a result of a predominantly higher IQ – or a side effect of being trained to think rationally, and base their opinions on evidence?
Because also takes brains to be a successful entrepreneur – so maybe we are talking about different type of IQ (remember that the tests are written by academics, and dyslexics like me, for example, always come out badly).
Entrepreneurs need a risk-taking intelligence (we might call it wits), where getting bogged down in evidence can be unhelpful. To make money, you need to be able to weigh up your strategy quickly before making a deal, and then strike at what seems to be the most advantageous moment – sometimes more on hunch and a sense of weakness in your ‘opponent’ than too much hard evidence. But then, once the ink is dry on the contract, you need be able to subvert your fears, and adopt a fatalistic stance, because if you can’t, you won’t remain an entrepreneur for very long, (however, it’s worth noting that larger, well established businesses do present as more altruistic, though they are, perhaps, not always as truly altruistic as they present).
I’ve certainly observed that nearly all of the entrepreneurs I know (SME directors, farmers and others that I’ve heard in media reports seem to have similar traits) tend to be intelligent and well informed, but also pro-Brexit without any real idea of how it will work out, worryingly non-committal about Trump, and at best dubious about climate change. I find that these friends are usually willing to listen to my side of an argument at first, but soon get bored with nuanced evidence, and demand instead a leap of faith.
This may be a good strategy when it’s just your own money at stake, but it’s less sensible when whole communities (or, in the case of climate change, mass extinctions and pandemics – even the entire human race) are facing major damage. Interestingly, nearly all these entrepreneurs come from comparatively privileged, wealthy backgrounds – as is typically the case.
A recent study from Harvard Kennedy School and Northeastern University may help to explain this, if, as they suggest, conservatives are more likely to believe untruths.
Of course, some well-off people are very generous and philanthropic (but sometimes not until they have more moment than they know what to do with, or even only for tax reasons – miaow!), and some poor people are ruthlessly selfish and/or criminal, and not just because they need to be or have not be taught right from wrong. (And are some people only poor because they are, in fact, too generous? It is, as they say, complicated).
But in any event, there would appear to be no statistical correlation between level of income and commitment to altruism. So how to explain these different strategies?
Chimps and Champs
Jared Diamond postulates in The Third Chimpanzee that it is all to do with the availability of new territory and resources, into which to expand your tribe – or lack of it.
Diamond noticed that the (largely patriarchal) chimps were free to range and ‘capture’ new territory, whereas the (largely matriarchal) Bonobos had become trapped in a niche between the Congo river and the mountains. The chimps, with plenty of land to fight over and fly from, had became war-like and even cannibalistic, whereas the bonobos had simply had to learn to co-operate – because it was better than internecine war, and the likely collapse of the breeding population.
He suggests that when there’s new territory available, the chimpanzee’s aggressive, masculine approach seems to work best for us – as in the USA – ‘Go West’ mentality, (aka Kin Selection behaviour). When there is not, then the bonobos’ feminine, co-operative strategy can avoid having us all fighting in selfish lumps – as in the UK in 1939, (aka Group Selection behaviour).
So does the difference come down something as simple as a patriarchal or matriarchal system? It might – though history is littered with almost as many (kin-selecting?) ruthless women as (group-selecting?) male tyrants.
In either event, the availability of new territory and resources seems to be an underlying imperative.
Naomi Klein, author of the highly recommended This Changes Everything, recently said prior to a visit to Australia;
“Climate denial is pervasive in English-speaking countries such as Australia, Canada, the U.S. and the UK because of a “colonial settler mentality. Countries founded on a powerful frontier mentality have this idea of limitless nature that can be endlessly extracted. Climate change is threatening to that because there are limits and you have to respect those limits. Where that frontier narrative is strongest is where denialism is strongest.”
The UK would not, one would think, fit the frontier descriptor – but perhaps we could blame Thacherism for the similarity. But theory is compelling for the others.
Could these countries have self-selected as chimpish (because the entrepreneurial types were the most likely to emigrate) , while the stay-at-homes in the old European countries became ever more bonobo-like? In which case, could these traits have now become at least partially genetic – or are they purely learned?
Humans are generally thought to be more like our war-mongering cousin than our touchy-feely one. But actually we do display both behaviours – and the split is not (at least, I don’t think it can be) a purely male-centric / female-centric thing.
Recently the University of Nabraska published some fascinating research, which suggests that our political bias may be genetic, rather than learned. If so, could this be this because our selfish/altruistic bias is also genetic?
On the other hand, new evidence from Berkley University suggests that selfishness may be learned, because, it seems, the more treasure we amass, the less altruistic we tend to be – and the more we self-justify our advantage. Danny Doling explores similar evidence in Inequality and the 1%, as do Kate Pickett and Richard Wilkinson in The Spirit Level – but the most convincing explanations come from the aforementioned Mistakes Were Made (But Not By Me.
If true, this would also make sense in evolutionary terms, and could be applied to both individuals and family groups (who benefit from the aggressor’s wealth – directly and by Kin Selection).
It’s even been suggested that the various traits may emanate from different hominid ancestors – which diverged (maybe proto chimp and proto bonobo?), adapted to new survival strategies, and then interbred again. How you behave could be just a matter of which code is predominant in your genes, so that only very strong environmental or intellectual stimuli can change your view (which might then, of course, tend to revert as soon as the pressure abated).
If so, this might help to explain the sociopathic tendencies of some company directors (a self-selecting ruthless largely male group if ever there was one), who seem genuinely incapable of fathoming the damage that their business models inflict on people and/or environment – regardless of the weight of evidence placed before them.
So maybe we can summarise all this by saying that there seem to be three axes upon which we can plot the forces which define our paradigms and position in the Kin / Group selection model:
- Intelligence (which may in fact have three termini; ‘Dunning-Kruger’ stupidity, academic capacity, and entrepreneurial ‘wits’)
- A preference for evidence against a preference for an easy-to-believe story
- Genetic predisposition (which may or may not include attractiveness and/or ancient bonobo/chimp DNA).
And there may be three drivers which can change in our beliefs and therefore our position on that graph:
- Education, with movement towards academic capability or entrepreneurial wit depending on the nature and quality of that education)
- Experience, with evidence demanding movement on one direction, and mystery or a personal crisis pushing in the other
- Wealth/comfort, which can insulate against all of the above in either direction, according to better or worse circumstances.
And most movement is slow, because of cognitive dissonance, self justification, peer pressure etc but Damascene conversions (paradigm shifts) may occur when external stimuli are strong enough.
So, we might ask, to what extent are our own conclusions concerning climate change being fixed by our personal paradigm?
Well, I’m willing to accept that mine must be (as you’re no doubt spotted – this whole blog is no more than an exposition of the Tom Bliss Paradumb), but I do hope that I’m doing my best constantly to challenge my beliefs, and to remain open to new opinion and evidence. (I even read the ‘denialist’ blogs on a regular basis, just in case)!
So. To what extent are your views being fixed and/or skewed by your own paradumb? It’s not an easy question to answer, is it?
I do think one thing is for certain, though; the altruism / selfishness / (sociopathy?) phenomenon is going to present major challenges when it comes to fitting 14 billion people onto one small planet – a large part of which may well be, by then, a very hostile environment indeed.
Territory and Growth
As we saw in The Urbal Fix, modern human society has now developed to a point where it may already have reached ‘Overshoot’ – exceeding a range of planetary boundaries which define the carrying capacity of the Earth, and so risking ecological collapse.
This act of collegiate self-destruction is often presented as a Tragedy of the Commons – an ecological term which describes a situation where individual members of a group acting independently and rationally (according to each one’s self-interest) actually damage the interests of the group as a whole – and thus the individuals, by depleting a common resource.
The concept derives from William Forster Lloyd, who noted that if one commoner was to increase his herd on the common grazing land he would reap a benefit. But if too many other commoners did the same thing, the land would soon become overgrazed, and would eventually be destroyed.
Planet earth is our ‘common’ and we are overgrazing it. In fact we are figuratively and literally running out of world, and have entered a sort of compression zone which pushes up against the absolute limits of the biosphere. Witness the increasingly dangerous efforts to extract resources from the deep ocean and the Arctic, and the accelerating destruction of rain forest and the depletion of water tables. (It has even been suggested that the current turmoil in the Middle East and Africa has its roots in climate change as well as in dwindling fossil fuel reserves).
It was not always so. Once, as with all animals, the root of all human prosperity was territory – of which there was plenty – which could be captured and held.
From this territory could be extracted produce (as long as the local ecosystem services were healthy), raw materials (until they ran out) and rent (as long as there were people who could afford it). And upon the trading of these the economy developed. We called it classical economics – and the greatest growth could be derived, not surprisingly, from land speculation. It still can, for now – as Mark Twain said, ‘Buy land – they’re not making it any more’.
There were obvious benefits; wealth for more people, improvements in health and living standards, increased consumer choice and much else – but it was all predicated on a system which literally took no account of the size or fragility of the planet.
Originally, of course, growth was slow – almost imperceptible in modern terms – and only a tiny percentage of the population was wealthy. But with the creation of commercial banking in the 18th C, the extraction of fossil fuels could begin in earnest – seeing the exploitation of coal, then oil, then gas (along with heavy and rare earth metals) quickly becoming exponential, along with the concomitant consumption – and pollution.
Then about thirty years ago, the authorities started to realise that the economy was not growing as quickly as they believed it should (or as quickly, perhaps as a growing world population, educated by a growing global media, demanded).
The capitalist system had always required money to be made, without effort, from money. This is justified as fair recompense to the lender by the borrower for lack of access to his treasure. But this was becoming increasingly difficult, because of increasing pressure from the compression zone mentioned above. Sadly, few realised this (though the writing was on the wall, not least in Limits to Growth and Silent Spring).
Most believed, as they still do, that the by then Gordian trading systems merely needed axing. So instead of reigning in consumption to a sustainable level, and starting to rectify environmental damage to ensure the future viability of human society within the global ecosystem, they decided to break the link between wealth and territory.
The constraints which protected necessary and benign debts (like buying a house, health insurance or a reasonable pension) were removed, and gambling replaced territory as the main source of growth. It was known as neoliberalism.
The gold/dollar standard was abandoned, and fractional reserve banking was unleashed, while intrinsically unsafe extraction, destructive agricultural practices and polluting industrial development were – behind cosmetic environmental laws – given the (un)green light.
It worked – in the short term. Consumption boomed – but so did the cost, and not just to the environment. Gradually it became clear that ‘consumer choice’ is not actually a good thing.
In fact, as Dan explains in Enough is Enough, after a short gratification period. it mostly makes people less happy and less healthy, and the longer term damage to consumer communities (health and wellbeing problems, criminal and social dysfunction etc) can be severe – as can the damage to producer communities, many of whom in the Global South pay a terrible price for their ‘Westernised’ lifestyles.
So by the turn of 21st Century, and especially after the financial crisis of 2008, (triggered, predictably, by land speculation expressed as unsupportable mortgages) the system was in crisis. And none of the supposed solutions – from left, centre or right – were going anywhere near the right direction.
The old political system had proved it could simply not deliver the necessary change, and there was, and remains, an urgent need for a new ‘green’ economic model – one which could deliver self-advancement without causing a tragedy of the commons.
There are, of course, other behavioural triggers which go back to early human evolution but may have now outlived their usefulness as survival strategies.
Much of what drives consumerism goes back to our hunter (‘boy’s toys’) / gatherer (‘retail therapy’) origins, so these urges will surely need to be satiated – but in ways that don’t create the terminal damage we are seeing today.
Another semi-conscious driver is the biophobia / biophilia spectrum (as explored by David Orr in Earth in Mind).
Different humans have very different attitudes to nature – Woody Allen is a psychotic biophobe, for example – but most of us find flowers attractive – supposedly because they inform us that later they will become fruit.
Most find the seaside fascinating, possibly – as Alister Hardy suggested in 1960 – because we spent some time in our evolution as semi-aquatic creatures (David Attenborough made two very interesting programmes about this in 2016). This is disputed by many scientists (more here), but it’s fairly well-established that humans have frequently lived close to the littoral zone (especially when the sea is warmer than the air), so we may have acquired a genetic attraction to this rich food source.
And then – an appreciation of rural views is explained as offering a good prospect for hunting, and so on. (There is also some very interesting work on why men and women navigate differently – but that’s for another time).
But of course nature can also be capricious, dangerous and damaging. It can snatch away vital resources in a trice, and decimate a tribe. So as the human paradigm evolved, so did a need to protect beneficial nature (mainly food), to separate from harmful nature (pests, predators and poisons), and eventually in modern man a strong desire to tame or even defeat nature.
I was not sure whether to be amused or appalled when public consultation over the establishment of a stunning rare wild-flower meadow produced an outraged response that it was ‘all just long grass and weeds – and it made my dog wet.’ Many house-buyers immediately set about cutting down trees and bushes around their new property, and often pave their new garden too. Millions of ‘metrosexuals’ adore urban environments, consume their countryside in antiseptic packages – and, more importantly, completely fail to use their votes on behalf of the nature upon which their very existence depends.
The extent to which this is genetic or learned is unclear, but our attitude to nature would certainly seem to form a large part of our paradigms, and as such, it must represent another major ‘stickage’ which we will certainly need to address – if we are to learn again to live in ‘sufficient’ harmony with the biosphere.