Spot the non-sequiturs in this story from last Thursday's Times (behind paywall):
Human beings are predisposed to believe in God and the afterlife, according to a study by academics at the University of Oxford.
The findings of a three-year, £1.9 million research project suggest that there is an inbuilt bias in the mind towards seeing the world in religious or spiritual terms. This means that public life will always have a strong religious dimension, and that religion will always have an impact on public life, the project leaders say.
“It means you cannot separate religion and public life,” said Roger Trigg, a philosophy professor and co-director of the project. Professor Trigg, from the Ian Ramsey Centre in the Theology Faculty at Oxford, said: “The mind is open to supernatural agency. There are lots of explanations. It is certainly linked to basic cognitive architecture, in other words, the way we think.”
So, it appears researchers have discovered that humans have an innate predisposition towards religious belief. I’m not qualified to assess the validity of this finding. But I’m curious as to how this leads automatically to the conclusion that ‘religion will always have an impact on public life’. How on earth did 'public life' sneak in there?
The term is, of course, taken straight from the current highly-charged debates about the place of religion in society, and this is very much a political rather than a scientific claim. One of the tropes of anti-secularist discourse has been that secularists and atheists are actively seeking to exclude religion from ‘public life’, or from something called ‘the public square’. It’s become one of those truisms for which little evidence is ever produced. As Helena Kennedy said last week about the Coalition’s constant claim that the Labour government left the economy in a terrible mess, if you repeat something often enough, people will eventually come to believe it (even in the absence of evidence and argument), and it will become part of 'common sense'. In the religion and secularism debate, we're used to a variety of weary familiar tropes of this kind: so, the ‘new’ atheism is always ‘militant’, secularism essentially ‘aggressive’, and liberalism is just as ‘fundamentalist’ as some forms of religion.
Returning to the Oxford research: As Deborah Cameron said in a recent radio debate with Simon Baron-Cohen, about supposedly ‘inbuilt’ gender differences, it’s not that the findings of neuroscience are necessarily ‘wrong’, it’s rather that neuroscientists claim too much for them. They over-ambitiously seek to draw a straight line from some aspect of our biological make-up to attitudes and activities that are deeply embedded in human culture, society and history. Listening to researchers of this ilk (and despite its theological patina, this research on religion is clearly making claims of a neuroscientific nature), you get the impression that thousands of years of history, social organisation and cultural development, not to mention philosophical reflection, count for nothing, and that our behaviour is directly determined by our genes or our brain cells, as if we lived in a laboratory rather than in complex, multi-layered human societies.
Professor Trigg almost concedes this:
He said that it was too simplistic to talk in terms of being “hard-wired” or “programmed” to believe in God, however. Environmental factors also applied, and humans were not naturally monotheistic. The supernatural instinct could manifest in polytheism or other belief systems as well.
Well, it's a relief to learn that we're not all programmed to be Christians or Muslims. However, an implied admission of the project's hubris comes later:
The research has raised philosophical questions, such as why it is that if God does exist, he makes it so difficult for humans to believe in him or her. “It is not obvious,” Professor Trigg said. “Others might say it would be an encroachment on human freedom if we were too forced to believe in God.”
It’s not clear how an academic research project, even one lasting three years and costing nearly two million pounds, ever thought it was going to solve philosophical and theological problems that have mystified humankind for millenia.
As I say, I don't feel qualified (not being a neuroscientist) to evaluate the findings of these studies, but even a humble scholar of the humanities and social sciences like me can see that there might be a problem with aspects of their methodology. For example:
One study by Emily Reed Burdett and Dr Barrett at Oxford suggested that children below the age of five found it easier to believe in some superhuman properties than to understand similar human limitations. Children were asked whether their mother would know the contents of a box into which she could not see. Those aged three believed that their mother and God would always know the contents, but by the age of four many started to understand that their mothers were not all-seeing and all-knowing while continuing to believe in an all-seeing, all-knowing supernatural agent such as God.
As every sociologist, or psychologist with an ounce of sociocultural awareness, will tell you, there's simply no way of identifying some 'instinctive' understanding of the world that precedes involvement in a social world of shared ideas and values. Even 'children under five' - especially those with the ability to understand and answer a researcher's questions - have acquired language, which comes imprinted with a mass of cultural assumptions.
This study begs as many questions as it answers. Where, we might ask, did these children derive their concept of 'God' as an 'all-seeing, all-knowing supernatural agent'? Are we supposed to believe that's 'inbuilt' too?
One of the researchers involved in this particular study continued:
This project does not set out to prove God or gods exist. Just because we find it easier to think in a particular way does not mean that it is true in fact. If we look at why religious beliefs and practices persist in societies across the world, we conclude that individuals bound by religious ties might be more likely to co-operate as societies. Interestingly, we found that religion is less likely to thrive in populations living in cities in developed nations where there is already a strong social support network.
Or, one might add, where religious ideas have been challenged by science or bycompeting philosophies. Again, this supposed finding ignores cultural and social differences between societies and historical periods, in its overweening attempt to identify ahistorical, decontextualised commonalities, and thus to prove the universality - and universal usefulness - of religion.
None of which is to deny the legitimate role of faith in public life. But when religion is forced to fall back on arguments about the social value of faith, rather than attempting to prove its truthfulness, and when theology turns to neuroscience to support its claims, it's a sign of weakness rather than strength, and of desperation rather than confidence.