Field of Science

Give me the child and I'll give you the adult

The Pew Research Center has a new survey out today looking into the reasons Americans give for switching in and out of religion (and between sects). Apart from the fact that there's a lot of switching going on, there's nothing particularly earth-shattering in it.

Apparently the main reason Americans give up on religion is that they just do. Well, really!

People also tend to make these life choices early on. But here's something interesting that the Pew Center doesn't pick up on.

Look at the ages of people who join the 'unaffiliated' group, but who were raised either Protestant or Catholic. More than 70% made the switch before they reached 25.

Now look at the people who were raised unaffiliated, but joined a faith group. Only 56% did that before they reached 25. In other words, they tend to be a bit older.

Switching between protestant and catholic tends to happen at still older ages. That's understandable, because it requires a smaller revolution in thought.

Why the difference between switching in and out of unaffiliated group? It might reflect Terror Management- i.e. that as you get closer to the end of life, you start to get a bit more fearful of death.

Or it might simply be that you tend to decide the major themes of your life quite early on, and they become entrenched.

Regardless, these data suggest that there's quite a small window for secularisation ideas to implant. This fits with other data that suggest there is a strong 'cohort effect' in religious belief.

Which is why the new drive in the UK to bring atheism to school kids makes sense. The Jesuits would no doubt approve!

_______________________________________________________________________________________
Creative Commons LicenseThis work by Tom Rees is licensed under a Creative Commons Attribution-Share Alike 2.0 UK: England & Wales License.

Purity and supersense unites Christians but divides liberals from conservatives

Jon Haidt, a research psychologist at the University of Virginia, reckons that he's put his finger on the root of differences in moral attitudes between liberals and conservatives in the US.

Buried in the data (although Haidt doesn't pick up on it) there's a little nugget to be teased out about the differences between believers and non believers, which is this: Christians really differ from non-religionists in just one facet of morality (purity/sanctity).

The purity moral factor is a strange one, but it's closely related to what Bruce Hood calls Supersense.

So, from the beginning. According to Haidt, morality can be divided into five factors, but liberals only care about two of them:

In every sample we've looked at, in the United States, in other Western countries, and even among our Latin American and Eastern Asian respondents, we find that people who self-identify as liberals endorse moral values and statements related to the two individualizing foundations primarily, whereas self-described conservatives endorse values and statements related to all five foundations. It seems that the moral domain encompasses more for conservatives—it's not just about Gilligan's care and Kohlberg's justice. It's also about Durkheim's issues of loyalty to the group, respect for authority, and sacredness.

Here's a example of what he found in one internet-based study conducted at Project Implicit. He asked participants whether they agreed with statements relating to the five factors.

You can see that as you move across the political spectrum from liberal to conservative, people become less willing to endorse value avoiding harm and being fair, and more willing to endorse ingroup loyalty, authority, and purity.

Now, to my mind this doesn't really support Haidt's conclusions. Far from conservatives endorsing all five factors of morality, there seems to be a trade-off.

But where it gets really interesting was in a study of Church sermons. They did a word-analysis of sermons from liberal and conservative churches, and found a similar pattern.

The results were broadly similar to the general population - conservatives valued fairness less and actually dismiss the importance of avoiding harm.

But, based on the results in the general population you'd expect a big difference in purity, whereas in fact the two are pretty similar.

Which means that the differences seen in the general population are due to the non-religionists. They have fewer hangups about purity.

Now, this aspect of morality is essentially a superstition (a la Supersense). So this study holds out the prospect that weaning people away from Churches might actually decrease this type of superstition (or at least the hold it has on the political scene).


PS. If you're interested in Haidt's study, he gave a presentation on it last year at TED.
_____________________________________________________________________________________
ResearchBlogging.org
Graham, Jesse, Haidt, Jonathan, & Nosek, Brian A. (2009). Liberals and conservatives rely on different sets of moral foundations. Journal of Personality and Social Psychology, 96 (5), 1029-1046

Creative Commons LicenseThis work by Tom Rees is licensed under a Creative Commons Attribution-Share Alike 2.0 UK: England & Wales License.

Atheists are unhappy? Really?

Most studies looking at the demography of religion take a broad look - religion as a continuum from the highly devout to the stridently atheist. They tend to find that the religious people are, the happier they are.

But it's pretty hard to gauge what that actually means for atheists. Atheists are different from religious people with doubts. Also, many atheists in religious countries find themselves marginalised and excluded.

Luke Galen, a psychologist at Grand Valley State University, has recently completed a study comparing members of the Center For Inquiry (CFI) in Michigan with local Church goers. He recently gave a talk to CFI Michigan, which you can find as a podcast and slides over on the Reasonable Doubt.

The full thing is well worth a listen (if you have a spare hour, as I did on a flight earlier this week!). But for the time pressured, here's what I think is the most interesting take-home.

Atheists in the USA are a pretty reviled group. Perhaps this social exclusion is a contributor to their unhappiness. However, Galen found that, on average, CFI members were as happy as the Church members and in fact probably somewhat above average happiness.

What makes Galen's study interesting is the members of CFI, although mostly atheists, have a social group to give them a bit of positive affirmation and group luvin (if they want it!).

Another key result is shown in the figure. The people with the high emotional stability are those at either extreme of the belief scale. The people with the problem are those in the middle. So it seems that, in this group at least, confident non-believers who have a like-minded peer group have similar emotional characteristics to the confident believers.

In other words, it's not the belief or non-belief that counts, it's personal conviction and social recognition that contributes to happiness.

Galen found a number of other interesting differences between the two groups. Compared with the Churchgoers, CFI members were:
  • More open to experience
  • Less agreeable (i.e. more independent minded, more likely to argue their case and less likely to accept another's views).
  • Better educated
  • Less conscientious.
Now, what I wonder looking at all this is whether you'd get the same results in a country like the UK, where atheism is far more acceptable and religious attendance is considered to be somewhat socially abnormal. I'd be willing to bet that atheism is not strongly linked to agreeableness, for a start.

Anthropomorphic gods turn religious transgressions into moral outrage

The previous post took a look at a recent brain scanning study which found that Orthodox Christians tend to relate to their God in an unorthodox way - pretty much as they would to another human being.

Here's another study that tries to puzzle out some of the implications of that. What Carey Morewedge (Carnegie Mellon University) and Michael Clear (Harvard University) wanted to know was this: how does your view of the nature of God affect how you judge breaking religious rules.

What they did was take a group of 43 Christian students at Harvard and measure the extent to which their view of god agreed with the standard theological line (“God can occupy space without in any way distorting it”, “God can do any number of things at the same time”, “God knows everything”, “God can read minds”).

They also measured the degree to which these students thought about God in anthropomorphic terms (i.e., accepting, caring, comforting, controlling, forgiving, judging, loving, responsive, and wrathful, but not impersonal, distant or unavailable (reversed scored).

Then they presented then with a number of little stories in which people broke one of the ten commandments. Some of the stories depict acts that are immoral by most people’s standards:

“Molly entered a department store with the hope of buying a new watch. When she realized that she did not have enough money to get the watch she wanted, Molly placed it in her purse and walked out of the store undetected.”

While others were purely religious transgressions:

“One day, on his way home from work, Sam got stuck in a large traffic jam. As he attempted to change lanes, the car behind him pulled out quickly and cut him off. Sam exclaimed loudly, “God!” and proceeded to wait for another opening in the traffic.”

When they asked whether these these stories were wrong from a religious perspective, they found that the concept of God that people held (theological or anthropomorphic) didn't matter. Whichever scale they looked at, the higher the subjects scored, the more likely they were to judge the people in the stories as having broken religious rules.

They got a different result when they asked if the stories illustrated an immoral act. Judgements of immorality were pretty much entirely driven by the degree to which the subjects thought of God in an anthropomorphic way.

In other words, students who believed in a distant, impersonal god did not judge breaches of the 10 commandments to be morally wrong, even though they knew they broke their religious rules.

Explaining these results is pretty tough. Morewedge and Clear suggest that it might be because believers in an anthropomorphic god may react to religious transgressions in the same way they might if a person is harmed.

We have an inbuilt predisposition to look unfavourably on actions that harm somebody, even unintentionally. If you think of God as a person, a friend and an ally, then perhaps insulting God (by breaking God's rules) takes on moral overtones.

_____________________________________________________________________________________

ResearchBlogging.org
Morewedge, C., & Clear, M. (2008). Anthropomorphic God Concepts Engender Moral Judgment Social Cognition, 26 (2), 182-189 DOI: 10.1521/soco.2008.26.2.182

The difference betwen God, Santa Claus, and Humpty Dumpty is all in the mind

Here's a new study of brain activity of Christians who were either praying (i.e. a personal prayer), reciting the Lord's Prayer, reciting a nursery rhyme, or making wishes to Santa Claus.

What they found was that, unlike the other activities, personal prayer lit up sections of the brain that seem to be connected normal social interactions. It activated so called 'theory of mind' processing.

In the graphic shown, they're looking specifically at a region of the brain (the temporo-parietal junction) that is thought to be involved in figuring out why people are behaving the way they are.

In other words, these Christians (all members of a rather hardcore Danish Lutheran sect) seem to believe in a God who is like a good friend, with whom you can have a conversation, rather than a kind of disembodied, primal force.

All this won't be terribly surprising to cognitive anthropologists, or to anyone who has made a close study of western religion as it is actually practised. But it will probably surprise theologians, who insist that that is absolutely not what God is - and take people like Dawkins to task for their naivety in suggesting otherwise.

As the study authors explain:

This finding ... offers important insights to the study of theology, in which Christian doctrine on God’s nature includes abstract concepts like God’s omnipresence, omniscience, omnipotence, the Trinity and the Holy Spirit. Interestingly, in terms of brain function, our results suggest that the Inner Mission [that's the name of the Christian sect] participants mainly think of God as a person, rather than as an abstract entity.

In fact, this brain imaging study confirms an analysis from earlier this year showing that people very much treat prayer like a conversation with a friend. And it also confirms another recent imaging study, which found that other aspects of religious thinking also press into action fairly standard brain circuitry dealing with normal, real world interactions.

This new study has some rather interesting details, however. In addition to looking at what happens when these Christians engage in personal prayer, they also looked at the results or ritual prayer.

They found that reciting the Lord's Prayer was pretty much the same as reciting a nursery rhyme. And remember that these were highly orthodox Christians, who take the Lord's Prayer very seriously and recite it regularly.

The Rev Dr John Polkinghorne has recently opined that the sense of wonder that scientists feel (on good days) is "in a sense, an act of worship". I rather doubt that, based on these results!

And what about poor old Santa Claus? Well, Justin Barrett argues that even fervent belief in St Nick is not the same as believing in God (basically on the somewhat tendentious grounds that adults don't believe).

This study found that these Christians were completely unmoved by making wishes to Santa. Perhaps unsurprising, given that he is a false god and all that!

_____________________________________________________________________________________

ResearchBlogging.orgSchjoedt, U., Stodkilde-Jorgensen, H., Geertz, A., & Roepstorff, A. (2009). Highly religious participants recruit areas of social cognition in personal prayer Social Cognitive and Affective Neuroscience DOI: 10.1093/scan/nsn050

How magic-based medicine thrives

Why is the world overflowing with quack treatments? The list is endless, and runs the full gamut from folk remedies for colds to exorcisms for mental illness. What they all have in common is that they are stubbornly popular, despite the fact that they don't work or are even harmful.

Well here's a thought: perhaps they are popular because they don't work. Sound ridiculous? Then read on…

New out in PloS ONE is a model of the uptake of medical treatments, both effective and ineffective, by a multidisciplinary team (evolutionary biologists Mark Tanaka (University of New South Wales) and Kevin Laland (University of St Andrews), and anthropologist Jeremy Kendal).

They started from two basic assumptions: that people start using new treatments (both effective and ineffective) when they see other people using them, and that they stop using treatments if their disease drags on. Crucially, the model creators assume that people have no direct way of telling whether a treatment is effective or not.

What happens next depends on the other assumptions. The simplest case is for illnesses from which people recover naturally and don't relapse. In this scenario, ineffective, magic-based medicine is in fact highly likely to spread.

Why? Simply because the disease lasts longer if the treatment doesn't work, and that means more opportunities for others to learn about it and pick it up themselves.

Now, there is a countervailing effect, of course. Diseases that drag on are likely to trigger people to abandon their treatment. Which effect wins (picking up ineffective treatments, or abandoning treatments as the disease drags on) depends on the assumptions you feed into the model.

Illnesses with a quick spontaneous recovery and people who are reluctant to abandon treatments are fertile territory for magical cures. One implication is that people and societies that are highly conservative (meaning that they are resistant to change) will find themselves lumbered with useless medicines.

If you complicate the model, then the prospects for effective treatments get better. For example, if people can learn about treatments from those who have recovered, then the advantage that magical medicine gets from prolonging disease drops.

And if the disease can recur, then the chances of transmitting effective, rather than ineffective treatments goes up again.

So this model makes some pretty concrete predictions. Magic-based medicine will be particularly popular to treat diseases that clear up by themselves, that rarely re-occur, and where people look for advice mainly to people who have the disease now, rather than at some time in the past. And it will also be prevalent in societies where tradition has a powerful hold.

Is this prediction valid? Well, my totally subjective opinion is that, in Western society at least, most purveyors of quack medicine target people with chronic, non-life threatening conditions. A quick shufty at WorldHealthCenter reveals that the top 5 products all target chronic digestive problems or fear of heavy metal poisoning.

Not really what this model predicts. But this probably reflects the prevalence of these health concerns and the prophylactic nature of the treatment, rather than a predilection for magical cures.

What's more, there is another powerful factor at work here – prestige bias (i.e. we tend to defer to the opinion of doctors, especially for serious conditions). So proper testing of the model will take some real investigative work.

There's a nagging question though. What about that assumption made right up front – that people adopt treatments regardless of their effectiveness? Common sense says that's not going to happen – people are obviously going to prefer the treatments that appear to be effective.

Except it's not that simple to work out which treatments work, and which don't. Modern medicine relies on large, carefully controlled trials to sort the wheat from the chaff – and even then it can take trials recruiting hundreds or even thousands of patients to detect the signal through the statistical noise.

Your average person – no matter how smart or how diligent – hasn't got a hope. In some ways, it's a miracle that any traditional remedies are effective at all!
_____________________________________________________________________________________

ResearchBlogging.org Tanaka, M., Kendal, J., & Laland, K. (2009). From Traditional Medicine to Witchcraft: Why Medical Treatments Are Not Always Efficacious PLoS ONE, 4 (4) DOI: 10.1371/journal.pone.0005192

Trackback

The problem with studies on the social effects of religion...

One of the many newspaper columns published over the weekend was this one, in The Miami Herald, on the alleged beneficial effects of religion. Most of it was drawn from the work of Mike McCullough, a psychologist at the University of Miami.
McCullough's research suggests that religious people of all faiths, by sizable margins, do better in school, live longer, have more satisfying marriages and are generally happier than their nonbelieving peers.
Yes well, that's all true enough, but does it justify the claim that religions causes all these effects? In fact, the evidence is surprisingly weak. Take, for example, McCullough's recent study, quoted in the article:

In the Journal of Drug Issues, he reported that in neighborhoods plagued by alcoholism, church attendance helps more than Alcoholics Anonymous.

In fact, what they showed was that women who say that they go to Church once a week (or more) also say that they rarely binge drink. This doesn't really prove that Church attendance reduces binge drinking.

But this study does very nicely exemplify all of the problems that bedevil research into the social effects of religion (all of which McCullough acknowledges):

What people say they do and what they actually do are different things.
Most studies into the effects of religion rely on self-reported behaviour. The subjects fill in a questionnaire reporting how often they go to church, and (in this case) how often they get drunk.

But we know that how people want to see themselves strongly influences what they put in these sorts of surveys. How many people are going to put down that they both go to Church regularly and get drunk regularly, when they know that the two together are highly socially unacceptable.

Correlation is not causation.
Religious people are different to non-religious. Hard drinkers are likely to avoid going to Church for a variety of reasons. Almost all studies are purely cross-sectional. That is, they look at people at a single point in time. But that really proves nothing, especially if you don't control for personality type.

Even longitudinal studies are suspect. Suppose it works like this. A drunkard decides to turn her life around. They start going to go to the church (because, in the US, that's the premier source of support networks), and with the help of their new friends, they start to turn their life around. Is this really a story of religion causing temperance?

There's no such thing as 'religion'.
OK, this will probably come as a surprise to many. But religion is a nebulous concept, and the reason is that it's actually a label applied to a bunch of different things - most notably participating in ritual activities, and a variety of supernatural beliefs.

Mixing the two up should be verboten. And yet that is exactly what McCullough does in this study. He begins by theorizing why religion should reduce binge drinking - because it contravenes religious beliefs in a variety of ways. And then goes on to look at how church attendance, not religious belief, is linked to less drinking. There's lots of reasons why people go to church - and only some of them have anything to do with religious beliefs.

So is there any practical way of untangling this sticky mess? I think there is, and there is a limited amount of data out there that's highly suggestive.

Firstly, if you want to make the arguments that McCullough is making, that adopting a religion helps people who would otherwise get into trouble, then you really need to do an interventional study.

This means getting a group of people and giving half of them religious instruction, and the other half some secular alternative - like engagement in a support group. Amazingly, given the amount of money spent on religion, and the widespread belief that it is effective, these sorts of studies are almost never done!

However, there are two recent examples. A study that looked to see whether virginity pledges were effective. And a study that looked at whether spiritual guidance helps drug addicts. Neither showed any benefit.

Secondly, you could do research in populations where the religious are in the minority. That would help sort out whether it's just a socialization effect. In other words, people who want to conform and have the willpower to participate in wider society will turn to religious groups in a religious society, and non-religious groups in a secular one.

This also is an under-researched area (there just isn't the interest in non-religious countries). But one recent study, in Scottish teens who were mostly non religious, found that religiosity did not affect sexual behaviour.

And finally, you can get a reality check on whether encouraging religion is really a useful way to focus society's energy by looking at non-religious countries. Across a wide range of outcomes, less religious countries are happier, healthier and more secure than religious ones.

If you want to make the world a better place, then worrying about religion is not the place to start.

_____________________________________________________________________________________

ResearchBlogging.orgTerrence D. Hill, & Michael E. McCullough (2008). Religious Involvement and the Intoxication Trajectories of Low Income Urban Women Journal of Drug Issues, 38 (3)

Doing what you're told: how ritual behaviour and beliefs can be inherited

Jesse Bering wrote recently of how children soak up the opinions of those that they regard as reliable, and treat them as fact. He was talking about the work of Paul Harris at Harvard and Melissa Koenig at the University of Minnesota, but it put me in mind of another experiment on imitation that I've been meaning to blog about for a while, but never got round to.

First the paper on imitation. Now you might think that human infants, being smarter than chimpanzee infants, would be much more willing to figure things out for themselves and not just blindly copy what they're shown. In fact, the opposite is true.

This fact has been known for a long time, and was widely assumed to be a social effect - young kids just wanting to please adults. What Frank Keil (Yale University) showed was that in fact it's because they genuinely believe that what they are copying is essential to the task at hand - even if it seems ridiculous.

What they did was set up simple puzzles, like the one pictured. Then they showed the kids how to open it, using a mix of relevant actions and irrelevant, 'magical' ones (like pushing the rod with a wand, rather than pulling it out with their hand).

Not only did they copy the adults faithfully, but they persisted even when they were told that some of the actions were irrelevant, and even after they thought the experiment was over. In fact, the only thing that could shake their conviction was physically separating the magical action from the puzzle box (young kids have a built-in predisposition to think that causally connected objects must be physically connected).

What Keil concludes is that children have built-in tendency to assume that whatever adults do must be sensible and necessary, even if they can't figure out why. This probably is down to the fact that kids have to survive in an enormously complex human culture, in which things are often done for reasons that are not readily apparent.

But the consequence is that, once aberrant behaviour creeps in (perhaps in a manner similar to that of Skinner's pigeons), it can be incredibly difficult to shake.

So what of Harris' study? Well, what he showed was that young kids are able to estimate the reliability of adults as informants. What's more, they then are highly likely to believe what these trusted adults tell them.

What these results mean is that the religious beliefs and behaviours of kids (and possibly adults) are a lot more to do with culture and a lot less to do with some kind of innate psychological predisposition that is often claimed.

In a later paper, Harris takes aim at the idea that we a 'born to believe'. Although he concedes that there is some evidence for this, he thinks that the power of testimony in forging the world-view of even the very young has been underestimated.

_____________________________________________________________________________________
ResearchBlogging.org
Lyons, D., Young, A., & Keil, F. (2007). The hidden structure of overimitation Proceedings of the National Academy of Sciences, 104 (50), 19751-19756 DOI: 10.1073/pnas.0704452104

Koenig, M., Clement, F., & Harris, P. (2004). Trust in Testimony. Children's Use of True and False Statements Psychological Science, 15 (10), 694-698 DOI: 10.1111/j.0956-7976.2004.00742.x

Harris, P., & Koenig, M. (2006). Trust in Testimony: How Children Learn About Science and Religion Child Development, 77 (3), 505-524 DOI: 10.1111/j.1467-8624.2006.00886.x

Can choosing the right god reduce anxiety and other psychosis?

Religion and mental health seems to be a double-edged sword. Religion features in a lot of psychotic delusions, but there's also a lot of evidence linking religious belief to better mental health. There's some new research which suggests that part of the problem in teasing out the relationships is that it's not belief itself, but rather the type of god that you believe in, that matters.

The data came from a 2004 US survey on religion and health, which included measures of what people thought about god, and also a variety of health measures including some standard psychiatric scales.

The researchers took out all those who said they had no religion (just 3%). What's more they found no effect of how religious people said they were, or how often they went to church. So this is not a study comparing religion with non belief.

The essence of what they found is shown in the graph. They looked at whether people agreed with each of three statements describing god: Close and Loving, Approving and Forgiving, and Creating and Judging.

Believers in a close and loving god had the lowest incidence of a range of psychoses, especially anxiety. Belief in a creator/judge god had the opposite effect.

Last year they also showed that there is a similar relationship between beliefs about the afterlife and mental health. People who believed that the afterlife will bring nice things, like union with god, peace and tranquillity, reunion with loved ones, or paradise, were less psychotic. Those who thought that there is no afterlife, or that the afterlife is just a pale and shadowy life, or even those that it would bring reincarnation or the possibility of punishment had more psychoses.

So what's going on? Of course there's a significant chicken and egg problem here. Is it that particular religious beliefs making people less psychotic, or is it simply that psychotic people have negative opinions of both other people and of god?

But Kevin Flannelly, the lead researcher, has a theory to explain these results. He suggests that there is a system in the brain that's specifically designed to alert you to threats in the environment around you (the Evolutionary Threat Assessment System, or ETAS).

Dysfunction of this system, he believes, underpins a lot of mental health problems. The idea is that activation of the ETAS causes a decrease in the sense of security and also an increase in psychiatric symptoms.

Flannelly thinks that belief in a close and loving god (i.e. a protective one that's looking out for you) decreases activation of the ETAS.

To be sure, alternative theories abound - in particular attachment theory, which suggests that God acts as a surrogate parent. But this doesn't explain why belief in a loving god does not reduce somatisation.

Whatever the cause, Flannelly's results are further evidence that positive beliefs about God are linked to less psychosis, especially anxiety-related psychosis (see also Xanax of the people)


_____________________________________________________________________________________

ResearchBlogging.orgFlannelly, K., Galek, K., Ellison, C., & Koenig, H. (2009). Beliefs about God, Psychiatric Symptoms, and Evolutionary Psychiatry Journal of Religion and Health DOI: 10.1007/s10943-009-9244-z

Flannelly, K., & Galek, K. (2009). Religion, Evolution, and Mental Health: Attachment Theory and ETAS Theory Journal of Religion and Health DOI: 10.1007/s10943-009-9247-9

Paleolithic graffiti and the mismeasurement of religion

Cave paintings are a kind of ritualised art. Quite what magic was intended is a matter of debate, but that the purpose was magic and ritual was pretty much assumed by 20th century paleoanthropologists. The famous archaeologist Henri Breuil proposed that they were a kind of sympathetic magic.

Twaddle, according to Dale Guther, author of The Nature of Paleolithic Art.

According to Guthrie, paleolithic art probably has more in common with schoolboy graffiti than religious art as we understand it. Pascal Boyer explains:

Parietal art is, overwhelmingly, about big mammals and big women ... Who would draw obsessively about these limited themes? Whose mental life is teeming with fantasies of plump women and dangerous pursuits? The themes of parietal art suggest that most artists were young men, in feverish pursuit of both girls and game, young men who would derive some vicarious pleasure from depicting in lavish detail what could be experienced all too rarely in the flesh. This would seem to reduce a lot of rock art to the level of common graffiti.

And here's what Guthrie has to say:

For me, to recognize that so many of the preserved Paleolithic images were done casually, by both sexes and all age-groups, more often than not by youngsters, who even left their tracks under renditions of wounded bulls and swollen vulvas, in no way makes Paleolithic sites less hallowed. The possibility that adolescent giggles and snickers may have echoed in dark cave passages as often as the rhythm of a shaman’s chant demeans neither artists nor art.

So why have so many paleo-anthropologists, never mind the rest of us, got it so wrong for so long? Boyer argues that it's because religion is so much a part of our lives, we just find it hard to get our heads round the idea that religion as we know it simply didn't exist in the palaeolithic (indeed, one argument has it that it was the neolithic invention of organised religion that precipitated civilisation)

It seems to me that many people are really committed to the existence of “religion”, as the integrated package of metaphysics, morality and coalitional dynamics that we are familiar with in large historical, state-based societies. No matter that most anthropologists have repeated at great length, that there is no such thing in most societies - many people just want to see the other as religious.

... "religions”, with doctrine, corporate identity, brand of services, etc., certainly did not exist before large state societies. There is therefore no point in looking for the Pleistocene origins of salvation doctrines or religious intolerance. There is no origin of “religion” in that sense. There may be evolutionary underpinings to thoughts about non-existent agents, or to the compulsion to engage in ritualized behavior, but these are found in many forms of human experience that have nothing to do with gods, spirits and ancestors.

This simple point, altogether banal in cultural anthropology, seems almost impossible to convey to a larger audience. To a degree, a belief in “religion” seems convenient both to members of modern religious guilds, for obvious reasons, but also to the Dennetts and Dawkinses of recent fame.

The background to this is Boyer's argument that there really is no such thing as religion (see his 2008 paper, Evolutionary Perspectives on Religion). In the same way that the term 'tree' is a prescientific category that does not hold up to scientific scrutiny, so 'religion' is a culturally specific idea that breaks down when you try to project it onto different times and places.

Secularisation in the US will be swamped by religious fertility and immigration

There was a lot of noise recently about the ARIS survey, which showed a dramatic increase over the past decade in the numbers of non-religious Americans.

So, what does the future hold? Ever more secularisation? Perhaps, if people continue to switch out of religion.

But the demographic picture is more complicated than that. Religious people have more children than non-religious. Sure some of their children will lose their faith as they reach adulthood, but only a minority.

Then too the USA is a nation of immigrants, and immigration continues at a high rate. Immigrants tend to be more religious than the natives (they tend to come from poorer, more religious countries).

The Association for the Study of Religion, Economics, and Culture is just concluding its 2009 conference in Arlington, Virginia. Among the presentations was one by Prof Eric Kaufmann of Birkbeck College, London.

Using estimates of switching, fertility, and immigration for the religious types, he's put together a projection for what the religious picture will look like in the USA in 2050. Here's what he concludes:

  • The main drivers of religious affiliation to 2043 are immigration and secularization. However, fertility matters more in the long term.
  • Muslims will outnumber Jews by approximately 2020
  • Jews, white Catholics and liberal Protestants will decline
  • Protestants will decline from a majority in 2003 to 40 percent by 2043; Catholics may outnumber Protestants by mid-century
  • The non-religious will increase their share of the white population but not of the total population
  • Secularization will plateau by 2043 and will reverse thereafter.

The 'culture wars' that play such an important role in US politics will be affected. Kaufmann predicts that opinion on abortion is likely to become more pro-life, but that attitudes regarding homosexuality will be stable, reflecting more liberal attitudes among younger cohorts but more conservative attitudes among demographically-growing groups.

Of course, there are a number of assumptions that have to be made to make these kinds of predictions. Will the switching out rates change? Will the composition, never mind the scale, of migration change? I reckon so.

Nevertheless, it seems reasonable that the broad picture will not look so different. There is no impending mass secularization of the USA. Fertility rates and immigration will overwhelm conversions.

Educating Peter: how education increases churchgoing but erodes belief

Here's a conundrum. In the USA, better educated people are more likely to go to church services. Yet, when you look across different religious denominations, those that have a generally better educated membership have the lowest level of attendance.

The reason, according to research by Edward Glaeser at Harvard University and Bruce Sacerdote at Dartmouth College, is that education simultaneously drives attendance but lowers the intensity of religious belief (which indirectly lowers attendance).

They used both US and international data to make their point. In Western countries, educated people are more likely to go to Church but, interestingly, the situation is reversed in the former Communist countries. They put this down to the powerful anti-religious component of Communist school curricula.

However, there is an 'impressive negative relationship between education and religious beliefs' (God, Heaven and the Devil) across the international sample. Meanwhile in the US, they estimate that being in the top 5% or so of education will lower your chances of believing in heaven by about 10%.

Then they model how beliefs and education combine to drive attendance and find, not surprisingly, that beliefs are more important. But education is a strong second – about half as powerful as beliefs in explaining religious attendance.

As a result, the direct effect (education increasing attendance) is a little bit stronger than the indirect effect (attendance decreasing beliefs, and so decreasing attendance).

Why does education have this effect? With beliefs, they provide some evidence that the effect probably works both ways – that education reduces beliefs, but also that people with stronger religious beliefs are less likely to get a good education. This would match with earlier research that suggests religious fundamentalists deny educational opportunities to their children.

As far as attendance goes, they show that the link to education is probably they same as the link between education and other kinds of formal social activity. Educated people are conscientious (they stick at education, after all) and school demands the same sorts of skills that are needed for other social activities.

In other words, educated people are more socially engaged. In countries where going to church is the done thing, that's what they do - even though they don't believe what they hear when they get there!

_____________________________________________________________________________________

ResearchBlogging.orgEdward L. Glaeser, & Bruce I. Sacerdote (2008). Education and Religion. Journal of Human Capital, 2 (2), 188-215 DOI: 10.1086/590413