Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

The idea of brainwashing is 'pseudoscientific' and 'dehumanises people'

Sister of women who helped plan 1978 Jonestown massacre says she is disheartened by how casually the word is used

Rebecca Moore
Thursday 19 July 2018 12:17 BST
Comments
The settlement is remembered in the phrase 'drinking the Kool-Aid', after over 900 people died there in 1978 after drinking poison-laced punch
The settlement is remembered in the phrase 'drinking the Kool-Aid', after over 900 people died there in 1978 after drinking poison-laced punch (Rex Features)

Nearly 40 years ago, my two sisters, Carolyn Layton and Annie Moore, were among those who planned the mass deaths in Jonestown on 18 November 1978.

Part of a movement called Peoples Temple, which was led by a charismatic pastor named Jim Jones, they had moved with 1,000 other Americans to the South American nation of Guyana in order to create a communal utopia. Under pressure from concerned relatives and the media, however, they implemented a plan of group murder and suicide. Jonestown is remembered in the phrase “drinking the Kool-Aid”, because more than 900 people died after drinking poison-laced punch. My two sisters and nephew were among those who died.

In the wake of this tragedy, you might think that I would be amenable to the idea that they had been brainwashed. It would absolve their heinous actions and offer an easy explanation for their behaviour.

Many argue that people join “cults” – or “new religious movements”, the term scholars prefer – because they’ve been brainwashed. The thinking goes that they’ve undergone some sort of programming that allows others to manipulate them against their will.

How else to explain why people become immersed in fringe groups that seem so alien to their previous, more socially acceptable lives? How else to account for the fact that – in some cases – they’ll even commit crimes?

But like the word “cult”, the term brainwashing seems to only be applied to groups we disapprove of. We don’t say that soldiers are brainwashed to kill other people; that’s basic training. We don’t say that fraternity members are brainwashed to haze their members; that’s peer pressure.

As a scholar of religious studies, I’m disheartened by how casually the word “brainwashing” gets thrown around, whether it’s used to describe a politician’s supporters, or individuals who are devoutly religious.

I reject the idea of brainwashing for three reasons: It is pseudoscientific, ignores research based explanations for human behaviour and dehumanises people by denying their free will.

No scientific grounding

Brainwashing is used so frequently to describe religious conversions that it has a certain panache to it, as if it were based in scientific theory.

But brainwashing presents what scientists call an “untestable hypothesis”. In order for a theory to be considered scientifically credible, it must be falsifiable; that is, it must be able to be proven incorrect. For example, as soon as things fall up instead of down, we will know that the theory of gravity is false.

Since we cannot really prove that brainwashing does not exist, it fails to meet the standard criteria of the scientific method.

In addition, there seems to be no way to have a conversation about brainwashing: you either accept it or you don’t. You can’t argue with someone who says “I was brainwashed.” But real science seeks argument and disagreement, as scholars challenge their colleagues’ theories and presuppositions.

Finally, if brainwashing really existed, more people would join and stay in these groups. But studies have shown that members of new religions generally leave the group within a few years of joining.

Even advocates of brainwashing theories are abandoning the term in the face of such criticism, using more scientific sounding expressions such as “thought reform” and “coercive persuasion” in its stead.

Conversion, conditioning and coercion

Once we move beyond brainwashing as an explanation for people’s behaviours, we can actually learn quite a bit about why individuals are drawn to new ideas and alternative religions or make choices at odds with their previous lifestyles.

There are at least three scientific, neutral and precise terms that can replace brainwashing.

The first is “conversion”, which describes an individual’s striking change in attitude, emotion or viewpoint. It’s typically used in the context of religious transformation, but it can describe other radical changes – from voting for the “wrong” candidate to joining Earth First!

It can be sudden and dramatic, as in the case of Saint Paul, who had been persecuting the early church but then stopped after supposedly hearing a voice from heaven. Or it can be a slow and gradual process, similar to the way Mahatma Gandhi came to understand his role and mission as a leader for Indian independence.

We usually think of conversion as a voluntary process. But when we look at accounts of well-respected converts – Saint Augustine comes to mind – we find exactly what the philosopher William James said we would: converts begin by being passive recipients of a transcendent, life changing event. They don’t plan for it; it just happens. But they cannot go back to the way things were before their experience.

Next, there’s conditioning, which refers to the psychological process of learning to behave in a certain way in response to certain stimuli. As we grow up and experience life, we become conditioned by parents, teachers, friends and society to think and feel in certain predictable ways. We get rewarded for some things we do and punished for others. This influences how we behave. There is nothing evil or nefarious about this process.

Studies have shown that many of the people who seek out new religions may be predisposed or conditioned to finding a group that fosters their world view.

But what about the nice people who, in rare cases, end up doing terrible things after joining a new religious movement?

Again, the process of conditioning seems to offer some explanation. For example, peer pressure has the powerful ability to condition people to conform to specific roles they are assigned. In the Stanford Prison Experiment, participants were randomly assigned the role of guard and prisoner – with the guards soon becoming abusive and the inmates becoming passive.

Meanwhile, deference to authority, which Stanley Milgram studied in his famous 1961 experiment, may encourage people to do what they know is wrong. In the case of Mr Milgram’s experiment, participants applied what they believed were electric shocks to individuals, even as they heard simulated screams of pain.

And finally, coercion can also help explain why people may act against their own values, even committing crimes on occasion.

If someone is told to do something – and threatened with physical, emotional or spiritual harm if they don’t – it’s coercion. Just because someone carries out an order, it doesn’t mean they agree with it. Prisoners of war may publicly denounce their home country or claim allegiance to the enemy just to survive. When they are released from captivity, however, they revert to their true beliefs.

In other words, coercion – or exhaustion, or hunger – can make people do things they might not otherwise do. We don’t need a theory of thought reform to understand the power of fear.

A denial of agency

True believers certainly exist. My sisters fall into that category. They sincerely promoted the cause of the Peoples Temple – no matter how misguided it was under the leadership of Jim Jones – because of their deep commitment to its ideals. This commitment arose from their conversion experiences and their gradual, conditioned acceptance of ethical misbehaviour.

I do not consider them brainwashed, however. They made decisions and choices more or less freely. They knew what they were doing. The same is true for members of the Branch Davidians: they accepted and believed the word of God as interpreted by David Koresh.

If brainwashing actually existed, we would expect to see many more dangerous people running around, planning to carry out reprehensible schemes.

Instead, we find that people frequently abandon their beliefs as soon as they leave coercive environments. This fact does not address the difficulty of leaving certain groups, whether they’re political parties, religious movements, social clubs or even business organisations.

Nevertheless, people can leave these groups and abandon their beliefs – and do.

Should we consider situational hurdles and peer pressure forms of brainwashing? If that were the case, then everything – and nothing – would constitute mind control.

We have studies that illuminate processes of conversion and conditioning. We have historical examples that demonstrate what people do under compulsion.

The brainwashing explanation ignores this social scientific research. It infantilises individuals by denying them personal agency and suggesting that they are not responsible for their actions. The courts don’t buy brainwashing.

Why should we?

Rebecca Moore is emerita professor of religious studies at San Diego State University.

This article was originally published on The Conversation. Read the original article here.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in