In my last post, I asked if brainwashing is actually a thing. If you can’t be bothered to read that, here’s the Cliff’s Notes edition: mainstream sociologists of religion generally deny that brainwashing (as most people understand it) really exists, but lots of former members of controlling religious groups identify with descriptions of brainwashing. In this series, I’m trying to get to the bottom of why that is.
Robert Jay Lifton did not coin the term brainwashing or write the first book about it, but his Thought Reform and the Psychology of Totalism was the first rigorous academic treatment of the topic. After interviewing multiple survivors of Chinese Communist prison camps, where inmates were subjected to ‘thought reform’ programmes, Lifton created a taxonomy that is still influential today. He claimed that thought reform, or brainwashing, had eight components. You’ll often see brief guides to his framework on ex-cult or anti-cult websites. Here’s my summary:
1) Milieu control
The group controls your environment, the people you mix with, the places you can go, and the information to which you have access.
2) Mystical manipulation
Most summaries I see of Lifton’s framework describe this idea badly, perhaps because this is the least clearly explained concept in Lifton’s book. I am not entirely sure myself what Lifton means by the ‘mystical’ part, as distinct from run-of-the-mill manipulation. What Lifton argued was that the group would orchestrate events and behaviours so that they would appear to happen spontaneously. To the people being brainwashed, events would seem to be happening of their own or other members’ own volition, when actually they have been manipulated into this behaviour.
Lifton argued that because the brainwashers are convinced that they are doing ‘God’s will’ or the equivalent thereof, almost anything could be justified to make it happen. Anything which contributes to achieving this “higher purpose” is right, and anything which gets in its way is selfish and undesirable.
Other summaries of Lifton’s framework I have seen explain it this way:
The group attributes supernatural influences where none are present–attributing an accident to a member that left to be “God’s punishment”–or manipulates situations so they appear spontaneous–members believing that their new feelings and behavior has arisen spontaneously because of joining their new group.
However, Lifton doesn’t actually say that in his book and I’m not convinced this is an accurate (or at least complete) interpretation of his meaning. I do think it’s one way people can be manipulated though.
From a psychological perspective, the idea of manipulating people into doing something is very interesting. As I’ve blogged in the past, cognitive dissonance theory predicts that people will change their beliefs to bring them in line with their own behaviour. If you force people to behave a certain way, though, you wouldn’t expect cognitive dissonance to act in the same way. Here’s how the two thought processes might go:
Unforced behaviour: I’ve been having ecstatic religious experiences for a year. I must be a believer in this religion.
Forced behaviour: I’ve been having ecstatic religious experiences, but I don’t really believe any of this. I’m just doing it because they’re forcing me.
If, however, you were manipulated into behaving a particular way, the thought pattern would be much more like the one of unforced behaviour. I think, then, that manipulation is clearly a potentially effective tool of indoctrination.
3) Demand for purity
The world is sharply divided into the pure and the impure, black and white. Those being brainwashed are pressured both to separate themselves from impure people (although this may be done for them through the first step, milieu control) and to purge themselves of impure ideas and thoughts. The idea is that you should strive to become the perfect Believer. Of course, this is impossible, and the constant failure creates a cycle of shame that makes you feel more and more dependent on the group.
4) Cult of confession
Those being brainwashed are expected to confess their sins to the group or to a group leader. And of course, these sins are often invented sins, arbitrary evils that the group has declared unacceptable. Sexual sins are a great one here, because virtually all humans have sexual impulses. By telling your members that these are wrong, they feel ashamed just for being human. By having to confess to the group, you increase your feelings of dependency on the group and you give the leaders ammunition to use against you in the future.
Incidentally, ‘biblical counselling’ methods like those advocated by Jay Lifton reek of the cult of confession.
5) Sacred science
The group’s dogma becomes a ‘science’. That is, the basic ideas are implicitly or explicitly unquestionable and claim to be absolute truth.
6) Loaded language
I wrote a whole post about this for Samantha Field’s blog. If you’re familiar with George Orwell’s Newspeak language from Nineteen Eighty-Four, you’ve read one example of a loaded language system. In high control groups, language is loaded in all kinds of way. There’s jargon which makes communication with outsiders more difficult. And there are thought terminating clichés: brief, reductive labels you can stick on things, and which end thought on the subject. When Islamists label something ‘harram’, for example, that is a thought-terminating cliché. It is forbidden. There is no need for any more consideration of whether it is bad.
7) Doctrine over person
I’m not sure Lifton really needed this category, because I don’t think it adds much that we don’t already know from ‘sacred science’, but clearly if the dogmas of the group are absolute truth, they are more important than any individual member. This means that if your experiences contradict the teachings of the group, they are to be dismissed.
For me ‘doctrine over person’ is encapsulated by the charismatic preachers of my childhood who used to get us to chant “I walk by faith. I am not moved by what I see. I am not moved by what I feel. I am only moved by God’s Word”. If the teachings of the group hadn’t worked out for us, it was us, not the teachings, who were to blame.
8) Dispensing of existence
The final stage is the logical conclusion of such a black-and-white way of thinking. When the group possesses the Absolute Truth, it has the say over who has the right to exist and who does not. We have seen this most graphically with da’esh and al-qaeda, but the same idea is there in the concept of eternal damnation: the group knows who ultimately has the right to exist and who does not. It’s there in Stalin’s purges and in the camps at Auschwitz. Even if this concept doesn’t reach the extremes of killing people, it might express itself through denying people the right to other ways of life. You see shades of this in illiberal political systems, particularly Apartheid, which treat people as nonpersons.
So what does all this mean?
Lifton acknowledges that all of these exist, to a certain extent, in most groups, so he saw totalism as a matter of degree. Still, it’s easy to think of groups where these eight characteristics are all extremely prevalent, and according to Lifton’s framework, these might be considered brainwashing. Later in his career, however, Lifton suggested we should stop using the word brainwashing because it is so easily misunderstood, tied in with popular misconceptions about turning people into robots.
So yes, if you’ve been through a programme that embodies all eight of these factors, you could say you’d been through a brainwashing programme. The problem is that these programmes rarely produce individuals who behave like the tabloid notion of a brainwashed person. You could go through all this and still not believe everything the group wants you to believe. You wouldn’t necessarily undergo a radical personality change, or have a thousand-yard stare, or be incapable of communication apart from repeating the group’s soundbites.
Of Lifton’s interviewees, most had never entirely believed what they were told even when they were in the Chinese prison camp. Of those that did, the effects wore off rapidly after they left. There was only one of his subjects who really took on the Communist dogma hook, line and sinker, and after a year of living away from Communist influence, even she began to show every sign of changing her mind.
So yes, brainwashing is a thing—and a very clearly defined thing—but we’re not actually much closer to knowing why it works on some people more than others, or how some people get dramatically indoctrinated. What Lifton gives us is a very insightful description of how groups with totalist ideologies tend to work.