How Beliefs Work
By Mark Tyrrell
In this article I'm going to talk about some of the ways in which people come to believe the things they do; how brainwashing works and the distinction you can make between belief based on emotional conditioning and knowledge based on real experience. When you understand how belief systems operate you gain a key insight into the behavior of people around you. And, of course, you gain some highly valuable self knowledge.
Hypnosis can be used to alter perception and therefore belief. When you work with hypnosis you learn something very profound about the nature of belief. Some hypnotized subjects can be hypnotized to see things that are not there or not to see things that are there. And it's not just about seeing. They may actually eat an onion after being instructed to see it as an apple and respond to it in every way as if it is a perfectly genuine apple.
The point here is that during these hypnotic episodes the subjects believe completely in the reality of what they experience. This illustrates to the hypnotist - and hopefully to the subject - that what you believe can be completely disconnected from actual objective reality. Similarly, when you dream during your sleep, you believe totally in the reality of the dream events you experience. But a few moments later that absolute belief counts for nothing when you wake up and see that what you believed so heartily just a few moments before has no basis in reality.
People assume that their beliefs are important, and they are, but how reliable are they? And how can perceptions be manipulated to alter beliefs? How do we come to believe things that are absurd, dangerous, or patently wrong?
The realization that beliefs are highly malleable is, in fact, fairly modern. In the old days people were seen as willfully choosing to be 'bad' or 'good' rather than being prey to implanted convictions and beliefs.
It was only in the 1950s that British psychologist William Sargent demonstrated how the Chinese succeeded in remolding the beliefs of captive American servicemen during the Korean War. When his findings were published, the term 'brainwashing' was born. Sargent also demonstrated how a person of one religion can be converted to another once you understand how belief itself works, regardless of what it is that someone believes.
Sargent suggested that beliefs largely stem from the accident of our environment rather than being personally worked out and adopted. The very accident of our birth lands us in an environment already full of the beliefs of other people - ready made dogmas just waiting for us. These 'accepted ideas' are, as it were, grafted on to us without our conscious participation. As a result, whatever their personal characteristics or level of education, the person born into medieval Japan would have a very different set of beliefs from someone born into Victorian England, or early 21st century America.
But taking our beliefs as reflections of unchanging, universal truths seriously limits our scope for self knowledge. Now I want to focus specifically on:
- Belief based on selected instances
- The 'engineering' of belief through 'brainwashing'
- Belief as contamination from others, or what we might call 'viral belief.'
When I was very young I had a cat called Morris. One day my beloved cat was run over and killed by a car. I was distraught - until I noticed that his eyes were open. I knew that when you are dead, your eyes are closed, not open. So Morris wasn't dead after all.
When the Spanish conquistadors invaded parts of South America on horseback, some of the indigenous people believed that the man and horse before them were a single creature rather than two separate ones - because they had never seen a horse before.
When the AIDS virus began to spread in the 1980s, people believed you could catch it through mere physical contact with an infected person. British policeman at rallies of HIV infected people wore plastic surgical gloves in case they had to make any arrests.
Now all of these things I have mentioned are beliefs. And the cunning thing about beliefs is that they do a pretty good imitation of knowledge. Beliefs are good at offering you the feeling of certainty. Because a belief feels strong, we assume that it must be true.
Basing a belief on selected instances prevents us waiting to discern the full pattern of something and just taking a single aspect of it and over applying it. I assumed a cat could only be dead if its eyes were closed because I had only ever seen TV actors acting being dead with their eyes closed. I made a faulty and incomplete link due to insufficient experience.
The indigenous South Americans had never seen horses nor tamed animals to use as personal transport - so they made the faulty link that the invaders must be a new species all together.
And people in the 1980s linked the new infectious disease to other infectious diseases they knew about, assuming that it must be the same as, say, scabies - which can pass through skin to skin contact.
The point here is that beliefs can be formed through 'proof by selected instances'. The bigger pattern, of course, is that cats can have open eyes and still be dead - something I learned pretty quickly. There are animals called horses that can be used to transport human beings, and so on.
So beliefs can be formed through jumping to conclusions based on incomplete information. The primitive person who knows nothing about aeronautics assumes a plane to be a giant metal bird. Linking something new to something that you think from incomplete experience resembles it can lead to faulty beliefs.
But desisting from hastily assigning a meaning to something genuinely new based on faulty comparisons takes maturity. When you find out for yourself and take the time to see what the new pattern holds, then you have knowledge rather than mere belief. In the decades since the death of Morris I have seen death enough to understand that the deceased can have eyes both open and closed. This knowledge has replaced belief.
We can form beliefs about other people in much the same way. You meet someone for the first time and they appear distracted. You quickly form the belief that they are basically a distracted person and are likely to continue to think of them as 'the distracted type'. We like to form quick and complete judgments even before all the evidence is in. Belief through selected, but incomplete, circumstances.
That person could have had a raging toothache at the time - a good enough reason for anyone to appear distracted - but if you didn't know that, the human tendency to take a single element and assume it is indicative of the whole pattern could just kick in and offer you a handy snap judgment. This tendency is endemic in human thinking and plays a central role in the formation of beliefs.
When you think about the nature of belief itself, rather than just the content of your beliefs, you start to become more objective not only about yourself but also about the environment in which you find yourself.
The role of emotion and brainwashing
It has been said that your beliefs have more to do with your emotional state than with your intelligence. Successfully and intentionally engineering the formation of a specific belief ('brainwashing') requires a whipping up of the emotions - not calm reasoned argument. We often attempt to employ reasoned logic afterwards in an effort to shore up emotionally driven belief. Psychologists call this 'rationalization' or 'justification'.
From Hitler rallies to Princess Diana's funeral, it is the raising of the emotional pitch which provides the glue in which beliefs 'stick'. As you know, otherwise perfectly reasonable people can come to believe all kinds of bizarre things. But the intensity of conviction has little to do with the reality of the belief.
Someone who may be wonderfully naturally intelligent but whose emotional life is unstable will be prey to all kinds of weird and wonderful beliefs about themselves or others. It is their emotionality which renders them vulnerable. Asking 'How could someone be so stupid as to believe that?' is to miss the point. They believe it not with their rationality, but with their emotionality. You can't contact their rationality, so to speak, until the emotionality calms down and gets out of the way.
This is exactly why, when doing hypnotherapy with, for example, a depressed person chock full of damaging beliefs, we need to calm the mind and body first, before presenting wider, more balanced ideas. We have to help realign the lens so it doesn't distort so much.
It is only very recently in the west that we have taken on board the realization that believing something with all your heart may not, of itself, be a sufficient reason for believing it to be true. Testing the validity of a belief requires being able to consider it calmly and having the opportunity to verify it free from the distortions wrought by emotion or limited experience.
Don't get me wrong here: Strong belief may be based on good solid, sensible and empirical foundations. But it can also be based upon the emotional conditioning that occurs not just inside 'cults' but inside all cultures and societies. Strong emotional focus on something makes us more impressionable concerning what we may come to believe about it. We all know that when we are 'in lust' with somebody we can believe all kinds of positive things about them which, when the lust dies off a little, turn out to be either gross exaggerations or not true at all. What makes these beliefs compelling is not logic but emotion. It's the intensity that forms beliefs.
Contamination from others
And here is another point I'd like to raise: emotional intensity is infectious - it can spread like wild fire. 'Other people believe this so strongly - so surely it must be true!'
Here is an eyewitness account written by Karl Ludecke describing the first time he ever heard Adolph Hitler speak:
Researchers found that young men and women find one another more physically attractive if they meet high up on a swinging rope bridge. The emotional intensity and raised heart beat that they experience is wrongly attributed to physical attraction rather than to the precariousness of their situation.
If you want someone to quickly believe something about you or your ideas, rouse their emotions and implant the ideas while they are emotional. A skilled hypnotherapist - such as the late Dr Milton Erickson - understands this and knows how to raise emotional pitch in their client so that new ideas can fix in the person's mind. This may have to be done after their old habitual ideas have had a chance to recede. So, paradoxically, you may have to calm a person down to make them more receptive, only to excite them again in order to fix on a new, more therapeutic and beneficial, idea. This concept is not generally understood by therapists or psychologists in our current time.
So raised emotions, attitudes and beliefs all spread across groups, even across whole communities and cultures. We don't need ordinarily recognizable hypnosis to brainwash people because any strongly whipped up emotional state becomes hypnotic in nature because of the fixation and narrowed focus it creates.
This is why it is so important to examine where beliefs come from. Certainly, when we are working psychotherapeutically with people it's important to calm the person down to the extent that objectivity gets a chance to operate in their mind. When this is done, we can begin to examine whether, say, a particular damaging belief is really their own originally, or has just accidentally spread or been grafted onto them by another person.
For example, we know that depression runs in families. It was long believed that it therefore must be genetic. However, despite occasional media hype, no actual depression gene has ever been found. Dr Michael Yapko, a world authority on depression, shows that it is just as - if not more - likely that depressive thinking and attitudes are passed though generations through emotional contagion. In other words, it runs in the families which start it and keep it going.
It's also important to remember that people (and that includes you and me) feel attached to their beliefs. Being exposed to a greater truth, or even just other viewpoints, is unsettling, People will go to extraordinary lengths to protect their beliefs from being assailed. Dr. Robert Cialdini in his book on influence showed that feeling that we are behaving consistently with our previous and publicly stated position can, for some people, become more important that discovering the real truth about something and risk as being accused of back-pedaling by others. I think you can see this happening on a daily basis in the world of politics.
Beliefs can be hard to shift. Flexible people can update their perceptions when they get new information. But for many, it's the first impressions and judgments that stick - perhaps because they save us from the inconvenience of having to think more carefully.
For many practical purposes unverified beliefs may be fine. We believe the sun will rise tomorrow. I believe my car works through fuel propulsion because others have told me it does and that's alright. But the danger signals should light up when we start to become highly emotional about something, or when we suspect that people around us are using justifications to explain away what would ordinarily be counter intuitive or cruel activities. Then it's time to step back and examine the pattern of the situation as a whole.
So to summarize what I've been talking about here:
- We can create false beliefs through proof from selected instances - like my five year old self forming beliefs about what constitutes a dead cat.
- We can wrongly assume that intensity of belief is indicative of veracity or accuracy.
- When emotions are whipped up then beliefs can be implanted through that raised emotionality - which is why cult leaders keep people emotional and you look so much more attractive up on that rope bridge
- Beliefs can be infectious and are more likely to be swallowed unexamined if many other people already hold that belief. For the practical workings of much of life this isn't necessarily a problem.
I also mentioned that admitting you were wrong can be too much for some people. Some folks, even when presented with conflicting evidence or new knowledge, value consistency over facts. I also rather cheekily suggested that we might see this clearly in politics every day.