Why We Believe Our Own Lies

Why We Believe Our Own Lies

Psychology August 08, 2012 / By Michael Michalko
Why We Believe Our Own Lies

The power of cognitive dissonance in our daily lives.

Leon Festinger was an American social psychologist, responsible for the development of the Theory of Cognitive Dissonance, which suggests that when people are persuaded to say things and to behave in ways that are inconsistent with their beliefs, an uncomfortable psychological tension is aroused. This tension will lead people to change their beliefs to fit their actual behavior, rather than the other way around, as originally thought.

Festinger’s ground-breaking social psychological experiments provide a central insight into the stories we tell ourselves about why we think and behave the way we do. His experiments were wonderfully deceptive and ingenious. Following is a thought experiment modeled after Festinger’s work. Imagine you are the participant in the experiment and imagine how you would respond to the experience.


You are told the experiment is about how your expectations affect the actual experience of a task. Apparently, there are two groups and in the other group, they have been given a particular expectation about the study. To instill the expectation subtly, the participants in the other groups are informally briefed by a student who has apparently just completed the task. In your group, though, you'll do the task with no expectations.

Your first task is extremely boring. You are asked to move some spools around in a box for half an hour, then for the next half an hour, you move pegs around a board. At the end of the tasks, the experimenter thanks you for taking part, and then tells you that many other people find the task pretty interesting. This puzzles you as you found it dreadfully boring. Then the experimenter asks for your help. The participant coming in after you is in the other condition he mentioned before you did the task - the condition in which they have an expectation before carrying out the task. This expectation is that the task is actually really interesting. Unfortunately, the person who usually sets up their expectation hasn't turned up. He offers you $1 as a token of appreciation if you agree. You agree to help. You are introduced to the next participant who is about to do the same task you just completed. As instructed you tell her that the task she's about to do is really interesting. Then the experimenter returns, thanks you again, and once again tells you that many people enjoyed the task and hopes you found it interesting.

Then you are ushered through to another room where you are interviewed about the experiment you've just done. One of the questions asks you about how interesting the task was that you were given to do. This makes you pause for a minute and think. Now it seems to you that the task wasn't as boring as you first thought. You start to see how even the repetitive movements of the spools and pegs had a certain symmetrical beauty. And it was all in the name of science after all. This was a worthwhile endeavor and you hope the experimenters get some interesting results out of it. You figure that, on reflection, it wasn't as bad as you first thought. You rate it moderately interesting.

After the experiment, you go and talk to your friend who was also doing the experiment. Comparing notes you found that your experiences were almost identical except for one vital difference. She was offered way more than you to brief the next student: $100! You ask her about the task with the spools and pegs: "Oh," she replies. "That was the most boring task imaginable, I gave it the lowest rating possible."

"No," you insist. "It wasn't that bad. Actually, when you think about it, it was pretty interesting." She looks at you mockingly, shakes her head, and walks away.

The Power of Cognitive Dissonance

What you've just experienced is the power of cognitive dissonance. Social psychologists studying cognitive dissonance are interested in the way we deal with two thoughts that contradict each other - and how we deal with this contradiction. In this case: you thought the task was boring to start off with and then you were persuaded to tell someone else the task was interesting. But, you're not the kind of person to casually go around lying to people. So how can you resolve your view of yourself as an honest person with lying to the next participant? Your mind resolves this conundrum by deciding that actually, the study was pretty interesting after all. You are helped to this conclusion by the experimenter who tells you other people also thought the study was pretty interesting. The other participant, meanwhile, has no need to justify her lie. She was paid $100 to lie, and she lied for the money. She experienced no contradictory thoughts or dissonance and stayed true to her belief about the experiment.


Numerous studies and experiments of cognitive dissonance have been performed over the years since Festinger’s theory was first published. The effect of contradictory thoughts on changing beliefs has been confirmed over and over. It explains much about us and our everyday behavior. Here are some examples provided by Morton Hunt in The Story of Psychology:

When trying to join a group, the harder they make the barriers to entry, the more you value your membership. To resolve the dissonance between the hoops you were forced to jump through, and the reality of what turns out to be a pretty average club, we convince ourselves the club is, in fact, fantastic.

People will interpret the same information in radically different ways to support their own views of the world. When deciding our view on a contentious point, we conveniently forget what contradicts with our own theory and remember everything that fits. People quickly adjust their values to fit their behavior, even when it is clearly immoral. Those stealing from their employer will believe that "Everyone does it" and that the theft is commonly accepted as part of the cost of doing business. Or, alternatively, "I'm underpaid so I deserve a little something extra on the side." People who cheat on their tax returns will believe that “The government expects you to cheat” and that people will think you are stupid if you don’t.

Soldiers, when asked to perform immoral tasks, will believe that it is their solemn duty to follow orders from their superiors which absolves them of any personal guilt. Politicians justify lying about their personal beliefs and principles to win votes on the grounds that their election is ultimately for the good of the country. Government leaders lie to their people about their real goals in waging war on the grounds that the average person is too ignorant to understand the geopolitical realities of the world.

Historically, many government leaders end up believing their own lies. An example is Adolph Hitler who started WWII to acquire more territory and power. He told the German people that the war was started by Poland, and despite his many offers of a negotiated peace he was refused time and time again. He then said he invaded Russia to prevent Bolshevism from conquering Europe. He died believing Germany was the victim of a worldwide Jewish conspiracy against the German people.

I’m sure you can think of a number of situations in which people resolve cognitive dissonance through rationalizations. The church minister who justifies personal wealth on the grounds that God wants true believers to live in luxury. The son who justifies not visiting his parent in a nursing home because of the lack of time. The father who justifies abandoning his family because they are better off without him. The criminal who justifies his crimes because of environmental or racial factors. The married man who has a one-night stand because what his wife doesn’t know can’t hurt her. The billionaire who inherited his wealth but tells others he is a self-made man. The church Bishops who secretly relocated priests who were child abusers to other parishes because God forgives all who confess their sins. The football coaches who ignore criminal acts in their programs because it is not their responsibility to inform the appropriate authorities. And so on and on.

Being aware of this can help us avoid falling foul of the most dangerous consequences of cognitive dissonance: believing our own lies.
Michael Michalko is the author of the highly acclaimed Thinkertoys: A Handbook of Creative Thinking Techniques; Cracking Creativity: The Secrets of Creative Genius; ThinkPak: A Brainstorming Card Deck and Creative Thinkering: Putting Your Imagination to Work. See more HERE.

Article Featured Image: "Vanitas Pinokio" by RaySys (2009)

comments powered by Disqus