Unlearning Mistaken Ideas

Share

Synopsis

"The psychological study of misconceptions shows that all of us possess many beliefs that are flawed or flat-out wrong—and also that we cling to these fallacies with remarkable tenacity".

"Often mistaken, never in doubt." That wry phrase describes us all more than we'd like to admit. The psychological study of misconceptions shows that all of us possess many beliefs that are flawed or flat-out wrong—and also that we cling to these fallacies with remarkable tenacity. An earlier issue of The Brilliant Report examined how we can know what we don't know. This week we'll look at ways to actively disabuse ourselves or others of erroneous conceptions. Although much of this research concerns misguided notions of how the physical world works, the techniques it has produced can be used to correct any sort of deficient understanding.
       The most important thing to realize is that just telling isn't enough. Most methods of instruction and training assume that if you provide people with the right information, it will replace any mistaken information listeners may already possess. But this just isn't so. Especially when our previous beliefs (even though faulty) have proved useful to us, and when they appear to be confirmed by everyday experience, we are reluctant to let them go. Donna Alvermann, a language and literacy researcher at the University of Georgia, notes that in study after study, "students ignored correct textual information when it conflicted with their previously held concepts. On measures of free recall and recognition, the students consistently let their incorrect prior knowledge override incoming correct information." It's what our mothers called "in one ear and out the other." Here, three ways to make that new information push out the old:

Highlight the mistaken notion. The simplest way to correct mistaken notions is to point them out as the accurate information is being presented. In a 2010 article in theInternational Journal of Science and Mathematics Education, researcher Christine Tippett offers an example from a science book for children: "Some people believe that a camel stores water in its hump. They think that the hump gets smaller as the camel uses up water.  But this idea is not true. The hump stores fat and grows smaller only if the camel has not eaten for a long time. A camel can also live for days without water because water is  produced as the fat in its hump is used up." Note the three-part structure: the misapprehension is described, declared false, and replaced by an accurate version. Although such "refutation text" is very effective in debunking misconceptions, Tippett notes, it's rarely used in informational books for children or in textbooks for older learners.

Issue an advance alert. For more deeply embedded beliefs that resist simple clarification, teachers, managers and other leaders can ask people to "activate" these prior beliefs, then instruct them to attend carefully to ways in which the correct explanation differs from their current conviction. For example, Donna Alvermann and a co-author conducted an experiment in which students in an introductory physics class were asked to draw, and then explain, the path a marble would take if shot from a tabletop. The investigators' instructions contained this advice: "If you thought that the path the marble would take would be straight down, straight out and then straight down, or straight out and then curved down, your ideas may be different from what the laws of physics would suggest. As you read the following text, be sure to pay attention to those ideas presented that may be different from your own." The students who were "forewarned" with these instructions, the authors note, "showed marked improvement in learning information that conflicted with their existing knowledge."

Create a confrontation. For the most tenaciously-held beliefs, it may be necessary to stage an intervention. In a 2002 article in the American Journal of Physics, researchers from the University of Washington note that "students often finish a standard introductory course or an advanced undergraduate course on relativity with some fundamentally incorrect beliefs." It's frequently not enough for instructors to point out the discrepancy between learners' convictions and the way things actually work, they note; learners have to perceive this discrepancy themselves, at which point they'll be motivated to resolve it. The Washington researchers designed tutorials in which students were led to confront the fact that they held two mutually-exclusive ideas, one mistaken and one correct (in this case, about the concept of time in special relativity). The students then were helped to discard naive beliefs and fully embrace scientific ones. The key is creating an uncomfortable sense of cognitive dissonance; only then are we willing to trade our private versions of reality for something that looks more like the real world. Abstracts of the studies referenced here can be found on my blog.

(Want to read past issues of The Brilliant Report? You'll find them here.)

 

Tags: cognitive dissonance, information, knowledge, misconceptions

blog comments powered by Disqus