Helping Students Overcome Confirmation Bias

Posted December 5, 2018

By Andrew Marx

Steve has to write a research paper for his psychology course. Given wide latitude by his instructor, he elected to take up a project assessing the relative merits of cognitive therapy as an alternative to medication for the treatment of depression. The trouble is, Steve already strongly believes that cognitive therapy is a superior alternative. That prior judgment is likely to inform his research and the paper he’ll eventually write.

Steve begins by searching for articles on this matter. He uses key phrases such as “advantages of talk therapy”, “why cognitive therapy is better than medication”, and “serious risks of SSRIs.” He does not bother to complete searches with phrases such “when is medication a better alternative to cognitive therapy”, or “comparison of outcomes for different therapeutic approaches to depression.” That is, he is searching in a manner that will tend to lead him to articles that support his current position. He will find confirmation for the view he already holds. He is also avoiding searches that would tend to lead him to material that could disconfirm his position.

A screengrab of a Google search for "why cognitive therapy is better than medication"
Screengrab of a Google search
Nevertheless, Steve finds and reads some articles whose conclusions run counter to his view. He finds them to be lacking, though, citing weak methodology, lack of author expertise, and so on. In short, he finds sources that support his view to be more authoritative than those that don’t.

These are typical manifestations of confirmation bias. As a teacher of psychology, you may already explore it as a concept with your students, but it’s clear that many students are also affected by it in practice. I’m not a psychology professor, but I teach critical thinking in an interdisciplinary general education program that also emphasizes research and writing. Over the years, I’ve seen many research projects culminating in argumentative essays gone awry, thanks to confirmation bias. This post explores how it can arise in coursework and discusses what we might do to mitigate it.

Definitions of confirmation bias are many and varied, but I’ll proceed with this understanding.

“Confirmation bias refers to a type of selective thinking whereby one tends to notice and to look for what confirms one's beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one's beliefs.” (Carroll, 2016)

A quote that reads: “Confirmation bias refers to a type of selective thinking whereby one tends to notice and to look for what confirms one's beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one's beliefs.” (Carroll, 2016)

We can see both aspects of this present in Steve’s work. He went about looking for relevant evidence in a way that heavily favored confirming data over disconfirming data, and he heavily discounted the latter when he found it.

Many teachers who assign argumentative essays require students to acknowledge and respond to counter arguments. Do such requirements succeed in getting students to treat multiple perspectives fairly and comprehensively? They often fail to help because students tend to misrepresent counterarguments or address them dismissively. Not only do they fail, but they may make matters even worse!

A well known 1979 study conducted at Stanford University supports this outlook (Lord, Ross & Lepper, 1979). Of the undergraduate participants in the study, half were supporters of capital punishment, and half opposed the practice. Each subject was presented with the results of two studies. One study presented evidence confirming the efficacy of capital punishment as a deterrent, and the other presented disconfirming evidence. Students tended to evaluate the methodology and reasoning of the study that happened to support their side more strongly. The tendencies to overrate the strength of favorable sources and underrate the strength of unfavorable ones had the effect of moving people on both sides to more extreme positions on the issue. The effect of exposing students to a balanced set of opposing views was not moderating. It was polarizing.

In my experience, students are more likely to handle opposing sources and opposing views poorly when they are directly put to the task of taking a side and arguing for it. For the typical student in that situation, everything is about finding and organizing evidence that advances their single goal - winning an argument. They have little reason to avoid the worst tendencies of confirmation bias.

Part of the problem is that students dismiss opposing arguments because they are induced to present arguments with confidence. So, we ought to try to present different stakes for them. This could mean shifting research and writing goals away from argument, or it could mean bringing in intermediate steps. A different kind of assignment can focus on research as an end in itself, where students must think about the merits of sources, that is, their authority and their content strengths. This can culminate in products such as critical annotated bibliographies, where students can be evaluated on (and perhaps more importantly, rewarded for) fair and thorough evaluations of sources.

Student studying at a computer
Image from Pexels
How could this help to lessen the impact of confirmation bias on students? When they are made to take opposing sources and arguments seriously and consider them more carefully (which they are more likely to do when they know they’ll be graded on precisely that), those same sources and arguments will not be so easily dismissed or downplayed further down the road. Once a student has acknowledged the merits of a study that challenges their prior judgments, it cannot then be dealt with as trivial in an argumentative research paper.

Another strategy that holds promise involves disabusing students of a certain notion. Many students approach arguments as contests that they don’t just have to win; they often feel a need to dominate! For the typical student, is it enough that the balance of evidence tips in their favor, or is it important to that every single issue that bears on their argument go their way? I’ve had students admit to having thought dichotomously about the positions they take in debates, thinking along the lines of “Either I’m right about everything on this issue, or I’m on the wrong side of it, altogether.” Students can get better at avoiding such false dichotomies.

Conversations with students like that one have led me to suspect that another kind of fallacy is at play, one that only fuels confirmation bias. I have dubbed it (tentatively) “The fallacy of the overwhelming case.” It is a flawed approach to reasoning wherein one assumes that any point that would, if true, support a conclusion, should be taken as true. Working under such an assumption, students may feel obliged to argue for points where the evidence is not on their side. This increases the psychological pressure to be dismissive of compelling evidence to the contrary.

Think about those students who argue for or against capital punishment. Four crucial issues that bear on the morality of capital punishment come to mind.

1. Does the threat of capital punishment deter heinous crimes?

2. Are there misdeeds so terrible that they warrant death?

3. Does the possibility of mistaken convictions make an irrevocable punishment unjust?

4. Can people who have committed evil find redemption?

We may disagree about their relative weight, but a student who covers the standard bases in the capital punishment argument would probably wish to address all those issues, and probably more. To these questions, the pro-capital punishment researcher would answer the first two questions above in the affirmative and the last two in the negative, and marshal evidence and reasons to support those answers. An opponent of capital punishment would more likely defend the opposite answer to each question.

Were I to ever approve a student’s proposal for this topic (and that seems unlikely!) for a major argumentative paper, I would urge that person to consider that the best answer to at least one of those questions might not be the one they like. However, a supporter of capital punishment could, for example, concede Question 1 or 3 to the opposition. Likewise, an opponent could concede Question 1 (agreeing for the sake of argument that the threat of capital punishment can be a deterrent) but defend an abolitionist view on other grounds.

You might consider requiring this approach of students. As they develop arguments to support a position, they could also have to prepare to concede at least one point to an opposing side. In so doing, they may gain a better appreciation for moderation and qualification in their arguments.

I wish I could guarantee that these methods would practically eliminate the effects of confirmation bias. In truth, the outlook among many cognitive psychologists on this matter is grim. There are success stories to be told, though. And if we can’t make confirmation bias go away, we can try to deflect it a little. By incorporating new steps into your research projects, you may still see confirmation bias reinforcing many of their prior views, but you may steer them toward more reasonable versions of those views.

Bio

Andrew Marx is a faculty member of University College at Virginia Commonwealth University. There, he teaches interdisciplinary courses such as Focused Inquiry, Inquiry and the Craft of Argument, and Pseudoscience. He earned his Ph.D. in philosophy from City of New York, but continues to draw upon his undergraduate major in psychology from University of Delaware to infuse his courses on critical thinking and ethics with insights from that discipline.

References

Carroll, R.T. (2016). Confirmation bias. The Skeptic’s Dictionary. Retrieved from http://skepdic.com/confirmbias.html

Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098-2109.