False-Equivalence Leads to Inaccurate Views On the Connection Between Vaccines and Autism
September 21, 2012 2 Comments
The discussion about false-equivalence in the media tends to focus on abstract philosophical questions about the media’s role and responsibilities. This media-centric view isn’t unwarranted — after all, journalists are the ones who will have to solve the problem — but it does ignore an important part of the equation: How does false-equivalence specifically influence public opinion?
According to a new study by Cornell’s Graham Dixon and George Mason’s Christopher Clarke, the answer is significantly, and in a bad way. Specifically, they found that articles presenting a false-balance between opposing views on the link between vaccines and autism make people more unsure about the absence of such a link:
To investigate how balanced presentations of the autism-vaccine controversy influence judgments of vaccine risk, we randomly assigned 327 participants to news articles that presented balanced claims both for and against an autism-vaccine link, antilink claims only, prolink claims only, or unrelated information. Readers in the balanced condition were less certain that vaccines did not cause autism and more likely to believe experts were divided on the issue. The relationship between exposure to balanced coverage and certainty was mediated by the belief that medical experts are divided about a potential autism-vaccine link.
At the very least, the study helps refute the great myth of false-equivalence — that people form opinions by cleverly navigating opposing arguments rather than using the quantity of coverage as a heuristic for figuring out what’s legitimate. As the authors point out, the results should be replicated using a wider range of articles in which various degrees of false-equivalence have been quantified, but the study is a good start toward building a body of experimental evidence that demonstrates the downside of false-equivalence.
It’s also worth noting that participants who read balanced articles were not any more likely to doubt the autism-vaccine link than people who read only a pro-link article. (In fact, people who read the balanced story were less certain than pro-link readers that there was no link, although the difference was not significant.) In other words, those who read pro-X and anti-X were just as likely (or slightly more likely) to believe X than those who read only pro-X. This strikes right at the fear of those who worry about false equivalence because it hints at the legitimizing effect of balance. When somebody reads a story that only presents one side, they may still understand that the other side exists and is the accepted view. But when the two sides are presented together, it legitimizes the unsubstantiated view and hammers home the idea that the views are equals. In terms of the autism-vaccine link, somebody who reads a story that only argues in favor of the link might still have the prior knowledge necessary to believe it’s a one-sided story about bogus science. But when when they see the pro- and anti-link views presented side by side it can cloud their perception and make them uncertain that there is a difference in the respectability of the two views.
Dixon, G.N., & Clarke, C.E. (2012). Heightening Uncertainty Around Certain Science: Media Coverage, False Balance, and the Autism-Vaccine Controversy Science Communication DOI: 10.1177/1075547012458290