Have you seen this video? It has been making the rounds and, recently, SGR featured it in our weekly 10 in 10 Update. I’m sure most of you reading this, especially those of you who are regular Facebook users, have witnessed the phenomenon outlined in this video—from viral videos to cat memes, information can spread like a communicable disease. Unfortunately misinformation can spread just as rapidly.
While funny memes and Buzzfeed articles are more entertaining than harmful, misinformation can be extremely harmful. Sometimes the people spreading these “mental sneezes,” and the ones catching them, don’t realize this. One of the great challenges we are faced with in the Information Age, is how to distinguish the factual, evidence-based information from the misinformation. If you are a frequent Facebooker or Googler, this can be a daunting task due to the speed at which misinformation travels and the speed with which search engines may be integrating our personal biases and habits with their filtering.
A while back, I read a book called The Filter Bubble. This book discusses the inner workings of internet filters and reveals the ways in which they can potentially promote bias and threaten rational discourse. In the book, the searches of two women are compared. Both Googled the term “BP” (as in, the oil company). The results Google returned were very different for each. One got results that were mostly about the BP oil spill, the other got mostly investment information about the company. These women had similar backgrounds and similar political beliefs. Google was primed to retrieve results based on their previous search habits. Considering these factors, you may suspect that people with drastically different political views and backgrounds would retrieve drastically different results when conducting identical searches.
What this suggests is that those great algorithms that help Google and Facebook determine what kinds of food and clothing we like (based on our previous habits) and target ads accordingly, are also determining the seemingly reliable information that is served up to us every time we do a search. In other words, Google is in your head and it knows what you want to hear.
Sometimes this is good thing. We want the information we want to be delivered as quickly as possible, right? The problem is, of course, it isn’t always reliable information. And this can be bad because, even if it isn’t reliable, we are still willing to accept it as reliable if it confirms our previously held notions. We tend to seek out information that makes us feel validated. And then we tend to post that information on Facebook or Reddit and argue about it for hours—even days—on end because even the arguing reinforces our belief in the opinion-confirming information.
Social psychologist Leon Festinger developed the theory cognitive dissonance to explain our tendency to seek out information that confirms and reinforces our previously held assumptions. The theory contends that we “seek consistency in our beliefs and attitudes in any situation where two cognitions are inconsistent,” (read more here). When we are confronted with information that conflicts with our beliefs and attitudes, we experience cognitive dissonance, a sort of “brain noise” that results from trying to hold two or more inconsistent beliefs at the same time. There are several ways we try to silence this noise:
- We change our beliefs, behaviors, and/or opinions (but not usually).
- We try to justify the belief, behavior, and/or opinion by altering the conflicting information so that it is a better fit with what we already think and feel.
- We try to justify beliefs, behaviors, and/or opinions by adding new ideas that make us feel better about what we already think and feel.
- We decide to ignore any new conflicting information that would call our beliefs, behaviors, and/or opinions into question.
As we learned from Randy’s blog on Monday, bad decisions are often made in groups. Herd mentality can ensure that bad decisions have a bigger impact, that mistakes are more difficult to correct, and that misinformation spreads faster. In order to make better decisions, we need to be well-informed; we need to be receptive to new information, even when it conflicts with our opinions and beliefs, and learn to spot harmful misinformation so that we don’t spread it. Next week, we’ll discuss some practices that can help us do this.
Written by:
Muriel Call
Research Coordinator
governmentresource.com
[…] my last blog, I wrote about cognitive dissonance and intended to follow up on this topic in my next blog. […]
[…] in mid-March, I wrote a blog post called “Ditching the Dissonance.” In this blog, I discussed how quickly bad information gets around, and how we become complicit in […]