Bad News: Why We Fall for Fake News
We are drawn to bad news: an experiment
You get to read the original story. Then your job is to tell the story, from memory, to the person in position number two. Then person number two has to retell it to number three, and lastly person number three passes it along to person number four, at the end of the chain. Each person has to write his or her version down, so that the researchers have a record of how the story got warped along the way. On average, around five out of the eight bad details made it into the last person’s retelling, but only two or three of the good details. Around half of the ambiguous details got lost along the way, but those that survived rarely stayed ambiguous. As the story made its way along the chain, those details were increasingly likely to become negative
We choose to consume stories about bad things. We’re more likely to want to share them with other people. We remember the bad stuff better than the good stuff, and even innocent details can become bad over time.
Why is that? Evolution may have something to do with it.
Since serious illness, injury, or death can decrease your chances of passing along your genes to the next generation, it would make sense for our minds to evolve to be particularly attentive to bad news. Does this sound familiar? Feeling compelled to keep up with the news, even while suspecting that it isn’t good for you. Suspecting that your time could be better spent. Perhaps even yearning for the imagined tranquility of simply tuning it out.
In 2001, the prolific science writer Cass Sunstein coined the term echo chamber to describe how the internet urges people to connect with like-minded others. Sunstein, a lawyer and First Amendment specialist, argued that mutual awareness, if not understanding, of opposing points of view, only possible through unfettered exposure to a wide range of viewpoints, is an essential element of democracy.
How propaganda works.
The more you expose to it, the more easily you can recall it, the more likely you will accept it as true. The relative ease with which you can think it through gives you a misleading feeling of being better informed about it. It helps you imagine how the claim might be true. And to the extent that you can easily imagine it might be true, it’s easier to take the next step and assume that it is true.
Manipulation
Something about seeing the object they were thinking about made them lean toward accepting the claim as true. We can be manipulated by images, but we can also be manipulated without images.
When it comes to figuring out what’s true and what’s not, what we see and what we’re told is sometimes less important than what we make of it.
Interpretation and fluid nature of truth as we perceive it.
We each interpret the world through the lens of our experiences, beliefs, fears, and desires. “From this point of view,” Hastorf and Cantril concluded, “it is inaccurate and misleading to say that different people have different ‘attitudes’ concerning the same ‘thing.’ For the ‘thing’ simply is not the same for different people whether the ‘thing’ is a football game, a presidential candidate, Communism, or spinach.”
Kahan and colleagues called the phenomenon “cultural cognition,” referring to the way that our beliefs and values can unconsciously influence how we perceive the world around us. “Who saw what,” they concluded, depended “on the relationship between the event and the subjects’ own values.
Presented participants with identical welfare policies that were said either to be strongly supported by either the majority of congressional Democrats or the majority of congressional Republicans. When the policy was said to be favored by Democrats, self-described Democrats said it was a good policy, and Republicans said it was a bad policy. When the policy was said to be favored by Republicans, the pattern reversed.
Some, for instance, presented people with scientific evidence supposedly confirming or contesting the efficacy of some policy about, say, the death penalty or gun control measures. Again, when people liked the conclusions, they rated the “evidence” as more persuasive; when they didn’t like the conclusions, they rated the evidence as weaker.
Pos-truth
In 2016, Oxford Dictionaries declared post-truth the word of the year, defining it as “circumstances in which people respond more to feelings and beliefs than to facts”
This is sometimes called relativism, in contrast to our earlier absolutism, and it makes the challenge of determining what’s true infinitely more difficult. At this point in our intellectual development, Kuhn explains, many of us fall deep into “a poisoned well of doubt.” After all, if knowledge consists not of facts but of opinions, and if opinions are basically chosen rather than ascertained from some absolute reality, then by what standard is any one opinion superior to any other, man? Hoisting oneself out of the deep well of doubt “is achieved at much greater effort than the quick and easy fall into its depths,” Kuhn goes on. The key lies in reconciling the insights of both absolutism and relativism into a happy middle ground. Yes, all knowledge is uncertain; we can’t usually crack reality open and peer directly into its depths to ascertain what’s true. Some degree of judgment is required
Fact vs opinion
Usually we can measure opinions against reality by some kind of yardstick. “While everyone has a right to their opinion, some opinions are in fact more right than others, to the extent that they are better supported by argument and evidence,” Kuhn explains. Kuhn calls this our “evaluativist” phase
We see the world through the lens of our existing beliefs. When we think something is true, it looks more like a fact. When we think something isn’t true, it looks like an opinion.