The late Senator from New York State, Daniel Patrick Moynihan, was attributed to have said, “Everyone is entitled to their own opinion but not their own facts.”
Go to the Web to see if anyone agrees with some pet theory or hunch of yours: could all that white pizza you ate last night be the cause of your dizzy spell? Isn’t it weird how sinkholes happen whenever a drone flew over the same area?
Whatever it is you are “wondering about”, go to the Net, search on the key words, and you will almost certainly find one or more sites or persons somewhere with a similar suspicion and belief. And when we engage in self-confirmation bias, we will use whatever shreds of confirming information we turn up to “prove” to ourselves that, aha!- indeed there was something to our suspicion after all! But most of the time, these “discoveries” are only a reflection of the power of large numbers—with billions of pages on the Web, and who knows how many words, created by millions of people, probability theory compels that there is going to be a pretty good chance that some page, some place, created at some point exists that supports your pet theory.
And once you are convinced that you’ve found the truth or confirmed an existing point of view, it may not matter to you what kind of actual evidence exists that exists to dispute or provide alternative explanations. According to recent research by Brendan Nyhan, PhD; Jason Reifler, PhD; Sean Richey, PhD, and Gary L. Freed, MD, MPH on efforts to educate parents on the safety of vaccinations, if we have a particular belief firmly in place, even when shown clear contradictory evidence, we find ways to hold on to and justify our prior viewpoint, particularly if that belief is important in how we view the way the world works. (In fact, the research has even shown that showing someone disconfirming information can backfire and serve to increase the person’s belief in incorrect information. This issue gets into the complex and fascinating topic of coincidence, conspiracy theories and fringe beliefs which I will discuss in future blog posts.
If a man is offered a fact which goes against his instincts, he will scrutinize it closely, and unless the evidence is overwhelming, he will refuse to believe it. If, on the other hand, he is offered something which affords a reason for acting in accordance to his instincts, he will accept it even on the slightest evidence.
And not only, as Bertrand Russell noted, does there appear to be a general human propensity to reject information that contradicts one’s existing beliefs, but in the last 20 years or so within the U.S. there has probably been in increase in the distrust of others’ different views, even or especially if from authoritative sources.
This rise in the propensity to reject others’ views can be attributed to several reinforcing factors: increased polarization of values within the country and an increasing cynicism about the integrity of our institutions. Those institutions include corporations, educational institutions, the medical establishment, religious organizations, and of course the government—and government distrust is particularly driven by the corrupting influence of money. All this distrust has seeped into a larger suspicion of pronouncements from those up on high, “the experts,” and even what some might even deride as experts’ “so called facts.”
And it’s hard to blame people for not trusting what our institutions and authorities say and do. Examples of misplaced trust abound. Weapons of mass destruction in Iraq; preventable fatal errors in hospitals; cover up for misdeeds at the VA hospitals and in the church; university’s pushing certain student loans for self-interested purposes; doctors writing addictive OxyContin prescriptions; and the list goes on…
Another reason why some people reject evidence and facts is when the disconfirming information could lead to consequences that the person simply does not like or want to happen. For example, some who dispute the reality and problems resulting from human created climate change do so out of a concern that if that reality is accepted, it could (and probably would) lead to more government rules and regulation, which to those persons would be an anathema.
Of course it’s not just some conservatives whose ideology trumps facts. There are also some on the left who will reject, out of hand, data, evidence or findings that could create what they perceive as potentially undesirable changes in government policies, laws or accepted social practices.
Finally, almost nobody other than perhaps Socrates, who famously said “I know that I know nothing” thinks they are wrong. (Though, perhaps if challenged on his remark, Socrates too might strongly cling to his core belief that he “knows nothing”!)
Ultimately, when it comes to deciding whether to believe something that seems outlandish, contrary to your own accepted beliefs, fringe-y, politically untenable, unheard of, or just weird, so much of it comes down to trust. And we all have different touchstones on whom and what we trust. Some trust The New York Times. Some trust their gut. Some trust results of careful scientific research. Some trust their own life experiences. Some trust their intuition. Some trust MSNBC. Some trust what their parents told them. Some trust Oprah. Many of us simply trust when we hear something that confirms what we already agree with or suspect is true.
Bottom line: the matter of who to trust is complex and of course there is no single way to determine who to trust. But here are five principles I try to follow when making a decision whether or not to trust an unknown source:
1. We don’t know everything, so remain open to what others have to say. Try to set aside and loosen up current beliefs when doing research. As someone once remarked, “research is a river—it takes you where it wants you to go.”
2. Our current state of knowledge and understanding of the world has changed and is going to change. Just look at dietary recommendations. Before the 1960s, eating fats was fine. In the 1970s and 1980s it was considered the enemy of good health. Later, it was determined that some fats were OK and others were not. Today some researchers are promoting eating more fats overall. Since the best predictor of the future is the past, you can probably count on this current perspective to change again (and again and again) as we move into the future.
3. Utilize the critical thinking of the journalists and scholar: be open but also skeptical. Be open because we don’t know everything; be skeptical too because you need to apply some yardstick for resolving conflicting claims and coming to your own decision.
4. Understand and apply the notion of probability to serve as a check on jumping to conclusions on the larger meaning of a certain coincidence. Remember, when it comes to “amazing coincidences” we all have a tendency to remember the hits but forget the misses, and that as my first year probability college professor told our class “rare events do happen.” (But remember too that sometimes there can be in fact meaningful coincidences—the odd and unexpected coinciding of events that are significant if they in fact reflect some unseen, hidden and/or yet to be discovered underlying force or reason creating that surprise). Again: open but skeptical.
5. What seems to be in opposition and contradictory does not have to be so; things are not always mutually exclusive and either/or. Natural climactic cycles occur AND human-induced warming both exist; there is corruption in some medical circles but vaccinations are STILL most likely to be safe. Stay alert to false and unnecessary either/or dichotomies.
How do you determine who to trust on the Web?
Next Week: How Wise is the Crowd Really? Exploring the Phenomenon of the Wisdom of Crowds