Why Do We Believe in the Unbelievable? It’s Natural

Why Do We Believe in the Unbelievable? It’s Natural January 10, 2018

Pixa
Pixabay

 

By Rick Snedeker

No wonder we human beings chronically believe in unbelievable things, such as the existence of gods and demons and “love at first sight.”

Indeed, science has repeatedly demonstrated that such willful credulity may be as much a natural function of our humanity as the compulsion to join social clubs. In fact, the tendencies are related.

Several new books speak to this cautionary fact, as discussed in a fascinating piece by Elizabeth Kolbert in The New Yorker magazine entitled, “Why Facts Don’t Change Our Minds: New Discoveries About the Human Mind Show the Limitations of Reason.”

Kolbert noted that subjects of a 1975 Stanford University study who were purposely deceived to think they could easily distinguish an actual suicide note from a bogus one, continued to believe their natural perceptivity was superior to others’ even after learning they had been tricked into that false assumption. In fact, all the test subjects had proved more or less equally able to spot ringers. The researchers “dryly observed,” that, despite evidence they were bamboozled, the subjects’ mental “impressions [remained] remarkably perseverant,” Kolbert wrote.

Researchers noted a similar effect in another Stanford study a few years later where subjects were again fooled, this time to believe they had a knack for identifying risk-takers over more cautious folks. But the scientists found that even when the hoodwinked subjects were shown incontrovertible evidence that they were dead wrong, they failed to “make appropriate revisions in those beliefs.”

Kolbert asks, “How did we come to be this way?”

She refers to a new book by European cognitive scientists Hugo Mercier and Dan Sperber, The Enigma of Reason (2017). The authors conclude that this human tendency is “an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context.” Mercier and Sperber believe their research supports the idea that “hypersocial” humans developed reason not to make rational decisions but to “resolve the problems posed by living in collaborative groups,” Kolbert wrote. Cooperative allegiances are very difficult to create and sustain, and are best nurtured by communal consensus, the researchers explained. They noted that reason also evolved to help humans avoid “being screwed by members of their group,” so to speak, according to Kolbert.

The tendency to automatically believe in chimera may seem odd or irrational to some in a modern, empirical context. But they present as “shrewd” in a primitive hunter-gatherer environment where shared beliefs and personal social success in a finite group were probably much more immediately critical than, say, skillfully parsing metaphysics, the researchers concluded.

Other studies show that humans are not only quick to accept the beliefs and core assumptions of their group, they also then cling jealously to them and aggressively attack any opposing ideas. Communal consensus is a closed, self-perpetuating system, and, in my view, a convincing template of how religions form, become rooted in societies and survive for millennia.

The importance of mankind’s social instincts is reiterated in another book published in 2017—The Knowledge Illusion: Why We Never Think Alone, by American cognitive scientists Steven Sloman and Philip Fernbach. In a study using, oddly, toilets, Sloman and Fernbach demonstrated that people tend to subconsciously conflate the knowledge and skills of their entire in-group with their own, leading them to vastly exaggerate their personal capabilities. Toilets, it turns out—and as the test subjects discovered—are far more complicated than they seem. The study revealed that because anyone can easily flush a toilet, most people falsely assume they intuitively understand how toilets work far better than they actually do. This is how communal “reasoning” often works.

In the same way, people instinctively believe they completely understand knives while being clueless about metallurgy, the authors noted. Yet they can still be more powerful with than without a knife even if it’s created by someone else in the group who in turn may be ignorant of how knife-making came to be in the first place. Unfortunately, this tendency to meld with one’s group is double-edged. It can also lead to people who falsely believe in phantasms and their own personal power making irrational and potentially dangerous decisions affecting the group.

Think Donald Trump and the GOP.

Sloman and Fernbach see science as a discipline that can take people’s natural tendency toward flawed thinking and redirect it in more verifiable directions. Because science requires solid evidence, repeatedly reconfirmed, the system inevitably leads toward concrete, objective reality, even as human cognition remains congenitally mired in the superstitions and gullibility of our species’ primordial past.

Another new book, published in 2016 shows how our primitive tendencies toward self-delusion can have dangerous consequences in the present. In Denying to the Grave: Why We Ignore Facts That Will Save Us, authors Jack Gorman, a psychiatrist, and his daughter, Sara Gorman, a public-health specialist, warn that irrational resistance to vaccination is but one egregious example of such biased thinking. Of course, science has resoundingly demonstrated that vaccinations—“one of the triumphs of modern medicine,” the Gormans wrote—can safely halt infectious disease in its tracks. What’s hazardous is not being vaccinated. Still, so-named anti-vaxxers persist, often for religious reasons (as they did in Boston in the 1700s, when inoculations in the U.S. were born).

The Gormans conclude that such instinctive human bias must have had adaptive reasons in the dim past, noting that some research suggests “people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs.” In other words, it’s pleasurable to be stubbornly defiant in the face of opposition.

If ever there was a time in America to emphasize critical thinking, it’s now in the maelstrom of the “post-truth” era. These new books on mankind’s perpetual self-delusions give renewed vigor to the idea that the best way forward is to always rigorously question, not heedlessly embrace, our natural instincts. To stop and think deeply about what we’re thinking.


RSB26A9507edRick Snedeker is a retired newspaper and magazine writer and editor currently living beside a lovely creek in the upper Midwest. Over a 40-year career, he lived and worked in Arizona, Saudi Arabia and South Dakota. He continues to write for various U.S. newspapers, magazines and online media, and at his personal blog, the Apostate Apostle, which focuses on humanist topics and the history of religious skepticism.


Browse Our Archives