menu

On Refusing Epistemic Interaction: Dealing with People Who Won’t Accept Facts

On Refusing Epistemic Interaction: Dealing with People Who Won’t Accept Facts September 13, 2021

I recently produced a piece or two on the ontology of abstracts and the epistemic barrier. But let’s now get psychological. Go read them for the foundation.

One thing that came up in conversation between two commenters is the idea of interacting with people who do not want to accept facts and data contrary to their beliefs, and have such cognitive dissonance that changing their minds is merely a hopeless pipedream.

This was a comment from one of the two (Chris Morris):

How to deal with someone refusing to “pierce an epistemic barrier” is an interesting (and increasingly urgent, I think) problem.

I’ve just asked someone on Facebook who has posted an article about the 18th century slave trade as a business if they can express any sympathy with the victims of the trade. This is part of the response:

You have not given any evidence of any specific suffering of fellow human beings..involved in the slave trade assuming it seems that people are better off dead than alive… Some years ago I heard an interview on French Radio in which an interviewer asked a descendent of a King in French West Africa if he was not ashamed of his ancestors role in the slave trade…He replied that he was proud of him as a humanitarian because as I have said both transportation and the slave trade gave people the chance of life rather than death… Many of those involved especially when Bristol was the main port were Quakers who saw it as a special mission to save people,… What you perhaps are not fully aware of is that the anti-slave trade movement (as the descendent of one of its driving forces wrote) “was the first successful propagandist agitation of the modern type, and its methods were afterwards imitated by …myriad societies and leagues.” And the whole point of propaganda is that it is one sided and will use any means to win the argument on the basis that “the end justifies the means”… And both the Scots and the Irish have a long history of successfully distorting the truth “in a just cause”. And propaganda usually tries to appeal to the most visceral and powerful emotions of armchair readers who are quite prepared to imagine the very worst of other people…(and were in the age of Gothic Horror) …And one interesting piece of information that I gleaned only from C.R. Fay (a proud Lancastrian) was that one of the ways that Liverpool managed to undercut Bristol and become the main slave trading port was by using the notorious “parish apprentices” system (later used in Lancashire factories) in order to staff slave ships with pauper apprentices forced to sea, and no doubt such lads forced away from their families and out to sea were very ready to complain about almost every aspect of their lives and the trade.. But this idea of slave ships being manned by children just underlines the fact that Africans were in control of what went on in Africa, and made sure the system worked to their benefit… which was the main point of my OP..

He assures me that my view of the slave trade is a “Hollywood fairy tale”.

This reminds me of the video I posted the other day about how Trump supporters just don’t accept facts and fact-checking when it is in front of their faces:

Why does this happen? If it doesn’t cost you anything to stay in your nonsense-bubble – when you pay no penalty but gain certain psychological rewards –  then you will stay there. Moreover, people put their emotions and identities into the claims they make. Trump won the election and Biden didn’t – it was a fraud – becomes who I am, the clothes I wear, the people I hang out with or talk online with, and so on. When that is what is driving you, facts play second fiddle.

A fact isn’t something that is undeniable anymore, it just needs to be something that makes sense to the person who claims it. And vice versa for denying facts from others. If a consensus fact doesn’t cohere with everything else about someone’s life, they can just drop it. A “fact” becomes a preference and a comfort, and stops being, well, a fact.

In philosophy, we have the Correspondence Theory of truth, the Coherence Theory, logical validity, and other such rational positions. But what prevails for the masses appears to be the smell test. If it sounds good, smells good, feels good, it’s in, it’s true. Throw in some confirmation bias, and if some new “fact” confirms or coheres with your bedrock of nonsense, then you get a burgeoning nonsense network.

When you have time, read the New Yorker’s Why Facts Don’t Change Our Minds“. After all, you can’t reason someone out of a position they didn’t reason themselves into:

If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias….

Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own….

In “Denying to the Grave: Why We Ignore the Facts That Will Save Us” (Oxford), Jack Gorman, a psychiatrist, and his daughter, Sara Gorman, a public-health specialist, probe the gap between what science tells us and what we tell ourselves. Their concern is with those persistent beliefs which are not just demonstrably false but also potentially deadly, like the conviction that vaccines are hazardous. Of course, what’s hazardous is not being vaccinated; that’s why vaccines were created in the first place. “Immunization is one of the triumphs of modern medicine,” the Gormans note. But no matter how many scientific studies conclude that vaccines are safe, and that there’s no link between immunizations and autism, anti-vaxxers remain unmoved. (They can now count on their side—sort of—Donald Trump, who has said that, although he and his wife had their son, Barron, vaccinated, they refused to do so on the timetable recommended by pediatricians.)

The Gormans, too, argue that ways of thinking that now seem self-destructive must at some point have been adaptive. And they, too, dedicate many pages to confirmation bias, which, they claim, has a physiological component. They cite research suggesting that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs. “It feels good to ‘stick to our guns’ even if we are wrong,” they observe.

The Gormans don’t just want to catalogue the ways we go wrong; they want to correct for them. There must be some way, they maintain, to convince people that vaccines are good for kids, and handguns are dangerous. (Another widespread but statistically insupportable belief they’d like to discredit is that owning a gun makes you safer.) But here they encounter the very problems they have enumerated. Providing people with accurate information doesn’t seem to help; they simply discount it. Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science. “The challenge that remains,” they write toward the end of their book, “is to figure out how to address the tendencies that lead to false scientific belief.”

Then, of course, there is cognitive dissonance, a common subject around here (Psychology Today):

As a currently relevant example, consider that incontrovertible proof has been discovered that Donald Trump has lied, cheated, stolen, and committed very serious crimes that most people would face significant jail time for. What’s more, he has implemented policies that are hurting many people who voted for him. Nevertheless, a great many people who voted for Trump vigorously defend their choice, despite the tremendous amount of credible, incriminating information about him that has come to light since November 8, 2016. Information that, had they been aware of before then, might have dissuaded some from casting their ballot the way they did.

In practice, here is how the people who voted for Trump reduce their cognitive dissonance when confronted with an ever increasing amount of highly disturbing facts about him:

  1. Change their behavior or belief by integrating the conflicting information into their world view. For example, “I will no longer support our current leadership, and I will vote differently in the next election.”
  2. Justify their behavior or belief by changing the conflicting cognition. For example, “The president is doing what’s best for the country and is the victim of a political witch hunt.”
  3. Justify their behavior or their beliefs by adding new cognitions. For instance, “The president is a successful businessman and must know what he’s doing—he’s just too smart for most people to understand.”
  4. Ignore or deny information that conflicts with their existing beliefs. For example, “All that stuff is just ‘fake news,’ and you can’t trust it.”

The bottom line is that when there is a conflict between our attitudes and our behavior, we tend to change our attitudes to make them consistent with our behavior rather than change our behavior to make it consistent with our attitudes.

What do we do with people who refuse to accept plain facts, with people who will more happily bury their heads in the sand than change their minds or accept that they were wrong?

This is about education, but also about tribalism – or desiring to move away from it.

On the education front, we could take a leaf out of Finland’s book, with their formalised war on misinformation. You can watch this hilarious segment from one of Samantha Bee’s contributors (genuinely enjoyed this):

Here is a more serious exposition:

Education and critical thinking. This is the key. We need to all be aware of our biases in order to try to mitigate them as much as possible. My former writing colleague and psychologist Dr Caleb Lack wrote an excellent chapter on cognitive biases and hopes of mitigating them in one of my earlier books 13 Reasons to Doubt [UK]. It’s worth buying just for that! As BBC Future observes:

The first theory of confirmation bias is the most common. It’s the one you can detect in expressions like “You just believe what you want to believe”, or “He would say that, wouldn’t he?” or when the someone is accused of seeing things a particular way because of who they are, what their job is or which friends they have. Let’s call this the motivational theory of confirmation bias. It has a clear prescription for correcting the bias: change people’s motivations and they’ll stop being biased.

The alternative theory of confirmation bias is more subtle. The bias doesn’t exist because we only believe what we want to believe, but instead because we fail to ask the correct questions about new information and our own beliefs. This is a less neat theory, because there could be one hundred reasons why we reason incorrectly – everything from limitations of memory to inherent faults of logic. One possibility is that we simply have a blindspot in our imagination for the ways the world could be different from how we first assume it is. Under this account the way to correct confirmation bias is to give people a strategy to adjust their thinking. We assume people are already motivated to find out the truth, they just need a better method. Let’s call this the cognition theory of confirmation bias.

The article discusses some studies before stating:

The finding is good news for our faith in human nature. It isn’t that we don’t want to discover the truth, at least in the microcosm of reasoning tested in the experiment. All people needed was a strategy which helped them overcome the natural human short-sightedness to alternatives.

The moral for making better decisions is clear: wanting to be fair and objective alone isn’t enough. What’s needed are practical methods for correcting our limited reasoning – and a major limitation is our imagination for how else things might be. If we’re lucky, someone else will point out these alternatives, but if we’re on our own we can still take advantage of crutches for the mind like the “consider the opposite” strategy.

Such approaches offer long-term solutions, and for these we need to work hard and play the long game. It’s no easy win.

Regulation of the media, and especially online media, is something that must be considered. BBC News and Sky News in the UK have responsible output because they are carefully regulated, in a good way. There certainly needs to be structural and strategic plans to deal with misinformation that provides the content of the beliefs of such people.

The second point is about tribalism. How do we deal with that? Well, part of the problem in the US is the two-party system. You need much more nuanced political representation, and this definitely includes an overhaul of the Electoral College system.

More parties, and better representative democracy will go some way to halt political polarisation, a polarisation that is moving the GOP to an insane populist authoritarian right obsessed with the cult of personality Orangenfuhrer Donald Trump.

And there’s me playing into that polarisation, no doubt.

But where we have had four years of Trump outright lying and outright denying truths, bare-faced and unrepentent, there are consequences. His followers are, well, following. They are taking more than a leaf out of Trump’s book, they are bashing their opponents with the whole book.

Humans don’t change behaviour on their own accord. Not easily. It is why I have long argued for regulation driving environmental change. It simply won’t be consumer-led. Ban the tungsten lightbulb, you get innovation into producing much cheaper and more efficient LED lights and similar. To tackle rampant disinformation and the resultant cultist conspiracy theorists refusing to accept obvious truths, the challenge is obvious, the problem pervasive and deep-rooted as well as being multifarious.

The most short-term solutions will be things like regulation, things like Facebook and Twitter fact-checking. But you can see what the people do with those as from that CNN video above. It’s all part of the liberal elite controlling social media.

Just another facet of the conspiracy theory modelling.

Even from a short-term conversational point of view, I like to use logical analogies to someone’s claims to get them in a twist and show their rational inconsistencies. But even this obvious exhibition of an interlocutor’s rational inadequacies usually just makes them angry and entrench in their original position more vehemently and emotionally. It happened only last night when I slam-dunked one of my 11-year-old twins with a logical sucker punch. Yeah, take that, kid. He just got angry and entrenched.

That’s a lotta words to conclude that it’s an uphill struggle with no clear solution. Any ideas?

 


Stay in touch! Like A Tippling Philosopher on Facebook:

A Tippling Philosopher

Please support my efforts to battle against disinformation and misinformation by sharing or donating, or both! You can buy me a cuppa. Or buy some of my awesome ATP merchandise! Please… It justifies my continuing to do this!




Browse Our Archives