How Cognitive Effort Influences Our Beliefs

It was about 10 years ago that I first heard the term “cognitive shortcut.” We all use them, and we must, because we can’t possibly thoroughly research every issue to reach a well-supported conclusion. So we use shortcuts, group identification being the most obvious. But that isn’t the only way that cognitive shortcuts affect what we believe and how easily we accept a claim, as Katy Waldman notes.

Truthiness is “truth that comes from the gut, not books,” Colbert said in 2005. The word became a lexical prize jewel for Frank Rich, who alluded to it in multiple columns, including one in which he accused John McCain’s 2008 campaign of trying to “envelop the entire presidential race in a thick fog of truthiness.” Scientists who study the phenomenon now also use the term. It humorously captures how, as cognitive psychologist Eryn Newman put it, “smart, sophisticated people” can go awry on questions of fact.

Newman, who works out of the University of California–Irvine, recently uncovered an unsettling precondition for truthiness: The less effort it takes to process a factual claim, the more accurate it seems. When we fluidly and frictionlessly absorb a piece of information, one that perhaps snaps neatly onto our existing belief structures, we are filled with a sense of comfort, familiarity, and trust. The information strikes us as credible, and we are more likely to affirm it—whether or not we should.

It’s simply easier to accept claims that fit with what we already believe. And this isn’t automatically a bad thing. Philosophers like to talk about our “background knowledge” as rightly influencing what we are going to accept as valid and what we aren’t. But this is also one of the roots of confirmation bias and we can go badly astray when we just casually accept a claim because it comports with our preconceived notions. This is why you see people on Facebook so easily accepting fake quotes from people, especially those they consider political enemies.

POPULAR AT PATHEOS Nonreligious
What Are Your Thoughts?leave a comment
  • Abby Normal

    Yea, that sounds about right.

  • John Pieret

    Yea, that sounds about right.

    Abby Normal is always on the right side of the issues, so I agree.

  • janicot

    One of my favorite quotes is from George Polya:

    “Do not believe anything, but question only what is worth questioning”.

    .

    BTW I know it’s good because he was never wrong about anything (like Abby Normal).

  • a_ray_in_dilbert_space

    Bayesian probability: Past experience affects the probability we assign to interpretations current facts and even the facts themselves as they whiz by us.

    However, if you assign zero probability to your prior probability for any outcome (e.g. ” It can’t happen”), your probability will always be zero (“It didn’t happen”).

  • abb3w

    Actually, this looks a little more complicated than merely confirmation bias and background knowledge.

    Occam’s Razor says that the simplest idea consistent with the data is the one most probably correct. There’s a mathematical theorem with a formalized “simplest” for which this can be proven. The measure can be thought of as analogous to the size of a computer program; of those that output the required data, programs that are shorter are better.

    However, what this seems to be saying (though the details in the paper may indicate otherwise — couldn’t figure out which of Eryn Newman papers this was about) is that humans don’t measure by size of program, but by how long it takes for the program to finish running. This provides an efficient resolution to the halting problem — but at the price that ideas which are hard to understand the implications of may be dismissed, even when correct.

    There may also be a secondary optimization, akin to an optimization on a cost for writing new information to long-term storage.

    Of course, my assessment may just result from how it neatly fits into my existing cognitive structures….

  • felidae

    I think we should call this the “bumper sticker effect”

  • http://cheapsignals.blogspot.com Gretchen

    Would calling it “intuition” have sounded too feminine? Because only women are supposed to have intuitions, or something?

    Because that’s what a conclusion based on “gut feelings” is. It doesn’t need a new name. Jesse Prinz has an excellent book about it called Gut Reactions: A Perceptual Theory of Emotion

    Jonathan Haidt’s work is largely about the extent to which our moral stances are derived from intuitions. He and Prinz agree on this, but have different beliefs about its origin– Haidt thinks these moral intuitions are evolved (at least in part), whereas Prinz thinks they are socially constructed.

  • grumpyoldfart

    Believe nothing you hear and only half what you see.

  • Michael Heath

    At least for me, the most powerful tool to pause prior to intuitively accepting something as true is to remain cognizant that we’re too easily susceptible to some assertions within certain contexts. That an assertion presented by an individual with whom we identify shouldn’t signal reactive acceptance of the assertion but instead, a warning flag we’re vulnerable to being defectively gullible in accepting the subject assertion. Not that we should distrust a scientist vs. ‘some person on the Internet’; only that we need to be actively, not passively, considering those assertions.

    When Ed blogged at ScienceBlogs a couple of commenters there used to cite the findings regarding this topic and occasionally elaborate on this research field. Those comments were part of some very useful comment threads. Threads that have unfortunately been deleted after National Geographic purchased that group blog.