It was about 10 years ago that I first heard the term “cognitive shortcut.” We all use them, and we must, because we can’t possibly thoroughly research every issue to reach a well-supported conclusion. So we use shortcuts, group identification being the most obvious. But that isn’t the only way that cognitive shortcuts affect what we believe and how easily we accept a claim, as Katy Waldman notes.
Truthiness is “truth that comes from the gut, not books,” Colbert said in 2005. The word became a lexical prize jewel for Frank Rich, who alluded to it in multiple columns, including one in which he accused John McCain’s 2008 campaign of trying to “envelop the entire presidential race in a thick fog of truthiness.” Scientists who study the phenomenon now also use the term. It humorously captures how, as cognitive psychologist Eryn Newman put it, “smart, sophisticated people” can go awry on questions of fact.
Newman, who works out of the University of California–Irvine, recently uncovered an unsettling precondition for truthiness: The less effort it takes to process a factual claim, the more accurate it seems. When we fluidly and frictionlessly absorb a piece of information, one that perhaps snaps neatly onto our existing belief structures, we are filled with a sense of comfort, familiarity, and trust. The information strikes us as credible, and we are more likely to affirm it—whether or not we should.
It’s simply easier to accept claims that fit with what we already believe. And this isn’t automatically a bad thing. Philosophers like to talk about our “background knowledge” as rightly influencing what we are going to accept as valid and what we aren’t. But this is also one of the roots of confirmation bias and we can go badly astray when we just casually accept a claim because it comports with our preconceived notions. This is why you see people on Facebook so easily accepting fake quotes from people, especially those they consider political enemies.