Satisfying Narratives and Feynman’s First Principle

Satisfying Narratives and Feynman’s First Principle May 5, 2016

once-upon-a-time-719174_640

“The first principle is that you must not fool yourself — and you are the easiest person to fool.”

— Richard Feynman, 1974

About a week ago, I posted a piece of satire that some people enjoyed, some people rolled their eyes at, and some people were confused by. It’s not entirely unexpected. I was taking out some frustration in some vague (and in some not-so-vague) ways through that piece, so I don’t fault anyone for reading into or misreading anything I said there.

But I feel like there are some serious points that are worth ripping out of that satirical context for a second look.

The day after I published the aforementioned piece, the great site FiveThirtyEight published a piece by Daniel Engber entitled “Who Will Debunk The Debunkers?” As a person who values skepticism (and enjoys the quality of 538’s posts in general), I was intrigued. Engber starts off with a story many of us may have heard:

Popeye loved his leafy greens and used them to obtain his super strength, Arbesman’s book explained, because the cartoon’s creators knew that spinach has a lot of iron. Indeed, the character would be a major evangelist for spinach in the 1930s, and it’s said he helped increase the green’s consumption in the U.S. by one-third. But this “fact” about the iron content of spinach was already on the verge of being obsolete, Arbesman said: In 1937, scientists realized that the original measurement of the iron in 100 grams of spinach — 35 milligrams — was off by a factor of 10. That’s because a German chemist named Erich von Wolff had misplaced a decimal point in his notebook back in 1870, and the goof persisted in the literature for more than half a century. […]

All these tellings and retellings miss one important fact: The story of the spinach myth is itself apocryphal. It’s true that spinach isn’t really all that useful as a source of iron, and it’s true that people used to think it was. But all the rest is false: No one moved a decimal point in 1870; no mistake in data entry spurred Popeye to devote himself to spinach; no misguided rules of eating were implanted by the sailor strip. The story of the decimal point manages to recapitulate the very error that it means to highlight: a fake fact, but repeated so often (and with such sanctimony) that it takes on the sheen of truth.

In that sense, the story of the lost decimal point represents a special type of viral anecdote or urban legend, one that finds its willing hosts among the doubters, not the credulous. It’s a rumor passed around by skeptics — a myth about myth-busting. Like other Russian dolls of distorted facts, it shows us that, sometimes, the harder that we try to be clear-headed, the deeper we are drawn into the fog.

If you consider yourself a skeptic and don’t find this disquieting, I question your skepticism. (Or maybe you’ve heard this one before.)

The irony that Engbar lays out by discussing cases like this is that we have gotten good at fashioning tidy, inaccurate, satisfying stories to explain the dangers of fashioning tidy, inaccurate, satisfying stories as explanations.

Narratives have a special kind of power. In some ways, they are the way we make sense of the sensory information we encounter during every conscious moment, the framework we use to synthesize it all into a coherent whole that we can parse. But like so many other cognitive processes, they can go awry and actually lead us to — shall we say — play a little fast and loose with the data.

Some of this is completely unconscious, too. This past fall, I attended Skepticon 8, and Saturday night’s final speaker was Destin Sandlin of the YouTube channel SmarterEveryDay. (You can read a full account of that event here.) Sandlin was speaking on topics both favorable (skepticism!) and unfavorable (science and faith…) to the audience.

At one point, Sandlin was making a point about how science and faith can be compatible, and he mentioned that the idea of the “Big Bang” was conceived by a priest, Georges Lemaître. This was entirely a passing remark, meant just to bolster the idea that a religious person could in fact do good science (which I consider a trivial claim, but nevertheless), but someone in the audience not too far from me yelled, not exceptionally loudly, “No, that was Fred Hoyle!” ¹

This was a rather oddly timed debunking, but aside from that, it was also a false one. Fred Hoyle did indeed coin the term “big bang,” but Lemaître did in fact conceive the model and Hoyle rejected Lemaître’s interpretation in favor of his own “steady state” model.

Was that person, as Engbar puts it, “blinkered by their own feelings of superiority” over Sandlin, a Christian? I don’t know, but something about the idea that Hoyle (an atheist) was responsible rather than a Christian or maybe something about the idea of Sandlin spreading an easily debunked falsehood must have seemed right. And there was a fact just waiting in the wings of that person’s memory, ready to fill the empty spot where the actual truth should have gone.

It wasn’t malice. It wasn’t even necessarily ignorance. It was just a damned good narrative.

Properly conceived, our mental narratives would be pliable, ready to adapt to new information as we observe things that challenge our existing stories about reality, about ourselves, about others. But they too often don’t. We look at a moral person suffering and think, “Well, there must be some reason for that.” We see people like us and think, “They must be trustworthy.” (Or conversely, we look at people not like us and suspect the worst.)

And more significantly, we let our emotions about our narratives shape them. That was ultimately the point of my satire: that too many people are willing to accept a satisfying story over a well-substantiated one (or over abstaining from a conclusion in the absence of good evidence). If people disagree with you, it’s easier — and more satisfying — to think the worst of them in terms of their character, knowledge, or intellect than to accept that there might not be sufficient evidence for your own position or even that the disagreement is not so easily identified at all. If your candidate is losing, it’s easier — and again more satisfying — to posit systemic problems with the electorate or the voting system rather than your candidate’s own failure to persuade voters. And so on.

These are not trivial problems. They are deeply serious ones, the kind of cognitive errors that can seriously obscure the truth from us. They are also the hardest to identify.

Which is where I come back to Feynman, whose “first principle” is a note of caution that every person should take under constant advisement: You must not fool yourself — and you are the easiest person to fool.

If something seems too good to be true, it probably is. If your narrative for a certain trend or a person’s actions or a problem’s cause is too satisfying, then stop and check it again — and again and again. And always be wary of convenient stories, ones that confirm your prejudices and don’t challenge your comfortable view of the world.

We are our own storytellers. It’s up to us to make sure that we aren’t telling ourselves fairy tales.


Image via Pixabay

¹For the record, Sandlin either didn’t hear or simply ignored the comment, which was better for all involved. ^

Browse Our Archives