When Do You Reject Your Intuitions?

A while ago, a commenter emailed me to ask if I could recommend any books to read on human cognitive bias, and now that I’ve finished Thinking Fast and Slow by Daniel Kahneman, I can, with great enthusiasm.  When we study flaws in human reasoning, we usually start with glaring ones, and find out that they’re just the most obvious examples of a broader problem (and the subtler errors are the more pernicious ones).  In the book, Kahneman has a really interesting riff on the Müller-Lyer illusion.

All the lines are the same length, but the different orientations of the arrows trick you into thinking the middle one is longest.  Kahneman writes:

Now that you have measured the lines, you–your System 2, the conscious being you call “I”– have a new belief: you know that the lines are equally long. But you still see the bottom line as longer… To resist the illusion, there is only one thing you can do: you must learn to mistrust your impression of the length of lines when fins are attached to then. To implement that rule, you must be able to recognize the illusory pattern and recall what you know about it. Of you can do this, you will never again be fooled by the Müller-Lyer illusion. But you will still see one line as longer than the other.

Learning that we err isn’t enough to fix our flaws.  It’s a constant struggle against a part of our nature to not get fooled by our heuristics.  And, as Kahneman points out, sometimes we’re never going to beat them, we’re just going to be able to remember we’re wrong in time to not act on them.

I have mild prosopagnosia (face-blindness) and I have a lot of trouble recognizing people.  My junior year of college, I had a lot of trouble telling my roommate (GirlOne) apart from a different girl who had a leadership position in the debate group I was running (GirlTwo).  This meant that, about once a week, I’d come back to the suite, or go from my room into the common room, and be convinced that GirlTwo was in my dorm — and since there was no reason she’d be there casually, this presumably meant the debate group was having some kind of political crisis, and I’d start feeling panicky.

It was never the case that GirlTwo was lying in wait for me in the common room — it was always just my roommate, GirlOne.  I couldn’t stop making the visual error, but I got a lot better at remembering that my intuition was pretty much always wrong, so I felt less jumpy.  I had to learn to stop privileged my flawed reactions and actively practice overriding my senses.

Eliezer Yudkowsky highlights a different sphere where we have to strive against our intuitions in the introduction to his sequence on quantum mechanics.  He writes:

I am not going to tell you that quantum mechanics is weird, bizarre, confusing, or alien. QM is counterintuitive, but that is a problem with your intuitions, not a problem with quantum mechanics. Quantum mechanics has been around for billions of years before the Sun coalesced from interstellar hydrogen. Quantum mechanics was here before you were, and if you have a problem with that, you are the one who needs to change. QM sure won’t. There are no surprising facts, only models that are surprised by facts; and if a model is surprised by the facts, it is no credit to that model…

In the coming sequence on quantum mechanics, I am going to consistently speak as if quantum mechanics is perfectly normal; and when human intuitions depart from quantum mechanics, I am going to make fun of the intuitions for being weird and unusual. This may seem odd, but the point is to swing your mind around to a native quantum point of view.

The trouble is that, in a lot of cases, it’s not as obvious that our intuitions are wrong as it is in the optical illusion or my faceblindness or quantum mechanics.  The challenge is trying to figure out which intuitions need to be subverted, and how confident we need to be to override them.  Because fighting intuitions can sound a lot like brainwashing.  In my experience with my roommate, I was literally trying to unsee what my eyes were telling me I did see.

Heuristics and reflexes aren’t bad in themselves, so how do we decide when the errors don’t outweigh the convenience, or when we want to try and subvert them in particular circumstances, or when we want to burn them out entirely.  This kind of problem is going to come up in a more specific way for tomorrow’s post for the Patheos Book Club, so I’d be interested in your general principles (and intuitions, if you trust them!) today.

About Leah Libresco

Leah Anthony Libresco graduated from Yale in 2011. She works as an Editorial Assistant at The American Conservative by day, and by night writes for Patheos about theology, philosophy, and math at www.patheos.com/blogs/unequallyyoked. She was received into the Catholic Church in November 2012."

  • http://www.realphilosophers.org/unnaturalatheism Tom

    One approach is to try to measure, empirically, what kinds of intuitions tend to go astray. (I take an “intuition” to be an intellectual seeming.)

    Three distinctions:
    (1) Descriptive facts are about the way things are, and normative facts are about the way they should be. Descriptive facts are “ises” and normative facts are “oughts.” (Ethical facts (if they exist) are a proper subset of normative facts.)
    (2) Nonmodal facts are about the way things actually are (contingent truths), and modal facts are about necessity, possibility, and impossibility. (We can confine our scope to metaphysical necessity: whether something could or couldn’t be true, or exist. Intuitions about merely physical modality are less reliable.)
    (3) A priori beliefs are beliefs formed independently of sensory or empirical observation; empirical beliefs are formed because of sensory or empirical observation.

    In terms of our experience, intuitions reporting normative or modal facts, and intuitions that produce a priori beliefs, seem to be much more reliable than the other sorts of intuition. For example, we seem to be only very rarely wrong about whether various truths are necessary, or possible, or impossible. And we sometimes find what seems to be ethical disagreement, but it very commonly turns out to arise from a more basic non-ethical disagreement, about, e.g., whether a certain human is actually a “person,” or about religion. (Even the various reasons people have given for skepticism about ethical intuition can usually be answered by further study of the sources of biases.) As for a priori beliefs, these tend to be about normative or modal truths.

    So if you’re trying to use intuition to discover contingent, descriptive, empirical truths, you’re likely to go astray. But there doesn’t seem to be any good reason for serious skepticism about the other uses of intuition.

  • http://last-conformer.net/ Gilbert

    I think I give my intuitions the benefit of the doubt and discard them only where I have a positive reason to distrust them. Since that reason will be empirical that effectively lets the strongest intuition win out.

    My justification is that the only alternative is letting the weaker intuition win out and that is even less appealing. Ultimately we need to think and act and that means we need to trust something.

    But practically speaking I don’t think that’s the real problem. The most important illusions and biases are already known, so in the abstract we already know when to distrust ourselves. The real problem is recognizing when the theoretically known rules actually apply. This is more a moral than a strictly epistemological problem: We need the humility to know when we are likely to be wrong and the honesty and courage not to resort to the easy excuses that are available in all interesting cases.

    • @b

      >>The most important illusions and biases are already known…

      Known by wikipedia, yes. In the minds of the 2 billion people with access to it, no.

  • deiseach

    I have no problem remembering people’s faces, but I am hopeless at remembering names, which means that I am always in the situation of “Oh, hey, that’s – er, Liz? No – Jane! No- Kate?” when it comes to meeting someone I worked with for five years but it’s been a couple of months since I changed jobs, long enough for me to forget her name. Not to mention when there were three people with the same first name but different surnames, so when I had gotten it into my head that these three women were all Niamh (and not one of the two Julies), I still managed to get tripped up over which Niamh was which.

    The moral of this story? Um – I never trust my intuitions, primarily because I don’t seem to have any in the first place. If it’s not a rule I learned by having it drilled into my thick skull, I’m lost :-)

  • Pingback: Should Atheism Be A ‘Comfortable’ Belief?

  • anon atheist

    I go by “what you can’t measure probably does not exist”. I’m going to be chided to be an evil empiricist but I don’t care.

    • Ray

      FWIW, I think there’s a lot to be said for this intuition, and I think people who object to it either vastly underestimate what can be measured (and you don’t need to measure something well to demonstrate its existence) or they are trying to sell something.

  • Pingback: Quasi-Transhumanist Charismatic Christians


CLOSE | X

HIDE | X