“What’s Hard is Simple, What’s Natural Comes Hard”

This post is part of Patheos’s book club for T.M. Luhrmann’s When God Talks Back: Understanding the American Evangelical Relationship with God. I recieved a review copy free of charge.


In T.M. Luhrmann’s ethnographic study of charismatic evangelical Christians, When God Talks Back, communing with God is a strenuous practice.  Cultivating a personal, two-way relationship is a choice for these Christians, and the sheer level of effort they put into changing their minds trips a lot of my epistemological red flags.  The people Luhrmann profiles sound like they’re brainwashing themselves and making it hard to ever recognize their error in the future.

A few days after finishing the book, though, I noticed a weird discrepancy in the way I think about religion.  Whether or not I believe in God, I don’t find anything weird about the idea that hewing to religious morality is hard and requires continual discipline and mortification of the self.  Heck, I think that’s true of most atheistic moral philosophies as well.  The difficulty isn’t necessarily a cue that moral obligations are foreign to us or unnatural, just that we’ve strayed away from our telos, and it takes a wrenching effort to get back on track.

Back when I saw Freud’s Last Session (a two-person play that imagines a debate/conversation between CS Lewis and Sigmund Freud), I thought Freud-in-the-play lost the argument when he complained that Christian morals were unlivable and therefore unreasonable, not that they were false.  I think I’m significantly out of sync with the person I ought to be that it’s not surprising that fixing myself might require a fairly radical and difficult overhaul of my habits of thought.  So why do I still feel very different from the people Luhrmann interviewed?

The brainhacking I do, from foreswearing free food to trying to accept gifts are an attempt to entrench a moral virtue as habit.  I’ve already been convinced I need to change my behavior, and now I’m trying to make the new practice sticky.  I’m suspicious of the people in Luhrmann’s book because they seem to be overhauling their minds as part of the evidence-gathering stage, not the post-decision implementation stage.

There are some disciplines that require a big shift in thinking before you can even understand what you’re studying (my go-to example is topology, but quantum mechanics is in this category as well).  But these changes seem safer since they’re a few degrees away from moral philosophy.   Having a bad idea of four-dimension geometry isn’t likely to lead you into the kind of error where you hurt other people or yourself (though I realize some of the commenters who see my neo-Platonism as pernicious may disagree about this last point).

Historically, most people who ask you to override your senses are tricking you and/or harming you (though some may be doing it unintentionally).  Most “visionaries” are wrong and therefore have reason to fear truth.  It’s prudent to be chary of this kind of epistemological change unless you’ve got some robust error-correction measures in place.

In math and quantum mechanics, there’s not a terribly long lag between learning a new mode of abstract thought and seeing positive consequences.  And the benefits you accrue in these disciplines (being able to write proofs, read papers, and imagine tesseracts) are a lot less ambiguous than possibly being in contact with God.  Ultimately, I see brain-hacking as pretty similar to making any other kind of radical personal change: it’s better to do it as the result of a considered decision, not as a way of exploring a possible choice.

 

The title of this post comes from the song “Anyone Can Whistle” from the Sondheim musical of the same title. A recording from the original production is available below:

YouTube Preview Image
Patheos will be hosting a live chat with the author of When God Talks Back on Friday, April 27th at 2pm EST.

About Leah Libresco

Leah Anthony Libresco graduated from Yale in 2011. She works as a statistician for a school in Washington D.C. by day, and by night writes for Patheos about theology, philosophy, and math at www.patheos.com/blogs/unequallyyoked. She was received into the Catholic Church in November 2012."

  • http://thinkinggrounds.blogspot.com Christian H

    “Most visionaries are wrong and therefore have reason to fear truth.”

    I don’t know what you mean by visionary. So I don’t know how you come to the conclusion that most of them are wrong. And even if they were wrong, would that mean they would have reason to fear truth? Presumably they do not think they are wrong? Or are you using “visionary” to mean “charlatan”? Charlatans are by definition wrong, so then why do you say “most”? (That is, I don’t think you can mean charlatan, but that’s the only kind of person related to this conversation that I can think of who would think that what they are selling is false, and would therefore fear truth.)

    • SAK7

      I had the same question when I came across that line. In fact the whole paragraph seems out of place. “Historically, most people who ask you to override your senses are tricking you and/or harming you (though some may be doing it unintentionally)….” I’d argue you are tricking yourself, not being tricked unless someone else controls your senses… such as an anesthesiologist who we can presume isn’t working to harm you as your senses are overridden.

      The closing line however leaves me puzzled…”…it’s better to do it as the result of a considered decision, not as a way of exploring a possible choice.” Can you explain?

      • leahlibresco

        It’s all right to use irrational/arational techniques to try to reinforce habits that I rationally decided were worth cultivating. When I give myself a piece of very dark chocolate after going to the gym, I’m just using conditioning tools. There’s no reason those two events need to be linked. But I decided it was worth conditioning myself into this behavior because I have rational reasons to expect I’ll extend the functional period of my life with regular exercise.

        When I was still dating my Catholic boyfriend, a friend suggested I try to condition myself into belief by listening to religious music I found emotionally moving. I rejected her suggestion since I only wish to believe if God exists. Using non-rational tools to change my behavior and affect would be a mistake here.

        • deiseach

          I agree with you about the emotional manipulation bit, because that’s building your house on sand. A belief that is only based on evoked positive reactions is going to crumple like wet cardboard the second the hard parts of life hit.

    • leahlibresco

      I’m going to go back and throw in scare quotes to make this clearer. I meant people a la this xkcd. Not every one of them is charlatan (though plenty of people who ask you ignore your senses are) but some of them might be self-deluded in the way Carl Sagan describes in his Dragon in the Garage example. They say they believe and they believe they believe, but they’re always ready ahead of time with a reason why the empirical world won’t match their predictions, so they already know on some level that there’s a mismatch. Someone who actually believed, instead of believing-in-belief would be surprised when things didn’t come out as they predicted.

      • deiseach

        The possibility of fraud is always there, of course. During the hey-day of the Spirtualist movement in the late 19th/early 20th centuries, some of the most gullible persons who fell for mediumistic fakes were the scientists investigating them, precisely because they were so convinced they were rational, unbiased observers who couldn’t be fooled.

        Thing is, if someone is telling me “I am now a unicorn and if you take this amazing natural, harmless herb, you too can become a unicorn!” and I don’t see any evidence of a horn, mane, hooves or tail, then I can be pretty sure – as you say – that this is a non-rational tool to change behaviour and it would be a mistake to engage in it. And if someone is telling me “I’m an alcoholic and I went to AA and the Higher Power of the 12-step plan changed my life”, and I can see that they’re not passing out drunk on the floor and reeking of booze every day anymore, then I can see that this is a non-rational (non-rational if I have no belief in higher powers, that is) tool to change behaviour that brings about a good outcome.

        But what about someone who tells me that belief in a higher power is changing them, and I don’t see particular evidence of a massive change, and they tell me that’s because they’re not finished changing yet? It could be self-delusion, or maybe they really haven’t reached the endpoint of the titre yet.

  • SAK7

    So when a goal is deemed to be rational, such as exercise that would appear to extend life based upon current research techniques which can be empirically tested, employing “mind games” to foster a conditioned behavior is acceptable.

    If the goal however is not able to be empirically tested, say the concept of an immortal soul, than these same conditioning techniques which would “bulk up” the soul muscle wouldn’t be valid?

    Because this soul can’t be hooked to a machine and readings taken to catalog responses and changes to stimuli might it not exist? But we can observe a certain joy in “true believers”…might that be worth cataloging or is it a consequence of endorphines released by said “mind games”.

    I’m reminded of Zeno’s Paradox which details how exactly half way to any goal is the mid point. You must reach the mid point before reaching the destination. But are there not an infinite number of midpoints to reach? How can we ever arrive… and yet we do. http://www.pippinbarr.com/games/letsplayancientgreekpunishment/LetsPlayAncientGreekPunishment.html

  • Ray

    Leah

    If, as you suggested in response to the Evil God Challenge, it is impossible to act contrary the “objective moral law” while knowing all the facts, aren’t you mucking with your knowledge by mucking with your behavior? It seems to me that making a sharp distinction between brainhacking to produce action and brainhacking to produce belief requires the rejection of this sort of objective moral law.

  • Pingback: cozy cove

  • Pingback: notepad bluebox

  • Pingback: xlovecam hack tool 2013

  • Pingback: banana blue

  • Pingback: divorce lawyers in maryland

  • Pingback: hearthstone arena guide

  • Pingback: picture fram


CLOSE | X

HIDE | X