Pain and utopia

I guess I should explain the “thing that made me want to scream” in Luke’s Facing the Intelligence Explosion. It’s when he’s listing possible features of a utopia, and “no pain” is first on the list. “Imagine,” he says, “a life without pain.”

I can imagine it, and it sounds pretty terrible. It would mean no pushing yourself physically (during sports, exercise, whatever) past the point where you want to quit because of the pain. No working hard and then having a bit of soreness as a reminder of what you’ve accomplished. No S&M.

That these are desirable things, or at least things some people want, could be seen as an instance of a point Luke makes earlier in that same chapter:

Citizens of utopia are not mindless drones who never use their brains to solve anything. Just as making a video game easier does not always make it more fun, citizens of utopia still face challenges and experience the joy it is to overcome them.

Now, anesthetics are great, and to be fair to Luke, he links to an essay by Ben Goertzel which is slightly more nuanced. Goertzel writes:

It’s quite possible that the total abolition of ouchness is not ideal, from a psychological or cultural perspective. But it’s very hard for me to believe that evolution has supplied us with a level of ouchness that is optimal from our contemporary and future perspectives. There’s just too much unnecessary and destructive pain around. Giving adult humans and AGIs conscious control over their level of ouchness is obviously the compassionate and ethical thing to do — and it’s also good mind design. How fortunate that science and engineering are likely to make this possible, and in the not extremely distant future!

But at the same time, he doesn’t seem to entirely get why anyone would choose not to alter their experience of pain much, beyond the cases we already use anesthesia for. He doesn’t understand why anyone would want to experience pain any more often than “maybe now and then just for novelty!” (in his words).

This seems like a pretty good example of why would-be utopians should proceed with extraordinary care; what seems not very important to one person, even a perfectly well intentioned person, is sometimes extraordinarily important to anther.

Edit: After writing this post, I realized it wouldn’t be complete without a Grumpy Cat meme:

  • http://www.facebook.com/lukeprog Luke Muehlhauser

    I’m not sure whether I’d want pain in the future. In any case, I wasn’t saying that if we get FAI right then pain wouldn’t be an option. I was saying that if we get FAI right then pain wouldn’t be required, as it is now. It’s not like I think we should directly specify the AI’s utility function, come up with a formal expression that captures all intuitive cases of pain, and assign such cases very low utility, such that the AI directly optimizes the world for containing no pain. If we’re specifying the AI’s utility function so directly, then we’re almost certainly screwed.

    • http://patheos.com/blogs/hallq/ Chris Hallquist

      I more or less assumed you’d say that. Not having it be clearer, though… well, it made me want to scream.

      I’d also emphasize that, to me any way, this seems like a nice illustration of the kinds of things that could go wrong with (a) trying to program all these things into a intended-to-be-friendly AI one at a time (b) generally not programming an intended-to-be-friendly AI right.

  • Jayn

    Given how happiness and sadness tend to exist in relation to each other as much as (if not more than) as absolutes, I’ve often wondered how a world without pain would affect our ability to be happy.

  • Christine

    I would say that having nicer ways to get rid of pain might be utopia. Imagine a world where you can take a painkiller for a sore muscle, without worrying that you’re going to hurt yourself worse because you don’t catch the warning twinges. Or where you didn’t have to choose between being doped out from the pain of a broken leg and doped out from the painkillers. (Pro tip: don’t break your leg when you’re doing two grad courses at the same time). Pain relief is a very young science, and we aren’t very good at managing pain – especially not the chronic kind. So while I don’t think that pain is always awful (even had I been allowed to have an epidural when I gave birth I quite possibly wouldn’t have had it), I completely agree that there is more “ouch” than we need, and more “ouch” than we can manage properly.

  • busterggi

    Will you agree no chronic pain? I’ve too many badly healed injuries and basically know I’m awake because I feel pain – I could do without that.

    • http://patheos.com/blogs/hallq/ Chris Hallquist

      I’m not sure I’d want to force any solution to pain on the whole population, but by the same token, certainly I’m in favor of very good pain control for people who want it.

  • Speedwell

    There’s a problem with pain, though. Pain has an alarming tendency to hurt. No, really, I’m not making this up.

  • qbsmd

    From what I’ve read, people who don’t feel pain are at risk for injuring themselves because they don’t get warning signals first. I imagine a utopia would be closer to the way pain is described in the earlier Terminator movies: the Terminator described “damage reports that could be perceived as pain” and the humans of the future had the ability to consciously dismiss pain and continue fighting, running, etc. I think that in a utopia the information about injuries or potential injuries would exist, but the subjective experience of pain could be consciously dismissed.

  • Pingback: yellow october


CLOSE | X

HIDE | X