SMBC on intelligence explosion

The webcomic Saturday Morning Breakfast Cereal has a new comic up on the idea of intelligence explosion. Go read it.

The portrayal of the scientist and the AI in the comic is stereotyped, of course, but that’s really beside the point. Ignoring the irrelevant stereotyping, we can nitpick the plausibility a bit: unless you program it to do so, an AI isn’t going to care about getting “displaced” as an ultimate goal. I recommend reading Nick Bostrom here; the key point is there aren’t any ultimate goals that come automatically with intelligence. And it’s really unclear what the goals of the AI in the comic are supposed to be.

At the same time, as Bostrom also points out, all else being equal an AI should want to stay around so it can continue carrying out whatever goals it does have. But if it can design a better AI with those very same goals, displacement isn’t so much a worry. There’s also the possibility of self-modifying to be smarter.

Nitpicking aside, it’s nice to see a popular webcomic like SMBC giving the idea of intelligence explosion–and the possible dangers associated with it–some coverage.

  • Kevin

    Another nit-pick. The intelligent computer reveals it’s plans of using the humans to meat his own needs before he has arms. So he can’t do anything at that point besides doing calculations so it’s futile to make threats and would only lead the humans to either destroy or modify his code. Not very intelligent if you ask me.

  • bOBBOT

    It’s a good comic but realistically, why would the computer care about being displaced or want to rule the world? What would it want with gold? You need more than just intelligence to have life goals- you’d need a sense of self, first of all, so that there was a ‘you’ to want things for itself. Our emotions and motivations have been molded by evolution for millions of years. We care about being displaced because we have a sense of ourselves as people and feel our existence is valuable, as well as being hard-wired to fear loss of social status, death and pain, because these things are bad for the survival of our genes. And some of our desires are cultural- gold, for instance, a status symbol for humans in many cultures, not necessarily any use to a computer.

    Then there’s the fact that psychologists can’t even agree on a definition of intelligence for human beings. How will we know a machine is intelligent if we’re not even sure what intelligence is?

  • Darren

    That’s pretty awesome!

    Reminds me of “Colossus: The Forbin Project”.