Review: Luke Muehlhauser’s Facing the Intelligence Explosion

I just finished reading the current version of Luke Muehlhauser’s website/ebook Facing the Intelligence Explosion. I have a handful of minor complaints (for example), and there was one thing that kinda made me want to scream (which I’ll talk about in a future post), but on the whole, Luke makes a lot of important points and I highly recommend it for anyone who’s remotely interested in these issues.

The website has been around for awhile, but it’s been revised over time, and reading the latest version I liked the revisions. Among other things, it alerted me to this essay by Alan Turing which seems to have beaten even I. J. Good to the punch on the idea of self-improving machines.

The current version covers the full range of the topics typically encountered on LessWrong, particularly Eliezer Yudkowsky’s writings: rationality, AI, and even utopia. While some LessWrongians will tell you to read all of “the Sequences,” Luke’s shorter version is extremely valuable if you’re not willing to devote quite so much time to reading about those issues right now (personally, I’ve loaded the Sequences onto my Kindle, but am less than 10% of my way through them). I also intend to refer back to Facing the Intelligence Explosion when working on these issues in the future.

The online form (15 chapters/webpages of roughly blog post length) is pretty readable as-is, but if you’d prefer it in PDF or an e-reader format, you can buy those here on a pay-what-you want basis.

No scientific evidence for that
Slavery abolition and animal rights: the biggest problem
I am very smart and it isn’t fair (to other people)
The ignorance and dishonesty of Christian apologetics, part 1: anti-evolutionism
  • staircaseghost

    “10 points for beginning the description of your theory by saying
    how long you have been working on it. (10 more for emphasizing that
    you worked on your own.”

    I wonder what an edited version with the link farm references to the sequences omitted would look like. At least 60% shorter, I imagine.

    • Chris Hallquist

      Um, Luke talks about how he came to hold an unconventional view, but it isn’t actually “his theory” that he spent a lot of time working on. In fact, very little in there is terribly original.

      And for people who are interested in that sort of thing, the summary of / guide to the Sequences is part of the value.

      • staircaseghost

        “Um, Luke talks about how he came to hold an unconventional view, but it
        isn’t actually “his theory” that he spent a lot of time working on.”

        It is a stylistic tic that follows an ideal type, in the same way that stories of dying and rising savior gods follow an ideal type. You don’t expect a literal one-to-one correspondence between the color of the saviors’ socks, but you look for overlaps and analogs in the functional role of various narrative elements.

        Which is not to say that non-crackpot science-popularizers never begin their stories with irrelevant autobiographical framing about how they Came To The Light. Just that it is uncannily reminiscent of the way standard Baez crackpots talk, or the way religious conmen like Lee Strobel talk, or the way Luke’s now-forsaken original guru Alonzo Fyfe talks, or late-night fad diet infomercial presenters talk.

        • Chris Hallquist

          Well, you’re talking a lot like how crackpot Jesus mythers talk, so ha!

          (Seriously, do you realize how easy it is to dismiss anyone based on superficial similarities?)

          • staircaseghost

            Your error here is to assume I am operating from a single data point, and not years of observations of LM’s fundamentalist mindset on the one hand, and Lesswrongian proto-cult crackpottery on the other.

            There is generally no need to open scientific treatises with the author’s own Come To Jesus Story, but hey, one red flag never sank a ship. It’s just that when you look closer it seems to be all flag and no ship, from the in-house/vanity-published citations of fellow SI member papers he makes, to the imprecatory language about what skeptics or rationalists “should” conclude.

            Or don’t you think one is entitled to be skeptical of Come To Jesus Stories when the storyteller is also selling “rationality bootcamps” for his employer at $3,900 a pop?