Neuroscience basics & some criticism of Eliezer Yudkowsky

I’ve just written a post at LessWrong covering some basics of neuroscience and criticizing a claim by Eliezer Yudkowsky that the brain “just doesn’t really look all that complicated.” If you just want to read the latter part of the post, I’ve put my criticisms below the fold on this one:

First of all, it is not true that the fact that that brain is divided into only 52 major areas is evidence that it is not very complex, because knowing about the complexity of its macroscopic organization tells us nothing about the complexity of its microscopic wiring. The brain consists of tens of billions of neurons, and a single neuron can make hundreds of synapses with other neurons. The details of how synapses are set up vary greatly. The fact is that under a microscope, the brain at least looks very complex.
 .
The argument from the small size of the genome is more plausible, especially if Eliezer is thinking in terms of Kolmogorov complexity, which is based on the size of the smallest computer program needed to build something. However, it does not follow that if the genome is not very complex, the brain must not be very complex, because the brain may be built not just based on the genome, but also based on information from the outside environment. We have good reason to think this is how the brain is actually set up, not just in cases we would normally associate with learning and memory, but with some of the most basic and near-universal features of the brain. For example, in normal mammals, the neurons in the visual cortex are organized into “ocular dominance columns,” but these fail to form if the animal is raised in darkness.
.
More importantly, there is no reason to think getting a lot of power out of a relatively simple design requires insights into the nature of intelligence itself. To use Eliezer’s own example of Windows Vista: imagine if, for some reason, Microsoft decided that it was very important for the next generation of its operating system to be highly compressible. Microsoft tells this to its programmers, and they set about looking for ways to make an operating system do most of what the current version of Windows does while being more compressible. They end up doing a lot of things that are only applicable to their situation, and couldn’t be used to make a much more powerful operating system. For example, they might look for ways to recycle pieces of code, and make particular pieces of code do as many different things in the program as possible.
.
In this case, would we say that they had discovered deep insights into how to build powerful operating systems? Well no. And there’s reason to think that life on Earth uses similar tricks to get a lot of apparent complexity out of relatively simple genetic codes. Genes code for protein. In a phenomenon known as “alternative splicing,” there may be several ways to combine the parts of a gene, allowing one gene to code for several proteins. And even a single, specific protein may perform several roles within an organism. A receptor protein, for example, may be plugged into different signaling cascades in different parts of an organism.
  • Alasdair

    I would stay away from LessWrong if I were you. Everything I’ve read about it suggests it’s effectively a cult based around the ideas of this Yudkowsky guy, not meaningfully different from Raelism or Scientology. Just because a group claims to be ‘rationalist’ doesn’t mean they have any understanding of what that means.

    • http://alephsquared.wordpress.com aleph squared

      Despite all the writing they due about the affect heuristic and halo effects, they seem remarkably incapable of recognizing it in themselves.*

      *Most of them I’ve encountered, anyway. I have found that some of the longer-running members there I’ve run into manage to avoid the cult-like attitudes of the rest.

    • http://www.mccaughan.org.uk/g/ g

      I think you may be failing to distinguish “having certain features in common with” from “not meaningfully different from”.

      Some important features of movements like Raelism and Scientology — features essential, I think, to why they have the bad reputation they rightly have — include:

      1. They require that their adherents assent to a bunch of firmly held propositions that, to pretty much the entire rest of the world, are demonstrably false.

      (I don’t think there is any such thing among the LW folks. There are ideas they tend to take more seriously than the rest of the world — strong AI and cryonics, for instance — but so far as I can tell there’s nothing resembling a requirement for LW people to think that superhuman AI will ever actually be produced, or that cryonics will actually end up working.)

      2. They require large financial commitments from their members.

      (The Singularity Institute has donation drives every now and then. That’s about it.)

      3. They make a big deal of authority and hierarchy.

      (People on LW tend to think Eliezer Yudkowsky is right about a lot of things. That’s about it.)

      If all you mean by “effectively a cult” is “a bunch of people united in part by a general agreement with the ideas of a particular person” then yeah, I suppose you could call LW a cult. But it’s not clear what would be so very awful about “cults” in that sense. If you mean something like “a psychologically abusive organization which exploits people in the name of batshit crazy ideas for the benefit of its leaders” (which, FWIW, is more what I’d use the word to mean) then no, it doesn’t look credible to me that LW is “effectively a cult” or close to being so.

      If I were looking for an uncomplimentary analogue of Less Wrong to illustrate the intellectual dangers that it might lead to, I’d point to something like Objectivism rather than Raelism or Scientology. (I think Objectivism is mostly bullshit, in case it matters to anyone. Speaking of which, http://lesswrong.com/lw/m1/guardians_of_ayn_rand/ might be of interest.)

      Full disclosure: I am a moderately frequent participant at Less Wrong. I have never donated to the Singularity Institute, am not signed up for cryonics, and am not sure what I think about the prospects of a technological singularity. I think Eliezer Yudkowsky is a very clever chap who’s probably right about lots of things, but if anyone told me I should accept everything he says or give him all my money I’d laugh in their face.

  • http://avatars.imvu.com/jamesskaar jamesskaar

    focussing on just the dna would be a tad silly, there’s epigenetics, though likely not an influence in this case, and rna, also a bunch of other things in cells. that other stuff probably has enough effect to increase the complexity plenty. there is the problem, that yes, the complexity comes from somewhere, it’s not too hard to say that a signal sent by hormones during development cause nearly all of it. the option is that widely varying developmental conditions end up causing the same kinds of complexity, rather than differentiating.

  • Pingback: Weekend recap: my series on William Lane Craig is finished! | The Uncredible Hallq

  • Pingback: “No different from fundamentalists/ a cult” | The Uncredible Hallq


CLOSE | X

HIDE | X