There have been a bunch of interesting comments already on my post on Mind/Brain division, so I’m pulling some out here to reply to in depth.
dcb wrote
I have to agree that this is a hard sell to me. I am not an atheist, so perhaps there is something big that I am missing, but if the physical basis for the you-program is lost, in what sense is the “ported” version still you? Sure, it might think it is you, but wouldn’t that be basically a figment of its reconstructed imagination?
This is especially so because, if we could in fact completely capture the physical state of your brain and create an effective duplicate, and assuming the result is in fact a working mind that cannot be distinguished from you as you are now (something I do not think is a given), it is conceivable that the original you might still be operational, so to speak, or that multiple copies could be stamped out. Which is actually “you”? None but the original, I think.
Persistence of identity problems are always interesting and lead to interesting hypotheticals. I ported my consciousness to a different body or different hardware, I believe my body is constantly changing as is. Swapping meatsuits would be a jump discontinuity, rather than gradual change, but so would a medically necessary amputation or other radical physical change. In either of these examples, although some parts of my current identity would change and those changes would influence my future character, the continuity/persistence of my identity through time would not be destroyed. So I don’t think installing my consciousness on a different substrate weakens the persistence of my identity except insofar as the culture shock of the shift could make me mad.
As for the question of doubling by running different copies: I think both are equally me, but they diverge at the moment of creation. Duplicate mes are essentially the same as multiple mes which exist in parallel worlds (you know, the kind that split when you make significant choices). In dcb’s example, the only difference is that instead of running in parallel worlds, here my doppelgangers would exist in the same universe. They both would have equal claim to being ‘me.’
Nate asked
And with regards to the comment on dualism, are we not also proving that consciousness DOES require a physical manifestation, doing away with dualistic soul gobbeldy-gook? Who cares if you’ve created a non-human analog for a human brain – it still only functions by virtue of a set of physical processes.
Playathomedad added:
As for potentially transferring a mind to another medium, replacing neurons with silicon is hardly separating the mind from the physical material from which it arises. One can copy a computer file from one medium to another. Would you say that the file exists independently of the material on which it is stored?
So, the short answer is that I do think that there’s a sense in which the software or file can exist without hardware. Maybe this is the almost-math major in me talking, but I think of algorithms as entities that exist on their own, whether or not they are written down or run on a particular machine. I don’t think I go quite so far when I think about humans, but it would be most accurate to say I’m agnostic on that question. Either way, I don’t think the mind’s reliance on some kind of physical substrate is a disproof of dualism. There’s still an important distinction between the algorithm and the substrate– one confers the intentionality on the other. Additionally, the algorithm is not necessarily subject to entropy and death in the same way that the body is.