Simulated Ethics and Brainmodding

It’s a week of tough questions about transhumanism.  After reading my post on moral hazard for uploaded humans, Gilbert asked:

OK, so people value other people based on some kind of proximity function. In practice they instinctively use some evaluation procedure that generalizes badly to modern situations (where interactions can be more remote) and even worse* to uploads. Perhaps a bit like different polynomials may be functionally identical on some finite fields but not on others so that using the “wrong” one may be efficient for one domain put catastrophic for another one.

Going by your premises the solution seems obvious: Mass upload and then fix the bug.

That is, of course, unless patching healthy minds is somehow taboo. Which it is, for the same reason doing the same to the body is: Both are unnatural in the philosophical sense of the term. But if human nature wasn’t intrinsically worthy of respect, well then you would be better advised to chip away at reverence for the mind-as-it-is so we’d be more comfortable studying how it leads us astray and making any helpful and feasible modifications.

Once again, let me start with the very practical and then wander out to the philosophical.  Being able to upload and/or simulate consciousness is no guarantee that we’d understand how it works or how to fix it.  Geneticists can sequence the genes of various species and can even synthesize an entire genome, without knowing what any of the genes do or how you would alter their function.  So the fact that humans minds could be run outside of conventional bodies wouldn’t mean we’d be able to have direct influence over their workings or that the any changes we could make would have very predictable effects.

That caveat aside, I think Gilbert’s question raises some really interesting ethical dilemmas.  You run into the expected problems of identity, but the uncertainty involved in altering the mind also brings up issues of medical ethics.  How would you test a mind-modification that you thought might have the potential to do a lot of good?

It’s not all science fiction. Check out Transcranial Magnetic Stimulation (http://en.wikipedia.org/wiki/Transcranial_magnetic_stimulation) or brain implants for depression (http://www.nature.com/news/2011/110207/full/news.2011.76.html)

As is standard procedure for risky but promising treatments, the mod could be offered to volunteers in the virtual world who were painfully deficient with regard to their moral instincts and wanted to be better (the equivalent of this guy from C.S. Lewis’s The Great Divorce).  It might be tempting to just run computer models, but once you can adequately simulate human minds, that sounds a lot like creating people for the sole purpose of research and then destroying them.

The best option I can come up with is letting volunteers bifurcate.  When they agreed to enter the trial, they’d duplicate their software/consciousness and make the change to one of the copies.  (Fellow scientists reading this are already squeeing at the rigor of the control group set-up).  But at the end of the experiment, what would you do?

Cantor Set (or the flow chart for your experimental design)

Maybe both copies would want to go on as they are, but, if a change dramatically improved our lives, would the control group version of you want to merge with the augmented self?  Is there anyway to talk about such a merge (or maybe the initial treatment, too) without it being the same thing as death?

I’m suspicious of any attempt to define a flaw as central to our own identity, but I don’t know how radical a change you can go through and still have any connection to your previous self.  For example, I was pretty roundly disliked before I went to college and, in that environment, developed some pretty bad moral habits.  I certainly wish that my future self will treat people more like people and less like prompts on a kind of morality SATs, but I don’t wish that Future!Leah would act differently because she had our past erased.

I’m suspicious I’m rebelling mainly against the way the change happens, not it’s magnitude, so I don’t know if I trust my revulsion.  We applaud major identity shifts provided they take place over a long timespan, but I don’t see any great virtue in prolonging suffering and bad habits as an end-in-itself.  Why shouldn’t the bifurcates subjects remerge and abandon the warped path one of them was stuck on?

How good would a change have to be in order to make it worth your while to subsume your self in a similar but different version?  How confident are you that you would recognize the altered you as superior if it truly was?

Call it a fun sci-fi premise.  Or a decent metaphor for conversion.

About Leah Libresco

Leah Anthony Libresco graduated from Yale in 2011. She works as an Editorial Assistant at The American Conservative by day, and by night writes for Patheos about theology, philosophy, and math at www.patheos.com/blogs/unequallyyoked. She was received into the Catholic Church in November 2012."

  • Cami

    Sorry, iPod is acting up.

    I sorta skipped the last paragraph at first and I was thinking “wow, that’s kinda like conversion.” Then I read the last paragraph.

    That’s how conversion was for me. I truly was a new person in Christ when I came back to the Church.

  • keddaw

    “the fact that humans minds could be run outside”
    Can it be called human? I have strong doubts, but they’re based on a narrow definition of human. I also have no problem with it.

    Our sense of permanence is false (no evidence supplied, unless this premise is strongly challenged – and then I could probably point to the comment above) so any bifurcation Of our minds creates 2 people as soon as there is the slightest difference in experience or thought. Asking me to sacrifice myself for a slightly better version of me is going nowhere because it isn’t me! That may be a weakness in our psychological makeup, but until we sort that out the original problem is going nowhere.

  • deiseach

    Thank you for answering my questions in the other post. I’m just as fascinated by the trial you propose; suppose – as you say – at the end of the trial both copies of the individual consciousness want to go their separate ways.

    Does the ‘original’ get the last word on wiping the test copy, or does the second copy consciousness have any rights? We’re a long way yet from having to deal with questions of assigning legal rights and personhood to virtual entities, but it’s never too early to start thinking about the problems.

    The difficulty I foresee is that you either end up with two classes of consciousnesses or entities, one adjudged to be the ‘real’ Leah or Deiseach or Gilbert – based on mere chronological sorting, i.e. whichever one came first is the original – and one judged to ‘just’ be a copy and therefore any modding or testing or hiring out your copy or copies for use by others could lead to a virtual slavery by the backdoor (or even the front door), or you treat all such entities equally, which means there could be two, four or however many Leahs, Deiseachs and Gilberts all floating around in cyberspace and all considered – even as a legal fiction – to be the same person.

    The problem with the chronological status is that okay, that works fine for me when I upload my consciousness either partially, temporarily or for good from my flesh-and-blood brain. What happens when I’ve tinkered with my virtual consciousness to the point of becoming a second consciousness, and then I make a copy, and then my outside body dies and then my copy makes a copy – Deiseach Version Three may argue that Deiseach Version Two is only a second-generation copy, not the originally uploaded one, and so has no better right to exist and should not be making the ‘wipe or keep’ decisions for Version Three?

    As to allowing all copies equal status, never mind the complications if there are ten versions of us running around at the same time, but down the line, does Deiseach Version Fifty really have any greater relation to Deiseach Version One sitting here now, than I do to my great-great-grandmother? Could Fifty be said to be the same person in any meaningful way?

    My own uninformed opinion is that any upload isn’t ‘me’ (whatever the ‘me-ness’ of me may consist of) but is indeed a copy. Therefore, if I pop my clogs in the flesh, I don’t continue to exist in virtual world but my copy does. I’m gone, this is my twin or sibling or child that is going forward. And if there are ten copies or versions, then yes, they are all separate entities. I think that simply by virtue of existing as separate copies they’re going to have divergent experiences. I share the same childhood memories as my siblings, but if my sister dies, my holding those common memories doesn’t make me ‘her’ as any kind of replacement, much less continuity.

    Well, we’re a couple of centuries away from those questions, anyhow. I don’t know if we’ll ever get there; I think we may never get much past wiring up our bodies with implanted chips and the like, and arguing about that is like arguing does wearing glasses or having a pacemaker alter the fact of your humanity. The real problem is going to be when people start screwing around with their brains.

  • keddaw

    One solution, and not one I’m entirely enamoured with, is that you put yourself into an unconscious state prior to uploading/copying and then one version, probably the new one, is awakened and asked if they are whole and if they consent to having the other version destroyed.

    I still don’t like it as every version of me will feel real and never consent to be destroyed even if it appears to be for the greater good. Unless the greater good is truly magnificent but the existence of a slightly better version of me doesn’t come close to that threshold.

    • deiseach

      That’s exactly the problem, keddaw. If every version shares the same base memories and personality, every copy is going to insist that it’s the original one or, if you can demonstrate that it was created as #3 in a series of 5, that it is just as real and has the same rights to exist as the other virtual entities.

      Who gets the say on who gets wiped? How do you judge, unless you are going to maintain that there is only one true original and the rest are nothing more than the equivalent of reflections in a mirror – which then kills the whole point of the exercise, because if these are only images imitating the actions of the original, as my reflection moves its arms when I move them, then you don’t have a genuine, independent, working mind of any kind, human or artificial, just an extremely elaborate wind-up toy.

  • Gilbert

    If the improvement does work I don’t sea why sharing in it would necessarily mean a merger. One could simply apply the same change to the copy thus getting two improved versions.

    The merger question could be relevant if the change is irreversible (short of restoring the back up from before the change) and fails. But that would pretty much by definition mean an experience of continuity was inconsistent with undoing the change. In that case “merging” couldn’t mean more than giving the better version a copy of the defective versions memories. Basically a slightly tuned up Joe Hill survival. I’m pretty sure my defective version wouldn’t agree to that.

    On the other hand there would be reasons against allowing permanent bifurcations. For example such a right would preclude democracy as a form of government and make a mess out of the original version’s moral and legal obligations.

    As for me, I don’t think any such project can be a moral improvement because I believe in an objective human nature. Yes, you are “rebelling mainly against the way the change happens” but you are right in so rebelling because that way is actually evil.

    Plus one more semi-related thought:
    If uploading becomes possible, and isn’t regarded as a form of suicide, then it probably will be seen as a solution to neuro-degenerative diseases. The simulation would probably operate at a higher level than the disease (i.e. one woudn’t simulate the neurons’s metabolism) and thus be immune from many degenerations long before they become curable and also long before physical brains could be synthesized. So there’s your reason for mass-uploading: For many people it might be the only path to the immanent immortality you desire. Unless, of course, uploading itself is death.


CLOSE | X

HIDE | X