Hoist on my own (Virtual) Petard?

Last week, I was ragging on people who deny meaning in this life or put all their focus on a questionable world to come, and deisach wondered in the comments if I might havc neglected the beam in my own eye:

“the idea of enlightenment as total disregard for the world you live in”

May I ask, how does this apply to your support for transhumanism (as I think you said you support it)?

Pardon me if my notion is the clichéd notion that you do not hold, but how is your hypothetical computer simulation world all that greatly different from the idea of uploading one’s consciousness into some form of computer storage, unmoored from a physical body, and then being able to manipulate and create one’s own reality? The dangers of the simulated world are separate from the actual dangers, that is, you can decide to either play by the rules of the simulation or ignore those rules as to whether or not fire burns or water drowns, but that has nothing to do with actual threats to your existence such as if the storage is corrupted or someone in meatspace pulls the power cord out of the socket.

This is a difficult question for me as a quasi-gnostic who doesn’t see physical existence as particularly important.  I should say first that I’m mostly interested in transhumanism for the near-future possibilities (developing new senses, thinking more about human-machine interaction, not getting complacent about the transhumanist changes that have been around a while).  I’m not counting down til the day I get my robot or sim-body.

I doubt any of those opportunities will occur in my lifetime, if they happen at all, but if I had the opportunity to upload, I’m not sure what I would do.   My pro-transhumanist beliefs don’t lead me to embrace every radical modification for the sake of rejecting our flawed bodies and minds.  I’m just trying to get people who object to these changes to offer up more of a reason than ‘It’s inhuman!”   Tell me why it’s bad for people, because there’s plenty of evolutionary detritus that is very bad for us and very human.

At this moment, I’d lean against humans uploading en masse.  (I’m not opposed to some daring souls doing it themselves in the same spirit as a one-way trip to Mars).  I suspect it would be bad for us morally.  As deisach said, if everyone in a simulation knows they’re in a simulation, every choice you make feels less weighty.  And, even though we may have a duty to care for all people and work for their good, it gets a lot harder to think of them when we don’t see them.

Think back to the problem of drone warfare and possible alienation from moral judgement.  Or, if you’d like a more classic example, consider that, in the Milgram experiments, subjects were less likely to shock the other participant when they could see them, instead of just hearing them.  They were even less likely to run through the full course of shocks when both participants were in the same room.  Anything that makes people feel less real is a moral hazard.

I may want people to put less emphasis on physical proximity and physical similarity, but ignoring the facts will only put people at risk.  As a transhumanist (and a not-so-soft paternalist), I use these data to try to hack my brain.  That could be anything from just keeping in mind that proximity is a weak point in human moral reasoning and reminding myself to stay on guard to avoiding making big decisions that affect others if I’m by myself and can’t see other people.

Here’s the takeaway: my fascination with transhumanism is less about abandoning the body as an end-in-itself.  My goal is to chip away at reverence for the body-as-it-is so we’re more comfortable studying how it leads us astray and making any helpful and feasible modifications.

And also implanting LEDs anywhere that seems fun.

 

P.S. It’s possible that a lot of these moral concerns would be moot if everyone didn’t know they were running in a computer, but I’m really not clear on what goal would justify secret uploading.  All I’m coming up with is beta-testing (unethical) and environmental conservation (a sledgehammer to hit a nail), so I’m inclined to treat that case as mostly irrelevant.

About Leah Libresco

Leah Anthony Libresco graduated from Yale in 2011. She works as an Editorial Assistant at The American Conservative by day, and by night writes for Patheos about theology, philosophy, and math at www.patheos.com/blogs/unequallyyoked. She was received into the Catholic Church in November 2012."

  • Gilbert

    OK, so people value other people based on some kind of proximity function. In practice they instinctively use some evaluation procedure that generalizes badly to modern situations (where interactions can be more remote) and even worse* to uploads. Perhaps a bit like different polynomials may be functionally identical on some finite fields but not on others so that using the “wrong” one may be efficient for one domain put catastrophic for another one.

    Going by your premises the solution seems obvious: Mass upload and then fix the bug.

    That is, of course, unless patching healthy minds is somehow taboo. Which it is, for the same reason doing the same to the body is: Both are unnatural in the philosophical sense of the term. But if human nature wasn’t intrinsically worthy of respect, well then you would be better advised to chip away at reverence for the mind-as-it-is so we’d be more comfortable studying how it leads us astray and making any helpful and feasible modifications.

    (While we’re at it I have meanwhile assembled my North Paw. Soon I will know if it can actually deliver on the promised sense. Abstract reasoning has made me even more skeptical than when I ordered it, but in the final end it’s an empirical question and that’s how I shall settle it.)

    * Or should that be worsely? That would be the only logical form, but intuition says no.

    • leahlibresco

      First of all, you’re definitely invited to write a guest post on your North Paw experience.

      I’ll hit the main content of your comment in the next post.

      • Gilbert

        I’ll take you up on that invitation. But first I have to give the thing a chance, i.e actually wear it for a while. So I guess I’ll send you a draft some time around New Year.

  • Pingback: Simulated Ethics and Brainmodding | Unequally Yoked

  • keddaw

    Sorry to say but there is some flawed reasoning in here.

    If I am in a simulation then I am aware that that is all I am, likewise I know that that is all the other simulations are so I see no reason that I’d treat similarly afflicted sub routines any worse than I currently treat my fellow meat bags.

    Maybe I’m unusual and the fact that it’s the pattern I find beautiful and worthy of respect and empathy. I hope not since caring overly for the shell ignores the baby chick inside.

    Not the most literary metaphor I’ve ever come up with but my phone lacks a thesaurus or keyboard to fully utilise google. Deal with it!

  • Pingback: Finding the Morality Pill Hard to Swallow | Unequally Yoked

  • Pingback: 7 Quick Takes (12/9/11)


CLOSE | X

HIDE | X