Last week, I was ragging on people who deny meaning in this life or put all their focus on a questionable world to come, and deisach wondered in the comments if I might havc neglected the beam in my own eye:
“the idea of enlightenment as total disregard for the world you live in”
May I ask, how does this apply to your support for transhumanism (as I think you said you support it)?
Pardon me if my notion is the clichéd notion that you do not hold, but how is your hypothetical computer simulation world all that greatly different from the idea of uploading one’s consciousness into some form of computer storage, unmoored from a physical body, and then being able to manipulate and create one’s own reality? The dangers of the simulated world are separate from the actual dangers, that is, you can decide to either play by the rules of the simulation or ignore those rules as to whether or not fire burns or water drowns, but that has nothing to do with actual threats to your existence such as if the storage is corrupted or someone in meatspace pulls the power cord out of the socket.
This is a difficult question for me as a quasi-gnostic who doesn’t see physical existence as particularly important. I should say first that I’m mostly interested in transhumanism for the near-future possibilities (developing new senses, thinking more about human-machine interaction, not getting complacent about the transhumanist changes that have been around a while). I’m not counting down til the day I get my robot or sim-body.
I doubt any of those opportunities will occur in my lifetime, if they happen at all, but if I had the opportunity to upload, I’m not sure what I would do. My pro-transhumanist beliefs don’t lead me to embrace every radical modification for the sake of rejecting our flawed bodies and minds. I’m just trying to get people who object to these changes to offer up more of a reason than ‘It’s inhuman!” Tell me why it’s bad for people, because there’s plenty of evolutionary detritus that is very bad for us and very human.
At this moment, I’d lean against humans uploading en masse. (I’m not opposed to some daring souls doing it themselves in the same spirit as a one-way trip to Mars). I suspect it would be bad for us morally. As deisach said, if everyone in a simulation knows they’re in a simulation, every choice you make feels less weighty. And, even though we may have a duty to care for all people and work for their good, it gets a lot harder to think of them when we don’t see them.
Think back to the problem of drone warfare and possible alienation from moral judgement. Or, if you’d like a more classic example, consider that, in the Milgram experiments, subjects were less likely to shock the other participant when they could see them, instead of just hearing them. They were even less likely to run through the full course of shocks when both participants were in the same room. Anything that makes people feel less real is a moral hazard.
I may want people to put less emphasis on physical proximity and physical similarity, but ignoring the facts will only put people at risk. As a transhumanist (and a not-so-soft paternalist), I use these data to try to hack my brain. That could be anything from just keeping in mind that proximity is a weak point in human moral reasoning and reminding myself to stay on guard to avoiding making big decisions that affect others if I’m by myself and can’t see other people.
Here’s the takeaway: my fascination with transhumanism is less about abandoning the body as an end-in-itself. My goal is to chip away at reverence for the body-as-it-is so we’re more comfortable studying how it leads us astray and making any helpful and feasible modifications.
And also implanting LEDs anywhere that seems fun.
P.S. It’s possible that a lot of these moral concerns would be moot if everyone didn’t know they were running in a computer, but I’m really not clear on what goal would justify secret uploading. All I’m coming up with is beta-testing (unethical) and environmental conservation (a sledgehammer to hit a nail), so I’m inclined to treat that case as mostly irrelevant.