Google Wants to Augment Your Reality

I’ve known this was coming for some time, but it’s actually more alarming than I expected it to be.

Google’s Project Glass is an augmented reality system based on the idea of wearable computers: in this case, a pair of glasses that compresses all the functionality of a smartphone (phone, messaging, camera, GPS, social networking, etc) into one hands-free, voice-controlled device.

Here’s the first look at it in action:

YouTube Preview Image

The prototypes appears to have a single screen positioned above the right eye, which suggests that it is possible to look away from the image:

But the New York Times is reporting that

Project Glass could hypothetically become Project Contact Lens. [Barbak] Parviz, who is also an associate professor at the University of Washington, specializes in bionanotechnology, which is the fusion of tiny technologies and biology. He most recently built a tiny contact lens that has embedded electronics and can display pixels to a person’s eye.

It looks like this:

We’ve been through many stages in the development of computer technology, and a number of the most important stages involved lowering barriers between user and device. Input has evolved from punchcard to command lines to mouse to stylus to touch, each time bringing the technology closer to the individual.

Yet one thing has remained fixed: the screen. We’ve moved it around  a bit, tarted it up with colors, shrunk it down, even started making it flexible. But the screen was always out there, at a distance. It was separate from ourselves.

I worked extensively with the early consumer “VR” headsets (I still have some around here somewhere) and found them all unsatisfactory not just for technological or ergonomic reasons, but for the way they shut out the world and isolated the user. As technology has improved, those problem of total immersion have remained a constant, and even gotten worse.

Sensory-deprivation studies on humans have revealed an effect called “faulty source monitoring”, which can occur within 15 minutes of a test subject being placed in a sensory-deprivation room or tank. This means that the brain starts to malfunction–quickly–in its ability to recognize the source of what it is perceiving. The result? Hallucinations.

That’s an incredibly fast reaction, and goes a long way to explaining why total immersion VR is acutely uncomfortable for many. Normal sensory input is being subjugated to artificial sensory input. Even really vivid 3D films and theme park rides can leave people woozy and discombobulated, and it’s not just from motion sickness. It’s from the fact that your senses are being screwed with.

The glasses for Project Glass are not total immersion. They do not deprive the senses of stimulus. In fact, they add new stimulus: the data streaming through the projected image and the earpieces. However, we’re dealing with a similar problem: the ability of the brain to process radical shifts in traditional sensory input.

The video shows a person walking around the city, doing all the usual smartphone tasks with an image floating in front of him. I get irritated when I have a floater in my eye, so I can’t imagine having Clippy or some other idiotic icon prancing around my field of vision, making this a total non-starter for me. In fact, I see a whole host of issues with this kind of tech.

First, we already have problems with people walking or driving while texting. Lifting the image from the device and placing it in front of the eye doesn’t make that better. It makes it worse. People cannot effectively split their concentration between two fields of vision, but the glasses will give them the illusion that they can. It takes intense training to get a fighter pilot to use a HUD (heads up display) correctly, but we think we’re going to strap the functional equivalent of a HUD to millions of people and just send them on their way?

Second, can you image a city full of people talking to themselves as they gaze at something no one else can see? It would be like living in a madhouse. Smartphones and personal music devices have already disconnected many people from each other. Augmented reality, particularly in a contact lens, would pretty much finish us off as a species. Further integration of people with technology means greater distance between people and their fellow humans.

Third, what exactly are we gaining from this? I can think of a few jobs that could benefit from augmented reality (things involving machinery, operating complex systems, or possibly medical applications), but very little about my everyday life that could be improved. It offers a very minor improvement in convenience (hands free and more convenient location of the image are about it) at the sacrifice of so much.

Fourth, what will it do to our brains? If symptoms of faulty source monitoring started to emerge after only 15 minutes of sensory deprivation, what kind of neurological and psychological problems will emerge with a constant data stream in your eyeball? Are they studying the way this affects the brain? Is anyone even asking the question?

Fifth, what about vision? The constant focusing, from near to far, that this must require would put a huge strain on the eye itself. Are they studying how this affects eyesight? The video shows it being used in daylight. What about nighttime use? Can it be used at night? Does that mean the screen is illuminated? If so, does it interfere with night vision?

Sixth, how much control of our senses are we giving over to third parties? If you can stop and take a picture of anything you’re looking at, doesn’t that mean someone can access your camera and simply follow you around, in a first person view, without you even knowing it? If you have a screen in your contact lens, how long before people start buying ad space on it? Who’s creating the applications you use with this technology, and what are their motivations and intentions? It’s one thing when you’re staring at a screen on your desk or something in your palm, a couple feet away from your body. But when that image is integrated with your very perception of reality, it becomes all that much more powerful. Is it really a good idea to let not just the technology, but the technologists, have that much access to your perceptions?

We are reaching the acceptable limit of convergence between man and machine. Our machines need to remain outside of us, apart from us, so that we can perceive them for what they are: tools to be set aside. They cannot be allowed to alter us at an ontological level, and the ability to manipulate basic sensory input gets perilously close to becoming an ontological problem. We are not merely what we perceive through our senses, but that is an awfully large part of our being. Using technology to create a whole new paradigm for the way we see and hear our world is fraught with all kinds of dangers, and for what reward? Being able to check in at a hot dog stand on Foursquare without touching your phone?

I’m sure mobile technology can evolve far beyond the current handset paradigm. The question is: do we want it–do we need it–to?

About Thomas L. McDonald

Thomas L. McDonald writes about technology, theology, history, games, and shiny things. Details of his rather uneventful life as a professional writer and magazine editor can be found in the About tab.

  • victor

    It seems like all this technology really needs to live up to its true potential is for the tiny screen on a contact lens to be connected to a tiny webcam pointed directly at the user’s navel.

  • Dennis Mahon

    The prototypes appears to have a single screen positioned above the right eye, which suggests that it is possible to look away from the image:

    Considering how irritated I get when there’s a streak of dirt or oil on the lens of my glasses, I think I’ll be giving this thing a pass.

  • http://www.robinhardy.com robin

    Given all the serious reservations about the technology which you articulate so well, it can only be a few short years before it is widely and enthusiastically adopted. Is the The Matrix, for reals?

  • http://moralmindfield.wordpress.com Brian Green

    This is a really interesting question: how close should we get to our technology and what kinds of tech and why. There seems to be some kind of mass movement towards further and further integration with communications and data-access technologies. Well, they are awfully handy, after all. You list a lot of good points, all worthy of consideration. The early-adopters will get to find the bugs for the rest of us… (That’s why I’m not an early-adopter… always middle-late).

    Another danger is the growing gulf between young and old in terms of being able to relate with each other. I’ve heard stories of more and more college students who are unable to write emails (to their profs) because they only want to text. That’s a difference in just the past ten years. How do these folks survive at Thanksgiving dinner with Grandma, there in person! Give them all glasses or contact lenses and they will only want to communicate via those and grandma will think she’s the only sane one in the room (and she might be right).

    Young people are growing up in a different tech culture than even those ten years older than they. The world is going to get really weird when every generation has different preferred communications technologies.

  • http://joshuapostema.com Joshua Postema (@JoshPostema)

    As you said, it will only be a matter of time before some starts advertising on these types of ‘augmented reality’ lenses. And considering Google makes its money that way, the ‘matter of time’ will probably be close to non-existent.

    I just found this blog today and have really enjoyed reading through it. I’m a software engineer/technologist myself but share similar reservations for our culture’s mindless pursuit of technology. There never seems to be any consideration not just of inherent dangers, but of good things that are simply lost. Time spent reflecting on one’s own, time spent in person with family, time spent thinking and reading without distraction; these things have a hard time surviving even today. I shudder to think what it might be like a decade from now.

    Anyway, thanks for sharing your thoughts!


CLOSE | X

HIDE | X