Does Technology Prevent Us From Being More Thoughtful About Technology?

iPhone 5

You’ve probably noticed that one of the recurring themes in my articles here on CAPC concerns the ever-increasing presence of technology in our lives, and its potential ramifications. (If you have any doubt that this is occurring, consider this recent Pew Internet poll concerning mobile phone users which found that 44% of mobile phone users sleep next to their phones at night, and 29% described their phones as “something they can’t imagine living without.”)

This ubiquity is aided greatly by technology’s constantly growing speed, power, and capacity. It seems like every six months, some new device is released that renders all of its predecessors obsolete by virtue of its technological advancements, be they speed and performance increases, bigger and brighter screens, more powerful and ingenious applications — or something previously unforeseen.

This is obviously a good thing in many ways. I wouldn’t be publishing this piece, and you wouldn’t be reading it, if it weren’t for certain technological advances within even the last five years or so. And that’s just the tip of the iceberg, if we step back and consider the countless benefits that a plethora of technologies have introduced into our lives in the last generation. Indeed, the rate at which we envision, develop, and implement new technology is one of our species’ defining, if not crowning, achievements.

Clayton Miller, a “graphic and interaction designer” from Chicago, recently bought an iPhone 5 and was wowed by its size, speed, etc. — even when compared to his previous iPhone. But those things, and the advances they entail, got him thinking about technology, and how he used to relate to it versus how he relates to it now. For example, regarding the incredible sharpness, clarity, and verisimilitude of the visuals one sees on an iPhone 5′s “Retina” display, he writes:

The history of raster-based computer displays may be seen as a single thread of increasing medium-abstraction from the technology’s earliest green-phosphor text terminals through today’s Retina displays. The experience of using the oldest screens was deeply connected to the limitations of the technology: Far from reproducing photographs in the millions of colors discernible by humans, images were limited to a single color and two intensities; even such screens’ greatest strength, text, was far removed from capturing the subtleties of centuries’ worth of typographic refinement. In the use of these technologies, the medium itself was ever-present.

As graphics technology improved over the next few decades, the technology itself began to abstract away as images could be reproduced at greater fidelity to the human eye and typography could be rendered with at least a recognizable semblance of its heritage. With high-DPI displays, the presence of the medium is all but gone — while dynamic range and depth cues may yet evade modern LCDs, the once-constant reminder that you are viewing a computer display has become so subtle as to have disappeared.

This may sound rather abstract and philosophical, but I believe Miller’s main question is this: When technology becomes so fast, powerful, and ubiquitous as to essentially become invisible to its users, what trade-offs have we (unknowingly) made? At the risk of sounding paranoid about some impending “robopocalypse”, if technology becomes so invisible that we no longer think about it, or becomes so close that we can no longer see it for what it is, are we still in control of that technology? On a macro level, yes: after all, we don’t self-perpetuating technology (yet). But on a micro, everyday life level? If technology is so invisible that we don’t mind sleeping with it, or it’s become so easily entwined with our lives that we can’t envision our lives without it, then perhaps we don’t enjoy as much control as we think we do.

As I reflected on Miller’s article, and wondered what it would look like to be more thoughtful about the technology in my life, I thought of recent food-related “movements” in which people have tried to be more thoughtful, deliberate, and intentional about what they eat. These movements have manifested themselves in numerous ways, from buying “organic” food that hasn’t been treated with growth hormones and other artificial chemicals, to the rise in CSAs and community gardens, to a growing support for local vendors. Among other things, I believe that these movements have been inspired by people’s desire to be more in control about the food that they and their families consume, to know more about it, and to be more empowered to make better decisions rather than live at the mercy of corporate supply chains and fast food franchises.

Is it possible to apply some of those same principles to our technological consumption. If so, what would that look like?

My desire here is not to make people paranoid about technology. I’m a geek myself, and I love my gadgets. But I hope that love never causes me to be blind or passive with regards to the technology that comes into my life, and the life of my family. Embracing technology in its ever-so-quickly developing forms is not inherently bad, but I want to do so with a critical, even skeptical (when warranted) eye.

About Jason Morehead

Jason Morehead lives in the lovely state of Nebraska with his wife, three children, zero pets, and a large collection of CDs, DVDs, books, and video games. He's a fan of Arcade Fire and Arvo Pärt, Jackie Chan and Andrei Tarkovsky, "Doctor Who" and "Community," and C.S. Lewis and Haruki Murakami. He's also a web development geek, which pays the bills — and buys new music and movies. Twitter: @jasonopus. Web: http://opus.fm.

  • Scott G

    I’ve been mulling over the idea of deliberation, old technology, and “play” lately.

    “Play” seems the most intuitive and important concept for our generation. It goes like this: when I get a tool (Scrivener, iPhone, new ratchet set), I immediately play with it. There’s something inherently joyous about putting something through its paces, seeing if it will be useful, and feeling the extension of our bodily abilities that is part of our nature as tool-wielding humans. With the increasing speed of new and better “toys” and “tools” to play with (and the increasing overlap between the two), it seems there is some danger of spending all one’s time playing with the next best thing, and none actually doing the things your technology allows.

    Deliberation is the opposite of play, I think. In play one learns by doing, while when one deliberates one thinks before doing. Traditionally, intellectuals were accused of deliberating too much; though I think after the internet arrived, that changed.

    Yet there seems to be a healthy medium in the intentional use of old technology. To me, a wet shave, or time spent kneading dough for cinnamon rolls, or the precise lettering of calligraphy, seems to act as a balm for the soul. I’m still “playing,” in the sense that I’m doing an action, rather than thinking, and in the sense that learning is therefore multisensory. But somehow these things are soothing in a way that learning Linux–or the battle system of Banner Saga–isn’t. Obviously, this is still working with technology, since there were no ovens or razors in Eden, but it is less catered to our needs, and therefore forces us to cater to the needs of the technology. In short, we have to think about what we’re doing, because older technology doesn’t make actions as easy as thought.

  • http://opus.fm/ Jason Morehead

    @Scott: The play/deliberation dichotomy is an interesting one. I like it. Thanks for sharing.

  • Pingback: Breakfast Links for 12/04/12: Build-a-Bear Theology; Eucharist Promise; Animals in the Kingdom; Porn Stars and Self-Esteem


CLOSE | X

HIDE | X