(This post is a continuation of yesterday’s post on “A Brief History of Tomorrow”: What Apple, Facebook, & Google Don’t Want You to Know, inspired by Yuval Harari’s book Homo Deus: A Brief History of Tomorrow.)
There are positive benefits of big data. Public monitoring based on Google searches, called “Google Flu Trends,” can already give a warning about flu outbreaks “ten days before traditional health services.” It could be even more accurate, of course, if Google also searched private emails for signals of flu outbreaks (340).
Or consider the forthcoming potential of autonomous cars. If we were collectively willing to give up the privacy of our location—and let algorithms know in advance where we are, where we want to go, and when we want to get there—then experts estimate that we could replace 1 billion private cars (which spend most of their time sitting around unused) with 50 million communal cars (whose use is optimized by algorithms). We would also need “far fewer road, bridges, tunnels, and parking spaces,” and we would lose less time and equanimity in traffic—and traffic jams. (390).
To give another example along those lines, have you ever used Google Maps or the app Waze, which was bought in 2013 by Google? Even when I am driving familiar roads, I sometimes turn on Google Maps because it lets me know if there is an unexpected traffic backup for any confluence of reasons, and if so, what my potential options are for re-routing. This app has frequently saved me a lot of time. And more than once I’ve ended up stuck in traffic because I was not using my mapping app. Of course, when I have it on, I’m giving Google lots of data about myself.
But my larger point is actually one level beyond that. Currently, all the various mapping apps give control to individual drivers. They present the situation and ask if you want to take an alternative route. But as some of you may have experienced, this can cause a bunch of people using the same mapping app to create a secondary traffic jam on a small side road. The next generation of mapping apps may try to “think for us”: “Maybe it will inform only half the drivers that Route #2 is open, while keeping this information secret from the other half. Thereby, pressure will ease on Route #1 without blocking Route #2” (347). More perniciously, maybe the “better” routes will be given to users who pay extra for a “pro” level of the app.
I’ll give you just one more example of the many different ways the religion of data could take us in the future. People may, one day in the not too distant future, find themselves picking up a smart phone and asking Apple’s Siri, Amazon’s Alexa, Google’s Assistant, or Microsoft’s Cortana—you know, depending on your Corporate Overlord of choice—and find themselves asking, “Siri/Alexa/Assistant/Cortana, whom should I marry?” And you might hear this answer:
Well, I’ve known you from the day you were born. I have read all your emails, recorded all your phone calls, and know your favorite films, your DNA and the entire biometric history of your heart. I have exact data about each date you went on, and, if you want, I can show you second-by-second graphs of your heart rate, blood pressure and sugar levels whenever you went on a date with John or Paul…. And naturally, I know them as well as I know you. Based on all this information, on my superb algorithms, and on decades’ worth of statistics about millions of relationships—I advise you to go with John, with an 87 percent probability that you will be more satisfied with him in the long run. Indeed, I know you so well that I also know you don’t like this answer. Paul is much more handsome than John, and because you secretly give external appearances too much weight, you secretly wanted me to say “Paul….” My algorithms, which are based on the most up-to-date studies and statistics—say that looks have only a 14 percent impact on the long-term success of romantic relationships. So, even though I took Paul’s looks into account, I still tell you that you would be better off with John (342).
That sort of dating “Big Data” style may (or may not) sound appealing. But you might ask whether it could be a helpful perspective to consider, irrespective of whether you decide to follow its advice.
However, Big Data also has the potential to create the “Big Brother” of an Orwellian police state (350). This sort of totalitarian regime would not only continually monitor our bodies and minds (providing information for individuals to use for their self-optimization), but also seek to control and regulate our every movement and thought.
There is so much more to say about all of this. But for now I’ll say this. There is a lot of fear-mongering happening these days around immigration. I invite you to consider that all that energy might be more fruitfully spent planning for a transition to a future in which, not immigrants, but robots and other forms of Artificial Intelligence really are coming for our jobs. And the jobs under threat of robotization are not only of bus/cab drivers, telemarketers, insurance underwriters, sports referees, cashiers, chefs, waiters, tour guides, construction laborers, and security guards, but also doctors, pharmacists, teachers, music composers, artists (316-321, 328-329, 330-331).
This is where Harari’s Dataism, his religion of data, can challenge us to ask: What are our deepest, most authentic “ultimate concerns?” On the brink of a potential paradigm shift from homo sapiens (“wise humans”) to homo deus (“godlike humans”)—augmented through algorithms, nanotechnology, and wearable/implanted data processing—we have the opportunity to ask ourselves anew “What are people for?” If we allow the answer to be that people are for the so-called bottom line of corporate profit alone, then we are headed toward some form of dystopia along the lines of Orwell’s 1984. (This point does not even get into the perversity of the Supreme Court’s Citizens United ruling that “Corporations are people.”)
But there are alternative paths in which “we the people” demand a global ethic such as the “Triple Bottom Line” of people, planet, and profit. Financial profit is still a factor, but it must be balanced against the wellbeing of people and the long term sustainability of this planet.
The Rev. Dr. Carl Gregg is a certified spiritual director, a D.Min. graduate of San Francisco Theological Seminary, and the minister of the Unitarian Universalist Congregation of Frederick, Maryland. Follow him on Facebook (facebook.com/carlgregg) and Twitter (@carlgregg).
Learn more about Unitarian Universalism: http://www.uua.org/beliefs/principles