Episode 7 of the first season with Jodie Whittaker as the Doctor is called “Kerblam!” We learn why very early in the episode, as the Doctor receives a teleport delivery via robot. Kerblam, we are told, is the biggest retailer in the galaxy. The Doctor had apparently ordered a fez some time ago and forgotten about it. If that were true and there were no other factors, we would think that the company was very late in delivery. But on the one hand, this might have been the Doctor passing through the earlier time period in which the order had been placed, while on the other, we learn later that orders had indeed become backlogged for a particular reason.
In the box the fez came in, “HELP ME” is written on the back of the packing slip. The Doctor and team pay Kerblam a visit, going undercover as new workers. We learn the company slogan: “Pick it, click it, kerblam it.”
When Graham says that the robots are creepy, the Doctor says, “That’s robophobic. Some of my best friends are robots.” We learn that Kerblam is 90% automated and 10% people-powered, with the organic employees consisting of some 10,000 human workers. When the Doctor asks why the work is not entirely automated, reference is made to Kandokan labor laws. One of the slogans related to the movement behind such legislation is quoted: “Real people deserve real jobs.”
There is no attempt to hide the message for our own time. A worker named Dan says, “While we had our eyes in our phones, technology nicked our jobs.” A manager named Slade insults an employee, and the Doctor stands up to him. Of course, it is easier to do that when you don’t depend on the company for your livelihood. And so once again, the episode highlights an issue but doesn’t offer a solution – expect to invite those who have the privilege of not being dependent (or at least as dependent) on an entity to use that privilege to stand up for those who may not feel empowered to stand up for themselves. Of course, there is always a risk that the one who does so will simply leave the person they sought to protect in jeopardy, once they are no longer around to stand between the bully and them. Kerblam, we learn, is its own jurisdiction. One cannot appeal to authorities against them, because they are their own authorities. And so this is every bit as important an issue that the episode highlights as is the issue of automation and human employment: if corporations become akin to societies that police themselves, can we trust them to treat employees with respect and fairness, any more than we can trust them to employ human beings in the first place?
There are a number of powerful lines – powerful both in what they emphasize, and in what they unwittingly or begrudgingly acknowledge. The Doctor says, “Don’t like bullies, don’t like conspiracies, and don’t like people in danger. This has a flavor of all three.” People manager Judy Maddox says at one point, “There are 10,000 people here. I can’t keep track of them all.” When a society or a corporation loses the ability to even notice when its workers are being killed, something is wrong. In this case, we later learn that an employee named Charlie is in fact an anti-robot activist, concerned with the ever-decreasing employment of humans, and determined to fight for humanity against automation. However, to protect the secret of the robot army he is amassing, he is willing to kill his co-workers and thousands of customers, highlighting the irony of how this sort of crusade finds itself having to rely on that which it claims to hate, and sacrifice that which it claims to be interested in protecting.
The wasp reference leading to a mention of Agatha Christie was a nice touch. I also liked the phrase, “Ooh, paperwork, very retro.” We later learn that this is not just because of a liking for older ways of doing things. Slade kept analog notes about the employees who had vanished because he was not sure he could trust the computer system that runs the warehouse and ultimately the company. The company is itself a machine, an automaton, which raises the question of who or what profits from its work. It is not humanity in general, but is it the AI that runs things, or some human being(s) at the top of the economic pinnacle? We’ll need to come back to that.
As the plot draws towards its close, the Doctor makes use of a first-generation delivery bot called twirly. It engages in upselling as well as delivery, and probably accurately represents how drone delivery will soon feature drone advertising as well. We soon learn that the Kerblam system itself sent the message to the Doctor seeking help. It speaks through twirly.
We then learn that an enormous number of deliveries have all been held up, to allow the robots to all teleport to their destinations simultaneously. The boxes contain the products ordered – wrapped in weaponized bubble wrap, each little pocket a tiny little bomb just waiting for the customer to pop it.
When we learn that Charlie is responsible for the plot, we also learn what happened to Kira, a co-worker he had a crush on. She had been told she was the employee of the day and would receive a present, and is taken to a room where she receives a parcel with bubble wrap, where Charlie can see what is happening and watch her die when she pops one of the tiny bubbles. The computer system gave Kira Charlie’s bubble wrap to try to show him what suffering he was going to inflict on others, and on those who love his victims as he loved Kira.
The Doctor had earlier said that Graham should become a maintenance man, because they have access to everything, but are noticed by no one. And so it is the perfect way to infiltrate an organization. The Doctor had Graham do it in the interest of helping, while Charlie had done it in a desire to destroy the company.
As he seeks to justify himself, Charles says that they want people to be grateful for 10% of jobs going to humans. But next it will be 7%, and then 5%. And so he wanted to erode trust in the company and in automation by creating a disaster that would require that it be shut down. Ironically, Charlie says that the problem is that the system doesn’t have a conscience. But as the Doctor points out, it does – in this case at least. The system had figured out what Charlie was doing and had sought help to stop him. One could, of course, say that the system was merely interested in self-preservation. But that’s one of the difficult quandaries: so often, the survival of the system and of ordinary people are so entangled as to be effectively inseparable. The Doctor insists that the system isn’t the problem, it’s how people exploit and use the system. And this could be said of anything. We cannot collectively complain about TV, advertising, consumerism, corporations, and yet as customers not merely uphold but effectively demand that these kinds of services and forms of entertainment be provided.
The Doctor uses twirly to change the destination of the delivery robots to the warehouse where they currently were. As usual, in a fitting bit of poetic justice, the robots’ explosive bubble wrap kills Charlie – but not before the Doctor tries one last time to get him to leave their midst and run to safety.
As the IO9 article about the episode highlights, the central ethical message of the episode is one that it would be easy to miss. That in itself illustrates precisely the conundrum when it comes to automation, corporations, CEOs, workers, and societal economic and political systems. The billionaires benefiting from the system can be blamed. Or we can blame the shoppers. Or we can recognize that all of us collectively support and maintain the way that things are done. Giving people a “right to work” may be one approach. The episode’s approach of having laws requiring that a percentage of every company’s workers be organic is a variation on that. And a basic income model that ensures that everyone benefits from automation-driven economic productivity is another. A workplace in which robots oversee human beings and intervene to prevent them from talking and lagging in productivity shows the irony of forcing a percentage of less efficient humans in a machine-dominated workplace. That experience can be every bit as dehumanizing as being replaced by robots. The episode, on the whole, seemed to me to do a good job of raising key questions about automation and the role of work in human society and individual meaning-making. It raised and highlighted the questions rather than trying to provide definitive answers. That’s probably for the best. What did you think of “Kerblam!”?
On a related note, I will be presenting together with my colleague Ankur Gupta at Starbase Indy later today and tomorrow, about the intersection of AI, automation, and ethics. We haven’t yet turned our attention to automation and the theology and ethics of human work, but it is on our list of things to do. The driverless car, where we began this project, is of course just one sort of robot that will replace humans in the workplace – on the road rather than in the factory and warehouse. And so expect to read more about this topic (and my first time at Starbase Indy) in the coming days. With Waymo set to offer a commercial driverless car service that competes with Uber and Lyft early next month, we chose a good topic to begin with!
Also on this topic, don’t miss all the attention that Yuval Noah Harari has been getting in the press, as they’ve tried to figure out why Silicon Valley loves this outspoken critic of what they do and stand for. There was also a conversation about the future in the New York Times that included a roboticist. New Scientist had an article about the use of AI at national borders. Boston Review had a piece on why the digital is political. And see what happened when a church started receiving donations through Venmo.
Finally, let me end in a suitably ironic fashion by directing your attention to a coupon that is currently available for a nice discount on old-fashioned analog print books…if you buy them from Amazon.com!