What does the explosive growth of Artificial Intelligence (AI) mean for humans? Do we have to worry? Will we become slaves to our own technology? The Westworld series holds up a mirror: our fear of AI is the fear of ourselves. Note: spoilers!
“To be more precise, science fiction is neither forward-looking nor utopian. Rather, in William Gibson’s phrase, science fiction is a means through which to preprogram the present.”
Kodwo Eshun, London theorist and filmmaker.
Without hesitation, Dolores shoots a smiling Richard Ford in the head. Ford is happy. Get rid of that mortal body at last! Android Dolores has no choice: Ford created her, and the only way to be truly free is to kill its creator. God must die. Co-producer of the Westworld television show Jonathan Nolan explains the scene as follows: “Realize that the only way for these creatures to be truly free is that God [has] to die.”
These creatures are the “hosts” of Westworld, androids that are indistinguishable from humans. At first glance, Westworld (“the first vacation destination where you can live without limits”) appears to be an amusement park where people can forget about their daily worries and let off steam. The theme park simulates the Wild West: cowboys, Indians, revolver pulling, the whiskey pouring bartender, willing women in the brothel. Those “hosts” are just robots. You can join in and leave whatever you want.
Most of the articles written about the series Westworld (three seasons already) describe the nihilistic and violent nature of the people in the series as exaggerated. That is far-fetched. The analogy with the way we deal with non-humans (including nature, animals and, in a different time, women and people of different colours) is striking. In the 1973 cult film “Westworld”, directed by bestselling author Michael Crichton and model for the series, the roles are reversed: robots travel through the amusement park in a murderous way.
In the series, it is the hosts who arouse sympathy, not the people. Faced with a human who accuses her of being nothing more than a dumb robot, Dolores remains calm:
“Here we are, only we are so much more than you. And know it is you that want to become like us. That’s the point of your secret project, isn’t? I can promise you this. […] You used to be in control so this must be painful for you”.
Westworld does not confront us with our nature, but with our culture, specifically the way we organize the world. For park founder Ford, Westworld is the way to study human behaviour in order to eventually merge with Artificial Intelligence (AI) and thus leave the mortal body. However, his AI does not cooperate: it becomes self-aware, rejects human behaviour, chooses for itself and thus becomes posthuman.
Actually, Westworld is not about AI, but about our inability to deal with change, to overcome ourselves, to leave behind what we thought was undeniably true. In the series, all characters are looking for Vitruvian Man: the depiction of human relationships is a symbol of humanism, man as the centre of the universe.
Not AI, but that thought – man is the centre of the universe – is dangerous. In Dutch weekly De Groene Amsterdammer of 16 May 2018, technology philosopher Jos de Mul concluded his article about our posthumanist future with the following warning:
“Our position is comparable to a chimpanzee wondering what it is like to be human. And as for the wordless monkey, even that question is an impossibility, so we lack the cognitive abilities to imagine a trans- or posthuman life form. Not to mention the desirability of this.”
How deep can you be caught up in old-fashioned hierarchical thinking?
Don’t listen to technology philosophers. Watch Westworld (or other good sci-fi) and don’t design from fear, but from change.