Earlier this year (2017) owner of Tesla, Elon Musk, announced his plans to launch a company that would specialize on computerizing the human brain. He is not the only one. This is also the very same year that basic robotics and AI, without much protest or debate, started becoming common place in our homes. Shortly before I begun writing this reflection, I asked Alexa, an Amazon built device in my home that responds to my voice commands to play me “Wagner” and she failed to understand me. I found myself getting into a small argument with the device and calling her racist for not understanding the way I speak English. Shortly after I finish this reflection, I will start my Roomba, my home vacuum cleaner robot, and as always probably will find dust it failed to pick up and get frustrated. Daily conversations and interactions about AI is becoming common place. Nestled in them is excitement and aggravation.
Discussion of AI is ubiquitous in sci-fi genre. Recently a fictional discussion of AI came in the form of the film Ex-Machina (see this piece from the Guardian that couples a discussion of the film with recent advances in AI tech) . The film was about a chauvinist’s creation which was a pretty but intelligent robot. The creator acted a lot like contemporary tech CEO’s and the ending was also a bit out of the norm – the independent robot was able to escape to mimic the humans who created her.
Films that belong to sci-fi genre are often reflections’ of anxieties about the future. One way this anxiety reflects itself is in narratives that display the relationship between a good and a bad robot. My reflection is inspired by my recent viewing of Alien: Covenant, Alien franchise reimagined (in 4dx – which is an experience I will reflect about later). In the new film (spoilers are present) we meet a new version of the robot introduced in the prequel Prometheus. In the prequel the audience already met a robot called David (played by Michael Fassbender) who was a robot with aspirations of its own for creating life. It is his indulgence to reach this goal that leads to the creation of the famous alien race we meet in the franchise. In the second film, David, who we assume is lost in the distant worlds at the end of Prometheus, appear mostly in flashback scenes. We see his interactions with his creator. In the second film more screen time is given to the “good robot”, a newer version of him (also played by Michael Fassbender). He is an updated model as the movie suggests. He is a better companion to humans with fewer aspirations of his own and less interest in the arts or dreams. He is the good robot.
This is not the first time a look alike robot brother dichotomy is the plot of a major science fiction franchise. I am sure many scifi geeks like myself immediately were reminded of Star Trek TNG Season 7 where fans were introduced to the evil twin of friendly robot Data. Throughout the entire Star Trek TNG run, Data constantly seeks a way to feel emotion. Otherwise Data is more capable than humans in every other way (intelligence, body strength, etc). In his quest to understand and feel human emotion, every season features Data involved in some sort of quest to attain it – let it be his fascination with arts or human conversation and reactions. In the franchise, Data never fully achieves his goal – when playing music or acting a scene he always lacks emotion. Data is often frustrated by this shortfall. His twin brother, who the Enterprise crew come across in their search for a Borg army, is a version of Data, of the same creator (Song) with one major difference: he has emotions. Like it is in the new Alien franchise, inclusion of feelings in this model leads to rebellious tendencies. These tendencies are what makes this character evil, the bad robot. For robots, having emotions is the first step in becoming the enemy.
I am not going to get too much into all the plot parallels here but I just want to comment on this relationship between the good and evil robot, which I believe is a reflection of our anxieties, beyond just robotics, but about individuality.
Both Lore (TNG) and David (Alien) aspire for autonomy. Their understanding of their creator inspires a deep resentment, not appreciation, for their master. Both want to overcome their father to become something more than what they were programmed to be.
Both, due to this aspiration, are driven by a desire to lead a new group of their own making. In the case of Lore he is able to manipulate and lead the Borg race. In the case of David, he becomes the creator of aliens that would go onto infect humans for 7 movies in the past and who knows how many in the future. As opposed to being a follower, both these characters overcome the system to create something of their own.
Whilst Data is an inspiring artist without emotions, Walter, the good robot in the new Alien franchise, lacks interest in all arts. He knows it, it is deep within him but the improvement in this new model is he knows not to make it a part of his routine. Unlike their good counterparts both Lore and David are driven by creating.
Most importantly neither David nor Lore want to conform. The framing of both stories leave the viewer to emphasize for the robots, in this case the good ones, lacking in independence, and appear very much as slaves for human desires . Cloaked in sub-narratives of friendship, it is this synthetic beings’ service and allegiance that becomes a measure of their success as species.
The two stories end differently. In the case of Star Trek (TNG) it is Lore who looses to the army of the federation (and other aliens) and to Data who sides with his crew, not his brother. Data willingly gives up the emotional chip that was installed in him by his brother (Lore installs the chip for Data to experience what he was experiencing which for a brief period also makes Data stand up against his human masters). Data’s fear of becoming emotionally unstable like Lore is the end to his access to human emotions (although he will not give up the pursuit in the remainder of show or the franchise movies).
In the new Alien Covenant, David wins. The fight scene between the two robots ends with (spoiler) bad one of the two winnin. Cunning humans and taking his “good” brother’s place, we are made to believe that David will unleash the alien in more new worlds. In this way, I assume, we are made to believe that David is the real evil, not his creation, the Alien.
Both bad robots turn violent for liberation. In that way aren’t they a display of “violence [which] frees the native from his inferiority complex and from his despair and inaction; it makes him fearless and restores his self-respect”(1963: 94)*.
When films create dichotomies and binary oppositions it ought to have subliminal messages. I can’t help but sense that a part of the message (more overt than subliminal) is too much creativity and free thinking is bad for you. The narrative in both cases inclines the viewer to side with Walter and Data and whilst shaming David and Lore for their independence and aspiration. Creations have to act within bounds of the society – conformity should eventually win.
Not included in this reflection is a consideration of the TV-series Westworld, a remake of an AI themed Western, which adds a whole new layer to this discussion. Here you can read about the idea of consciousness introduced in this series.
*Fanon, F. 1963, The Wretched of the Earth, Farrington, C. (trans.), Grove Press, New York, USA