How can it not know what it is? — Rick Deckard, Blade Runner
That can be a question or a declaration put forth by Deckard, a Blade Runner hired to kill rogue replicants in the cinematic dystopian classic, Blade Runner. The film demands a deep dig into our personal definition of what it is to be human. The debate over whether Artificial Intelligence (AI) is sentient and able to take moral decisions or just a lower level of being calls into question our definition of life. Are we only a sum of our sinew, muscle, bone, and brain, or is there more to it? The film leaves you to wrestle with the idea.
Filmed in 1982 and ironically set in the year 2019, the futuristic theme of Blade Runner imagines a complicated world of humans existing with AI humanoid slaves who begin to develop a conscience and emotions. AI with near-human characteristics, abilities, and super strength has been developed by the Tyrell Corporation. In the corporation’s rationale, it would make sense to replicate a human being who is also a disposable asset, trained to work on dangerous and remote planets. Dr. Eldon Tyrell, CEO of Tyrell Corp., coolly brags, “Commerce is our goal here at Tyrell. More human than human is our motto.”[1] This is accomplished by the Tyrell Corp. with “replicants” — superhumans in strength and ability. They are programmed to think and interact with basic emotions, but a glitch in the algorithm has brought about a personality problem because the emotions are not tied to experience or memory. The replicants start to adapt to their world, taking on more human tendencies of deeper emotion like reason, survival, revenge, thoughtfulness, and cruelty. Emotions are beyond the control of Tyrell Corporation, but as a safeguard against any uncontrollable human complications, the replicants are built with a kill switch set to four-years and termed “retirement.”
What should give us pause is Tyrell’s pitch of “more human than human.” Awareness, emotion — what truly separates man from AI machine? The replicants’ behavior implies emotion, but their responses are more akin to animal instincts. Can they take non-programmed decisions?
Science wrestles with the idea that AI may behave morally, but that question leads us to wonder about the definition of morality — the morality of a programmed mind. At the conclusion of Blade Runner, Deckard pursues Roy Batty, the last surviving rogue replicant, with the intent to destroy it. Shooting the other replicants was not a difficult decision because they were set on killing any human who got in their way, but Batty is different. He has somehow acquired empathy while the highest mental process in other replicants is self-preservation. Batty is in the position of killing Deckard with one deft kick, but he acts in a way that hints that there is more humanity beneath his skin and a soul behind his eyes. As the rain pours down on the scene, Batty looks at Deckard and tells him, “I’ve seen things you people wouldn’t believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhauser gate. All those moments will be lost in time, like tears in rain. Time to die.”[2] His internal “retirement” program terminates him. More than mere programming, his remembered experiences of Batty’s four-year lifespan die with him, never to be shared. He releases a white dove that he was clutching in his hand into the stormy sky, perhaps symbolic of his developing spiritual element. We are left to wonder if this replicant, comprised of mechanization and his accumulated memories, was at the edge of real human life.
How would you define what it is to be human — the qualities that divide us from AI replicants and animals? As far as physical characteristics, all have facial similarities and bodies that have limbs that end with paws and claws or hands with nails. People have emotions, but if you have ever seen a video on dog shaming, it could be argued that animals do too. Animals function from a level of instinct or perhaps a reaction to an event grounded in a memory. The replicants in Blade Runner were programmed with memories, albeit those belonging to someone else, but those memories are resources to aid them in survival. The line between behavior rooted in instinct and memory versus that of free will is very thin.
True empathy is a human element — feeling for someone else rather than reacting, or simply the ability to ‘put ourselves in someone else’s shoes,’ to show compassion and consider outcomes. Batty’s decision to save Deckard is unexpected. He gives an empathic, rather than an instinctual, response. Even greater than empathy is the human capacity for reason and imagination. Batty’s near-melancholy reflection on his “moments lost in time” hints at the loss of an imagined future. Unlike animals and algorithms, “the human mind is capable of forming mental images of things not actually present.”[3] Only human minds are capable of deep spirituality or a sense of wonder and awe — the experience of the numinous — that sets us apart from any other being.
What is Human?
I gaze into a mirror that reveals
The features that compose a human face.
A wax museum replication forged
From atoms linked together, matrix-formed.
Now gazing back at me, a flawless clone,
Unmoving, hollow construct stares beyond
And unresponsive, cold reflecting back.
Immune to thought or movement, unlike me.
A beast will see the same array composed
Degrees of skin and hair and nose and eyes.
But beasts can sense, react, respond, and feel.
Raw fight or flight is programmed into them.
As sentient human minds perceive the good
Or evil, but can brutes return in kind?
If it exhibits life, is it alive?
Amassing algorithms make a mind
That functions with pure methodology.
And simply an experiment survives
To function as a human, less the soul.
The mystery of human-ness is grace.
I feel compassion far beyond myself.
Benevolence may mark itself in tears.
Your hurt and joy compress against my heart.
The empathy behind the gaze revealed
Sets us apart from beast or humanoid.
Look deeply into eyes, they write the tale,
That shows Imagination from without,
And infinite complexity within.

Citation Information
Annie Nardone, “What Makes Us Human?,” An Unexpected Journal 3, no. 2. (Summer 2020), 175-194.
Direct Link: https://anunexpectedjournal.com/what-makes-us-human-observations-on-blade-runner/
Endnotes
[1] Blade Runner, directed by Ridley Scott, screenplay by Hampton Fancher and David Webb Peoples, (Warner Media, 1982).
[2] Ibid.
[3] J.R.R. Tolkien, On Fairy-stories (London: HarperCollins Publishers, 2008), 59.