I recently had an opportunity to have the kind of conversation that many a parent yearns for with their kids when my 11-year old daughter was excitedly telling me about a book she had read called Frost by M. P. Koslowsky.

The novel takes place in a post-apocalyptic setting where, before he dies, a father uploads his memories into a robot servant, in order to help his daughter survive in a world filled with cannibals and rogue robots that hunt after living things.  Our conversation went on to discuss how Frost is a prime examples of a perennial trope in science fiction that famed author Isaac Asimov called the Frankenstein Complex, which is the fear that robots or other artificially created intelligences will, in time, become so advanced that they will replace, enslave, or kill off humans.

Given the exponential strides that industrialized nations have made in technology, along with the sheer amount of high-tech infrastructure that we have built and come to rely on in our daily lives to accommodate that growth, that fear has become a legitimate topic of discussion in the real world.  While much of that conversation concerning the advances in Artificial Intelligence (A.I.) has centered around the predictable social upheavals that will be ensue as A.I. and robots put people out of work, scientists like Stephen Hawking or hi-tech entrepreneurs like Elon Musk have all expressed concerns that the rapidly increasing power of A.I. presents an existential threat to humanity.

While I am not an expert in the field of robotics or artificial intelligence and I cannot see what the future holds, if we look at this issue from a first principles point of view, I think that the modern manifestation of the Frankenstein Complex is, as it always has been, grounded in something far more basic than any particular technological advancement.

Frankenstein and the Post-Modern Prometheus

The Frankenstein Complex gets its name from Mary Shelly's 1818 novel Frankenstein which explores, as its subtitle A Modern Prometheus implies, the moral ramifications which arise when mankind presumes to create, let alone be responsible for, that which is supposed to belong to God alone: life.

To be honest though, in regards to modern fears surrounding A.I., Shelly's allusion to the Titan who stole fire from the Olympian gods, is kind of outdated.  The problem is that the Promethean quandary comes out of a worldview that at least acknowledged the existence of a distinct spiritual hierarchy which separated the divine and mortal.  However, in our post-modern materialistic and scientism-tinted world, those traditional hierarchies have all been flattened, leaving man's subjective vision of himself as his own center of gravity in the cosmos.

From this perspective, I would posit that a more fitting and contemporary progenitor of our contemporary Frankenstein Complex was J. Robert Oppenheimer, the physicist who played a key role in the development of the first atomic bomb.  It was this post-modern Prometheus who, not content with merely stealing fire from the gods, led a team of scientist to create a device that would create a fire of such divine power that even President Harry S. Truman wrote of it as being “the fire destruction prophesied in the Euphrates Valley Era, after Noah and his fabulous Ark.”

As Oppenheimer watched the first successful test of the atomic bomb at the Trinity site in New Mexico in 1945, along with the pride and exaltation he felt with the Manhattan Project's success, he experienced a feeling of dread. He famously mutter at the time “I am become Death, the destroyer of worlds,” as it would seem that in the flash of a thousand suns, he and everyone responsible for making the bomb, were able to see that they were “naked” and thus felt ashamed. Oppenheimer would admit of this later in life when he said that "the physicists have known sin; and this is a knowledge which they cannot lose.”

It is this moral dissonance between having the power to “be as God” while at the same still being constrained by a fallen human nature, that I believe is at the root of the fear that many scientists have of the very machines they are obsessed with creating. They are unwilling or unable to accept that their scientific inquiries could ever be subject to something as base as human passions, let alone something as benighted as the concept of sin.

Instead, in true gnostic fashion, they absolve themselves of the anxieties which arise out of contending with the existence of an objective and eternal moral order, by separating their participation in their “knowing of good and evil.”  They lay claim to the “good” by attaching it to their intentions and actions, while distancing themselves from the “evil” that may arise from their efforts by projecting it onto the machines they create, and thus turning them into monsters.

This is a point that E. Michael Jones made in his intriguing book Monsters of the Id where, in talking about the various monsters which inhabit the horror genre, he notes that every time mankind tries to abandon the moral order laid out by God, “you're going to have your passions get out of control, and when the passions get out of control they become monsters.”  This is clearly seen in the myths and legends we tell in movies and novels about our fears of A.I.  Not only with the 1958 movie The Forbidden Planet from which Jones's book gets its title, but also with HAL from 2001 A Space Odyssey, Skynet from the Terminator franchise, and the computer overlords of The Matrix movies.  

All of these stories are about the Frankenstein Complex, and all of them, according to Jones, arise from turning our back on God's moral order, which in regards to A.I. have to do with the tenets of the first commandment: idolatry and the pride that fuels it.  It is the kind of idolatry practiced not only by people who St. Paul described as being “without excuse” because God's “eternal power and deity, has been clearly perceived in the things that have been made.”  But also those who, as the author of the Book of Wisdom points out, set “their hopes set on dead things, and are the men who give the name “gods” to the works of men’s hands.”  

In other words, it is an idolatry born out of a conceited intellect that will not admit to the evidence of God which is all around them, and thus turn to idolatry to fulfill their spiritual longings by imbuing the things they themselves have created with a divine nature.  After all, think about it.  The fear of A.I. run amok is never over the likes of Data from Star Trek the Next Generation or droids like R2-D2 from the Star Wars saga. Nor is it over sentient robots that have hang-ups just like humans such as Marvin from The Hitchhiker's Guide to the Galaxy series of books and movies.  Still less is there a fear of a dysfunctional robots like “Bender” from the Futurama TV series who is drinking a beer when he first meets the main character Fry who has just awakened from a 1,000 year cryogenic sleep. Fry genuinely asks, “Why would a robot need to drink?”  Bender replies, “I don't need to drink. I can quit anytime I want.”  No, the fear is always based on a prideful and conceited notion that their massive (in their estimations) intellects will create something so much greater than themselves.

Less Human than Human

An excellent example of this conceit view of the Frankenstein Complex was seen in the making of the 1982 movie Blade Runner.  In an interview given by Philip K. Dick, the author of the novel Do Androids Dream of Electric Sheep on which the movie was based, he talks of a fundamental disagreement he had with the film's director Ridley Scott.  While Scott thought of the androids or replicants as being superior to humans in speed, strength, and intellect, Dick said that was a complete misreading of his story.  Concerning the replicants, Dick said, “They are cold, they are selfish, they are heartless, they are completely self-centered, they have no empathy...and to me this is a less than human entity because of that.”  To Dick, the point of his story was not so much an esoteric story about what it means to be human, as it was about how “Rick Deckard is dehumanized in his job of tracking down replicants and killing them. And essentially he ends up like them.”

This then in the end, is the upshot of the pride and idolatry which fuels the modern Frankenstein Complex. The issue should not be about worrying that robots and A.I. will get so advanced that they are going to become, as the fictional Tyrell Corporation, which made the replicants in the Blade Runner movie, said of their products “more human than human.”  The issue we should be worrying about is whether in our pursuit of more advanced A.I., that we don't end up like Deckard and become less human than human by reducing our value of the human person from Imago Dei to a jumble of biological parts.

For in removing God from his proper place in the cosmos, and placing ourselves in his place, we risk becoming the kind of person that Roman Guardini wrote about in The End of the Modern World: an empty, soulless husk, a shadow of humanity. He looks the part, and he goes through all the motions, but he does nothing of his own volition.”  And given the fact we live in a world where the Frankenstein Complex is not only being seriously discussed, but where a robot has been granted citizenship and clever anti-telemarketing “bots” are able to fool real people, it's hard not to think that that world is not already here.

Photo: Blade Runner 2049/ The Fan Carpet