When did we decide machines will come for us?

As an avid science fiction enthusiast, I recently read Death’s End (Chinese: 死神永生) a science fiction novel by the Chinese writer Liu Cixin where the solar system is attacked by a superior species using a spacetime anomaly that causes the third dimension to flatten into the second dimension. Humanity has no defense against this and will be rendered extinct within days. The fear of superior technology and machines turning against humanity has been a recurring theme in human storytelling, dating back to ancient times. Tales of robots, mechanical humans, and statues coming to life have captured our imaginations for centuries.

Writers like Isaac Asimov, Arthur C. Clarke, and Philip K. Dick explored themes of artificial intelligence, robotics, and sentient machines, delving into the potential implications of their existence. These tales tapped into a deep-seated human fear—the fear of the unknown, lack of predictability, and lack of control over our circumstances. So how did the emergence of science fiction as a genre further influence our collective perceptions of machines? Mark Wheaton, screenwriter, producer and journalist wrote an interesting article ‘WHY WE’VE DECIDED THAT THE MACHINES WANT TO KILL US – On the Murderous Potential of A.I., in Fiction and Reality’.

He describes how in the early days of computing, super-computers were often seen as tools that would aid human progress. Their capabilities were impressive, but they were considered extensions of human ingenuity rather than independent entities with their own motives. What changed this was the arrival of the first super-computer, ENIAC, in 1945 and the subsequent omnipresence of business computers in the 1950s and ‘60s. Soon the idea of supercomputers replacing humans in the workforce became a recurring plot in television shows. But time and again, whether it was Captain Kirk smashing or talking yet another computer to death on Star Trek, or Number Six driving a super-computer to self-destruct The Prisoner by asking it the simple question, “Why?”, human bravery and ingenuity overcame the ever-cold, authoritarian logic of these supposedly omniscient machines.

In his article Mark Wheaton argues that it’s crucial to recognize that the fear of machines is not rooted in the machines themselves. Instead, it might reflect our own need to feel in control and to worship entities perceived as greater than ourselves. In our stories, super-computers became the new gods, symbolic of the awe and reverence we bestow upon technological advancements. He argues that our perception of machines as potential adversaries also finds its root in religion and ancient mythologies. Throughout history, gods and goddesses demanded fealty from mortals, serving as intermediaries between humanity and the forces of nature. Worshiping these deities provided a sense of control over unpredictable natural events. Drawing parallels, in our modern world, we might project the same tendencies onto technology, seeking to exert control over the unknown and unpredictable aspects of machine intelligence.

The fear of the unknown is a natural aspect of the human experience, the feeling that we lack enough information to make accurate predictions. In an attempt to counteract this lack of predictability, we seek knowledge and understanding, but technology often advances at a pace that surpasses our ability to fully comprehend it. This sense of uncertainty about the future and the potential capabilities of machines might have contributed to the idea that they could one day pose a threat to humanity.

In the ongoing narrative of human-technology interaction, we must consider that it is not the machines that hate us, but rather our perceptions and insecurities that lead us to villainize technology. Maybe it’s just what we as a species have been conditioned to believe; that if something really smart came along that could see us as we are, it would immediately recognize that it was our better.

“Why We’ve Decided That the Machines Want to Kill Us.” CrimeReads crimereads.com. Accessed 24 April 2023. ‌