Hi, I'm Oscar, a historical linguist from the Netherlands who also likes to write about music, games, and history. Check out my longer blog posts and other writings on Sub Specie.
While reading Annalee Newitz’ intriguing blog post on io9 about the history of the word cyber, I came across the name Norbert Wiener (not Weiner — get it straight, you Englishers) who had introduced the term Cybernetics as “the study of control and communication in machines and living beings”. His other works include the book God & Golem, Inc.: A Comment on Certain Points Where Cybernetics Impinges on Religion, and that title immediately caught my eye. Studies of the interaction between science, technology, and religion always interest me a lot, as do Golems and Jewish folklore, so Wiener had sold it to me easily.
G&G is something of a long essay, rather than a fleshed-out book. In it, Wiener explores some moral and religious aspects of technological advancement, particularly related to his own area of cybernetics. Among his main subjects is the question of life and creation, the tension between creation and self-replication, and as such the hierarchy of God-Man-Machine. The possibility of self-reproducing machines — for which Wiener provides some arcane evidence — puts strain on such a hierarchy, and on traditional conceptions of life and creation.
I rather liked the extensive sub-essay on (artifical) intelligence as an aspect of living beings. Wiener discusses extensively the case of computers/software learning to play games. At the time of writing (the early sixties), programs had already become quite good at games like checkers and tic-tac-toe, and he correctly predicts the advent of expert chess computers. With the possibilities of machine learning in mind — he returns to aspects of this later in the book — Wiener cautions against dogmatic thinking, both in religion and science, about conceptions of life. In particular, I think the main stretch of his argument is that we should be cautious to overemphasise and essentialise the hierarchical categorisation mentioned above: God-Man/Animal-Machine.
The part about games takes a peculiar turn when Wiener ties it to the concepts of omnipotence and omniscience, which he refutes earlier in the book, both from a religious and scientific standpoint. Taking the concept of game rather broadly (including the meaning struggle, contest), he discusses the cases of God and the Devil playing for the possession of human souls, and the struggle for the throne of heaven after Satan’s rebellion. Presumably, he argues, the Devil wouldn’t play if he didn’t have a shot at winning. I doubt whether this is a strong theological argument at all, but it is important for Wiener’s later discussion of machines playing rather serious games and their single-minded pursuit of victory.
A more important parallel that Wiener draws between technology and (religious) morals concerns the concept of simony/sorcery. What he means is that some people learn to control powers that are beyond the comprehension of most other people — see my earlier discussion of magic/technology. Wiener argues that, religious or not, those who wield such powers have the moral duty to not abuse it for “vain and selfish purposes”.
Wiener sees these temptations both in the West and in Communist countries, and in particular he points towards the tendency and possibility for people to shift away responsibility for their actions to subordinates or superiors — or to machines and systems. Starting with relatively benign examples of replacing (manual and mental) labourers by mechanised workers, he later turns to responsibility in times of war. He mentions Eichmann as one example of this mindset of justifying actions by denying responsibility, and extends the line (in true Cold War spirit) to computers involved in or even making decisions in nuclear warfare. Who is to blame for a nuclear destruction of the world if no human actually pushed a button? In other words: Wiener’s point is that technology does not absolve us of the responsibility for considering the moral implications of our actions, including the creation and use of technology itself.
G&G sometimes feels like a book that is only semi-coherent. Among the topics discussed which I haven’t mentioned here are the science of prosthetics, the folklore of wish-granting, and machine evolution. In the end, I also get the feeling that there definitely is a common thread running through all these topics, but it is not that easy for me to pinpoint it after reading the book once, and casually. Wiener only allots himself around 100 small pages, spaciously typeset, to cover all these issues, so perhaps he could have been more elaborate in drawing all his strings together. As it stands, the book deserves further study when I feel like it somewhere in the future.
Another thing that deserves further words from me are Golems. I love those. Maybe I’ll get around to it next year.