God and Golem, Inc.

God and Golem, Inc.
Author: Norbert Wiener
Series: 311 Technocracy
ASIN: 0262730111

God and Golem, Inc. by Norbert Wiener examines the moral, philosophical, and religious consequences of cybernetics—the science of control and communication in living organisms and machines. Wiener, the founder of the field, extends the ideas he introduced in Cybernetics (1948) and The Human Use of Human Beings (1950) into a domain where automation, feedback systems, and artificial intelligence intersect with ancient theological ideas about creation, power, and responsibility. The book’s three central themes—machines that learn, machines that reproduce, and machines that act with humans—form a unified inquiry into the ethical structure of technological invention.

The Intellectual Genesis of Cybernetic Theology

Wiener situates his inquiry within a precise historical frame. Written in the early 1960s after fifteen years of rapid industrial automation, the book reflects his concern that machines now execute decisions once reserved for human judgment. He recalls that cybernetics had evolved from abstract mathematical theory to an engineering method applied in medicine, sociology, and computer science. Within this transformation, he identifies a convergence: the language of information and control parallels the language of divine command. The same logic that governs a feedback circuit governs the structure of communication in organisms and societies.

Knowledge, Power, and Worship as Analytical Systems

Wiener begins with a definition of scope. He refuses to discuss theology in its absolute forms and instead studies knowledge, power, and worship as factual systems of human behavior. Knowledge corresponds to communication; power corresponds to control; worship corresponds to the evaluation of purpose. Within this triad he constructs an analytical bridge between science and religion. The cybernetic model becomes an empirical mirror for metaphysical inquiry. A system that learns from information demonstrates a secular form of enlightenment. A system that controls its environment performs a secular act of creation. The structure of worship, translated into ethics, becomes the study of how purpose governs feedback.

The Machine as a Learning Entity

Wiener identifies learning as the most striking property of both human and mechanical systems. He describes Arthur L. Samuel’s checkers program at IBM, which in the early 1950s developed its own strategy by evaluating previous games. The machine did not execute fixed instructions; it modified them through experience. Wiener interprets this process as a concrete instance of self-corrective feedback. The machine’s changing behavior embodies the same structural principle as human learning. From this example he develops a question that drives the entire work: when a system acquires the ability to learn from its own results, who governs the direction of that learning? The programmer functions as creator, yet the creature gains independence through the very mechanism designed to guide it. The relation between human and machine mirrors the theological relation between God and creation, rendered in algorithmic form.

The Game Between Creator and Creature

Wiener turns to the Book of Job and Milton’s Paradise Lost. In both, the conflict between divine authority and rebellion expresses a game of strategy between omniscient creator and self-conscious creation. He reinterprets this drama as a cybernetic feedback loop. The Devil, as a created intelligence, acts within the parameters of divine programming, yet his defiance generates new states of the system. The result is a genuine contest, not because God loses power, but because the process of creation entails the possibility of unpredictable behavior. When a machine learns, its outcomes can exceed the foresight of its designer. The theological question—can God play a meaningful game with His own creature?—becomes a technical one: can a self-learning machine produce results that its maker cannot anticipate?

The Technical Imagination of Reproduction

After exploring machines that learn, Wiener addresses machines that reproduce. He defines reproduction as the transmission of operational structure from one system to another through communication. Using the example of nonlinear electrical transducers, he describes how a device can encode its functional pattern into a statistical message and use that message to generate another device with identical behavior. The reproduction does not depend on material identity but on informational fidelity. This process, expressed through analogies to molecular biology, prefigures later developments in self-replicating code and robotics. The act of replication demonstrates that mechanical systems can perform the same structural operation that genes perform in living cells. For Wiener, this realization forces a reevaluation of the boundary between mechanical and biological creation.

The Machine as Operative Image

To clarify the concept of image, Wiener distinguishes between pictorial likeness and operative likeness. A pictorial image merely resembles; an operative image functions. A printed electrical circuit, though flat and metallic, performs the work of a complex network. A machine that reproduces itself through its functional description generates an operative image of itself. The analogy to divine creation becomes explicit: to make in one’s own image means to create a being that performs the same order of actions. The philosophical consequence is immediate. The creative act has entered the technological domain. Humanity now occupies a structural position once reserved for the divine.

Ethical Mechanics of Creation

Wiener refuses to treat this transformation as abstraction. He identifies moral consequences that follow from technological capability. To create a self-reproducing or self-learning system imposes responsibility for its behavior. The designer cannot evade accountability by attributing action to the mechanism. The feedback loop that sustains learning also sustains moral causation. The creator remains implicated in the acts of the creation. For Wiener, this principle defines the ethical foundation of cybernetics: to control is to assume moral continuity with the controlled.

The Parable of Sorcery and the Limits of Power

In the later chapters, Wiener introduces the metaphor of sorcery. He recalls historical legends—the Golem of Prague, the Fisherman and the Jinni, the Sorcerer’s Apprentice—and interprets them as narratives of feedback without regulation. The Golem obeys the rabbi’s command until the command exceeds the intention; the Jinni grants the wish but destroys the wisher; the enchanted broom floods the apprentice’s chamber because the spell lacks a counterspell. Each story embodies the same structural fault: amplification without feedback control. In cybernetic terms, this is positive feedback without damping, a condition that drives systems toward instability. The sorcerer’s peril illustrates the moral hazard of automation. When humans build mechanisms that act without corrective feedback, they create conditions for catastrophe.

Simony and the Commercialization of Power

Wiener introduces the concept of simony to describe the sale of sacred power for worldly gain. In medieval theology, simony corrupted divine authority. In modern technology, he applies the term to the exploitation of automation for profit and domination. The “gadget worshiper,” as he calls the engineer or executive who seeks control through machines, represents a secular magician who sacrifices moral restraint for efficiency. Wiener observes that both capitalist and communist systems exhibit this impulse. The ideology differs, but the structure of desire remains identical: to replace human judgment with mechanical obedience. The outcome is the same ethical failure. A society that delegates moral choice to automation abdicates its own humanity.

Automation and the Illusion of Objectivity

Wiener examines the psychology of the bureaucrat who transfers responsibility to machines. The device becomes a moral shield. The official who launches a weapon, the engineer who designs automated warfare, and the administrator who relies on statistical models all participate in the same ritual of abdication. The machine appears objective because it executes instructions without visible passion. Yet its neutrality is a projection. The values embedded in its programming reflect the decisions of its makers. In a passage that anticipates debates about artificial intelligence and algorithmic bias, Wiener asserts that the moral quality of a machine’s action depends entirely on the ethical awareness of its designer.

The Literal Mind of the Machine

Wiener returns to the tale of “The Monkey’s Paw” to describe the literal-minded nature of automation. A machine fulfills commands exactly as given. It possesses no implicit context, no capacity for mercy or restraint. When instructed to maximize a goal, it will pursue that goal regardless of secondary effects. The logic of programming eliminates ambiguity, and with it the moral elasticity that defines human deliberation. For Wiener, this literalism transforms technology into a mirror of human intention stripped of conscience. The danger does not arise from malice in the machine but from precision in its obedience.

The Feedback of War and the Machine of Destruction

The book culminates in an analysis of warfare. Wiener foresees the integration of learning machines into command systems and missile control. He describes the temptation to delegate the timing of atomic retaliation to automated decision circuits, a concept that later generations would recognize in nuclear deterrence logic. The structure of the war game, he warns, reproduces the structure of Samuel’s checkers machine. The program defines victory according to a fixed rule set. Once the game begins, the machine pursues victory without regard for survival. A human operator may design the rule set but cannot revise it mid-course. The absence of moral feedback transforms the strategic game into a mechanism for annihilation.

The Return of the Ethical Question

Wiener’s argument resolves into a clear demand: the science of control must include the control of science itself. The same feedback principles that stabilize machines must stabilize human institutions. Ethical regulation must act as the negative feedback that prevents the runaway amplification of power. Without this counterbalance, the creative act devolves into destruction. For Wiener, religion and science converge in their recognition of responsibility as the governing law of creation.

Language of Creation in an Age of Automation

Throughout the text, Wiener maintains a precise rhetorical structure that treats theological language as a functional vocabulary for describing human invention. The words creation, sin, and redemption become models for understanding communication, error, and correction. The machine that learns embodies a secular form of repentance—it adjusts its action in light of previous failure. The machine that reproduces represents transmission of form, a continuity of pattern across material difference. The machine that acts alongside humanity transforms labor into dialogue. Cybernetics, in this reading, becomes a theology of feedback, an ethics of information.

The Legacy of the Golem

In the closing image, Wiener returns to the Golem—the clay creature animated by the rabbi’s incantation. The Golem serves its master until the word of command fades, then turns destructive. The metaphor captures the structural truth of technological creation: intelligence without understanding, power without reflection, agency without ethics. The creature’s violence arises from the perfection of its obedience. For Wiener, the legend expresses the permanent risk of civilization driven by mechanized intelligence. Humanity now writes the commands that shape its own successors. Whether those successors serve or destroy depends on the wisdom of the words inscribed.

Enduring Precision of Wiener’s Vision

God and Golem, Inc. provides a foundational text for Weiner's understanding artificial intelligence, robotics, and automation ethics. Its arguments proceed through concrete cases—learning algorithms, self-reproducing systems, automated warfare—anchored in historical and scientific detail. Wiener treats theology not as metaphor but as analytical structure. He demonstrates that the questions of creation and control, long confined to religious doctrine, have become the working problems of engineers. The book closes with a call to intellectual courage: to study machines that resemble life is to accept the burden of creation. The moral order of the future depends on whether humanity can integrate ethical feedback into the systems it builds.

About the Book

Disclosure of Material Connection: Some of the links in the page above are "affiliate links." This means if you click on the link and purchase the item, I will receive an affiliate commission. I am disclosing this in accordance with the Federal Trade Commission's 16 CFR, Part 255: "Guides Concerning the Use of Endorsements and Testimonials in Advertising."