Friday, December 19, 2014

George Dyson, Turing's Cathedral

This strange but ultimately rewarding book chronicles the building of the Institute for Advanced Study computer, conceived by John von Neumann and built by Julian Bigelow and a team of other engineers between 1946 and 1952. But that, honestly, is one of the smaller parts of the book. George Dyson has so much to say about the history and future of computers that his ideas keep bursting out of his narrative structure. His text wanders off on tangents, offers general thoughts that seem unrelated to the particular matter at hand, and loops back on itself so much that I wondered if he had intentionally modeled it on an iterative program. Dyson had particular trouble figuring out how to end his book, and I hit four or five spots in the last quarter that felt to me like conclusions, only to find more chapters of history, speculation and rumination beyond. I am glad I read it, but I wished it had had a better editor.

John von Neumann was one of the great geniuses of the 20th century, the creator of game theory and the co-inventor or midwife of the Monte Carlo method, the hydrogen bomb, and the digital computer. It's an impressive list, and yet it really doesn't do justice to what von Neumann achieved. He knew everybody, had a finger in everything, and even while crisscrossing the country to work on a dozen different military assignments during World War II he still found time to do groundbreaking mathematical work. So a book built around von Neumann has certain innate advantages, and Dyson makes good use of them. He follows von Neumann around America and Britain, giving us a bit of Los Alamos and the bomb, a bit of radar and antiaircraft fire control, a bit of code-breaking. Von Neumann worked directly with many of the age's other great geniuses -- Kurt Gödel, Alan Turing, Stanislaw Ulam, Edward Teller -- and we also get capsule biographies of these people, and accounts of their work insofar as it intersected with von Neumann or the IAS computer. There is also a fairly detailed account of the origins of the Institute for Advanced Study, a tour through philanthropy, the Rockefeller Foundation, various critical assaults on American higher education of the 1920s, and so on.

Dyson has a very clear and somewhat idiosyncratic view of how the digital computer came to be. For Dyson it all started with a challenge issued by German mathematician David Hilbert, to establish mathematics as a system of logical deduction based on a limited set of axioms. (As Euclid tried to do with geometry.) It was while trying to satisfy Hilbert's charge that Kurt Gödel came up with his famous Incompleteness Theorem, which proved that for any logical system complex enough to contain arithmetic
  1. If the system is consistent, it cannot be complete. 
  2. The consistency of the axioms cannot be proven within the system.  
Gödel's proof involved creating a sort of matrix in which logical statements are coded by numbers, the "Gödel numbers." In this matrix, Dyson sees the germ of  the the digital world. When Alan Turing turned his attention to another question formulated by Hilbert, the "decision problem" (whether it is possible to tell whether a statement is provable without actually proving it) he replaced the Gödel matrix with his "Turing Machine." And the Turing Machine, as you probably already know, is the logical model for the universal computer, a machine that can perform any calculation it is possible to perform.

So to Dyson the key steps in computer science were taken by mathematicians before anybody had thought to try building an actual computer. Meanwhile, though, other developments were taking place that would contribute to the computer's eventual construction. War meant calculation. To create a firing table for an artillery piece, showing the gunners what angle to use for any given range and how to set the fuse, took months of hand calculations. As World War II got going two programs in particular needed vast amounts of calculation, the American atomic bomb program and the British codebreaking program. These programs employed teams of "computers," that is, people who sat at desks and performed calculations by hand or with mechanical adding machines. To get useful results from teams of people operating by rote, mathematicians formulated ways of dividing complex problems into simple routines that could be done over and over again until the end of the process was reached, knowledge that would eventually be put to use writing programs for digital computers. As the war went on ever more powerful calculating machines were built, like the electromechanical "bombes" used at Bletchley Park to break the Enigma codes. Toward the end of the war the first digital computers were built, the ENIAC in the U.S. and Colossus in Britain.

These machines, though impressive, were not Turing machines -- they did not have the theoretical power to do any calculation, just the particular ones they were built to do. In 1945 both von Neumann and Turing himself embarked on programs to build actual, digital Turing machines. One characteristic a Turing Machine must have is the ability to modify its own instructions. Therefore, those instructions cannot be wired into the hardware; they must be stored in the machine's digital memory. Software was born.

Dyson absolutely does not believe that Turing or von Neumann or anyone else invented the digital computer. The computer in his telling sprang from the intersection of Gödel and Turing's mathematics with the ever increasing sophistication of electronic adding machines and the wartime need for computation on a heroic scale. Once the math and the hardware were available, people all over the place started imagining computers, including one random guy in Iowa who still sometimes shows up in online articles as the computer's inventor. But computers were organized in various different ways, and the particular architecture we mainly use today goes back to the one von Neumann defined for the IAS machine. (The weak point in the architecture is still known as the von Neumann bottleneck.) Our computers have gotten a million times faster, but they still rely on logical schemes from 1945, a fact remarked on by several computer scientists interviewed by Dyson. The von Neumann architecture is sort of like the internal combustion engine, something that works well enough that we keep using it even though it has grievous flaws and something else would probably work better.

The real purpose of the IAS computer, to the people who paid for it, was to do calculations for the hydrogen bomb. Of course nobody could admit this publicly. Therefore von Neumann was always looking for other ways to use the machine, things it could do that would be reported to the press and thereby distract attention from its military applications. One was weather prediction. People had tried numerical weather prediction before, but it took hand calculators a month to work through the math of a single day's forecast, by which time the forecast was not much use. It was only with digital computers that weather prediction could be put on any sort of scientific basis, and over the past 70 years we have seen enormous progress for forecasts up to 7 days out. Unfortunately we have also discovered that detailed forecasts are pretty much impossible any further out than that, but sometimes that's how science goes.

Dyson spends what felt to me like an inordinate amount of time on the work of Nils Aall Barricelli, who hung around the Institute from 1953 to 1956 making digital models of the origin and early evolution of life. Barricelli comes across as a half-mad prophet of a coming digital age, when life will make the transition from biological to digital forms. Dyson makes portentous parallels between the computer work of Barricelli and the structure of DNA, unraveled around the same time: as the code of life came to be understood, we were also developing living codes. It is along these lines that Turing's Cathedral gets really wild. Maybe, says Dyson, certain computer codes are already alive, already evolving, with the most successful ones using us to create millions of copies of themselves all over the world. Or maybe biological and digital life are merging, and some biological organisms will pass through a developmental stage in digital form (scanning of DNA for flaws and fixing them) before emerging as bodies; maybe the minds of programmers are just a stage in the lives of digital programs.

Turing's Cathedral is many things: history, biography, technical manual, thought experiment. It delves into both the particular social world of the Institute for Advanced Study in the 1940s and the possible future world of living machines. It is weird and occasionally annoying. But it is generally fascinating, and if you have any interest in these things I recommend  it highly.

No comments: