Turing's Cathedral: The Origins of the Digital Universe
Autor George Dysonen Limba Engleză Paperback – 27 feb 2013
In 1945 a small group of brilliant engineers and mathematicians gathered at the Institute for Advanced Study in Princeton, determined to build a computer that would make Alan Turing's theory of a 'universal machine' reality. Led by the polymath émigré John von Neumann, they created the numerical framework that underpins almost all modern computing - and ensured that the world would never be the same again.
George Dyson is a historian of technology whose interests include the development (and redevelopment) of the Aleut kayak. He is the author ofBaidarka; Project Orion; andDarwin Among the Machines.
'Unusual, wonderful, visionary' Francis Spufford,Guardian
'Fascinating . . . the story Dyson tells is intensely human . . . a gripping account of ideas and inventionFascinating . . . the story Dyson tells is intensely human . . . a gripping account of ideas and invention' Jenny Uglow
'Glorious . . . as much a story of the personalities involved as of the discoveries they made, and you do not need any knowledge of computers or mathematics to enjoy the ride . . . a ripping yarn' John Gribbin,Literary Review
Toate formatele și edițiile | Preț | Express |
---|---|---|
Paperback (2) | 75.03 lei 23-34 zile | +29.62 lei 6-10 zile |
Penguin Books – 27 feb 2013 | 75.03 lei 23-34 zile | +29.62 lei 6-10 zile |
VINTAGE BOOKS – 10 dec 2012 | 118.63 lei 17-23 zile | +9.64 lei 6-10 zile |
Preț: 75.03 lei
Preț vechi: 89.58 lei
-16% Nou
Puncte Express: 113
Preț estimativ în valută:
14.37€ • 15.54$ • 11.97£
14.37€ • 15.54$ • 11.97£
Carte disponibilă
Livrare economică 22 noiembrie-03 decembrie
Livrare express 05-09 noiembrie pentru 39.61 lei
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9780141015903
ISBN-10: 014101590X
Pagini: 432
Dimensiuni: 129 x 198 x 26 mm
Greutate: 0.37 kg
Editura: Penguin Books
Colecția Penguin
Locul publicării:London, United Kingdom
ISBN-10: 014101590X
Pagini: 432
Dimensiuni: 129 x 198 x 26 mm
Greutate: 0.37 kg
Editura: Penguin Books
Colecția Penguin
Locul publicării:London, United Kingdom
Notă biografică
George
Dyson
is
a
historian
of
technology
whose
interests
include
the
development
(and
redevelopment)
of
the
Aleut
kayak.
He
is
the
author
ofBaidarka;
Project
Orion;
andDarwin
Among
the
Machines.
Recenzii
Riveting
.
.
.
conveys
the
electrifying
sense
of
possibility
that
the
first
computers
unleashed
.
.
.
a
page-turner
Brings to life a myriad cast of extraordinary characters, each of whom contributed to ushering in today's digital age
An engrossing and well-researched book that recounts an important chapter in the history of 20th-century computing
Brings to life a myriad cast of extraordinary characters, each of whom contributed to ushering in today's digital age
An engrossing and well-researched book that recounts an important chapter in the history of 20th-century computing
Extras
Preface
POINT SOURCE SOLUTION
I am thinking about something much more important than bombs. I am thinking about computers.
—John von Neumann, 1946
There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.
In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today.
Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same.
Working outside the bounds of industry, breaking the rules of academia, and relying largely on the U.S. government for support, a dozen engineers in their twenties and thirties designed and built von Neumann’s computer for less than $1 million in under five years. “He was in the right place at the right time with the right connections with the right idea,” remembers Willis Ware, fourth to be hired to join the engineering team, “setting aside the hassle that will probably never be resolved as to whose ideas they really were.”
As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, “What’s next?” Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the “Super,” or hydrogen bomb. Just before dawn on the morning of July 16, 1945, the New Mexico desert was illuminated by an explosion “brighter than a thousand suns.” Eight and a half years later, an explosion one thousand times more powerful illuminated the skies over Bikini Atoll. The race to build the hydrogen bomb was accelerated by von Neumann’s desire to build a computer, and the push to build von Neumann’s computer was accelerated by the race to build a hydrogen bomb.
Computers were essential to the initiation of nuclear explosions, and to understanding what happens next. In “Point Source Solution,” a 1947 Los Alamos report on the shock waves produced by nuclear explosions, von Neumann explained that “for very violent explosions . . . it may be justified to treat the original, central, high pressure area as a point.” This approximated the physical reality of a nuclear explosion closely enough to enable some of the first useful predictions of weapons effects.
Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand. It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent.
Turing’s model of universal computation was one-dimensional: a string of symbols encoded on a tape. Von Neumann’s implementation of Turing’s model was two-dimensional: the address matrix underlying all computers in use today. The landscape is now three-dimensional, yet the entire Internet can still be viewed as a common tape shared by a multitude of Turing’s Universal Machines.
Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be slowing down.
Universal codes and universal machines, introduced by Alan Turing in his “On Computable Numbers, with an Application to the Entscheidungsproblem” of 1936, have prospered to such an extent that Turing’s underlying interest in the “decision problem” is easily overlooked. In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That’s what makes the digital universe so interesting, and that’s what brings us here.
It is impossible to predict where the digital universe is going, but it is possible to understand how it began. The origin of the first fully electronic random-access storage matrix, and the propagation of the codes that it engendered, is as close to a point source as any approximation can get.
POINT SOURCE SOLUTION
I am thinking about something much more important than bombs. I am thinking about computers.
—John von Neumann, 1946
There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.
In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today.
Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same.
Working outside the bounds of industry, breaking the rules of academia, and relying largely on the U.S. government for support, a dozen engineers in their twenties and thirties designed and built von Neumann’s computer for less than $1 million in under five years. “He was in the right place at the right time with the right connections with the right idea,” remembers Willis Ware, fourth to be hired to join the engineering team, “setting aside the hassle that will probably never be resolved as to whose ideas they really were.”
As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, “What’s next?” Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the “Super,” or hydrogen bomb. Just before dawn on the morning of July 16, 1945, the New Mexico desert was illuminated by an explosion “brighter than a thousand suns.” Eight and a half years later, an explosion one thousand times more powerful illuminated the skies over Bikini Atoll. The race to build the hydrogen bomb was accelerated by von Neumann’s desire to build a computer, and the push to build von Neumann’s computer was accelerated by the race to build a hydrogen bomb.
Computers were essential to the initiation of nuclear explosions, and to understanding what happens next. In “Point Source Solution,” a 1947 Los Alamos report on the shock waves produced by nuclear explosions, von Neumann explained that “for very violent explosions . . . it may be justified to treat the original, central, high pressure area as a point.” This approximated the physical reality of a nuclear explosion closely enough to enable some of the first useful predictions of weapons effects.
Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand. It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent.
Turing’s model of universal computation was one-dimensional: a string of symbols encoded on a tape. Von Neumann’s implementation of Turing’s model was two-dimensional: the address matrix underlying all computers in use today. The landscape is now three-dimensional, yet the entire Internet can still be viewed as a common tape shared by a multitude of Turing’s Universal Machines.
Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be slowing down.
Universal codes and universal machines, introduced by Alan Turing in his “On Computable Numbers, with an Application to the Entscheidungsproblem” of 1936, have prospered to such an extent that Turing’s underlying interest in the “decision problem” is easily overlooked. In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That’s what makes the digital universe so interesting, and that’s what brings us here.
It is impossible to predict where the digital universe is going, but it is possible to understand how it began. The origin of the first fully electronic random-access storage matrix, and the propagation of the codes that it engendered, is as close to a point source as any approximation can get.