The Marginalian
The Marginalian

From Francis Bacon to Hobbes to Turing: George Dyson on the History of Bits

“It is possible to invent a single machine which can be used to compute any computable sequence,” proclaimed twenty-four-year-old Alan Turing in 1936.

The digital text you are reading on your screen, which I wrote on my keyboard, flows through the underbelly of the Internet to transmit a set of ideas from my mind to yours. It all happens so fluidly, so familiarly, that we take it for granted. But it is the product of remarkable feats of technology, ignited by the work of a small group of men and women, spearheaded by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey. Legendary science historian George Dyson traces the history and legacy of this pioneering work in his highly anticipated new book, Turing’s Cathedral: The Origins of the Digital Universe, based on his 2005 essay of the same name — the most comprehensive and ambitious account of that defining era yet, reconstructing the events and characters that coalesced into the dawn of computing and peering into the future to examine where the still-unfolding digital universe may be headed.

It begins with a captivating promise:

There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.

Art from Trees at Night by Art Young, 1926. (Available as a print.)

The book is absolutely absorbing in its entirety, but this particular abstract on the history of bits illustrates beautifully Dyson’s gift for contextualizing the familiar with the infinitely fascinating:

A digital universe — whether 5 kilobytes or the entire Internet — consists of two species of bits: differences in space, and differences in time. Digital computers translate between these two forms of information — structure and sequence — according to definite rules. Bits that are embodied as structure (varying in space, invariant across time) we perceive as memory; and bits that are embodied as sequence (varying in time, invariant across space) we perceive as code. Gates are the intersections where bits span both worlds at the moments of transition from one instant to the next.

The term bit (the contraction, by 40 bits, of “binary digit”) was coined by statistician John W. Tukey shortly after he joined Von Neumann’s project in November of 1945. The existence of a fundamental unit of communicable information, representing a single distinction between two alternatives, was defined rigorously by information theorist Claude Shannon in his then-secret Mathematical Theory of Cryptography of 1945, expanded into his Mathematical Theory of Communication of 1948. “Any difference that makes a difference” is how cybernetician Gregory Bateson translated Shannon’s definition into informal terms. To a digital computer, the only difference that makes a difference is the difference between a zero and a one.

That two symbols were sufficient for encoding all communication had been established by Francis Bacon in 1623. ‘The transposition of two Letters by five placeings will be sufficient for 32 Differences [and] by this Art a way is opened, whereby a man may expresse and signifie the intentions of his minde, at any distance of place, by objects… capable of a twofold difference onely,’ he wrote, before giving examples of how such binary coding could be conveyed at the speed of paper, the speed of sound, or the speed of light.

That zero and one were sufficient for logic as well as arithmetic was established by Gottfried Wilhelm Leibniz in 1679, following the lead given by Thomas Hobbes in his Computation, or Logique of 1656. ‘By Ratiocination, I mean computation,’ Hobbes had announced. ‘Now to compute, is either to collect the sum of many things that are added together, or to know what remains when one thing is taken out of another. Ratiocination, therefore is the same with Addition or Substraction; and if any man adde Multiplication and Division, I will not be against it, seeing… that all Ratiocination is comprehended in these two operations of the minde.’ The new computer, for all its powers, was nothing more than a very fast adding machine, with a memory of 40,960 bits.


Published March 7, 2012

https://www.themarginalian.org/2012/03/07/george-dyson-turings-cathedral/

BP

www.themarginalian.org

BP

PRINT ARTICLE

Filed Under

View Full Site

The Marginalian participates in the Bookshop.org and Amazon.com affiliate programs, designed to provide a means for sites to earn commissions by linking to books. In more human terms, this means that whenever you buy a book from a link here, I receive a small percentage of its price, which goes straight back into my own colossal biblioexpenses. Privacy policy. (TLDR: You're safe — there are no nefarious "third parties" lurking on my watch or shedding crumbs of the "cookies" the rest of the internet uses.)