Brain Pickings

Posts Tagged ‘history’

07 MARCH, 2012

Ralph Ellison on Fiction as a Voice for Injustice, a Chariot of Hope, and a Lens on the Human Condition

By:

“…there must be possible a fiction which… can arrive at the truth about the human condition, here and now, with all the bright magic of the fairy tale.”

“Good fiction is made of what is real, and reality is difficult to come by,” beloved writer Ralph Ellison famously said. In 1953, upon winning the National Book Award for Invisible Man, the only novel he published in his lifetime, Ellison — a lover of language, symbolism connoisseur, and astute observer of humanity — captured the heart of fiction in his remarkable acceptance speech:

If I were asked in all seriousness just what I considered to be the chief significance of Invisible Man as a fiction, I would reply: Its experimental attitude and its attempt to return to the mood of personal moral responsibility for democracy which typified the best of our nineteenth-century fiction.

When I examined the rather rigid concepts of reality which informed a number of the works which impressed me and to which I owed a great deal, I was forced to conclude that for me and for so many hundreds of thousands of Americans, reality was simply far more mysterious and uncertain, and at the same time more exciting, and still, despite its raw violence and capriciousness, more promising.

To see America with an awareness of its rich diversity and its almost magical fluidity and freedom I was forced to conceive of a novel unburdened by the narrow naturalism which has led after so many triumphs to the final and unrelieved despair which marks so much of our current fiction. I was to dream of a prose which was flexible, and swift as American change is swift, confronting the inequalities and brutalities of our society forthrightly, but yet thrusting forth its images of hope, human fraternity, and individual self-realization. A prose which would make use of the richness of our speech, the idiomatic expression, and the rhetorical flourishes from past periods which are still alive among us. Despite my personal failures there must be possible a fiction which, leaving sociology and case histories to the scientists, can arrive at the truth about the human condition, here and now, with all the bright magic of the fairy tale.”

For more on the distinction between truth and fiction, see this omnibus of insights by some of modernity’s greatest writers.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

07 MARCH, 2012

From Francis Bacon to Hobbes to Turing: George Dyson on the History of Bits

By:

What Sir Francis Bacon has to do with the dawn of the Internet and the inner workings of your iPhone.

“It is possible to invent a single machine which can be used to compute any computable sequence,” proclaimed twenty-four-year-old Alan Turing in 1936.

The digital text you are reading on your screen, which I wrote on my keyboard, flows through the underbelly of the Internet to transmit a set of ideas from my mind to yours. It all happens so fluidly, so familiarly, that we take it for granted. But it is the product of remarkable feats of technology, ignited by the work of a small group of men and women, spearheaded by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey. Legendary science historian George Dyson traces the history and legacy of this pioneering work in his highly anticipated new book, Turing’s Cathedral: The Origins of the Digital Universe, based on his 2005 essay of the same name — the most comprehensive and ambitious account of that defining era yet, reconstructing the events and characters that coalesced into the dawn of computing and peering into the future to examine where the still-unfolding digital universe may be headed.

It begins with a captivating promise:

There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.”

The book is absolutely absorbing in its entirety, but this particular abstract on the history of bits illustrates beautifully Dyson’s gift for contextualizing the familiar with the infinitely fascinating:

A digital universe — whether 5 kilobytes or the entire Internet — consists of two species of bits: differences in space, and differences in time. Digital computers translate between these two forms of information — structure and sequence — according to definite rules. Bits that are embodied as structure (varying in space, invariant across time) we perceive as memory; and bits that are embodied as sequence (varying in time, invariant across space) we perceive as code. Gates are the intersections where bits span both worlds at the moments of transition from one instant to the next.

The term bit (the contraction, by 40 bits, of “binary digit”) was coined by statistician John W. Tukey shortly after he joined von Neumann’s project in November of 1945. The existence of a fundamental unit of communicable information, representing a single distinction between two alternatives, was defined rigorously by information theorist Claude Shannon in his then-secret Mathematical Theory of Cryptography of 1945, expanded into his Mathematical Theory of Communication of 1948. “Any difference that makes a difference” is how cybernetician Gregory Bateson translated Shannon’s definition into informal terms. To a digital computer, the only difference that makes a difference is the difference between a zero and a one.

That two symbols were sufficient for encoding all communication had been established by Francis Bacon in 1623. ‘The transposition of two Letters by five placeings will be sufficient for 32 Differences [and] by this Art a way is opened, whereby a man may expresse and signifie the intentions of his minde, at any distance of place, by objects… capable of a twofold difference onely,’ he wrote, before giving examples of how such binary coding could be conveyed at the speed of paper, the speed of sound, or the speed of light.

That zero and one were sufficient for logic as well as arithmetic was established by Gottfried Wilhelm Leibniz in 1679, following the lead given by Thomas Hobbes in his Computation, or Logique of 1656. ‘By Ratiocination, I mean computation,’ Hobbes had announced. ‘Now to compute, is either to collect the sum of many things that are added together, or to know what remains when one thing is taken out of another. Ratiocination, therefore is the same with Addition or Substraction; and if any man adde Multiplication and Division, I will not be against it, seeing… that all Ratiocination is comprehended in these two operations of the minde.’ The new computer, for all its powers, was nothing more than a very fast adding machine, with a memory of 40,960 bits.”

Turing’s Cathedral, though dense at times, is fascinating in its entirety.

Brain Pickings has a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

01 MARCH, 2012

Memory Is Not a Recording Device: How Technology Shaped Our Metaphors for Remembering

By:

Debunking the myth that memory is about “reliving” a permanent record stored in a filing cabinet.

Last year, Joshua Foer set out to hack his memory. Last week, Jonah Lehrer examined the promise and peril of memory erasure. But what, exactly, is memory — and how does it work? A new book by Alison Winter tackles precisely that — Memory: Fragments of a Modern History explores how the science and understanding of memory evolved over the past century, from early metaphors that likened it to a filing cabinet to the quasi-science of the prewar era’s “truth serums” to the psychology of false confessions and the latest neuroscience breakthroughs on how remembering works.

One particularly fascinating aspect Winter examines is the entwining of technology, media, and cultural accounts of memory. The book, Winter points out, “may tell us as much about how to approach the history of information media as as it does about human memory.”

The modern sciences of memory emerged when newly developed sciences of mind coincided with the proliferation of new media: technologies for recording, transmitting, and recreating sounds and images. Photography, the phonograph, and the moving image all developed between 1850 and 1900. They became identified with memory process in a series of associations that shaped both how those processes were understood and how the technologies themselves would be used. Throughout the twentieth century, memory researchers continued to look to the most recent, cutting-edge recording technologies for insights into the nature of remembering.

[…]

There was never just one way to use recording technologies to think about memory. At one extreme, researchers suggested that they were models for memory itself. Perhaps, they reflected, memory was an internal recording that could be replayed at will… But on the other extreme, researchers also used recording devices to define precisely what memory was not. For a number of scientists, the idea that memory is a recording device rests on an unrealistic fantasy of accuracy and permanence. Instead of practices that facilitated ‘reliving’ a permanent record, they sought out ways to reveal an ineradicable role of interpretation… in the construction of knowledge and memory.”

I spend a lot of time thinking about the evolution of knowledge in the age of Information Overload, and “memory” has always been a centerpiece of “knowledge.” But at a time when the rift between accessibility and access to information is gaping wider than ever, memory as a foundation for knowledge is shifting from retentional to relational, elevating the importance of the ability to retrieve and connect information over that of the ability to retain it.

As we move from storing units of data — books, music, images, footage — to saving pathways to and among them, our frameworks for thoughtful interpretation, which include the curation and contextualization of information, will become of crucial importance.

Winter, of course, goes in much more depth in Memory — well worth a read.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.