Brain Pickings

Posts Tagged ‘history’

07 MARCH, 2012

How Iconic Album Cover Illustrator R. Crumb Brought Comics to Music

By:

What Janis Joplin has to do with rediscovering yesteryear’s forgotten masters.

Alex Steinweiss may be the father of the modern album cover, but Robert Crumb is its favorite weird uncle. Though best-known as a pioneer of the underground comix movement, the subversive artist had long been fascinated with the music of the 1920s and 1930s — jazz, big band, swing, blues, cajun — so when, in 1968, Janis Joplin asked him to design the cover for her album Cheap Thrills, it was the beginning of R. Crumb’s prolific second career illustrating hundreds of covers for artists emerging and legendary. In fact, Crumb’s covers for yesteryear’s forgotten masters were so influential in and of themselves that they spurred the rediscovery of many of these old records in the 1960s and 1970s.

R. Crumb: The Complete Record Cover Collection captures Crumb’s quirky, beautiful work and his enduring legacy in 450 vibrant four-color, black-and-white, and monocolor illustrations that exude his love of music and his love of art in equal measure. Accompanying his unmistakeable record covers are also posters, calling cards, advertisements, and stand-alone portraits of icons like James Brown, Frank Zappa, Gus Cannon, George Jones, Woody Guthrie, and more.

In this narrated short, Crumb, who eventually learned to play the uke, banjo, and mandolin himself, talks about the convergence of his two passions:

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

07 MARCH, 2012

Ralph Ellison on Fiction as a Voice for Injustice, a Chariot of Hope, and a Lens on the Human Condition

By:

“…there must be possible a fiction which… can arrive at the truth about the human condition, here and now, with all the bright magic of the fairy tale.”

“Good fiction is made of what is real, and reality is difficult to come by,” beloved writer Ralph Ellison famously said. In 1953, upon winning the National Book Award for Invisible Man, the only novel he published in his lifetime, Ellison — a lover of language, symbolism connoisseur, and astute observer of humanity — captured the heart of fiction in his remarkable acceptance speech:

If I were asked in all seriousness just what I considered to be the chief significance of Invisible Man as a fiction, I would reply: Its experimental attitude and its attempt to return to the mood of personal moral responsibility for democracy which typified the best of our nineteenth-century fiction.

When I examined the rather rigid concepts of reality which informed a number of the works which impressed me and to which I owed a great deal, I was forced to conclude that for me and for so many hundreds of thousands of Americans, reality was simply far more mysterious and uncertain, and at the same time more exciting, and still, despite its raw violence and capriciousness, more promising.

To see America with an awareness of its rich diversity and its almost magical fluidity and freedom I was forced to conceive of a novel unburdened by the narrow naturalism which has led after so many triumphs to the final and unrelieved despair which marks so much of our current fiction. I was to dream of a prose which was flexible, and swift as American change is swift, confronting the inequalities and brutalities of our society forthrightly, but yet thrusting forth its images of hope, human fraternity, and individual self-realization. A prose which would make use of the richness of our speech, the idiomatic expression, and the rhetorical flourishes from past periods which are still alive among us. Despite my personal failures there must be possible a fiction which, leaving sociology and case histories to the scientists, can arrive at the truth about the human condition, here and now, with all the bright magic of the fairy tale.”

For more on the distinction between truth and fiction, see this omnibus of insights by some of modernity’s greatest writers.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

07 MARCH, 2012

From Francis Bacon to Hobbes to Turing: George Dyson on the History of Bits

By:

What Sir Francis Bacon has to do with the dawn of the Internet and the inner workings of your iPhone.

“It is possible to invent a single machine which can be used to compute any computable sequence,” proclaimed twenty-four-year-old Alan Turing in 1936.

The digital text you are reading on your screen, which I wrote on my keyboard, flows through the underbelly of the Internet to transmit a set of ideas from my mind to yours. It all happens so fluidly, so familiarly, that we take it for granted. But it is the product of remarkable feats of technology, ignited by the work of a small group of men and women, spearheaded by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey. Legendary science historian George Dyson traces the history and legacy of this pioneering work in his highly anticipated new book, Turing’s Cathedral: The Origins of the Digital Universe, based on his 2005 essay of the same name — the most comprehensive and ambitious account of that defining era yet, reconstructing the events and characters that coalesced into the dawn of computing and peering into the future to examine where the still-unfolding digital universe may be headed.

It begins with a captivating promise:

There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.”

The book is absolutely absorbing in its entirety, but this particular abstract on the history of bits illustrates beautifully Dyson’s gift for contextualizing the familiar with the infinitely fascinating:

A digital universe — whether 5 kilobytes or the entire Internet — consists of two species of bits: differences in space, and differences in time. Digital computers translate between these two forms of information — structure and sequence — according to definite rules. Bits that are embodied as structure (varying in space, invariant across time) we perceive as memory; and bits that are embodied as sequence (varying in time, invariant across space) we perceive as code. Gates are the intersections where bits span both worlds at the moments of transition from one instant to the next.

The term bit (the contraction, by 40 bits, of “binary digit”) was coined by statistician John W. Tukey shortly after he joined von Neumann’s project in November of 1945. The existence of a fundamental unit of communicable information, representing a single distinction between two alternatives, was defined rigorously by information theorist Claude Shannon in his then-secret Mathematical Theory of Cryptography of 1945, expanded into his Mathematical Theory of Communication of 1948. “Any difference that makes a difference” is how cybernetician Gregory Bateson translated Shannon’s definition into informal terms. To a digital computer, the only difference that makes a difference is the difference between a zero and a one.

That two symbols were sufficient for encoding all communication had been established by Francis Bacon in 1623. ‘The transposition of two Letters by five placeings will be sufficient for 32 Differences [and] by this Art a way is opened, whereby a man may expresse and signifie the intentions of his minde, at any distance of place, by objects… capable of a twofold difference onely,’ he wrote, before giving examples of how such binary coding could be conveyed at the speed of paper, the speed of sound, or the speed of light.

That zero and one were sufficient for logic as well as arithmetic was established by Gottfried Wilhelm Leibniz in 1679, following the lead given by Thomas Hobbes in his Computation, or Logique of 1656. ‘By Ratiocination, I mean computation,’ Hobbes had announced. ‘Now to compute, is either to collect the sum of many things that are added together, or to know what remains when one thing is taken out of another. Ratiocination, therefore is the same with Addition or Substraction; and if any man adde Multiplication and Division, I will not be against it, seeing… that all Ratiocination is comprehended in these two operations of the minde.’ The new computer, for all its powers, was nothing more than a very fast adding machine, with a memory of 40,960 bits.”

Turing’s Cathedral, though dense at times, is fascinating in its entirety.

Brain Pickings has a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.