Brain Pickings

Posts Tagged ‘technology’

02 APRIL, 2012

How Ignorance Fuels Science and the Evolution of Knowledge

By:

“We judge the value of science by the ignorance it defines.”

“Science is always wrong,” George Bernard Shaw famously proclaimed in a toast to Albert Einstein. “It never solves a problem without creating 10 more.”

In the fifth century BC, long before science as we know it existed, Socrates, the very first philosopher, famously observed, “I know one thing, that I know nothing.” Some 21 centuries later, while inventing calculus in 1687, Sir Isaac Newton likely knew all there was to know in science at the time — a time when it was possible for a single human brain to hold all of mankind’s scientific knowledge. Fast-forward 40 generations to today, and the average high school student has more scientific knowledge than Newton did at the end of his life. But somewhere along that superhighway of progress, we seem to have developed a kind of fact-fetishism that shackles us to the allure of the known and makes us indifferent to the unknown knowable. Yet it’s the latter — the unanswered questions — that makes science, and life, interesting. That’s the eloquently argued case at the heart of Ignorance: How It Drives Science, in which Stuart Firestein sets out to debunk the popular idea that knowledge follows ignorance, demonstrating instead that it’s the other way around and, in the process, laying out a powerful manifesto for getting the public engaged with science — a public to whom, as Neil deGrasse Tyson recently reminded Senate, the government is accountable in making the very decisions that shape the course of science.

The tools and currencies of our information economy, Firestein points out, are doing little in the way of fostering the kind of question-literacy essential to cultivating curiosity:

Are we too enthralled with the answers these days? Are we afraid of questions, especially those that linger too long? We seem to have come to a phase in civilization marked by a voracious appetite for knowledge, in which the growth of information is exponential and, perhaps more important, its availability easier and faster than ever.*

(For a promise of a solution, see Clay Johnson’s excellent The Information Diet.)

The cult of expertise — whose currency are static answers — obscures the very capacity for cultivating a thirst for ignorance:

There are a lot of facts to be known in order to be a professional anything — lawyer, doctor, engineer, accountant, teacher. But with science there is one important difference. The facts serve mainly to access the ignorance… Scientists don’t concentrate on what they know, which is considerable but minuscule, but rather on what they don’t know…. Science traffics in ignorance, cultivates it, and is driven by it. Mucking about in the unknown is an adventure; doing it for a living is something most scientists consider a privilege.

[…]

Working scientists don’t get bogged down in the factual swamp because they don’t care all that much for facts. It’s not that they discount or ignore them, but rather that they don’t see them as an end in themselves. They don’t stop at the facts; they begin there, right beyond the facts, where the facts run out. Facts are selected, by a process that is a kind of controlled neglect, for the questions they create, for the ignorance they point to.

Firestein, who chairs the Department of Biological Sciences at Columbia University, stresses that beyond simply accumulating facts, scientists use them as raw material, not finished product. He cautions:

Understanding the raw material for the product is a subtle error but one that can have surprisingly far-reaching consequences. Understanding this error and its ramifications, and setting it straight, is crucial to understanding science.

What emerges is an elegant definition of science:

Real science is a revision in progress, always. It proceeds in fits and starts of ignorance.

(What is true of science is actually also true of all creativity: As Jonah Lehrer puts it “The only way to be creative over time — to not be undone by our expertise — is to experiment with ignorance, to stare at things we don’t fully understand.” Einstein knew that, too, when he noted that without a preoccupation with “the eternally unattainable in the field of art and scientific research, life would have seemed… empty.” And Kathryn Schulz touched on it with her meditation on pessimistic meta-induction.)

In highlighting this commonality science holds with other domains of creative and intellectual labor, Firestein turns to the poet John Keats, who described the ideal state of the literary psyche as Negative Capability — “that is when a man is capable of being in uncertainties, Mysteries, doubts without any irritable reaching after fact & reason.” Firestein translates this to science:

Being a scientist requires having faith in uncertainty, finding pleasure in mystery, and learning to cultivate doubt. There is no surer way to screw up an experiment than to be certain of its outcome.

He captures the heart of this argument in an eloquent metaphor:

Science, then, is not like the onion in the often used analogy of stripping away layer after layer to get at some core, central, fundamental truth. Rather it’s like the magic well: no matter how many buckets of water you remove, there’s always another one to be had. Or even better, it’s like the widening ripples on the surface of a pond, the ever larger circumference in touch with more and more of what’s outside the circle, the unknown. This growing forefront is where science occurs… It is a mistake to bob around in the circle of facts instead of riding the wave to the great expanse lying outside the circle.

However, more important than the limits of our knowledge, Firestein is careful to point out, are the limits to our ignorance. (Cue in Errol Morris’s fantastic 2010 five-part New York Times series, The Anosognosic’s Dilemma.) Science historian and Stanford professor Robert Proctor has even coined a term for the study of ignorance — agnotology — and, Firestein argues, it is a conduit to better understanding progress.

Science historian and philosopher Nicholas Rescher has offered a different term for a similar concept: Copernican cognitivism, suggesting that just like Copernicus showed us there was nothing privileged about our position in space by debunking the geocentric model of the universe, there is also nothing privileged about our cognitive landscape.

But the most memorable articulation of the limits of our own ignorance comes from the Victorian novella Flatland, where a three-dimensional sphere shows up in a two-dimensional land and inadvertently wreaks havoc on its geometric inhabitants’ most basic beliefs about the world as they struggle to imagine the very possibility of a third dimension.

An engagement with the interplay of ignorance and knowledge, the essential bargaining chips of science, is what elevated modern civilization from the intellectual flatness of the Middle Ages. Firestein points out that “the public’s direct experience of the empirical methods of science” helped humanity evolve from the magical and mystical thinking of Western medieval thought to the rational discourse of contemporary culture.

At the same time, Firestein laments, science today is often “as inaccessible to the public as if it were written in classical Latin.” Making it more accessible, he argues, necessitates introducing explanations of science that focus on the unknown as an entry point — a more inclusive gateway than the known.

In one of the most compelling passages of the book, he broadens this insistence on questions over answers to the scientific establishment itself:

Perhaps the most important application of ignorance is in the sphere of education, particularly of scientists… We must ask ourselves how we should educate scientists in the age of Google and whatever will supersede it… The business model of our Universities, in place now for nearly a thousand years, will need to be revised.

[…]

Instead of a system where the collection of facts is an end, where knowledge is equated with accumulation, where ignorance is rarely discussed, we will have to provide the Wiki-raised student with a taste of and for boundaries, the edge of the widening circle of ignorance, how the data, which are not unimportant, frames the unknown. We must teach students how to think in questions, how to manage ignorance. W. B. Yeats admonished that ‘education is not the filling of a pail, but the lighting of a fire.’

(For a taste of what modern science education can and should be like beyond the academy, see Joe Hanson’s It’s Okay To Be Smart, Ed Yong’s Not Exactly Rocket Science, and Bora Zivkovic’s Twitter feed.)

Firestein sums it up beautifully:

Science produces ignorance, and ignorance fuels science. We have a quality scale for ignorance. We judge the value of science by the ignorance it defines. Ignorance can be big or small, tractable or challenging. Ignorance can be thought about in detail. Success in science, either doing it or understanding it, depends on developing comfort with the ignorance, something akin to Keats’ negative capability.

* See some thoughts on the difference between access and accessibility.

Scientific American

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

28 MARCH, 2012

The Idea Factory: Insights on Creativity from Bell Labs and the Golden Age of Innovation

By:

Successful innovation requires the meeting of the right people at the right place with just the right problem.

At the turn of the twentieth century, Thomas Edison was the most famous inventor in the world. He hoarded useful materials, from rare metals to animal bones, and through careful, methodical testing, he made his new inventions work, and previous inventions work better. Churning out patent after patent, Edison’s particular form of innovation was about the what, and not about the how — the latter he could outsource and hire for.

“In 1910, few Americans knew the difference between a scientist, an engineer, and an inventor” explains Jon Gertner at the beginning of his lively book about a place that fostered a home for all three, The Idea Factory: Bell Labs and the Great Age of American Innovation. The difference was clear to Edison, who was generally disinterested in the theory behind his inventions, filling his Menlo Park complex with specialists to do the work he’d rather not. “I can always hire mathematicians,” he said, “but they can’t hire me.”

At Menlo Park, Edison hired scientists to do the theoretical work so that he could concentrate on testing his inventions.

To be an inventor, Gertner insists, one needed “mainly mechanical skill and ingenuity, not scientific knowledge and training.” (Qualities that the ingenious Hedy Lamarr had alongside her mechanical partner George Antheil, an unlikely artistic pair who invented an essential frequency-hopping radio signal during World War II that later gave us technologies like Bluetooth and wifi.) For more than sixty years, from the 1920s to the 1980s, Bell Labs would bring together all of the above to create essential inventions of the twentieth century: the transistor, radar, the laser, communication satellites, UNIX, and the C++ programming language.

Adventure stories about 'wireless boys' and 'radio boys' were popular around the turn of the century.

It was the child-tinkerers during the first decades of the century who would populate Bell Labs during its explosive growth in the 30s and 40s. Adventure books recounted tales of “Wireless Boys” (or “Radio Boys”) who solved dastardly crimes and helped those in need, all by building their own wireless telegraphs at home. “Wireless is a thrilling pastime!,” exclaims the author of one of these books:

To be a wireless boy and make your own apparatus is to have the kind of stuff in you of which successful men are made — men who, if they were shipwrecked on a desert isle at daybreak, would have something to eat by noon, a spring bed to sleep on by night and a wireless station the next day sending out an SOS to ships below the horizon, for help.

Around this time, Alexander Graham Bell’s American Telephone and Telegraph Company (AT&T) had a massive, government-sanctioned monopoly on all telephone subscriptions, buying regional phone companies, single-handedly manufacturing all of the parts for all of the cables, switches, repeaters, and vacuum-tubes. AT&T made the phones work, it made the parts that made the phones work, and it hired scientists and engineers to make the phones work better. During the 1920s, this third arm is what became Bell Labs.

Tesla I communications satellite for television signals and space data, 1962. (Alcatel-Lucent USA Inc. and the AT&T Archives and History Center)

In the beginning, Bell Labs was populated with grown-up wireless boys — physics, engineering, and chemistry grad students and junior professors seduced away from colleges with astronomically better pay. The new recruits were required to climb telephone poles, operate a switchboard, and sign a paper that sold all rights to any future patents to AT&T for a dollar.

The Picturephone, from the 1964 New York World's Fair (AT&T Archives and History Center)

Bell Labs was a place for discovery, which wasn’t always profitable, and invention, which usually was. During World War II, the US government invested $2 billion into the development of the atomic bomb, but they invested around $3 billion in the development of radar, much of which took place at Bell Labs. (“Scientists who worked on radar often quipped that radar won the war,” Gertner writes, ” whereas the atomic bomb merely ended it.”)

In 1961, Bell Labs moved to a campus designed by Eero Saarinen. It was sold by the company in 2006.

During the post-war reorganization of the Labs, older management was demoted, younger management given new titles, and, most importantly, every research group was interdisciplinary: chemists mingled with physicists who chatted with metallurgists who lunched with engineers. Every building in the New Jersey campus was interconnected and no one was allowed to shut their door. This was the beginning of a newly innovative time, but not the same “genius”-driven Eureka! moments that seemingly characterized the work of Edison. Gertner writes:

At the start, forces that precede an invention merely begin to align, often imperceptibly, as a group of people and ideas converge, until over the course of months or years (or decades) they gain clarity and momentum and the help of additional ideas and actors. Luck seems to matter, and so does timing, for it tends to be the case that the right answers, the right people, the right place — perhaps all three — require a serendipitous encounter with the right problem. And then — sometimes — a leap. Only in retrospect do leaps look obvious.

(“Chance favors the connected mind,” Steven Johnson famously observed in his own exploration of how innovation happens.)

The story of The Idea Factory is one of individuals, architecture, millions of tiny moving parts, deliberate work, and, of course, luck and timing. It was a culture of creativity that worked for its age, impossible to reproduce in quite the same way, nor would we want to. Today, we might subscribe to the philosophy that “creativity is just connecting things,” as Steve Jobs once said about his own idea factory, but first someone has to test, apply, develop, and manufacture all of those connectors.

Michelle Legro is an associate editor at Lapham’s Quarterly. You can find her on Twitter.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

23 MARCH, 2012

PBS Off Book: Art in the Age of the Internet

By:

How the digital age is changing the rhetoric and regimes of creative expression.

Over the past few months, the fine folks at PBS Arts have been exploring various facets of creative culture — including typography, product design, generative art, papercraft, and more — and their evolution in the digital age as part of the ongoing Off Book series. The latest installment explores art in the era of the Internet, and features Kickstarter founder Yancey Strickler, Creative Commons mastermind Lawrence Lessig, and my dear friend Julia Kaganskiy, editor of Creators Project, along with her colleague and creative director Ciel Hunter.

When extend the life of a physical project on the web, and give people the ability to remix that media, they’ll do some really inventive stuff with it.” ~ Julia Kaganskiy, Creators Project

The Internet’s incredible ability to align people with similar interests makes it very possible for normal people to make big things happen, and that’s something that wasn’t possible at any other time.” ~ Yancey Strickler, Kickstarter

We had a regime of copyright and the Internet completely flipped the technical foundation upon which that regime had been built. […] My creative utopia is that we have a huge proportion of all of us creating all the time.” ~ Lawrence Lessig, Creative Commons

As Edward Gorey might remind you, PBS is public media supported by “viewers like you” — show them some love here.

@juliaxgulia

Brain Pickings has a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.