Brain Pickings

Posts Tagged ‘John Brockman’

30 OCTOBER, 2013

How Our Minds Mislead Us: The Marvels and Flaws of Our Intuition

By:

“The confidence people have in their beliefs is not a measure of the quality of evidence but of the coherence of the story that the mind has managed to construct.”

Every year, intellectual impresario and Edge editor John Brockman summons some of our era’s greatest thinkers and unleashes them on one provocative question, whether it’s the single most elegant theory of how the world works or the best way to enhance our cognitive toolkit. This year, he sets out on the most ambitious quest yet, a meta-exploration of thought itself: Thinking: The New Science of Decision-Making, Problem-Solving, and Prediction (public library) collects short essays and lecture adaptations from such celebrated and wide-ranging (though not in gender) minds as Daniel Dennett, Jonathan Haidt, Dan Gilbert, and Timothy Wilson, covering subjects as diverse as morality, essentialism, and the adolescent brain.

One of the most provocative contributions comes from Nobel-winning psychologist Daniel Kahneman — author of the indispensable Thinking, Fast and Slow, one of the best psychology books of 2012 — who examines “the marvels and the flaws of intuitive thinking.”

In the 1970s, Kahneman and his colleague Amos Tversky, self-crowned “prophets of irrationality,” began studying what they called “heuristics and biases” — mental shortcuts we take, which frequently result in cognitive errors. Those errors, however, reveal a great deal about how our minds work:

If you want to characterize how something is done, then one of the most powerful ways of characterizing how the mind does anything is by looking at the errors that the mind produces while it’s doing it because the errors tell you what it is doing. Correct performance tells you much less about the procedure than the errors do.

One of the most fascinating examples of heuristics and biases is what we call intuition — a complex cluster of cognitive processes, sometimes helpful but often misleading. Kahneman notes that thoughts come to mind in one of two ways: Either by “orderly computation,” which involves a series of stages of remembering rules and then applying them, or by perception, an evolutionary function that allows us to predict outcomes based on what we’re perceiving. (For instance, seeing a woman’s angry face helps us predict the general sentiment and disposition of what she’s about to say.) It is the latter mode that precipitates intuition. Kahneman explains the interplay:

There is no sharp line between intuition and perception. … Perception is predictive. . . . If you want to understand intuition, it is very useful to understand perception, because so many of the rules that apply to perception apply as well to intuitive thinking. Intuitive thinking is quite different from perception. Intuitive thinking has language. Intuitive thinking has a lot of word knowledge organized in different ways more than mere perception. But some very basic characteristics [of] perception are extended almost directly to intuitive thinking.

He then considers how the two types of mental operations established by modern cognitive science illuminate intuition:

Type 1 is automatic, effortless, often unconscious, and associatively coherent. . . . Type 2 is controlled, effortful, usually conscious, tends to be logically coherent, rule-governed. Perception and intuition are Type 1. … Type 2 is more controlled, slower, is more deliberate. . . . Type 2 is who we think we are. [And yet] if one made a film on this, Type 2 would be a secondary character who thinks that he is the hero because that’s who we think we are, but in fact, it’s Type 1 that does most of the work, and it’s most of the work that is completely hidden from us.

Type 1 also encompasses all of our practiced skills — for instance, driving, speaking, and understanding a language — which after a certain threshold of mastery enter autopilot mode. (Though this presents its own set of problems.) Underpinning that mode of thinking is our associative memory, which Kahneman unpacks:

You have to think of [your associative memory] as a huge repository of ideas, linked to each other in many ways, including causal links and other links, and activation spreading from ideas to other ideas until a small subset of that enormous network is illuminated, and the subset is what’s happening in the mind at the moment. You’re not conscious of it, you’re conscious of very little of it.

This leads to something Kahneman has termed “associative coherence” — the notion that “everything reinforces everything else.” Much like our attention, which sees only what it wants and expects to see, our associative memory looks to reinforce our existing patterns of association and deliberately discounts evidence that contradicts them. And therein lies the triumph and tragedy of our intuitive mind:

The thing about the system is that it settles into a stable representation of reality, and that is just a marvelous accomplishment. … That’s not a flaw, that’s a marvel. [But] coherence has its cost.

Coherence means that you’re going to adopt one interpretation in general. Ambiguity tends to be suppressed. This is part of the mechanism that you have here that ideas activate other ideas and the more coherent they are, the more likely they are to activate each other. Other things that don’t fit fall away by the wayside. We’re enforcing coherent interpretations. We see the world as much more coherent than it is.

Put another way, our chronic discomfort with ambiguity — which, ironically, is critical to both our creativity and the richness of our lives — leads us to lock down safe, comfortable, familiar interpretations, even if they are only partial representations of or fully disconnected from reality.

The Type 1 modality of thought gives rise to a System 1 of interpretation, which is at the heart of what we call “intuition” — but which is far less accurate and reliable than we like to believe:

System 1 infers and invents causes and intentions. [This] happens automatically. Infants have it. . . . We’re equipped … for the perception of causality.

It neglects ambiguity and suppresses doubt and … exaggerates coherence. Associative coherence [is] in large part where the marvels turn into flaws. We see a world that is vastly more coherent than the world actually is. That’s because of this coherence-creating mechanism that we have. We have a sense-making organ in our heads, and we tend to see things that are emotionally coherent, and that are associatively coherent.

But the greatest culprit in the failures of our intuition is another cognitive property Kahneman names “what you see is all there is” — a powerful and persistent flaw of System-1 thinking:

This is a mechanism that takes whatever information is available and makes the best possible story out of the information currently available, and tells you very little about information it doesn’t have. So what you get are people jumping to conclusions. I call this a “machine for jumping to conclusions.”

This jumping to conclusions, Kahneman adds, is immediate and based on unreliable information. And that’s a problem:

That will very often create a flaw. It will create overconfidence. The confidence people have in their beliefs is not a measure of the quality of evidence [but] of the coherence of the story that the mind has managed to construct. Quite often you can construct very good stories out of very little evidence. . . . People tend to have great belief, great faith in the stories that are based on very little evidence.

Most treacherous of all is our tendency to use our very confidence — and overconfidence — as evidence itself:

What’s interesting is that many a time people have intuitions that they’re equally confident about except they’re wrong. That happens through the mechanism I call “the mechanism of substitution.” You have been asked a question, and instead you answer another question, but that answer comes by itself with complete confidence, and you’re not aware that you’re doing something that you’re not an expert on because you have one answer. Subjectively, whether it’s right or wrong, it feels exactly the same. Whether it’s based on a lot of information, or a little information, this is something that you may step back and have a look at. But the subjective sense of confidence can be the same for intuition that arrives from expertise, and for intuitions that arise from heuristics. . . .

In other words, intuition, like attention, is “an intentional, unapologetic discriminator [that] asks what is relevant right now, and gears us up to notice only that” — a humbling antidote to our culture’s propensity for self-righteousness, and above all a reminder to allow yourself the uncomfortable luxury of changing your mind.

Thinking is excellent and mind-expanding in its entirety. Complement it with Brockman’s This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking, one of the best psychology books of 2012.

Public domain photographs via Flickr Commons

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

22 JANUARY, 2013

This Explains Everything: 192 Thinkers on the Most Elegant Theory of How the World Works

By:

“The greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple principles in a surprising way.”

Every year since 1998, intellectual impresario and Edge editor John Brockman has been posing a single grand question to some of our time’s greatest thinkers across a wide spectrum of disciplines, then collecting the answers in an annual anthology. Last year’s answers to the question “What scientific concept will improve everybody’s cognitive toolkit?” were released in This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking, one of the year’s best psychology and philosophy books.

In 2012, the question Brockman posed, proposed by none other than Steven Pinker, was “What is your favorite deep, elegant, or beautiful explanation?” The answers, representing an eclectic mix of 192 (alas, overwhelmingly male) minds spanning psychology, quantum physics, social science, political theory, philosophy, and more, are collected in the edited compendium This Explains Everything: Deep, Beautiful, and Elegant Theories of How the World Works (UK; public library) and are also available online.

In the introduction preceding the micro-essays, Brockman frames the question and its ultimate objective, adding to history’s most timeless definitions of science:

The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology, genetics, computer science, neurophysiology, psychology, cosmology, and physics. Emerging out of these contributions is a new natural philosophy, new ways of understanding physical systems, new ways of thinking that call into question many of our basic assumptions.

[…]

Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple principles in a surprising way. These explanations are called ‘beautiful’ or ‘elegant.’

[…]

The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining knowledge about anything — including such fields of inquiry as philosophy, mathematics, economics, history, language, and human behavior. The common thread is that a simple and nonobvious idea is proposed as the explanation of a diverse and complicated set of phenomena.

Stanford neuroscientist Robert Sapolsky, eloquent as ever, marvels at the wisdom of the crowd and the emergence of swarm intelligence:

Observe a single ant, and it doesn’t make much sense, walking in one direction, suddenly careening in another for no obvious reason, doubling back on itself. Thoroughly unpredictable.

The same happens with two ants, a handful of ants. But a colony of ants makes fantastic sense. Specialized jobs, efficient means of exploiting new food sources, complex underground nests with temperature regulated within a few degrees. And critically, there’s no blueprint or central source of command—each individual ants has algorithms for their behaviors. But this is not wisdom of the crowd, where a bunch of reasonably informed individuals outperform a single expert. The ants aren’t reasonably informed about the big picture. Instead, the behavior algorithms of each ant consist of a few simple rules for interacting with the local environment and local ants. And out of this emerges a highly efficient colony.

Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple rules about when to lay down a pheromone trail and what to do when encountering someone else’s trail—approximations of optimal solutions to the Traveling Salesman problem. This has useful applications. In “ant-based routing,” simulations using virtual ants with similar rules can generate optimal ways of connecting the nodes in a network, something of great interest to telecommunications companies. It applies to the developing brain, which must wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of connecting axons. And migrating fetal neurons generate an efficient solution with a different version of ant-based routine.

A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple molecules in an organic soup to occasionally form more complex ones. Life may have originated this way without the requirement of bolts of lightning to catalyze the formation of complex molecules.

And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems don’t require a blue print, they don’t require a blue print maker. If they don’t require lightning bolts, they don’t require Someone hurtling lightning bolts.

Developmental psychologist Howard Gardner, who famously coined the seminal theory of multiple intelligences, echoes Anaïs Nin in advocating for the role of the individual and Susan Sontag in stressing the impact of individual acts on collective fate. His answer, arguing for the importance of human beings, comes as a welcome antidote to a question that suffers the danger of being inherently reductionist:

In a planet occupied now by seven billion inhabitants, I am amazed by the difference that one human being can make. Think of classical music without Mozart or Stravinsky; of painting without Caravaggio, Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible contributions of Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs (or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses or Christ.

[…]

Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of single individuals, or of small groups, working against the odds. As scholars, we cannot and should not sweep these instances under the investigative rug. We should bear in mind anthropologist Margaret Mead’s famous injunction: ‘Never doubt that a small group of thoughtful committed citizens can change the world. It is the only thing that ever has.’

Uber-curator Hans Ulrich Obrist, who also contributed to last year’s volume, considers the parallel role of patterns and chance in the works of iconic composer John Cage and painter Gerhard Richter, and the role of uncertainty in the creative process:

In art, the title of a work can often be its first explanation. And in this context I am thinking especially of the titles of Gerhard Richter. In 2006, when I visited Richter in his studio in Cologne, he had just finished a group of six corresponding abstract paintings which he gave the title Cage.

There are many relations between Richter’s painting and the compositions of John Cage. In a book about the Cage series, Robert Storr has traced them from Richter‘s attendance of a Cage performance at the Festum Fluxorum Fluxus in Düsseldorf 1963 to analogies in their artistic processes. Cage has often applied chance procedures in his compositions, notably with the use of the I Ching. Richter in his abstract paintings also intentionally allows effects of chance. In these paintings, he applies the oil paint on the canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the paint leaves on the canvas is to a large extent the outcome of chance.

[…]

Richter‘s concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other works)—but, one can say, the short form already contains everything. The title, like an explanation of a phenomenon, unlocks the works, describing their relation to one of the most important cultural figures of the twentieth century, John Cage, who shares with Richter the great themes of chance and uncertainty.

Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains indispensable, offers a lyrical meditation on the peculiar odds behind coincidences and déja vus:

I take comfort in the fact that there are two human moments that seem to be doled out equally and democratically within the human condition—and that there is no satisfying ultimate explanation for either. One is coincidence, the other is déja vu. It doesn’t matter if you’re Queen Elizabeth, one of the thirty-three miners rescued in Chile, a South Korean housewife or a migrant herder in Zimbabwe—in the span of 365 days you will pretty much have two déja vus as well as one coincidence that makes you stop and say, “Wow, that was a coincidence.”

The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can happen at any given moment, the fact is, that in practice, coincidences almost never do occur. Coincidences are actually so rare that when they do occur they are, in fact memorable. This suggests to me that the universe is designed to ward off coincidence whenever possible—the universe hates coincidence—I don’t know why—it just seems to be true. So when a coincidence happens, that coincidence had to work awfully hard to escape the system. There’s a message there. What is it? Look. Look harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a theorem for something larger than what they think it is.

What’s both eerie and interesting to me about déja vus is that they occur almost like metronomes throughout our lives, about one every six months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely assume that my thirteen year old niece, Stephen Hawking and someone working in a Beijing luggage-making factory each experience two déja vus a year. Not one. Not three. Two.

The underlying biodynamics of déja vus is probably ascribable to some sort of tingling neurons in a certain part of the brain, yet this doesn’t tell us why they exist. They seem to me to be a signal from larger point of view that wants to remind us that our lives are distinct, that they have meaning, and that they occur throughout a span of time. We are important, and what makes us valuable to the universe is our sentience and our curse and blessing of perpetual self-awareness.

MIT social scientist Sherry Turkle, author of the cyber-dystopian Alone Together: Why We Expect More from Technology and Less from Each Other, considers the role of “transitional objets” in our relationship with technology:

I was a student in psychology in the mid-1970s at Harvard University. The grand experiment that had been “Social Relations” at Harvard had just crumbled. Its ambition had been to bring together the social sciences in one department, indeed, most in one building, William James Hall. Clinical psychology, experimental psychology, physical and cultural anthropology, and sociology, all of these would be in close quarters and intense conversation.

But now, everyone was back in their own department, on their own floor. From my point of view, what was most difficult was that the people who studied thinking were on one floor and the people who studied feeling were on another.

In this Balkanized world, I took a course with George Goethals in which we learned about the passion in thought and the logical structure behind passion. Goethals, a psychologist who specialized in adolescence, was teaching a graduate seminar in psychoanalysis. … Several classes were devoted to the work of David Winnicott and his notion of the transitional object. Winnicott called transitional the objects of childhood—the stuffed animals, the bits of silk from a baby blanket, the favorite pillows—that the child experiences as both part of the self and of external reality. Winnicott writes that such objects mediate between the child’s sense of connection to the body of the mother and a growing recognition that he or she is a separate being. The transitional objects of the nursery—all of these are destined to be abandoned. Yet, says Winnicott, they leave traces that will mark the rest of life. Specifically, they influence how easily an individual develops a capacity for joy, aesthetic experience, and creative playfulness. Transitional objects, with their joint allegiance to self and other, demonstrate to the child that objects in the external world can be loved.

[…]

Winnicott believes that during all stages of life we continue to search for objects we experience as both within and outside the self. We give up the baby blanket, but we continue to search for the feeling of oneness it provided. We find them in moments of feeling “at one” with the world, what Freud called the “oceanic feeling.” We find these moments when we are at one with a piece of art, a vista in nature, a sexual experience.

As a scientific proposition, the theory of the transitional object has its limitations. But as a way of thinking about connection, it provides a powerful tool for thought. Most specifically, it offered me a way to begin to understand the new relationships that people were beginning to form with computers, something I began to study in the late 1970s and early 1980s. From the very beginning, as I began to study the nascent digital culture culture, I could see that computers were not “just tools.” They were intimate machines. People experienced them as part of the self, separate but connected to the self.

[…]

When in the late 1970s, I began to study the computer’s special evocative power, my time with George Goethals and the small circle of Harvard graduate students immersed in Winnicott came back to me. Computers served as transitional objects. They bring us back to the feelings of being “at one” with the world. Musicians often hear the music in their minds before they play it, experiencing the music from within and without. The computer similarly can be experienced as an object on the border between self and not-self. Just as musical instruments can be extensions of the mind’s construction of sound, computers can be extensions of the mind’s construction of thought.

This way of thinking about the computer as an evocative object puts us on the inside of a new inside joke. For when psychoanalysts talked about object relations, they had always been talking about people. From the beginning, people saw computers as “almost-alive” or “sort of alive.” With the computer, object relations psychoanalysis can be applied to, well, objects. People feel at one with video games, with lines of computer code, with the avatars they play in virtual worlds, with their smartphones. Classical transitional objects are meant to be abandoned, their power recovered in moments of heightened experience. When our current digital devices—our smartphones and cellphones—take on the power of transitional objects, a new psychology comes into play. These digital objects are never meant to be abandoned. We are meant to become cyborg.

Anthropologist Scott Aran considers the role of the absurd in religion and cause-worship, and the Becket-like notion of the “ineffable”:

The notion of a transcendent force that moves the universe or history or determines what is right and good—and whose existence is fundamentally beyond reason and immune to logical or empirical disproof—is the simplest, most elegant, and most scientifically baffling phenomenon I know of. Its power and absurdity perturbs mightily, and merits careful scientific scrutiny. In an age where many of the most volatile and seemingly intractable conflicts stem from sacred causes, scientific understanding of how to best deal with the subject has also never been more critical.

Call it love of Group or God, or devotion to an Idea or Cause, it matters little in the end. This is the “the privilege of absurdity; to which no living creature is subject, but man only” of which Hobbes wrote in Leviathan. In The Descent of Man, Darwin cast it as the virtue of “morality,” with which winning tribes are better endowed in history’s spiraling competition for survival and dominance. Unlike other creatures, humans define the groups to which they belong in abstract terms. Often they strive to achieve a lasting intellectual and emotional bonding with anonymous others, and seek to heroically kill and die, not in order to preserve their own lives or those of people they know, but for the sake of an idea—the conception they have formed of themselves, of “who we are.”

[…]

There is an apparent paradox that underlies the formation of large-scale human societies. The religious and ideological rise of civilizations—of larger and larger agglomerations of genetic strangers, including today’s nations, transnational movements, and other “imagined communities” of fictive kin — seem to depend upon what Kierkegaard deemed this “power of the preposterous” … Humankind’s strongest social bonds and actions, including the capacity for cooperation and forgiveness, and for killing and allowing oneself to be killed, are born of commitment to causes and courses of action that are “ineffable,” that is, fundamentally immune to logical assessment for consistency and to empirical evaluation for costs and consequences. The more materially inexplicable one’s devotion and commitment to a sacred cause — that is, the more absurd—the greater the trust others place in it and the more that trust generates commitment on their part.

[…]

Religion and the sacred, banned so long from reasoned inquiry by ideological bias of all persuasions—perhaps because the subject is so close to who we want or don’t want to be — is still a vast, tangled and largely unexplored domain for science, however simple and elegant for most people everywhere in everyday life.

Psychologist Timothy Wilson, author of the excellent Redirect: The Surprising New Science of Psychological Change, explores the Möbius loop of self-perception and behavior:

My favorite is the idea that people become what they do. This explanation of how people acquire attitudes and traits dates back to the philosopher Gilbert Ryle, but was formalized by the social psychologist Daryl Bem in his self-perception theory. People draw inferences about who they are, Bem suggested, by observing their own behavior.

Self-perception theory turns common wisdom on its head. … Hundreds of experiments have confirmed the theory and shown when this self-inference process is most likely to operate (e.g., when people believe they freely chose to behave the way they did, and when they weren’t sure at the outset how they felt).

Self-perception theory is an elegant in its simplicity. But it is also quite deep, with important implications for the nature of the human mind. Two other powerful ideas follow from it. The first is that we are strangers to ourselves. After all, if we knew our own minds, why would we need to guess what our preferences are from our behavior? If our minds were an open book, we would know exactly how honest we are and how much we like lattes. Instead, we often need to look to our behavior to figure out who we are. Self-perception theory thus anticipated the revolution in psychology in the study of human consciousness, a revolution that revealed the limits of introspection.

But it turns out that we don’t just use our behavior to reveal our dispositions—we infer dispositions that weren’t there before. Often, our behavior is shaped by subtle pressures around us, but we fail to recognize those pressures. As a result, we mistakenly believe that our behavior emanated from some inner disposition.

Harvard physician and social scientist Nicholas Christakis, who also appeared in Brockman’s Culture: Leading Scientists Explore Societies, Art, Power, and Technology, offers one of the most poetic answers, tracing the history of our understanding of why the sky is blue:

My favorite explanation is one that I sought as a boy. It is the explanation for why the sky is blue. It’s a question every toddler asks, but it is also one that most great scientists since the time of Aristotle, including da Vinci, Newton, Kepler, Descartes, Euler, and even Einstein, have asked.

One of the things I like most about this explanation—beyond the simplicity and overtness of the question itself—is how long it took to arrive at correctly, how many centuries of effort, and how many branches of science it involves.

Aristotle is the first, so far as we know, to ask the question about why the sky is blue, in the treatise On Colors; his answer is that the air close at hand is clear and the deep air of the sky is blue the same way a thin layer of water is clear but a deep well of water looks black. This idea was still being echoed in the 13th century by Roger Bacon. Kepler too reinvented a similar explanation, arguing that the air merely looks colorless because the tint of its color is so faint when in a thin layer. But none of them offered an explanation for the blueness of the atmosphere. So the question actually has two, related parts: why the sky has any color, and why it has a blue color.

[…]

The sky is blue because the incident light interacts with the gas molecules in the air in such as fashion that more of the light in the blue part of the spectrum is scattered, reaching our eyes on the surface of the planet. All the frequencies of the incident light can be scattered this way, but the high-frequency (short wavelength) blue is scattered more than the lower frequencies in a process known as Rayleigh scattering, described in the 1870’s. John William Strutt, Lord Rayleigh, who also won the Nobel Prize in physics in 1904 for the discovery of argon, demonstrated that, when the wavelength of the light is on the same order as the size of the gas molecules, the intensity of scattered light varies inversely with the fourth power of its wavelength. Shorter wavelengths like blue (and violet) are scattered more than longer ones. It’s as if all the molecules in the air preferentially glow blue, which is what we then see everywhere around us.

Yet, the sky should appear violet since violet light is scattered even more than blue light. But the sky does not appear violet to us because of the final, biological part of the puzzle, which is the way our eyes are designed: they are more sensitive to blue than violet light.

The explanation for why the sky is blue involves so much of the natural sciences: the colors within the visual spectrum, the wave nature of light, the angle at which sunlight hits the atmosphere, the mathematics of scattering, the size of nitrogen and oxygen molecules, and even the way human eyes perceive color. It’s most of science in a question that a young child can ask.

Nature editor-in-chief Philip Campbell considers the beauty of a sunrise, echoing Richard Feynman’s thoughts on science and mystery and Danis Dutton’s evolutionary theory of beauty:

Scientific understanding enhances rather than destroys nature’s beauty. All of these explanations for me contribute to the beauty in a sunrise.

Ah, but what is the explanation of beauty? Brain scientists grapple with nuclear-magnetic resonance images—a recent meta-analysis indicated that all of our aesthetic judgements seem to include the use of neural circuits in the right anterior insula, an area of the cerebral cortex typically associated with visceral perception. Perhaps our sense of beauty is a by-product of the evolutionary maintenance of the senses of belonging and of disgust. For what it’s worth, as exoplanets pour out of our telescopes, I believe that we will encounter astrochemical evidence for some form of extraterrestrial organism well before we achieve a deep, elegant or beautiful explanation of human aesthetics.

But my favorite essay comes from social media researcher and general genius Clay Shirky, author of Cognitive Surplus: Creativity and Generosity in a Connected Age, who considers the propagation of ideas in culture and the problems with Richard Dawkins’s notion of the meme in a context of combinatorial creativity:

Something happens to keep one group of people behaving in a certain set of ways. In the early 1970s, both E.O. Wilson and Richard Dawkins noticed that the flow of ideas in a culture exhibited similar patterns to the flow of genes in a species—high flow within the group, but sharply reduced flow between groups. Dawkins’ response was to assume a hypothetical unit of culture called the meme, though he also made its problems clear—with genetic material, perfect replication is the norm, and mutations rare. With culture, it is the opposite — events are misremembered and then misdescribed, quotes are mangled, even jokes (pure meme) vary from telling to telling. The gene/meme comparison remained, for a generation, an evocative idea of not much analytic utility.

Dan Sperber has, to my eye, cracked this problem. In a slim, elegant volume of 15 years ago with the modest title Explaining Culture, he outlined a theory of culture as the residue of the epidemic spread of ideas. In this model, there is no meme, no unit of culture separate from the blooming, buzzing confusion of transactions. Instead, all cultural transmission can be reduced to one of two types: making a mental representation public, or internalizing a mental version of a public presentation. As Sperber puts it, “Culture is the precipitate of cognition and communication in a human population.”

Sperber’s two primitives—externalization of ideas, internalization of expressions—give us a way to think of culture not as a big container people inhabit, but rather as a network whose traces, drawn carefully, let us ask how the behaviors of individuals create larger, longer-lived patterns. Some public representations are consistently learned and then re-expressed and re-learned—Mother Goose rhymes, tartan patterns, and peer review have all survived for centuries. Others move from ubiquitous to marginal in a matter of years. . . .

[…]

This is what is so powerful about Sperber’s idea: culture is a giant, asynchronous network of replication, ideas turning into expressions which turn into other, related ideas. … Sperber’s idea also suggests increased access to public presentation of ideas will increase the dynamic range of culture overall.

Characteristically thought-provoking and reliably cross-disciplinary, This Explains Everything is a must-read in its entirety.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

04 DECEMBER, 2012

The 10 Best Psychology and Philosophy Books of 2012

By:

From Buddhism to the relationship between creativity and dishonesty, by way of storytelling and habit.

After the best science books, art books, and design books of 2012, the season’s subjective selection of best-of reading lists continue with the most stimulating philosophy, psychology, and creativity books published this year. (Catch up on last year’s roundup here.)

THIS WILL MAKE YOU SMARTER

Every year for more than a decade, intellectual impresario and Edge editor John Brockman has been asking the era’s greatest thinkers a single annual question, designed to illuminate some important aspect of how we understand the world. In 2010, he asked how the Internet is changing the way we think. In 2011, with the help of psycholinguist Steven Pinker and legendary psychologist Daniel Kahneman, he posed an even grander question: “What scientific concept will improve everybody’s cognitive toolkit?” The answers, featuring a wealth of influential scientists, authors, and thought-architects, were collected in This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking (public library) — a formidable anthology of short essays by 151 of our time’s biggest thinkers on subjects as diverse as the power of networks, cognitive humility, the paradoxes of daydreaming, information flow, collective intelligence, and a dizzying, mind-expanding range in between. Together, they construct a powerful toolkit of meta-cognition — a new way to think about thinking itself.

Brockman prefaces the essays with an important definition that captures the dimensionality of “science”:

Here, the term ‘scientific’ is to be understood in a broad sense — as the most reliable way of gaining knowledge about anything, whether it be human behavior, corporate behavior, the fate of the planet, or the future of the universe. A ‘scientific concept’ may come from philosophy, logic, economics, jurisprudence, or any other analytic enterprises, as long as it is a rigorous tool that can be summed up succinctly but has broad application to understanding the world.”

Neuroscientist David Eagleman, author of the excellent Incognito: The Secret Lives of the Brain, explores the concept of “the umwelt” coined by biologist Jakob von Uexküll in 1909 — the idea that different animals in the same ecosystem pick up on different elements of their environment and thus live in different micro-realities based on the subset of the world they’re able to detect. Eagleman stresses the importance of recognizing our own umwelt — our unawareness of the limits of our awareness:

I think it would be useful if the concept of the umwelt were embedded in the public lexicon. It neatly captures the idea of limited knowledge, of unobtainable information, and of unimagined possibilities. Consider the criticisms of policy, the assertions of dogma, the declarations of fact that you hear every day — and just imagine if all of these could be infused with the proper intellectual humility that comes from appreciating the amount unseen.

Nobel laureate Daniel Kahneman, who authored one of the best psychology books of 2011, contemplates the “focusing illusion” — our tendency to misjudge the scale of impact certain circumstances, from a pay raise to the death of a loved one, will have on our actual well-being.

Marketers exploit the focusing illusion. When people are induced to believe that they “must have” a good, they greatly exaggerate the difference that the good will make to the quality of their life. The focusing illusion is greater for some goods than for others, depending on the extent to which the goods attract continued attention over time. The focusing illusion is likely to be more significant for leather car seats than for books on tape.

Politicians are almost as good as marketers in causing people to exaggerate the importance of issues on which their attention is focused. People can be made to believe that school uniforms will significantly improve educational outcomes, or that health care reform will hugely change the quality of life in the United States — either for the better or for the worse. Health care reform will make a difference, but the difference will be smaller than it appears when you focus on it.

Martin Seligman, father of positive psychology, writes about PERMA, the five pillars of well-being — Positive Emotion, Engagement, Positive Relationships, Meaning and Purpose, and Accomplishment — reminding us that reducing disabling conditions like poverty, disease, depression, aggression, and ignorance is only one half of the life satisfaction equation:

Science and public policy have traditionally been focused solely on remediating the disabling conditions, but PERMA suggests that this is insufficient. If we want global well being, we should also measure and try to build PERMA. The very same principal seems to be true in your own life: if you wish to flourish personally, getting rid of depression, anxiety, and anger and getting rich is not enough, you also need to build PERMA directly.”

Biological anthropologist Helen Fisher, who has previously examined the neurochemistry of love and desire, zooms in on the temperament as the essential building block of the self:

Personality is composed of two fundamentally different types of traits: those of ‘character;’ and those of ‘temperament.’ Your character traits stem from your experiences. Your childhood games; your family’s interests and values; how people in your community express love and hate; what relatives and friends regard as courteous or perilous; how those around you worship; what they sing; when they laugh; how they make a living and relax: innumerable cultural forces build your unique set of character traits. The balance of your personality is your temperament, all the biologically based tendencies that contribute to your consistent patterns of feeling, thinking and behaving. As Spanish philosopher, Jose Ortega y Gasset, put it, ‘I am, plus my circumstances.’ Temperament is the ‘I am,’ the foundation of who you are.

Theoretical physicist Carlo Rovelli reminds us that uncertainty and the willingness to be proven wrong are a vital part of intellectual, and I dare add personal, growth:

The very foundation of science is to keep the door open to doubt. Precisely because we keep questioning everything, especially our own premises, we are always ready to improve our knowledge. Therefore a good scientist is never ‘certain’. Lack of certainty is precisely what makes conclusions more reliable than the conclusions of those who are certain: because the good scientist will be ready to shift to a different point of view if better elements of evidence, or novel arguments emerge. Therefore certainty is not only something of no use, but is in fact damaging, if we value reliability.

But my favorite comes from curator extraordinaire Hans-Ulrich Obrist:

Lately, the word “curate” seems to be used in an greater variety of contexts than ever before, in reference to everything from a exhibitions of prints by Old Masters to the contents of a concept store. The risk, of course, is that the definition may expand beyond functional usability. But I believe ‘curate’ finds ever-wider application because of a feature of modern life that is impossible to ignore: the incredible proliferation of ideas, information, images, disciplinary knowledge, and material products that we all witnessing today. Such proliferation makes the activities of filtering, enabling, synthesizing, framing, and remembering more and more important as basic navigational tools for 21st century life. These are the tasks of the curator, who is no longer understood as simply the person who fills a space with objects but as the person who brings different cultural spheres into contact, invents new display features, and makes junctions that allow unexpected encounters and results.

[…]

To curate, in this sense, is to refuse static arrangements and permanent alignments and instead to enable conversations and relations. Generating these kinds of links is an essential part of what it means to curate, as is disseminating new knowledge, new thinking, and new artworks in a way that can seed future cross-disciplinary inspirations. But there is another case for curating as a vanguard activity for the 21st century.

As the artist Tino Sehgal has pointed out, modern human societies find themselves today in an unprecedented situation: the problem of lack, or scarcity, which has been the primary factor motivating scientific and technological innovation, is now being joined and even superseded by the problem of the global effects of overproduction and resource use. Thus moving beyond the object as the locus of meaning has a further relevance. Selection, presentation, and conversation are ways for human beings to create and exchange real value, without dependence on older, unsustainable processes. Curating can take the lead in pointing us towards this crucial importance of choosing.”

The true gift of This Will Make You Smarter — of Brockman — is in acting as a potent rupture in the filter bubble of our curiosity, cross-pollinating ideas across a multitude of disciplines to broaden our intellectual comfort zones and, in the process, spark a deeper, richer, more dimensional understanding not only of science, but of life itself.

The text of the answers is also available online in its entirety.

Originally featured, with more excerpts, in February.

TINY BEAUTIFUL THINGS

When an anonymous advice columnist by the name of “Dear Sugar” introduced herself on The Rumpus on March 11, 2010, she made her proposition clear: a “by-the-book common sense of Dear Abby and the earnest spiritual cheesiness of Cary Tennis and the butt-pluggy irreverence of Dan Savage and the closeted Upper East Side nymphomania of Miss Manners.” But in the two-some years that followed, she proceeded to deliver something tenfold punchier, more honest, more existentially profound than even such an intelligently irreverent promise could foretell. This year, all of Sugar’s no-bullshit, wholehearted wisdom on life’s trickiest contexts — sometimes the simplest, sometimes the most complex, always the most deeply human — was released in Tiny Beautiful Things: Advice on Love and Life from Dear Sugar (public library), along with several never-before-published columns, under Sugar’s real name: Cheryl Strayed.

The book is titled after Dear Sugar #64, which remains my own favorite by a long stretch. It’s exquisite in its entirety, but this particular bit makes the heart tremble with raw heartness:

Your assumptions about the lives of others are in direct relation to your naïve pomposity. Many people you believe to be rich are not rich. Many people you think have it easy worked hard for what they got. Many people who seem to be gliding right along have suffered and are suffering. Many people who appear to you to be old and stupidly saddled down with kids and cars and houses were once every bit as hip and pompous as you.

When you meet a man in the doorway of a Mexican restaurant who later kisses you while explaining that this kiss doesn’t ‘mean anything’ because, much as he likes you, he is not interested in having a relationship with you or anyone right now, just laugh and kiss him back. Your daughter will have his sense of humor. Your son will have his eyes.

The useless days will add up to something. The shitty waitressing jobs. The hours writing in your journal. The long meandering walks. The hours reading poetry and story collections and novels and dead people’s diaries and wondering about sex and God and whether you should shave under your arms or not. These things are your becoming.

One Christmas at the very beginning of your twenties when your mother gives you a warm coat that she saved for months to buy, don’t look at her skeptically after she tells you she thought the coat was perfect for you. Don’t hold it up and say it’s longer than you like your coats to be and too puffy and possibly even too warm. Your mother will be dead by spring. That coat will be the last gift she gave you. You will regret the small thing you didn’t say for the rest of your life.

Say thank you.

In the introduction, Steve Almond, who once attempted to be Sugar before there was Sugar, captures precisely what makes Sugar Sugar:

The column that launched Sugar as a phenomenon was written in response to what would have been, for anyone else, a throwaway letter. Dear Sugar, wrote a presumably young man. WTF? WTF? WTF? I’m asking this question as it applies to everything every day. Cheryl’s reply began as follows:

Dear WTF,

My father’s father made me jack him off when I was three and four and five. I wasn’t good at it. My hands were too small and I couldn’t get the rhythm right and I didn’t understand what I was doing. I only knew I didn’t want to do it. Knew it made me feel miserable and anxious in a way so sickeningly particular that I can feel the same particular sickness rising this very minute in my throat.

It was an absolutely unprecedented moment. Advice columnists, after all, adhere to an unspoken code: focus on the letter writer, dispense all necessary bromides, make it all seem bearable. Disclosing your own sexual assault is not part of the code.

But Cheryl wasn’t just trying to shock some callow kid into greater compassion. She was announcing the nature of her mission as Sugar. Inexplicable sorrows await all of us. That was her essential point. Life isn’t some narcissistic game you play online. It all matters — every sin, every regret, every affliction. As proof, she offered an account of her own struggle to reckon with a cruelty she’s absorbed before she was old enough to even understand it. Ask better questions, sweet pea, she concluded. The fuck is your life. Answer it.

Originally featured in July.

WHERE THE HEART BEATS

“Good music can act as a guide to good living,” John Cage (1912-1992) once said. But what, exactly, is good music, or good living, or, for that matter, goodness itself?

In Where the Heart Beats: John Cage, Zen Buddhism, and the Inner Life of Artists (public library), longtime art critic and practicing Buddhist Kay Larson constructs a remarkable intellectual, creative, and spiritual biography of Cage — one of the most influential composers in modern history, whose impact reaches beyond the realm of music and into art, literature, cinema, and just about every other aesthetic and conceptual expression of curiosity about the world, yet also one of history’s most misunderstood artists. Fifteen years in the making, it is without a doubt the richest, most stimulating,most absorbing book I’ve read in ages — superbly researched, exquisitely written, weaving together a great many threads of cultural history into a holistic understanding of both Cage as an artist and Zen as a lens on existence.

From his early life in California, defined by his investigations into the joy of sound, to his pivotal introduction to Zen Buddhism in Japanese Zen master D. T. Suzuki’s Columbia University class, to his blossoming into a force of the mid-century avant-garde, Larson traces Cage’s own journey as an artist and a soul, as well as his intermeshing with the journeys of other celebrated artists, including Marcel Duchamp, Jasper Johns, Yoko Ono, Robert Rauschenberg, Jackson Pollock, and, most importantly, Merce Cunningham.

The book itself has a beautiful compositional structure, conceived as a conversation with Cage and modeled after Cage’s imagined conversations with Erik Satie, one of his mentors, long after Satie’s death. Interspersed in Larson’s immersive narrative are italicized excerpts of Cage’s own writing, in his own voice.

Where to begin? Perhaps at the core — the core of what Cage has come to be known for, that expansive negative space isn’t nihilistic, isn’t an absence, but, rather, it’s life-affirming, a presence. Cage himself reflects:

Our intention is to affirm this life, not to bring order out of chaos, nor to suggest improvements in creation, but simply to wake up to the very life we’re living, which is so excellent once one gets one’s mind and desires out of its way and lets it act of its own accord.

Xenia Kashevaroff

Image courtesy of the Metropolitan Museum of Art

In his early life, however, Cage was rather unable to get his “mind and desires out of the way,” leading himself into a spiral of inner turmoil. While engaged in a relationship with a man named Don Sample, he met artist Xenia Kashevaroff, the Alaskan-born daughter of a Russian priest, and quickly fell in love. The two got married and, for a while, Cage was able to appease his dissonance about his affair with Sample. But rather than gaining deeper self-knowledge, he seemed to steer further away from himself. Perhaps that’s what prompted him, sixty years later, to admonish:

I’m entirely opposed to emotions….I really am. I think of love as an opportunity to become blind and blind in a bad way….I think that seeing and hearing are extremely important; in my view they are what life is; love makes us blind to seeing and hearing.

By the 1940s, Cage’s relationship with Xenia had begun to unravel. When the two eventually divorced in 1945, Cage’s identity was thrown into turbulence. His work followed faithfully, as he set out to compose Ophelia (1946), a “two-tone poem to madness” based on Shakespeare. Larson writes:

Margaret Leng Tan asked Cage why his portrait of Ophelia is so much harsher than Shakespeare’s. She recorded his reply that ‘all madness is inherently violent, even when it is not directed towards others, for it invariably ravages the sufferer internally.’

Cage and Cunningham, circa 1948, as Cage's confusion and despair began to lift. In this classic image, taken at Black Mountain College, the perfection of their partnering seems a force of nature. Why did Cage struggle at first?

Image courtesy of John Cage Trust / Penguin

Soon, Cage began the decades-long romance with the love of his life, dancer and choreographer Merce Cunningham, which would last until the end of Cage’s life and bequeath some of the most magical collaborations in the history of 20th-century art. Around the same time, Cage began the other essential relationship of his life — that with Zen Buddhism.

Hardly anywhere does Larson’s gift for prose and grasp of the human condition shine more beautifully than in this passage articulating the profound, uncomfortable transformation that love sets in motion:

Caught in the roar of his emotions, Cage was forced to confront a question totally new to him: What is the ‘self’ that is being expressed? The self that hurts so badly it nearly kills you? The self that isn’t seen until it aches?

When Cage and Cunningham met, perhaps they felt a tremor of gravitational shift. It might have been small at first, or the shiver might have been so insistent it rattled them. Whatever the case, something evidently stirred between the two men before they came to New York. But maybe nothing was spoken.

So it is with the places preparing to teach us. It’s only when the heart begins to beat wildly and without pattern — when it begins to realize its boundlessness — that its newly adamant pulse bangs on the walls of its cage and is bruised by its enclosure.

To feel the heart pound is only the beginning. Next is to feel the hurt — the tearing of the psyche — the prelude of entry into the place one has always feared. One fears that place because of being drawn to it, loving it, and wanting to be taught by it. Without the need to be taught, who would feel the psyche rip?…. Without the bruise, who would know where the walls are?

Tying it back to Cage himself, Larson writes:

Bruised and bloodied by throwing himself against the four walls of his enclosure, and deeply shaken by his shrieking emotions, Cage stopped pacing his confinement and realized that his container had no roof. Looking up, he could see the sky. Fascinated, he set out to explore this new dimension.

What he found was a language of silence and immanence.

In 1964, John Cage was fifty-two years old and had been partnering with Merce Cunningham for two decades. The two men's bright confidence in 1948 has shifted to something calmer: the settled assurance of the bond between them -- one of the great redeeming love affairs in the history of the American arts -- which would endure until their deaths.

Image courtesy of John Cage Trust / Penguin

This Cageian inquisitiveness was indeed fundamental to both this personal life and his approach to music — an ethos reminiscent of Rilke’s counsel to live the questions. Cage:

What can be analyzed in my work, or criticized, are the questions.

One remarkable aspect of Cage’s music, derived from his close study of Indian traditions, was the notion of “disinterestedness” — which is not to be confused with “indifference.” Larson distinguishes:

From the standpoint of spiritual practice, the two words have nothing in common. Indifference borders on nihilism. It has a quality of ‘not caring.’ It is ‘apathetic.’ It expresses corrosive cynicism. Ultimately, it is poisonous, both to the practitioner and to the culture as a whole.

Disinterestedness, on the contrary, ‘is unbiased by personal interest or advantage; not influenced by selfish motives,’ according to the Random House Dictionary (1971). Disinterestedness is the natural outcome of meditation on the self and recognition of its lack of substance — then what can trouble you? freeing one’s mind from the grip of the self leads to spiritual ease — being at home in your own skin, free of self-attachment, cured of likes and dislikes, afloat in rasa. It’s how you open your ears to the music of the world.

Cage defined disinterestedness and equated it with ‘love’ in 1948:

‘If one makes music, as the Orient would say, disinterestedly, that is, without concern for money or fame but simply for the love of making it, it is an integrating activity and one will find moments in his life that an complete and fulfilled.’

(This sentiment regarding purpose and doing what you love would come to be articulated by many other creators over the decades to come.)

Echoing something Jackson Pollock’s dad once wrote to his son in one of history’s finest letters, Cage advises:

Look at everything. Don’t close your eyes to the world around you. Look and become curious and interested in what there is to see.

Cage has become 'the man of the great smile, the outgoing laugh,' his friend Peter Yates remembered. 'Around him everyone laughs.'

Larson concludes with a beautiful metaphor for both Zen Buddhism and Cage’s legacy, reflecting on artist Bruce Nauman’s show Mapping the Studio I (Fat Chance John Cage), which was spurred by Nauman’s discovery that he had mice in his studio:

In the studio, things happen by chance. A mouse runs by. A moth flitters through space. These ‘chance events’ are random and filled with non-intention — the buzz of small creatures, caught on film, in the midst of their busy eventful lives. As far as a mouse is concerned, its life is the center of the universe. By watching through the neutral eye of the camera, we are able to see what we might not glimpse otherwise: that a ‘silent’ space is an invisible game of billiards played by beings, each at its own center, each responding to all other beings. The mice, dashing here and there, are playing out their expectations about the cat. Life fills the gaps.

There are absolutely no metaphors, just observations.

[…]

The artist maps reality. That’s the cat-and-mouse game between the artist and the world. And it’s not just the artist who plays it. Each of us is in a cat-and-mouse game with our perceptual life. Do we really see ourselves? Or do we see only what obtrudes in daylight? Do we crash through our nightlife, scattering the subtle things that abide there? Or do we simply watch without judgment, in the expectation of learning something?

Originally featured at length in July.

AS CONSCIOUSNESS IS HARNESSED TO FLESH

It’s no secret Susan Sontag’s journals have been on heavy rotation here this year. As Consciousness Is Harnessed to Flesh: Journals and Notebooks, 1964-1980 (public library), the second published volume of her diaries, offers an intimate glimpse of the inner life of a woman celebrated as one of the twentieth century’s most remarkable intellectuals, yet one who felt as deeply and intensely as she thought. Oscillating between conviction and insecurity in the most beautifully imperfect and human way possible, Sontag details everything from her formidable media diet of literature and film to her intense love affairs and infatuations to her meditations on society’s values and vices. The tome includes her insights on art, love, writing, censorship, boredom, and aphorisms.

Nothing is mysterious, no human relation. Except love.

BREAKTHROUGH

What extraordinary energy we expend, as a culture and a civilization, on trying to understand where good ideas come from, how creativity works, its secrets, its origins, its mechanisms, and the five-step action plan for coaxing it into manifestation. And little compares to the anguish that comes with the blockage of creative flow.

In 2010, designer and musician Alex Cornell found himself stumped by a creative block while trying to write an article about creative block. Deterred neither by the block nor by the irony, he reached out to some of his favorite artists and asked them for their coping strategies in such an event. The response was overwhelming in both volume and depth, inspiring Cornell to put together a collection on the subject. The result is Breakthrough!: 90 Proven Strategies to Overcome Creative Block and Spark Your Imagination (public library) — a small but potent compendium of field-tested, life-approved insight on optimizing the creative process from some of today’s most exciting artists, designers, illustrators, writers, and thinkers. From the many specific strategies — walks in nature, porn, destruction of technology, weeping — a few powerful universals emerge, including the role of procrastination, the importance of a gestation period for ideas, and, above all, the reminder that the “creative block” befalls everyone indiscriminately.

Writer Michael Erard teases apart “creative block” and debunks its very premise with an emphasis on creativity as transformation:

First of all, being creative is not summoning stuff ex nihilo. It’s work, plain and simple — adding something to some other thing or transforming something. In the work that I do, as a writer and a metaphor designer, there’s always a way to get something to do something to do something else. No one talks about work block.

Also, block implies a hydraulic metaphor of thinking. Thoughts flow. Difficulty thinking represents impeded flow. This interoperation also suggests a single channel for that flow. A stopped pipe. A dammed river. If you only have one channel, one conduit, then you’re vulnerable to blockage. Trying to solve creative block, I imagine a kind of psyching Roto-Rootering.

My conceptual scheme is more about the temperature of things: I try to find out what’s hot and start there, even if it may be unrelated to what I need to be working on, and most of the time, that heats up other areas too. You can solve a lot with a new conceptual frame.

Designer Sam Potts suggests that heartbreak isn’t merely evolutionary adaptive strategy, it’s a creative one:

Have your heart broken. It worked for Rei Kawakubo. You’ll realize the work you’d been doing wasn’t anywhere near your potential.

The inimitable Debbie Millman has kindly offered this hand-lettered version of the typeset list in the book:

Writer Douglas Rushkoff rebels:

I don’t believe in writer’s block.

Yes, there may have been days or even weeks at a time when I have not written — even when I may have wanted to — but that doesn’t mean I was blocked. It simply means I was in the wrong place at the wrong time. Or, as I’d like to argue, exactly the right place at the right time.

The creative process has more than one kind of expression. There’s the part you could show in a movie montage — the furious typing or painting or equation solving where the writer, artist, or mathematician accomplishes the output of the creative task. But then there’s also the part that happens invisibly, under the surface. That’s when the senses are perceiving the world, the mind and heart are thrown into some sort of dissonance, and the soul chooses to respond.

That response doesn’t just come out like vomit after a bad meal. There’s not such thing as pure expression. Rather, because we live in a social world with other people whose perceptual apparatus needs to be penetrated with our ideas, we must formulate, strategize, order, and then articulate. It is that last part that is visible as output or progress, but it only represents, at best, 25 percent of the process.

Real creativity transcends time. If you are not producing work, then chances are you have fallen into the infinite space between the ticks of the clock where reality is created. Don’t let some capitalist taskmaster tell you otherwise — even if he happens to be in your own head.

Musician Jamie Lidell echoes Tchaikovsky:

Cheers. Watcha gonna do with a blocked toilet? I mean, that’s all it is, right? A bung that needs pulling to let the clear waters of inspiration flow.

Maybe. Or maybe it just takes showing up. Going back again and again to write or paint or sing or cook.

Some days the genius will be in you, and you will sail. Other days the lead will line the slippers, and you’ll be staring into the void of your so-called creative mind, feeling like a fraud. It’s all part of the big ole cycle of creativity, and it’s a healthy cycle at that.

Philosopher Daniel Dennett has a special term for his method:

My strategy for getting myself out of a rut is to sit at my desk reminding myself of what the problem is, reviewing my notes, generally filling my head with the issues and terms, and then I just get up and go do something relatively mindless and repetitive. At our farm in the summer, I paint the barn or mow the hayfield or pick berries or cute fire wood to length…. I don’t even try to think about the problem, but more often than not, at some point in the middle of the not very challenging activity, I’ll find myself mulling it over and coming up with a new slant, a new way of tackling the issue, maybe just a new term to use. Engaging my brain with something else to control and think about helps melt down the blockades that have been preventing me from making progress, freeing up the circuits for some new paths. My strategy could hardly be cruder, but it works so well so often that I have come to rely on it.

One summer, many years ago, my friend Doug Hofstadter was visiting me at my farm, and somebody asked him where I was. He gestured out to the big hayfield behind the house, which I was harrowing for a reseeding. ‘He’s out there on his tractor, doing his tillosophy,’ Doug said. Ever since then, tillosophy has been my term for this process. Try it; if it doesn’t work, at least you’ll end up with a painted room, a mowed lawn, a clean basement.

But as a tireless proponent of combinatorial creativity, my favorite comes from the inimitable Jessica Hagy of indexed fame, who pretty much articulates the Brain Pickings founding philosophy:

How can you defeat the snarling goblins of creative block? With books, of course. Just grab one. It doesn’t matter what sort: science fiction, science fact, pornography (soft, hard, or merely squishy), comic books, textbooks, diaries (of people known or unknown), novels, telephone directories, religious texts — anything and everything will work.

Now, open it to a random page. Stare at a random sentence.

[…]

Every book holds the seed of a thousand stories. Every sentence can trigger an avalanche of ideas. Mix ideas across books: one thought from Aesop and one line from Chomsky, or a fragment from the IKEA catalog melded with a scrap of dialog from Kerouac.

By forcing your mind to connect disparate bits of information, you’ll jump-start your thinking, and you’ll fill in blank after blank with thought after thought. The goblins of creative block have stopped snarling and have been shooed away, you’re dashing down thoughts, and your synapses are clanging away in a symphonic burst of ideas. And if you’re not, whip open another book. Pluck out another sentence. And ponder mash-ups of out-of-context ideas until your mind wanders and you end up in a new place, a place that no one else ever visited.

Marvelous.

Originally featured, with more excerpts, in October.

WHY DOES THE WORLD EXIST?

“What is it that breathes fire into the equations and makes a universe for them to describe?,” wondered Stephen Hawking in A Brief History of Time. “Why does the universe go through all the bother of existing?”

This inquiry has long occupied scientists, philosophers, and deep thinkers alike, culminating in the most fundamental question of why there is something rather than nothing. That, in fact, is the epicenter of intellectual restlessness that Jim Holt sets out to resolve in Why Does the World Exist?: An Existential Detective Story (public library). Seeking to tease apart the most central existential question of all — why there is a world, rather than nothingness, a question he says is “so profound that it would occur only to a metaphysician, yet so simple it would occur only to a child” — Holt pores through millennia of science and theology, theory by theory, to question our most basic assumptions about the world, reality, and the nature of fact itself, with equal parts intelligence, irreverence, and insight.

Reflecting on his many conversations with philosophers, theologians, particle physicists, cosmologists, mystics, and writers, Holt puts things in perspective:

When you listen to such thinkers feel their way around the question of why there is a world at all, you begin to realize that your own thoughts on the matter are not quite so nugatory as you had imagined. No one can confidently claim intellectual superiority in the face of the mystery of existence. For, as William James observed, ‘All of us are beggars here.’

And while the book is remarkable in its entirety — take a closer look with Kathryn Schulz’s exquisite review for New York Magazine — one of Holt’s most fascinating conversations is with someone one wouldn’t immediately peg as an expert on cosmogony: novelist John Updike, who seems to share in Isaac Asimov’s famous contention that “the most beautiful experience we can have is the mysterious.” Holt writes:

[T]he laws amount to a funny way of saying, ‘Nothing equals something,'” Updike said, bursting into laughter. “QED! One opinion I’ve encountered is that, since getting from nothing to something involves time, and time didn’t exist before there was something, the whole question is a meaningless one that we should stop asking ourselves. It’s beyond our intellectual limits as a species. Put yourself into the position of a dog. A dog is responsive, shows intuition, looks at us with eyes behind which there is intelligence of a sort, and yet a dog must not understand most of the things it sees people doing. It must have no idea how they invented, say, the internal-combustion engine. So maybe what we need to do is imagine that we’re dogs and that there are realms that go beyond our understanding. I’m not sure I buy that view, but it is a way of saying that the mystery of being is a permanent mystery, at least given the present state of the human brain. I have trouble even believing—and this will offend you—the standard scientific explanation of how the universe rapidly grew from nearly nothing. Just think of it. The notion that this planet and all the stars we see, and many thousands of times more than those we see — that all this was once bounded in a point with the size of, what, a period or a grape? How, I ask myself, could that possibly be? And, that said, I sort of move on.

Taking a jab at the “beautiful mathematics” of string theory, Updike echoes the landmark conversation between Einstein and Indian philosopher Tagore, exclaiming:

Beautiful in a vacuum! What’s beauty if it’s not, in the end, true? Beauty is truth, and truth is beauty.

Holt invites Updike to reconcile the “brute fact theory” of science and the “God theory” of religion:

He was silent again for a moment, then continued. “Some scientists who are believers, like Freeman Dyson, have actually tackled the ultimate end of the universe. They’ve tried to describe a universe where entropy is almost total and individual particles are separated by distances that are greater than the dimensions of the present observable universe … an unthinkably dreary and pointless vacuum. I admire their scientific imagination, but I just can’t make myself go there. And a space like that is the space in which God existed and nothing else. Could God then have suffered boredom to the point that he made the universe? That makes reality seem almost a piece of light verse.”

What a lovely conceit! Reality is not a “blot on nothingness,” as Updike’s character Henry Bech had once, in a bilious moment, decided. It is a piece of light verse.

Originally featured in July.

THE POWER OF HABIT

“We are spinning our own fates, good or evil, and never to be undone. Every smallest stroke of virtue or of vice leaves its never so little scar,” William James famously wrote on habit. Indeed, try as we might to reverse-engineer willpower and flowchart our way to happiness, in the end it is habit that lies at the heart of our successes and our failures. So argues New York Times reporter Charles Duhigg in The Power of Habit: Why We Do What We Do in Life and Business (public library), proposing that the root of adhering to our highest ideals — exercising regularly, becoming more productive, sleeping better, reading more, cultivating the discipline necessary for building successful ventures — is in understanding the science and psychology of how habits work.

Duhigg, whose chief premise echoes many of Timothy Wilson’s insights in Redirect: The Surprising New Science of Psychological Change, takes a deep dive into the bleeding edge of neuroscience and behavioral psychology to explore not only why habits exist in the first place, but also how they can be reprogrammed and optimized.

Duhigg first became fascinated by the power of habit eight years ago, while in Baghdad as a newspaper reporter. There, he met an army major who was conducting a curious experiment in the small town of Kufa: After analyzing taped footage of riots in the area, the major identified a common sequence — first a crowd of Iraqis would gather in the plaza, drawing in spectators and food vendors, then eventually someone would throw a rock and all hell would break loose.

So the major summoned Kufa’s mayor and made a strange request: Get the food vendors out of the plaza. The next time the sequence began to unfold and a crowd started to gather, something different transpired — the crowd snowballed and people started chanting angry slogans, but by dusk, people had gotten hungry and restless. They looked for the familiar kebobs, but they weren’t there. Eventually, the spectators left and the chanters lost steam. By 8PM, everyone was gone.

Upon asking the major how he figured out the clever strategy, Duhigg got the following response: “Understanding habits is the most important thing I’ve learned in the army.”

Originally featured in March.

THE (HONEST) TRUTH ABOUT DISHONESTY

Behavioral economist Dan Ariely belongs to the rare breed of scientists who are both actively engaged in empirical research, running all kinds of fascinating experiments in the lab, and keenly skilled in synthesizing those findings as equally fascinating insights into human nature, then communicating those articulately and engagingly to a non-scientist reader. Adding to his track record> of doing precisely that is The (Honest) Truth About Dishonesty: How We Lie to Everyone — Especially Ourselves (public library), in which Ariely asks a seemingly simple question — “is dishonesty largely restricted to a few bad apples, or is it a more widespread problem?” — and goes on to reveal the surprising, illuminating, often unsettling truths that underpin the uncomfortable answer. Like cruelty, dishonesty turns out to be a remarkably prevalent phenomenon better explained by circumstances and cognitive processes than by concepts like character.

Ariely writes in the introduction:

In addition to exploring the forces that shape dishonesty, one of the main practical benefits of the behavioral economics approach is that it shows us the internal and environmental influences on our behavior. Once we more clearly understand the forces that really drive us, we discover that we are not helpless in the face of our human follies (dishonesty included), that we can restructure our environment, and that by doing so we can achieve better behaviors and outcomes.

Particularly interesting is a chapter on the relationship between creativity and dishonesty. The same habits of mind that allow us to create elaborate ideas turn out to also be responsible for enabling dishonesty and the subsequent rationalizations justifying our immoral behavior. That penchant for justification, in fact — which Ariely places at the “control tower of thinking, reasoning, and morality” — is a powerful driver of how we make decisions towards what we want to do and reverse-engineer them towards what we believe the right thing to do is.

[S]ometimes (perhaps often) we don’t make choices based on our explicit preferences. Instead, we have a gut feeling about what we want, and we go through a process of mental gymnastics, applying all kinds of justifications to manipulate the criteria. That way, we can get what we really want, but at the same time keep up the appearance — to ourselves and to others — that we are acting in accordance with our rational and well-reasoned preferences.

Here’s where it gets interesting:

[T]he difference between creative and less creative individuals comes into play mostly when there is ambiguity in the situation at hand and, with it, more room for justification… Put simply, the link between creativity and dishonesty seems related to the ability to tell ourselves stories about how we are doing the right thing, even when we are not. The more creative we are, the more we are able to come up with good stories that help us justify our selfish interests.

But could it be, Ariely wondered, greater intelligence was responsible for better stories? One experiment measured the brain structure of pathological liars, and compared it to normal controls — more specifically, the ratio of gray matter (the neural tissue that makes up the bulk of our brains) to white matter (the wiring that connects those brain cells). Liars, it turned out, had 14% less gray matter than the controls but had 22-26% more white matter in the prefrontal cortex, suggesting that they were more likely to make connections between different memories and ideas as increased connectivity means greater access to the reserve of associations and memories stored in gray matter. “Intelligence,” it turned out, wasn’t correlated with dishonesty — but creativity, which we already know is all about connecting things, was.

In another experiment, Ariely tested how “moral flexibility” was related to the level of creativity required in different jobs by visiting an ad agency and studying the capacity for dishonesty in representatives of its various departments:

[T]he level of moral flexibility was highly related to the level of creativity required in their department and by their job. Designers and copy-writers were at the top of the moral flexibility scale, and the accountants ranked at the bottom. It seems that when ‘creativity’ is in our job description, we are more likely to say ‘Go for it’ when it comes to dishonest behavior.

Ultimately, Ariely explains the osmotic balance between creativity and dishonestly through our capacity for storytelling:

Just as creativity enables us to envision novel solutions to tough problems, it can also enable us to develop original paths around rules, all the while allowing us to reinterpret information in a self-serving way… [C]reativity can help us tell better stories — stories that allow us to be even more dishonest but still think of ourselves as wonderfully honest people.

Originally featured at length in May.

THE STORYTELLING ANIMAL

“The universe is made of stories, not atoms,” poet Muriel Rukeyser memorably asserted, and Harvard sociobiologist E. O. Wilson recently pointed to the similarity between innovators in art and science, both of whom he called “dreamers and storytellers.” Stories aren’t merely essential to how we understand the world — they are how we understand the world. We weave and seek stories everywhere, from data visualization to children’s illustration to cultural hegemony. In The Storytelling Animal (public library), educator and science writer Jonathan Gottschall traces the roots, both evolutionary and sociocultural, of the transfixing grip storytelling has on our hearts and minds, individually and collectively. What emerges is a kind of “unified theory of storytelling,” revealing not only our gift for manufacturing truthiness in the narratives we tell ourselves and others, but also the remarkable capacity of stories — the right kinds of them — to change our shared experience for the better.

Gottschall articulates a familiar mesmerism:

Human minds yield helplessly to the suction of story. No matter how hard we concentrate, no matter how deep we dig in our heels, we just can’t resist the gravity of alternate worlds.

Joining these favorite book trailers is a wonderful short black-and-white teaser animation:

One particularly important aspect of storytelling Gottschall touches on is the osmotic balance between the writer’s intention and the reader’s interpretation, something Mortimer Adler argued for decades ago in his eloquent case for marginalia. Gottschall writes:

The writer is not … an all-powerful architect of our reading experience. The writer guides the way we imagine but does not determine it. A film begins with a writer producing a screenplay. But it is the director who brings the screenplay to life, filling in most of the details. So it is with any story. A writer lays down words, but they are inert. They need a catalyst to come to life. The catalyst is the reader’s imagination.

In discussing the extent to which we live in stories, Gottschall puts in concrete terms something most of us suspect — fear, perhaps — on an abstract, intuitive level: the astounding amount of time we spend daydreaming.

Clever scientific studies involving beepers and diaries suggest that an average daydream is about fourteen seconds long and that we have about two thousand of them per day. In other words, we spend about half of our waking hours — one-third of our lives on earth — spinning fantasies. We daydream about the past: things we should have said or done, working through our victories and failures. We daydream about mundane stuff such as imagining different ways of handling conflict at work. But we also daydream in a much more intense, storylike way. We screen films with happy endings in our minds, where all our wishes — vain, aggressive, dirty — come true. And we screen little horror films, too, in which our worst fears are realized.

From War and Peace to pro wrestling, from REM sleep to the “fictional screen media” of commercials, from our small serialized personal stories on Facebook and Twitter to the large cultural stories of religious traditions, The Storytelling Animal dives into what science knows — and what it’s still trying to find out — about our propensity for storytelling to reveal not only the science of story but also its seemingly mystical yet palpably present power.

Originally featured in May.

QUIET

Do you feel a pang of guilt when you decline a dinner party invitation in favor of a good book and a cup of tea? Or, worse yet, do you reluctantly accept the invitation even though you’d much rather curl up with the book? You are not alone. In Quiet: The Power of Introverts in a World That Can’t Stop Talking, Susan Cain dissects the anatomy of this socially-induced guilt and delves deep into one of psychology’s most enduring tenets — that the single most important defining aspect of personality is where we fall on the introvert-extrovert spectrum — to break through the “long and storied tradition” of neatly mapping this binary division onto others, like submission and leadership, loneliness and happiness, settling and success.

Cain exposes the much more complicated interplay between these character traits and society’s metrics for fulfillment, exploring how “closeted introverts” — a self-reported one third to one half of people, including cultural icons and legendary entrepreneurs like Gandhi, Abraham Lincoln, Steve Jobs, Bill Gates, and Craig Newmark — are expending enormous energy on trying to pass as extroverts in a culture that rewards extroversion and conflates it with boldness, happiness, sociability, and success.

Introversion — along with its cousins sensitivity, seriousness, and shyness — is now a second-class personality trait, somewhere between a disappointment and a pathology. Introverts living under the Extrovert Ideal are like women living in a man’s world, discounted because it goes to the core of who they are. Extroversion is an enormously appealing personality trait, but we’ve turned it into an oppressive standard to which most of us feel we must conform.”

Ultimately, Cain teases apart not only how and why we internalize society’s extroversion bias very early on, but also how we can reconnect with the valuable qualities implicit to introversion and rethink our hard-wired strengths in a culture that categorizes them as weaknesses.

I had always imagined Rosa Parks as a stately woman with a bold temperament. But when she died in 2005 at the age of ninety-two, the flood of obituaries recalled her as soft-spoken, sweet, and small in stature. They said she was ‘timid and shy’ but had ‘the courage of a lion.’ They were full of phrases like ‘radical humility’ and ‘quiet fortitude.’ What does it mean to be quiet and have fortitude? these descriptions asked implicitly. How could you be shy and courageous?

Originally featured in February.

Honorable mention: Complement Quiet with Frank Partnoy’s Wait: The Art and Science of Delay.

BONUS: MORTALITY

“One should try to write as if posthumously,” Christopher Hitchens famously opined in a New York Public Library talk three days before his fatal cancer diagnosis. “Distrust compassion; prefer dignity for yourself and others,” he advised young contrarians years earlier. How striking, then, becomes the clash between his uncompromising ethos and the equally uncompromising realities of death, recorded in Mortality (public library), his last published work, out this week — a gripping and lucid meditation on death as it was unfolding during Hitch’s last months of life. But what makes the book truly extraordinary is his profound oscillation between his characteristic, proud, almost stubborn self-awareness — that ability to look on with the eye of the critic rather than the experiencing self — and a vulnerability that is so clearly foreign to him, yet so breathlessly inevitable in dying. The ideological rigor with which he approaches his own finality, teasing apart religion and politics and other collective and thus impersonal facets of culture, cracks here and there, subtly at first, letting the discomfort of his brush with the unknown peek through, then gapes wide open to reveal the sheer human terror of ceasing to exist.

To the dumb question ‘Why me?’ the cosmos barely bothers to return the reply: Why not?

Read closer with the original article from September.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

19 NOVEMBER, 2012

The Best Science Books of 2012

By:

From cosmology to cosmic love, or what your biological clock has to do with diagraming evolution.

It’s that time of year again, the time for those highly subjective, grossly non-exhaustive, yet inevitable and invariably fun best-of reading lists. To kick off the season, here are, in no particular order, my ten favorite science books of 2012. (Catch up on last year’s reading list here.)

INTERNAL TIME

“Six hours’ sleep for a man, seven for a woman, and eight for a fool,” Napoleon famously prescribed. (He would have scoffed at Einstein, then, who was known to require ten hours of sleep for optimal performance.) This perceived superiority of those who can get by on less sleep isn’t just something Napoleon shared with dictators like Hitler and Stalin, it’s an enduring attitude woven into our social norms and expectations, from proverbs about early birds to the basic scheduling structure of education and the workplace. But in Internal Time: Chronotypes, Social Jet Lag, and Why You’re So Tired (public library), a fine addition to these 7 essential books on time, German chronobiologist Till Roenneberg demonstrates through a wealth of research that our sleep patterns have little to do with laziness and other such scorned character flaws, and everything to do with biology.

In fact, each of us possesses a different chronotype — an internal timing type best defined by your midpoint of sleep, or midsleep, which you can calculate by dividing your average sleep duration by two and adding the resulting number to your average bedtime on free days, meaning days when your sleep and waking times are not dictated by the demands of your work or school schedule. For instance, if you go to bed at 11 P.M. and wake up at 7 A.M., add four hours to 11pm and you get 3 A.M. as your midsleep.

The distribution of midsleep in Central Europe. The midsleep times (on free days) of over 60 percent of the population fall between 3:30 and 5:30 A.M.

Roenneberg traces the evolutionary roots of different sleep cycles and argues that while earlier chronotypes might have had a social advantage in agrarian and industrial societies, today’s world of time-shift work and constant connectivity has invalidated such advantages but left behind the social stigma around later chronotypes.

This myth that early risers are good people and that late risers are lazy has its reasons and merits in rural societies but becomes questionable in a modern 24/7 society. The old moral is so prevalent, however, that it still dominates our beliefs, even in modern times. The postman doesn’t think for a second that the young man might have worked until the early morning hours because he is a night-shift worker or for other reasons. He labels healthy young people who sleep into the day as lazy — as long sleepers. This attitude is reflected in the frequent use of the word-pair early birds and long sleepers [in the media]. Yet this pair is nothing but apples and oranges, because the opposite of early is late and the opposite of long is short.

Roenneberg goes on to explore sleep duration, a measure of sleep types that complements midsleep, demonstrating just as wide a spectrum of short and long sleepers and debunking the notion that people who get up late sleep longer than others — this judgment, after all, is based on the assumption that everyone goes to bed at the same time, which we increasingly do not.

Sleep duration shows a bell-shaped distribution within a population, but there are more short sleepers (on the left) than long sleepers (on the right).

The disconnect between our internal, biological time and social time — defined by our work schedules and social engagements — leads to what Roenneberg calls social jet lag, a kind of chronic exhaustion resembling the symptoms of jet lag and comparable to having to work for a company a few time zones to the east of your home.

Unlike what happens in real jet lag, people who suffer from social jet lag never leave their home base and can therefore never adjust to a new light-dark environment … While real jet lag is acute and transient, social jet lag is chronic. The amount of social jet lag that an individual is exposed to can be quantified as the difference between midsleep on free days and midsleep on work days … Over 40 percent of the Central European population suffers from social jet lag of two hours or more, and the internal time of over 15 percent is three hours or more out of synch with external time. There is no reason to assume that this would be different in other industrialized nations.

The scissors of sleep. Depending on chronotype, sleep duration can be very different between work days and free days.

Chronotypes vary with age:

Young children are relatively early chronotypes (to the distress of many young parents), and then gradually become later. During puberty and adolescence humans become true night owls, and then around twenty years of age reach a turning point and become earlier again for the rest of their lives. On average, women reach this turning point at nineteen and a half while men start to become earlier again at twenty-one … [T]his clear turning point in the developmental changes of chronotype … [is] the first biological marker for the end of adolescence.

Roenneberg points out that in our culture, there is a great disconnect between teenagers’ biological abilities and our social expectations of them, encapsulated in what is known as the disco hypothesis — the notion that if only teens would go to bed earlier, meaning not party until late, they’d be better able to wake up clear-headed and ready for school at the expected time. The data, however, indicate otherwise — adolescents’ internal time is shifted so they don’t find sleep before the small hours of the night, a pattern also found in the life cycle of rodents.

Here, we brush up against a painfully obtrusive cultural obstacle: School starts early — as early as 7 A.M. in some European countries — and teens are expected to perform well on a schedule not designed with their internal time in mind. As a result, studies have shown that many students show the signs of narcolepsy — a severe sleeping disorder that makes one fall asleep at once when given the chance, immediately entering REM sleep. The implications are worrisome:

Teenagers need around eight to ten hours of sleep but get much less during their workweek. A recent study found that when the starting time of high school is delayed by an hour, the percentage of students who get at least eight hours of sleep per night jumps from 35.7 percent to 50 percent. Adolescent students’ attendance rate, their performance, their motivation, even their eating habits all improve significantly if school times are delayed.

Similar detrimental effects of social jet lag are found in shift work, which Roenneberg calls “one of the most blatant assaults on the body clock in modern society.” (And while we may be tempted to equate shift work with the service industry, any journalist, designer, developer, or artist who works well into the night on deadline can relate — hey, it’s well past midnight again as I’m writing this.) In fact, the World Health Organization recently classified “shift work that involves circadian disruption” as a potential cause of cancer, and the consequences of social jet lag and near-narcolepsy extend beyond the usual suspects of car accidents and medical errors:

We are only beginning to understand the potentially detrimental consequences of social jet lag. One of these has already been worked out with frightening certainty: the more severe the social jet lag that people suffer, the more likely it is that they are smokers. Tis is not a question of quantity (number of cigarettes per day) but simple whether they are smokers or not … Statistically, we experience the worst social jet lag as teenagers, when our body clocks are drastically delayed for biological reasons, but we still have to get up at the same traditional times for school. This coincides with the age when most individuals start smoking. Assuredly there are many different reasons people start smoking at that age, but social jet lag certainly contributes to the risk.

If young people’s psychological and emotional well-being isn’t incentive enough for policy makers — who, by the way, Roenneberg’s research indicates tend to be early chronotypes themselves — to consider later school times, one would think their health should be.

The correlation between social jet lag and smoking continues later in life as well, particularly when it comes to quitting:

[T]he less stress smokers have, the easier it is for them to quit. Social jet lag is stress, so the chances of successfully quitting smoking are higher when the mismatch of internal and external time is smaller. The numbers connecting smoking with social jet lag are striking: Among those who suffer less than an hour of social jet lag per day, we find 15 to 20 percent are smokers. This percentage systematically rises to over 60 percent when internal and external time are more than five hours out of sync.

Another factor contributing to our social jet lag is Daylight Savings Time. Even though DST’s proponents argue that it’s just one small hour, the data suggest that between October and March, DST throws off our body clocks by up to four weeks, depending on our latitude, not allowing our bodies to properly adjust to the time change, especially if we happen to be later chronotypes. The result is increased social jet lag and decreased sleep duration.

But what actually regulates our internal time? Though the temporal structures of sun time — tide, day, month, and year — play a significant role in the lives of all organisms, our biological clocks evolved in a “time-free” world and are somewhat independent of such external stimuli as light and dark. For instance, early botanical studies showed that a mimosa plant kept in a pitch-dark closet would still open and close its leaves the way it does in the day-night cycle, and subsequent studies of human subjects confined to dark bunkers showed similar preservation of their sleep and waking patterns, which followed, albeit imperfectly, the 24-hour cycle of day and night.

Our internal clocks, in fact, can be traced down to the genetic level, with individual “clock genes” and, most prominently, the suprachiasmatic nucleus, or SCN — a small region in the brain’s midline that acts as a kind of “master clock” for mammals, regulating neuronal and hormonal activity around our circadian rhythms. Roenneberg explains how our internal clocks work on the DNA level:

In the nucleus, the DNA sequence of a clock gene is transcribed to mRNA; the resulting message is exported from the nucleus, translated into a clock protein, and is then modified. This clock protein is itself part of the molecular machinery that controls the transcription of its ‘own’ gene. When enough clock proteins have been made, they are imported back into the nucleus, where they start to inhibit the transcription of their own mRNA. Once this inhibition is strong enough, no more mRNA molecules are transcribed, and the existing ones are gradually destroyed. As a consequence, no more proteins can be produced and the existing ones will also gradually be destroyed. When they are all gone, the transcriptional machinery is not suppressed anymore, and a new cycle can begin.

[…]

Despite this complexity, the important take-home message is that daily rhythms are generated by molecular mechanisms that could potentially work in a single cell, for example a single neuron of the SCN.

Internal Time goes on to illuminate many other aspects of how chronotypes and social jet lag impact our daily lives, from birth and suicide rates to when we borrow books from the library to why older men marry younger women, and even why innovators and entrepreneurs tend to have later chronotypes.

Originally featured at length in May.

THE WHERE, THE WHY, AND THE HOW

At the intersection of art and science, The Where, the Why, and the How: 75 Artists Illustrate Wondrous Mysteries of Science (public library) invites some of today’s most celebrated artists to create scientific illustrations and charts to accompany short essays about the most fascinating unanswered questions on the minds of contemporary scientists across biology, astrophysics, chemistry, quantum mechanics, anthropology, and more. The questions cover such mind-bending subjects as whether there are more than three dimensions, why we sleep and dream, what causes depression, how long trees live, and why humans are capable of language.

The images, which come from a mix of well-known titans and promising up-and-comers, including favorites like Lisa Congdon, Gemma Correll, and Jon Klassen, borrow inspiration from antique medical illustrations, vintage science diagrams, and other historical ephemera from periods of explosive scientific curiosity.

Above all, the project is a testament to the idea that ignorance is what drives discovery and wonder is what propels science — a reminder to, as Rilke put it, live the questions and delight in reflecting on the mysteries themselves. The trio urge in the introduction:

With this book, we wanted to bring back a sense of the unknown that has been lost in the age of information. … Remember that before you do a quick online search for the purpose of the horned owl’s horns, you should give yourself some time to wonder.

The motion graphics book trailer is an absolute masterpiece itself:

Pondering the age-old question of why the universe exists, Brian Yanny asks:

Was there an era before our own, out of which our current universe was born? Do the laws of physics, the dimensions of space-time, the strengths and types and asymmetries of nature’s forces and particles, and the potential for life have to be as we observe them, or is there a branching multi-verse of earlier and later epochs filled with unimaginably exotic realms? We do not know.

What existed before the big bang?

Illustrated by Josh Cochran

Exploring how gravity works, Terry Matilsky notes:

[T]he story is not finished. We know that general relativity is not the final answer, because we have not been able to synthesize gravity with the other known laws of physics in a comprehensive “theory of everything.”

How does gravity work?

Illustrated by The Heads of State

Zooming in on the microcosm of our own bodies and their curious behaviors, Jill Conte considers why we blush:

The ruddy or darkened hue of a blush occurs when muscles in the walls of blood vessels within the skin relax and allow more blood to flow. Interestingly, the skin of the blush region contains more blood vessels than do other parts of the body. These vessels are also larger and closer to the surface, which indicates a possible relationship among physiology, emotion, and social communication. While it is known that blood flow to the skin, which serves to feed cells and regulate surface body temperature, is controlled by the sympathetic nervous system, the exact mechanism by which this process is activated specifically to produce a blush remains unknown.

What is dark matter?

Illustrated by Betsy Walton

Equal parts delightful and illuminating, The Where, the Why, and the How is the kind of treat bound to tickle your brain from both sides.

Originally featured in October.

IN PURSUIT OF THE UNKNOWN

When legendary theoretical physicist Stephen Hawking was setting out to release A Brief History of Time, one of the most influential science books in modern history, his publishers admonished him that every equation included would halve the book’s sales. Undeterred, he dared include E = mc², even though cutting it out would have allegedly sold another 10 million copies. The anecdote captures the extent of our culture’s distaste for, if not fear of, equations. And yet, argues mathematician Ian Stewart in In Pursuit of the Unknown: 17 Equations That Changed the World, equations have held remarkable power in facilitating humanity’s progress and, as such, call for rudimentary understanding as a form of our most basic literacy.

Stewart writes:

The power of equations lies in the philosophically difficult correspondence between mathematics, a collective creation of human minds, and an external physical reality. Equations model deep patterns in the outside world. By learning to value equations, and to read the stories they tell, we can uncover vital features of the world around us… This is the story of the ascent of humanity, told in 17 equations.

From how the Pythagorean theorem, which linked geometry and algebra, laid the groundwork of the best current theories of space, time, and gravity to how the Navier-Stokes equation applies to modeling climate change, Stewart delivers a scientist’s gift in a storyteller’s package to reveal how these seemingly esoteric equations are really the foundation for nearly everything we know and use today.

Some the most revolutionary of the breakthroughs Stewart outlines came from thinkers actively interested in both the sciences and the humanities. Take René Descartes, for instance, who is best remembered for his timeless soundbite, Cogito ergo sumI think, therefore I am. But Descartes’ interests, Stewart points out, extended beyond philosophy and into science and mathematics. In 1639, he observed a curious numerical pattern in regular solids — what was true of a cube was also true of a dodecahedron or an icosahedron, for all of which subtracting from the number of faces the number of edges and then adding the number of vertices equaled 2. (Try it: A cube has 6 faces, 12 edges, and 8 vertices, so 6 – 12 + 8 = 2.) But Descartes, perhaps enchanted by philosophy’s grander questions, saw the equation as a minor curiosity and never published it. Only centuries later mathematicians recognized it as monumentally important. It eventually resulted in Euler’s formula, which helps explain everything from how enzymes act on cellular DNA to why the motion of the celestial bodies can be chaotic.

So how did equations begin, anyway? Stewart explains:

An equation derives its power from a simple source. It tells us that two calculations, which appear different, have the same answer. The key symbol is the equals sign, =. The origins of most mathematical symbols are either lost in the mists of antiquity, or are so recent that there is no doubt where they came from. The equals sign is unusual because it dates back more than 450 years, yet we not only know who invented it, we even know why. The inventor was Robert Recorde, in 1557, in The Whetstone of Witte. He used two parallel lines (he used an obsolete word gemowe, meaning ‘twin’) to avoid tedious repetition of the words ‘is equal to’. He chose that symbol because ‘no two things can be more equal’. Recorde chose well. His symbol has remained in use for 450 years.

The original coinage appeared as follows:

To avoide the deiouse repetition of these woordes: is equalle to: I will sette as I doe often in woorke use, a paire of paralleles, or gemowe lines of one lengthe: =, bicause noe .2. thynges, can be moare equalle.

Far from being a mere math primer or trivia aid, In Pursuit of the Unknown is an essential piece of modern literacy, wrapped in an articulate argument for why this kind of knowledge should be precisely that.

Stewart concludes by turning his gaze towards the future, offering a kind of counter-vision to algo-utopians like Stephen Wolfram and making, instead, a case for the reliable humanity of the equation:

It is still entirely credible that we might soon find new laws of nature based on discrete, digital structures and systems. The future may consist of algorithms, not equations. But until that day dawns, if ever, our greatest insights into nature’s laws take the form of equations, and we should learn to understand them and appreciate them. Equations have a track record. They really have changed the world — and they will change it again.

Originally featured in full in April.

IGNORANCE

“Science is always wrong,” George Bernard Shaw famously proclaimed in a toast to Albert Einstein. “It never solves a problem without creating 10 more.”

In the fifth century BC, long before science as we know it existed, Socrates, the very first philosopher, famously observed, “I know one thing, that I know nothing.” Some 21 centuries later, while inventing calculus in 1687, Sir Isaac Newton likely knew all there was to know in science at the time — a time when it was possible for a single human brain to hold all of mankind’s scientific knowledge. Fast-forward 40 generations to today, and the average high school student has more scientific knowledge than Newton did at the end of his life. But somewhere along that superhighway of progress, we seem to have developed a kind of fact-fetishism that shackles us to the allure of the known and makes us indifferent to the unknown knowable. Yet it’s the latter — the unanswered questions — that makes science, and life, interesting. That’s the eloquently argued case at the heart of Ignorance: How It Drives Science, in which Stuart Firestein sets out to debunk the popular idea that knowledge follows ignorance, demonstrating instead that it’s the other way around and, in the process, laying out a powerful manifesto for getting the public engaged with science — a public to whom, as Neil deGrasse Tyson recently reminded Senate, the government is accountable in making the very decisions that shape the course of science.

The tools and currencies of our information economy, Firestein points out, are doing little in the way of fostering the kind of question-literacy essential to cultivating curiosity:

Are we too enthralled with the answers these days? Are we afraid of questions, especially those that linger too long? We seem to have come to a phase in civilization marked by a voracious appetite for knowledge, in which the growth of information is exponential and, perhaps more important, its availability easier and faster than ever.*

(For a promise of a solution, see Clay Johnson’s excellent The Information Diet.)

The cult of expertise — whose currency are static answers — obscures the very capacity for cultivating a thirst for ignorance:

There are a lot of facts to be known in order to be a professional anything — lawyer, doctor, engineer, accountant, teacher. But with science there is one important difference. The facts serve mainly to access the ignorance… Scientists don’t concentrate on what they know, which is considerable but minuscule, but rather on what they don’t know…. Science traffics in ignorance, cultivates it, and is driven by it. Mucking about in the unknown is an adventure; doing it for a living is something most scientists consider a privilege.

[…]

Working scientists don’t get bogged down in the factual swamp because they don’t care all that much for facts. It’s not that they discount or ignore them, but rather that they don’t see them as an end in themselves. They don’t stop at the facts; they begin there, right beyond the facts, where the facts run out. Facts are selected, by a process that is a kind of controlled neglect, for the questions they create, for the ignorance they point to.

Firestein, who chairs the Department of Biological Sciences at Columbia University, stresses that beyond simply accumulating facts, scientists use them as raw material, not finished product. He cautions:

Understanding the raw material for the product is a subtle error but one that can have surprisingly far-reaching consequences. Understanding this error and its ramifications, and setting it straight, is crucial to understanding science.

What emerges is an elegant definition of science:

Real science is a revision in progress, always. It proceeds in fits and starts of ignorance.

(What is true of science is actually also true of all creativity: As Jonah Lehrer puts it “The only way to be creative over time — to not be undone by our expertise — is to experiment with ignorance, to stare at things we don’t fully understand.” Einstein knew that, too, when he noted that without a preoccupation with “the eternally unattainable in the field of art and scientific research, life would have seemed… empty.” And Kathryn Schulz touched on it with her meditation on pessimistic meta-induction.)

In highlighting this commonality science holds with other domains of creative and intellectual labor, Firestein turns to the poet John Keats, who described the ideal state of the literary psyche as Negative Capability — “that is when a man is capable of being in uncertainties, Mysteries, doubts without any irritable reaching after fact & reason.” Firestein translates this to science:

Being a scientist requires having faith in uncertainty, finding pleasure in mystery, and learning to cultivate doubt. There is no surer way to screw up an experiment than to be certain of its outcome.

He captures the heart of this argument in an eloquent metaphor:

Science, then, is not like the onion in the often used analogy of stripping away layer after layer to get at some core, central, fundamental truth. Rather it’s like the magic well: no matter how many buckets of water you remove, there’s always another one to be had. Or even better, it’s like the widening ripples on the surface of a pond, the ever larger circumference in touch with more and more of what’s outside the circle, the unknown. This growing forefront is where science occurs… It is a mistake to bob around in the circle of facts instead of riding the wave to the great expanse lying outside the circle.

However, more important than the limits of our knowledge, Firestein is careful to point out, are the limits to our ignorance. (Cue in Errol Morris’s fantastic 2010 five-part New York Times series, The Anosognosic’s Dilemma.) Science historian and Stanford professor Robert Proctor has even coined a term for the study of ignorance — agnotology — and, Firestein argues, it is a conduit to better understanding progress.

Science historian and philosopher Nicholas Rescher has offered a different term for a similar concept: Copernican cognitivism, suggesting that just like Copernicus showed us there was nothing privileged about our position in space by debunking the geocentric model of the universe, there is also nothing privileged about our cognitive landscape.

But the most memorable articulation of the limits of our own ignorance comes from the Victorian novella Flatland, where a three-dimensional sphere shows up in a two-dimensional land and inadvertently wreaks havoc on its geometric inhabitants’ most basic beliefs about the world as they struggle to imagine the very possibility of a third dimension.

An engagement with the interplay of ignorance and knowledge, the essential bargaining chips of science, is what elevated modern civilization from the intellectual flatness of the Middle Ages. Firestein points out that “the public’s direct experience of the empirical methods of science” helped humanity evolve from the magical and mystical thinking of Western medieval thought to the rational discourse of contemporary culture.

At the same time, Firestein laments, science today is often “as inaccessible to the public as if it were written in classical Latin.” Making it more accessible, he argues, necessitates introducing explanations of science that focus on the unknown as an entry point — a more inclusive gateway than the known.

In one of the most compelling passages of the book, he broadens this insistence on questions over answers to the scientific establishment itself:

Perhaps the most important application of ignorance is in the sphere of education, particularly of scientists… We must ask ourselves how we should educate scientists in the age of Google and whatever will supersede it… The business model of our Universities, in place now for nearly a thousand years, will need to be revised.

[…]

Instead of a system where the collection of facts is an end, where knowledge is equated with accumulation, where ignorance is rarely discussed, we will have to provide the Wiki-raised student with a taste of and for boundaries, the edge of the widening circle of ignorance, how the data, which are not unimportant, frames the unknown. We must teach students how to think in questions, how to manage ignorance. W. B. Yeats admonished that ‘education is not the filling of a pail, but the lighting of a fire.’

Firestein sums it up beautifully:

Science produces ignorance, and ignorance fuels science. We have a quality scale for ignorance. We judge the value of science by the ignorance it defines. Ignorance can be big or small, tractable or challenging. Ignorance can be thought about in detail. Success in science, either doing it or understanding it, depends on developing comfort with the ignorance, something akin to Keats’ negative capability.

Originally featured in April.

* See some thoughts on the difference between access and accessibility.

DREAMLAND

The Ancient Greeks believed that one fell asleep when the brain filled with blood and awakened once it drained back out. Nineteenth-century philosophers contended that sleep happened when the brain was emptied of ambitions and stimulating thoughts. “If sleep doesn’t serve an absolutely vital function, it is the greatest mistake evolution ever made,” biologist Allan Rechtschaffen once remarked. Even today, sleep remains one of the most poorly understood human biological functions, despite some recent strides in understanding the “social jetlag” of our internal clocks and the relationship between dreaming and depression. In Dreamland: Adventures in the Strange Science of Sleep (public library), journalist David K. Randall — who stumbled upon the idea after crashing violently into a wall while sleepwalking — explores “the largest overlooked part of your life and how it affects you even if you don’t have a sleep problem.” From gender differences to how come some people snore and others don’t to why we dream, he dives deep into this mysterious third of human existence to illuminate what happens when night falls and how it impacts every aspect of our days.

Most of us will spend a full third of our lives asleep, and yet we don’t have the faintest idea of what it does for our bodies and our brains. Research labs offer surprisingly few answers. Sleep is one of the dirty little secrets of science. My neurologist wasn’t kidding when he said there was a lot that we don’t know about sleep, starting with the most obvious question of all — why we, and every other animal, need to sleep in the first place.

But before we get too anthropocentrically arrogant in our assumptions, it turns out the quantitative requirement of sleep isn’t correlated with how high up the evolutionary chain an organism is:

Lions and gerbils sleep about thirteen hours a day. Tigers and squirrels nod off for about fifteen hours. At the other end of the spectrum, elephants typically sleep three and a half hours at a time, which seems lavish compared to the hour and a half of shut-eye that the average giraffe gets each night.

[…]

Humans need roughly one hour of sleep for every two hours they are awake, and the body innately knows when this ratio becomes out of whack. Each hour of missed sleep one night will result in deeper sleep the next, until the body’s sleep debt is wiped clean.

What, then, happens as we doze off, exactly? Like all science, our understanding of sleep seems to be a constant “revision in progress”:

Despite taking up so much of life, sleep is one of the youngest fields of science. Until the middle of the twentieth century, scientists thought that sleep was an unchanging condition during which time the brain was quiet. The discovery of rapid eye movements in the 1950s upended that. Researchers then realized that sleep is made up of five distinct stages that the body cycles through over roughly ninety-minute periods. The first is so light that if you wake up from it, you might not realize that you have been sleeping. The second is marked by the appearance of sleep-specific brain waves that last only a few seconds at a time. If you reach this point in the cycle, you will know you have been sleeping when you wake up. This stage marks the last drop before your brain takes a long ride away from consciousness. Stages three and four are considered deep sleep. In three, the brain sends out long, rhythmic bursts called delta waves. Stage four is known as slow-wave sleep for the speed of its accompanying brain waves. The deepest form of sleep, this is the farthest that your brain travels from conscious thought. If you are woken up while in stage four, you will be disoriented, unable to answer basic questions, and want nothing more than to go back to sleep, a condition that researchers call sleep drunkenness. The final stage is REM sleep, so named because of the rapid movements of your eyes dancing against your eyelids. In this stage of sleep, the brain is as active as it is when it is awake. This is when most dreams occur.

(Recall the role of REM sleep in regulating negative emotions.)

Randall’s most urgent point, however, echoes what we’ve already heard from German chronobiologist Till Roenneberg (see above) — in our blind lust for the “luxuries” of modern life, with all its 24-hour news cycles, artificial lighting on demand, and expectations of round-the-clock telecommunications availability, we’ve thrown ourselves into a kind of circadian schizophrenia:

We are living in an age when sleep is more comfortable than ever and yet more elusive. Even the worst dorm-room mattress in America is luxurious compared to sleeping arrangements that were common not long ago. During the Victorian era, for instance, laborers living in workhouses slept sitting on benches, with their arms dangling over a taut rope in front of them. They paid for this privilege, implying that it was better than the alternatives. Families up to the time of the Industrial Revolution engaged in the nightly ritual of checking for rats and mites burrowing in the one shared bedroom. Modernity brought about a drastic improvement in living standards, but with it came electric lights, television, and other kinds of entertainment that have thrown our sleep patterns into chaos.

Work has morphed into a twenty-four-hour fact of life, bringing its own set of standards and expectations when it comes to sleep … Sleep is ingrained in our cultural ethos as something that can be put off, dosed with coffee, or ignored. And yet maintaining a healthy sleep schedule is now thought of as one of the best forms of preventative medicine.

Reflecting on his findings, Randall marvels:

As I spent more time investigating the science of sleep, I began to understand that these strange hours of the night underpin nearly every moment of our lives.

Indeed, Dreamland goes on to explore how sleep — its mechanisms, its absence, its cultural norms — affects everyone from police officers and truck drivers to artists and entrepreneurs, permeating everything from our decision-making to our emotional intelligence.

Originally featured in August.

TREES OF LIFE

Since the dawn of recorded history, humanity has been turning to the visual realm as a sensemaking tool for the world and our place in it, mapping and visualizing everything from the body to the brain to the universe to information itself. Trees of Life: A Visual History of Evolution (public library) catalogs 230 tree-like branching diagrams, culled from 450 years of mankind’s visual curiosity about the living world and our quest to understand the complex ecosystem we share with other organisms, from bacteria to birds, microbes to mammals.

Though the use of a tree as a metaphor for understanding the relationships between organisms is often attributed to Darwin, who articulated it in his Origin of Species by Means of Natural Selection in 1859, the concept, most recently appropriated in mapping systems and knowledge networks, is actually much older, predating the theory of evolution itself. The collection is thus at once a visual record of the evolution of science and of its opposite — the earliest examples, dating as far back as the sixteenth century, portray the mythic order in which God created Earth, and the diagrams’ development over the centuries is as much a progression of science as it is of culture, society, and paradigm.

Theodore W. Pietsch writes in the introduction:

The tree as an iconographic metaphor is perhaps the most universally widespread of all great cultural symbols. Trees appear and reappear throughout human history to illustrate nearly every aspect of life. The structural complexity of a tree — its roots, trunk, bifurcating branches, and leaves — has served as an ideal symbol throughout the ages to visualize and map hierarchies of knowledge and ideas.

The Ladder of Ascent and Descent of the Intellect, not tree-like at first glance, but certainly branching dichotomously, the steps labeled from bottom to top, with representative figures on the right and upper left: Lapis (stone), Flamma (fire), Planta (plant), Brutum (beast), Homo (human), Caelum (sky), Angelus (angel), and Deus (God), a scheme that shows how one might ascend from inferior to superior beings and vice versa. After Ramon Lull (1232–1315), Liber de ascensu et descensu intellectus, written about 1305 but not published until 1512.

The 'Crust of the Earth as Related to Zoology,' presenting, at one glance, the 'distribution of the principle types of animals, and the order of their successive appearance in the layers of the earth’s crust,' published by Louis Agassiz and Augustus Addison Gould as the frontispiece of their 1848 Principles of Zoölogy. The diagram is like a wheel with numerous radiating spokes, each spoke representing a group of animals, superimposed over a series of concentric rings of time, from pre-Silurian to the 'modern age.' According to a divine plan, different groups of animals appear within the various 'spokes' of the wheel and then, in some cases, go extinct. Humans enter only in the outermost layer, at the very top of the diagram, shown as the crowning achievement of all Creation.

'Genealogy of the class of fishes' published by Louis Agassiz in his Recherches sur les poissons fossiles (Research on fossil fishes) of 1844.

The 'genealogico-geographical affinities' of plant families based on the natural orders of Carl Linnaeus (1751), published by Paul Dietrich Giseke in 1792. Each family is represented by a numbered circle (roman numerals), the diameter of which gives a rough measure of the relative number of included genera (arabic numerals).

The unique egg-shaped 'system of animals' published by German zoologist Georg August Goldfuss in his Über de Entwicklungsstufen des Thieres (On animal development) of 1817.

'Universal system of nature,' from Paul Horaninow’s Primae lineae systematis naturae (Primary system of nature) of 1834, an ingenious and seemingly indecipherable clockwise spiral that places animals in the center of the vortex, arranged in a series of concentric circles, surrounded in turn by additional nested circles that contain the plants, nonmetallic minerals, and finally metals within the outermost circle. Not surprisingly, everything is subjugated to humans (Homo) located in the center.

Ernst Haeckel’s famous 'great oak,' a family tree of animals, from the first edition of his 1874 Anthropogenie oder Entwickelungsgeschichte des menschen (The evolution of man).

(More on Haeckel’s striking biological art here.)

Tree by John Henry Schaffner showing the relationships of the flowering plants. The early split at the base of the tree leads to the monocotyledonous plants on the left and the dicotyledons on the right.

Schaffner, 1934, Quarterly Review of Biology, 9(2):150, fig. 2; courtesy of Perry Cartwright and the University of Chicago Press.

A phylogeny of horses showing their geological distribution throughout the Tertiary, by Ruben Arthur Stirton.

Stirton, 1940, plate following page 198; courtesy of Rebecca Wells and the University of California Press.

Ruben Arthur Stirton’s revised view of horse phylogeny.

Stirton, 1959, Time, Life, and Man: The Fossil Record, p. 466, fig. 250; courtesy of Sheik Safdar and John Wiley & Sons, Inc. Used with permission.

William King Gregory’s 1946 tree of rodent relationships.

Gregory, 1951, Evolution Emerging: A Survey of Changing Patterns from Primeval Life to Man, vol. 2, p. 757; fig. 20.33; courtesy of Mary DeJong, Mai Qaraman, and the American Museum of Natural History.

The frontispiece of William King Gregory’s two-volume Evolution Emerging.

Gregory, 1951, Evolution Emerging: A Survey of Changing Patterns from Primeval Life to Man, vol. 2, p. 757; fig. 20.33; courtesy of Mary DeJong, Mai Qaraman, and the American Museum of Natural History.

Originally featured in May.

SPACE CHRONICLES

Neil deGrasse Tyson might be one of today’s most prominent astrophysicists, but he’s also a kind of existential philosopher, bringing his insights from science into the broader realm of the human condition — a kind of modern-day Carl Sagan with a rare gift for blending science and storytelling to both rub neurons with his fellow scientists and engage a popular-interest audience. In Space Chronicles: Facing the Ultimate Frontier, Tyson explores the future of space travel in the wake of NASA’s decision to put human space flight essentially on hold, using his signature wit and scientific prowess to lay out an urgent manifesto for the economic, social, moral, and cultural importance of space exploration. This excerpt from the introduction captures Tyson’s underlying ethos and echoes other great thinkers’ ideas about intuition and rationality, blending the psychosocial with the political:

Some of the most creative leaps ever taken by the human mind are decidedly irrational, even primal. Emotive forces are what drive the greatest artistic and inventive expressions of our species. How else could the sentence ‘He’s either a madman or a genius’ be understood?

It’s okay to be entirely rational, provided everybody else is too. But apparently this state of existence has been achieved only in fiction [where] societal decisions get made with efficiency and dispatch, devoid of pomp, passion, and pretense.

To govern a society shared by people of emotion, people of reason, and everybody in between — as well as people who think their actions are shaped by logic but in fact are shaped by feelings and nonempirical philosophies — you need politics. At its best, politics navigates all the minds-states for the sake of the greater good, alert to the rocky shoals of community, identity, and the economy. At its worst, politics thrives on the incomplete disclosure or misrepresentation of data required by an electorate to make informed decisions, whether arrived at logically or emotionally.

Nowhere does Tyson’s gift shine more brilliantly than in this goosebump-inducing mashup by Max Schlickenmeyer, remixing images of nature at its most inspiring with the narration of Tyson’s answer to a TIME magazine reader, who asked, “What is the most astounding fact you can share with us about the Universe?”

When I look up at the night sky and I know that, yes, we are part of this Universe, we are in this Universe, but perhaps more important than most of those facts is that the Universe is in us. When I reflect on that fact, I look up — many people feel small, because they’re small, the Universe is big — but I feel big, because my atoms came from those stars. There’s a level of connectivity — that’s really what you want in life. You want to feel connected, you want to feel relevant. You want to feel like you’re a participant in the goings on and activities and events around you. That’s precisely what we are, just by being alive.”

Originally featured in March.

HIDDEN TREASURE

For the past 175 years, the The National Library of Medicine in Bethesda has been building the world’s largest collection of biomedical images, artifacts, and ephemera. With more than 17 million items spanning ten centuries, it’s a treasure trove of rare, obscure, extravagant wonders, most of which remain unseen by the public and unknown even to historians, librarians, and curators. Until now.

Hidden Treasure is an exquisite large-format volume that culls some of the most fascinating, surprising, beautiful, gruesome, and idiosyncratic objects from the Library’s collection in 450 full-color illustrations. From rare “magic lantern slides” doctors used to entertain and cure inmates at the St. Elizabeth’s Hospital for the Insane to astonishing anatomical atlases to the mimeographed report of the Japanese medical team first to enter Hiroshima after the atomic blast, each of the curious ephemera is contextualized in a brief essay by a prominent scholar, journalist, artist, collector, or physician. What results is a remarkable journey not only into the evolution of mankind’s understanding of the physicality of being human, but also into the evolution of librarianship itself, amidst the age of the digital humanities.

The Artificial Teledioptric Eye, or Telescope (1685-86) by Johann Zahn

Zahn's baroque diagram of the anatomy of vision (left) needs to be viewed in relation to his creation of a mechanical eye (right), the scioptric ball designed to project the image of the sun in a camera obscura

Printed book, 3 volumes

International Nurse Uniform Photograph Collection (ca. 1950), helene Flud Health Foundation

Left to right, top to bottom: Philippines, Denmark, British Honduras; Hong Kong, Madeira, Kenya; Nepal, Dominican Republic, Colombia

Jersey City, New Jersey. 93 color photographs, glossy

Michael North, Jeffrey Reznick, and Michael Sappol remind us in the introduction:

It’s no secret that nowadays we look for libraries on the Internet — without moving from our desks or laptops or mobile phones… We’re in a new and miraculous age. But there are still great libraries, in cities and on campuses, made of brick, sandstone, marble, and glass, containing physical objects, and especially enshrining the book: the Library of Congress, Bibliotheque Nationale de France, the British Library, the New York Public Library, the Wellcome Library, the great university libraries at Oxford, Harvard, Yale, Johns Hopkins, and elsewhere. And among them is the National LIbrary of Medicine in Bethesda, the world’s largest medical library, with its collection of over 17 million books, journals, manuscripts, prints, photographs, posters, motion pictures, sound recordings, and “ephemera” (pamphlets, matchbook covers, stereograph cards, etc.).

Complete Notes on the Dissection of Cadavers (1772)

Muscles and attachments

Kaishi Hen. Kyoto, Japan. Printed woodblock book, color illustrations

Darwin Collection (1859-1903)

The expression of emotions in cats and dogs, The Expression of Emotions in Man and Animals (London, 1872)

London, New York, and other locations

(Also see how Darwin’s photographic studies of human emotions changed visual culture forever.)

Civil War Surgical Card Collection (1860s)

The Army Medical Museum's staff mined incoming reports for 'interesting' cases -- such as a gunshot would to the 'left side of scalp, denuding skull' or 'gunshot would, right elbow with gangrene supervening' -- and cases that demonstrated the use of difficult surgical techniques, such as an amputation by circular incision or resection of the 'head of humerus and three inches of the left clavicle.'

Washington, DC. 146 numbered cards, with tipped-in photographs and case histories

Studies in Anatomy of the Nervous System and Connective Tissue (1875-76) by Axel Key and Gustaf Retzius

Arachnoid villi, or pacchionian bodies, of the human brain.

Studien in der Anatomie des Nervensystems und des Bindegewebes. Stockholm. Printed book, with color and black-and-white lithographs, 2 volumes.

Anti-Germ Warfare Campaign Posters (ca. 1952), Second People's Cultural Institute

Hand-drawn Korean War propaganda posters, from two incomplete sequence in the collection of Chinese medical and health materials acquired by the National Library of Medicine

Fuping County, Shaanxi Province, China. Hand-inked and painted posters on paper.

Medical Trade Card Collection (ca. 1920-1940s)

The front of a Dr. Miles' Laxative Tablets movable, die-cut advertising novelty card, lowered and raised (Elkhart, Indiana, ca. 1910)

France, Great Britain, Mexico, United States, and other counties. Donor: William Helfand

Thoughtfully curated, beautifully produced, and utterly transfixing, Hidden Treasure unravels our civilization’s relationship with that most human of humannesses. Because try as we might to order the heavens, map the mind, and chart time in our quest to know the abstract, we will have failed at being human if we neglect this most fascinating frontier of concrete existence, the mysterious and ever-alluring physical body.

Originally featured, with more images, in April.

Honorable mention: The Art of Medicine.

QUANTUM UNIVERSE

“The universe is made of stories, not atoms,” poet Muriel Rukeyser famously remarked. “We’re made of star-stuff,” Carl Sagan countered. But some of the most fascinating and important stories are those that explain atoms and “star stuff.” Such is the case of The Quantum Universe: Everything That Can Happen Does Happen by rockstar-physicist Brian Cox and University of Manchester professor Jeff Forshaw — a remarkable and absorbing journey into the fundamental fabric of nature, exploring how quantum theory provides a framework for explaining everything from silicon chips to stars to human behavior.

Quantum theory is perhaps the prime example of the infinitely esoteric becoming the profoundly useful. Esoteric, because it describes a world in which a particle really can be in several places at once and moves from one place to another by exploring the entire Universe simultaneously. Useful, because understanding the behaviour of the smallest building blocks of the universe underpins our understanding of everything else. This claim borders on the hubristic, because the world is filled with diverse and complex phenomena. Notwithstanding this complexity, we have discovered that everything is constructed out of a handful of tiny particles that move around according to the rules of quantum theory. The rules are so simple that they can be summarized on the back of an envelope. And the fact that we do not need a whole library of books to explain the essential nature of things is one of the greatest mysteries of all.

The story weaves a century of scientific hindsight and theoretical developments, from Einstein to Feynman by way of Max Planck, who coined the term “quantum” in 1900 to describe the “black body radiation” of hot objects through light emitted in little packets of energy he called “quanta,” to arrive at a modern perspective on quantum theory and its primary role in predicting observable phenomena.

The picture of the universe we inhabit, as revealed by modern physics, [is] one of underlying simplicity; elegant phenomena dance away out of sight and the diversity of the macroscopic world emerges. This is perhaps the crowning achievement of modern science; the reduction of the tremendous complexity in the world, human beings included, to a description of the behaviour of just a handful of tiny subatomic particles and the four forces that act between them.

To demonstrate that quantum theory is intimately entwined with the fabric of our everyday, rather than a weird and esoteric fringe of science, Cox offers an example rooted in the familiar. (An example, in this particular case, based on the wrong assumption — I was holding an iPad — in a kind of ironic meta-wink from Heisenberg’s uncertainty principle.)

Consider the world around you. You are holding a book made of paper, the crushed pulp of a tree. Trees are machines able to take a supply of atoms and molecules, break them down and rearrange them into cooperating colonies composed of many trillions of individual parts. They do this using a molecule known as chlorophyll, composed of over a hundred carbon, hydrogen and oxygen atoms twisted into an intricate shape with a few magnesium and nitrogen atoms bolted on. This assembly of particles is able to capture the light that has travelled the 93 million miles from our star, a nuclear furnace the volume of a million earths, and transfer that energy into the heart of cells, where it is used to build molecules from carbon dioxide and water, giving out life-enriching oxygen as it does so. It’s these molecular chains that form the superstructure of trees and all living things, the paper in your book. You can read the book and understand the words because you have eyes that can convert the scattered light from the pages into electrical impulses that are interpreted by your brain, the most complex structure we know of in the Universe. We have discovered that all these things are nothing more than assemblies of atoms, and that the wide variety of atoms are constructed using only three particles: electrons, protons and neutrons. We have also discovered that the protons and neutrons are themselves made up of smaller entities called quarks, and that it is where things stop, as far as we can tell today. Underpinning all of this is quantum theory.

But at the core of The Quantum Universe are a handful of grand truths that transcend the realm of science as an academic discipline and shine out into the vastest expanses of human existence: that in science, as in art, everything builds on what came before; that everything is connected to everything else; and, perhaps most importantly, that despite our greatest compulsions for control and certainty, much of the universe — to which the human heart and mind belong — remains reigned over by chance and uncertainty. Cox puts it this way:

A key feature of quantum theory [is that] it deals with probabilities rather than certainties, not because we lack absolute knowledge, but because some aspects of Nature are, at their very heart, governed by the laws of chance.”

Originally featured in February.

BIG QUESTIONS

“If you wish to make an apple pie from scratch,” Carl Sagan famously observed in Cosmos, “you must first invent the universe.” The questions children ask are often so simple, so basic, that they turn unwittingly yet profoundly philosophical in requiring apple-pie-from-scratch type of answers. To explore this fertile intersection of simplicity and expansiveness, Gemma Elwin Harris asked thousands of primary school children between the ages of four and twelve to send in their most restless questions, then invited some of today’s most prominent scientists, philosophers, and writers to answer them. The result is Big Questions from Little People & Simple Answers from Great Minds (public library) — a compendium of fascinating explanations of deceptively simple everyday phenomena, featuring such modern-day icons as Mary Roach, Noam Chomsky, Philip Pullman, Richard Dawkins, and many more, with a good chunk of the proceeds being donated to Save the Children.

Alain de Botton explores why we have dreams:

Most of the time, you feel in charge of your own mind. You want to play with some Lego? Your brain is there to make it happen. You fancy reading a book? You can put the letters together and watch characters emerge in your imagination.

But at night, strange stuff happens. While you’re in bed, your mind puts on the weirdest, most amazing and sometimes scariest shows.

[…]

In the olden days, people believed that our dreams were full of clues about the future. Nowadays, we tend to think that dreams are a way for the mind to rearrange and tidy itself up after the activities of the day.

Why are dreams sometimes scary? During the day, things may happen that frighten us, but we are so busy we don’t have time to think properly about them. At night, while we are sleeping safely, we can give those fears a run around. Or maybe something you did during the day was lovely but you were in a hurry and didn’t give it time. It may pop up in a dream. In dreams, you go back over things you missed, repair what got damaged, make up stories about what you’d love, and explore the fears you normally put to the back of your mind.

Dreams are both more exciting and more frightening than daily life. They’re a sign that our brains are marvellous machines — and that they have powers we don’t often give them credit for, when we’re just using them to do our homework or play a computer game. Dreams show us that we’re not quite the bosses of our own selves.

Evolutionary biologist Richard Dawkins breaks down the math of evolution and cousin marriages to demonstrate that we are all related:

Yes, we are all related. You are a (probably distant) cousin of the Queen, and of the president of the United States, and of me. You and I are cousins of each other. You can prove it to yourself.

Everybody has two parents. That means, since each parent had two parents of their own, that we all have four grandparents. Then, since each grandparent had to have two parents, everyone has eight great-grandparents, and sixteen great- great-grandparents and thirty-two great-great-great-grandparents and so on.

You can go back any number of generations and work out the number of ancestors you must have had that same number of generations ago. All you have to do is multiply two by itself that number of times.

Suppose we go back ten centuries, that is to Anglo-Saxon times in England, just before the Norman Conquest, and work out how many ancestors you must have had alive at that time.

If we allow four generations per century, that’s about forty generations ago.

Two multiplied by itself forty times comes to more than a thousand trillion. Yet the total population of the world at that time was only around three hundred million. Even today the population is seven billion, yet we have just worked out that a thousand years ago your ancestors alone were more than 150 times as numerous.

[…]

The real population of the world at the time of Julius Caesar was only a few million, and all of us, all seven billion of us, are descended from them. We are indeed all related. Every marriage is between more or less distant cousins, who already share lots and lots of ancestors before they have children of their own.
By the same kind of argument, we are distant cousins not only of all human beings but of all animals and plants. You are a cousin of my dog and of the lettuce you had for lunch, and of the next bird that you see fly past the window. You and I share ancestors with all of them. But that is another story.

Neuroscientist David Eagleman explains why we can’t tickle ourselves:

To understand why, you need to know more about how your brain works. One of its main tasks is to try to make good guesses about what’s going to happen next. While you’re busy getting on with your life, walking downstairs or eating your breakfast, parts of your brain are always trying to predict the future.

Remember when you first learned how to ride a bicycle? At first, it took a lot of concentration to keep the handlebars steady and push the pedals. But after a while, cycling became easy. Now you’re not aware of the movements you make to keep the bike going. From experience, your brain knows exactly what to expect so your body rides the bike automatically. Your brain is predicting all the movements you need to make.

You only have to think consciously about cycling if something changes — like if there’s a strong wind or you get a flat tyre. When something unexpected happens like this, your brain is forced to change its predictions about what will happen next. If it does its job well, you’ll adjust to the strong wind, leaning your body so you don’t fall.

Why is it so important for our brains to predict what will happen next? It helps us make fewer mistakes and can even save our lives.

[…]

Because your brain is always predicting your own actions, and how your body will feel as a result, you cannot tickle yourself. Other people can tickle you because they can surprise you. You can’t predict what their tickling actions will be.

And this knowledge leads to an interesting truth: if you build a machine that allows you to move a feather, but the feather moves only after a delay of a second, then you can tickle your- self. The results of your own actions will now surprise you.

Particle physicist and cosmologist Lawrence Krauss explains why we’re all made of stardust:

Everything in your body, and everything you can see around you, is made up of tiny objects called atoms. Atoms come in different types called elements. Hydrogen, oxygen and carbon are three of the most important elements in your body.

[…]

How did those elements get into our bodies? The only way they could have got there, to make up all the material on our Earth, is if some of those stars exploded a long time ago, spew- ing all the elements from their cores into space. Then, about four and a half billion years ago, in our part of our galaxy, the material in space began to collapse. This is how the Sun was formed, and the solar system around it, as well as the material that forms all life on earth.

So, most of the atoms that now make up your body were created inside stars! The atoms in your left hand might have come from a different star from those in your right hand. You are really a child of the stars.

But my favorite answers are to the all-engulfing question, How do we fall in love? Author Jeanette Winterson offers this breathlessly poetic response:

You don’t fall in love like you fall in a hole. You fall like falling through space. It’s like you jump off your own private planet to visit someone else’s planet. And when you get there it all looks different: the flowers, the animals, the colours people wear. It is a big surprise falling in love because you thought you had everything just right on your own planet, and that was true, in a way, but then somebody signalled to you across space and the only way you could visit was to take a giant jump. Away you go, falling into someone else’s orbit and after a while you might decide to pull your two planets together and call it home. And you can bring your dog. Or your cat. Your goldfish, hamster, collection of stones, all your odd socks. (The ones you lost, including the holes, are on the new planet you found.)

And you can bring your friends to visit. And read your favourite stories to each other. And the falling was really the big jump that you had to make to be with someone you don’t want to be without. That’s it.

PS You have to be brave.

Evolutionary psychologist and sociologist Robin Dunbar balances out the poetics with a scientific look at what goes on inside the brain when we love:

What happens when we fall in love is probably one of the most difficult things in the whole universe to explain. It’s something we do without thinking. In fact, if we think about it too much, we usually end up doing it all wrong and get in a terrible muddle. That’s because when you fall in love, the right side of your brain gets very busy. The right side is the bit that seems to be especially important for our emotions. Language, on the other hand, gets done almost completely in the left side of the brain. And this is one reason why we find it so difficult to talk about our feelings and emotions: the language areas on the left side can’t send messages to the emotional areas on the right side very well. So we get stuck for words, unable to describe our feelings.

But science does allow us to say a little bit about what happens when we fall in love. First of all, we know that love sets off really big changes in how we feel. We feel all light-headed and emotional. We can be happy and cry with happiness at the same time. Suddenly, some things don’t matter any more and the only thing we are interested in is being close to the person we have fallen in love with.

These days we have scanner machines that let us watch a person’s brain at work. Different parts of the brain light up on the screen, depending on what the brain is doing. When people are in love, the emotional bits of their brains are very active, lighting up. But other bits of the brain that are in charge of more sensible thinking are much less active than normal. So the bits that normally say ‘Don’t do that because it would be crazy!’ are switched off, and the bits that say ‘Oh, that would be lovely!’ are switched on.

Why does this happen? One reason is that love releases certain chemicals in our brains. One is called dopamine, and this gives us a feeling of excitement. Another is called oxytocin and seems to be responsible for the light-headedness and cosiness we feel when we are with the person we love. When these are released in large quantities, they go to parts of the brain that are especially responsive to them.

But all this doesn’t explain why you fall in love with a particular person. And that is a bit of a mystery, since there seems to be no good reason for our choices. In fact, it seems to be just as easy to fall in love with someone after you’ve married them as before, which seems the wrong way round. And here’s another odd thing. When we are in love, we can trick ourselves into thinking the other person is perfect. Of course, no one is really perfect. But the more perfect we find each other, the longer our love will last.

Big Questions from Little People is a wonderful complement to The Where, the Why, and the How: 75 Artists Illustrate Wondrous Mysteries of Science and is certain to give you pause about much of what you thought you knew, or at the very least rekindle that childlike curiosity about and awe at the basic fabric of the world we live in.

Originally featured earlier this month.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.