Brain Pickings

Posts Tagged ‘best of’

19 NOVEMBER, 2012

The Best Science Books of 2012

By:

From cosmology to cosmic love, or what your biological clock has to do with diagraming evolution.

It’s that time of year again, the time for those highly subjective, grossly non-exhaustive, yet inevitable and invariably fun best-of reading lists. To kick off the season, here are, in no particular order, my ten favorite science books of 2012. (Catch up on last year’s reading list here.)

INTERNAL TIME

“Six hours’ sleep for a man, seven for a woman, and eight for a fool,” Napoleon famously prescribed. (He would have scoffed at Einstein, then, who was known to require ten hours of sleep for optimal performance.) This perceived superiority of those who can get by on less sleep isn’t just something Napoleon shared with dictators like Hitler and Stalin, it’s an enduring attitude woven into our social norms and expectations, from proverbs about early birds to the basic scheduling structure of education and the workplace. But in Internal Time: Chronotypes, Social Jet Lag, and Why You’re So Tired (public library), a fine addition to these 7 essential books on time, German chronobiologist Till Roenneberg demonstrates through a wealth of research that our sleep patterns have little to do with laziness and other such scorned character flaws, and everything to do with biology.

In fact, each of us possesses a different chronotype — an internal timing type best defined by your midpoint of sleep, or midsleep, which you can calculate by dividing your average sleep duration by two and adding the resulting number to your average bedtime on free days, meaning days when your sleep and waking times are not dictated by the demands of your work or school schedule. For instance, if you go to bed at 11 P.M. and wake up at 7 A.M., add four hours to 11pm and you get 3 A.M. as your midsleep.

The distribution of midsleep in Central Europe. The midsleep times (on free days) of over 60 percent of the population fall between 3:30 and 5:30 A.M.

Roenneberg traces the evolutionary roots of different sleep cycles and argues that while earlier chronotypes might have had a social advantage in agrarian and industrial societies, today’s world of time-shift work and constant connectivity has invalidated such advantages but left behind the social stigma around later chronotypes.

This myth that early risers are good people and that late risers are lazy has its reasons and merits in rural societies but becomes questionable in a modern 24/7 society. The old moral is so prevalent, however, that it still dominates our beliefs, even in modern times. The postman doesn’t think for a second that the young man might have worked until the early morning hours because he is a night-shift worker or for other reasons. He labels healthy young people who sleep into the day as lazy — as long sleepers. This attitude is reflected in the frequent use of the word-pair early birds and long sleepers [in the media]. Yet this pair is nothing but apples and oranges, because the opposite of early is late and the opposite of long is short.

Roenneberg goes on to explore sleep duration, a measure of sleep types that complements midsleep, demonstrating just as wide a spectrum of short and long sleepers and debunking the notion that people who get up late sleep longer than others — this judgment, after all, is based on the assumption that everyone goes to bed at the same time, which we increasingly do not.

Sleep duration shows a bell-shaped distribution within a population, but there are more short sleepers (on the left) than long sleepers (on the right).

The disconnect between our internal, biological time and social time — defined by our work schedules and social engagements — leads to what Roenneberg calls social jet lag, a kind of chronic exhaustion resembling the symptoms of jet lag and comparable to having to work for a company a few time zones to the east of your home.

Unlike what happens in real jet lag, people who suffer from social jet lag never leave their home base and can therefore never adjust to a new light-dark environment … While real jet lag is acute and transient, social jet lag is chronic. The amount of social jet lag that an individual is exposed to can be quantified as the difference between midsleep on free days and midsleep on work days … Over 40 percent of the Central European population suffers from social jet lag of two hours or more, and the internal time of over 15 percent is three hours or more out of synch with external time. There is no reason to assume that this would be different in other industrialized nations.

The scissors of sleep. Depending on chronotype, sleep duration can be very different between work days and free days.

Chronotypes vary with age:

Young children are relatively early chronotypes (to the distress of many young parents), and then gradually become later. During puberty and adolescence humans become true night owls, and then around twenty years of age reach a turning point and become earlier again for the rest of their lives. On average, women reach this turning point at nineteen and a half while men start to become earlier again at twenty-one … [T]his clear turning point in the developmental changes of chronotype … [is] the first biological marker for the end of adolescence.

Roenneberg points out that in our culture, there is a great disconnect between teenagers’ biological abilities and our social expectations of them, encapsulated in what is known as the disco hypothesis — the notion that if only teens would go to bed earlier, meaning not party until late, they’d be better able to wake up clear-headed and ready for school at the expected time. The data, however, indicate otherwise — adolescents’ internal time is shifted so they don’t find sleep before the small hours of the night, a pattern also found in the life cycle of rodents.

Here, we brush up against a painfully obtrusive cultural obstacle: School starts early — as early as 7 A.M. in some European countries — and teens are expected to perform well on a schedule not designed with their internal time in mind. As a result, studies have shown that many students show the signs of narcolepsy — a severe sleeping disorder that makes one fall asleep at once when given the chance, immediately entering REM sleep. The implications are worrisome:

Teenagers need around eight to ten hours of sleep but get much less during their workweek. A recent study found that when the starting time of high school is delayed by an hour, the percentage of students who get at least eight hours of sleep per night jumps from 35.7 percent to 50 percent. Adolescent students’ attendance rate, their performance, their motivation, even their eating habits all improve significantly if school times are delayed.

Similar detrimental effects of social jet lag are found in shift work, which Roenneberg calls “one of the most blatant assaults on the body clock in modern society.” (And while we may be tempted to equate shift work with the service industry, any journalist, designer, developer, or artist who works well into the night on deadline can relate — hey, it’s well past midnight again as I’m writing this.) In fact, the World Health Organization recently classified “shift work that involves circadian disruption” as a potential cause of cancer, and the consequences of social jet lag and near-narcolepsy extend beyond the usual suspects of car accidents and medical errors:

We are only beginning to understand the potentially detrimental consequences of social jet lag. One of these has already been worked out with frightening certainty: the more severe the social jet lag that people suffer, the more likely it is that they are smokers. Tis is not a question of quantity (number of cigarettes per day) but simple whether they are smokers or not … Statistically, we experience the worst social jet lag as teenagers, when our body clocks are drastically delayed for biological reasons, but we still have to get up at the same traditional times for school. This coincides with the age when most individuals start smoking. Assuredly there are many different reasons people start smoking at that age, but social jet lag certainly contributes to the risk.

If young people’s psychological and emotional well-being isn’t incentive enough for policy makers — who, by the way, Roenneberg’s research indicates tend to be early chronotypes themselves — to consider later school times, one would think their health should be.

The correlation between social jet lag and smoking continues later in life as well, particularly when it comes to quitting:

[T]he less stress smokers have, the easier it is for them to quit. Social jet lag is stress, so the chances of successfully quitting smoking are higher when the mismatch of internal and external time is smaller. The numbers connecting smoking with social jet lag are striking: Among those who suffer less than an hour of social jet lag per day, we find 15 to 20 percent are smokers. This percentage systematically rises to over 60 percent when internal and external time are more than five hours out of sync.

Another factor contributing to our social jet lag is Daylight Savings Time. Even though DST’s proponents argue that it’s just one small hour, the data suggest that between October and March, DST throws off our body clocks by up to four weeks, depending on our latitude, not allowing our bodies to properly adjust to the time change, especially if we happen to be later chronotypes. The result is increased social jet lag and decreased sleep duration.

But what actually regulates our internal time? Though the temporal structures of sun time — tide, day, month, and year — play a significant role in the lives of all organisms, our biological clocks evolved in a “time-free” world and are somewhat independent of such external stimuli as light and dark. For instance, early botanical studies showed that a mimosa plant kept in a pitch-dark closet would still open and close its leaves the way it does in the day-night cycle, and subsequent studies of human subjects confined to dark bunkers showed similar preservation of their sleep and waking patterns, which followed, albeit imperfectly, the 24-hour cycle of day and night.

Our internal clocks, in fact, can be traced down to the genetic level, with individual “clock genes” and, most prominently, the suprachiasmatic nucleus, or SCN — a small region in the brain’s midline that acts as a kind of “master clock” for mammals, regulating neuronal and hormonal activity around our circadian rhythms. Roenneberg explains how our internal clocks work on the DNA level:

In the nucleus, the DNA sequence of a clock gene is transcribed to mRNA; the resulting message is exported from the nucleus, translated into a clock protein, and is then modified. This clock protein is itself part of the molecular machinery that controls the transcription of its ‘own’ gene. When enough clock proteins have been made, they are imported back into the nucleus, where they start to inhibit the transcription of their own mRNA. Once this inhibition is strong enough, no more mRNA molecules are transcribed, and the existing ones are gradually destroyed. As a consequence, no more proteins can be produced and the existing ones will also gradually be destroyed. When they are all gone, the transcriptional machinery is not suppressed anymore, and a new cycle can begin.

[…]

Despite this complexity, the important take-home message is that daily rhythms are generated by molecular mechanisms that could potentially work in a single cell, for example a single neuron of the SCN.

Internal Time goes on to illuminate many other aspects of how chronotypes and social jet lag impact our daily lives, from birth and suicide rates to when we borrow books from the library to why older men marry younger women, and even why innovators and entrepreneurs tend to have later chronotypes.

Originally featured at length in May.

THE WHERE, THE WHY, AND THE HOW

At the intersection of art and science, The Where, the Why, and the How: 75 Artists Illustrate Wondrous Mysteries of Science (public library) invites some of today’s most celebrated artists to create scientific illustrations and charts to accompany short essays about the most fascinating unanswered questions on the minds of contemporary scientists across biology, astrophysics, chemistry, quantum mechanics, anthropology, and more. The questions cover such mind-bending subjects as whether there are more than three dimensions, why we sleep and dream, what causes depression, how long trees live, and why humans are capable of language.

The images, which come from a mix of well-known titans and promising up-and-comers, including favorites like Lisa Congdon, Gemma Correll, and Jon Klassen, borrow inspiration from antique medical illustrations, vintage science diagrams, and other historical ephemera from periods of explosive scientific curiosity.

Above all, the project is a testament to the idea that ignorance is what drives discovery and wonder is what propels science — a reminder to, as Rilke put it, live the questions and delight in reflecting on the mysteries themselves. The trio urge in the introduction:

With this book, we wanted to bring back a sense of the unknown that has been lost in the age of information. … Remember that before you do a quick online search for the purpose of the horned owl’s horns, you should give yourself some time to wonder.

The motion graphics book trailer is an absolute masterpiece itself:

Pondering the age-old question of why the universe exists, Brian Yanny asks:

Was there an era before our own, out of which our current universe was born? Do the laws of physics, the dimensions of space-time, the strengths and types and asymmetries of nature’s forces and particles, and the potential for life have to be as we observe them, or is there a branching multi-verse of earlier and later epochs filled with unimaginably exotic realms? We do not know.

What existed before the big bang?

Illustrated by Josh Cochran

Exploring how gravity works, Terry Matilsky notes:

[T]he story is not finished. We know that general relativity is not the final answer, because we have not been able to synthesize gravity with the other known laws of physics in a comprehensive “theory of everything.”

How does gravity work?

Illustrated by The Heads of State

Zooming in on the microcosm of our own bodies and their curious behaviors, Jill Conte considers why we blush:

The ruddy or darkened hue of a blush occurs when muscles in the walls of blood vessels within the skin relax and allow more blood to flow. Interestingly, the skin of the blush region contains more blood vessels than do other parts of the body. These vessels are also larger and closer to the surface, which indicates a possible relationship among physiology, emotion, and social communication. While it is known that blood flow to the skin, which serves to feed cells and regulate surface body temperature, is controlled by the sympathetic nervous system, the exact mechanism by which this process is activated specifically to produce a blush remains unknown.

What is dark matter?

Illustrated by Betsy Walton

Equal parts delightful and illuminating, The Where, the Why, and the How is the kind of treat bound to tickle your brain from both sides.

Originally featured in October.

IN PURSUIT OF THE UNKNOWN

When legendary theoretical physicist Stephen Hawking was setting out to release A Brief History of Time, one of the most influential science books in modern history, his publishers admonished him that every equation included would halve the book’s sales. Undeterred, he dared include E = mc², even though cutting it out would have allegedly sold another 10 million copies. The anecdote captures the extent of our culture’s distaste for, if not fear of, equations. And yet, argues mathematician Ian Stewart in In Pursuit of the Unknown: 17 Equations That Changed the World, equations have held remarkable power in facilitating humanity’s progress and, as such, call for rudimentary understanding as a form of our most basic literacy.

Stewart writes:

The power of equations lies in the philosophically difficult correspondence between mathematics, a collective creation of human minds, and an external physical reality. Equations model deep patterns in the outside world. By learning to value equations, and to read the stories they tell, we can uncover vital features of the world around us… This is the story of the ascent of humanity, told in 17 equations.

From how the Pythagorean theorem, which linked geometry and algebra, laid the groundwork of the best current theories of space, time, and gravity to how the Navier-Stokes equation applies to modeling climate change, Stewart delivers a scientist’s gift in a storyteller’s package to reveal how these seemingly esoteric equations are really the foundation for nearly everything we know and use today.

Some the most revolutionary of the breakthroughs Stewart outlines came from thinkers actively interested in both the sciences and the humanities. Take René Descartes, for instance, who is best remembered for his timeless soundbite, Cogito ergo sumI think, therefore I am. But Descartes’ interests, Stewart points out, extended beyond philosophy and into science and mathematics. In 1639, he observed a curious numerical pattern in regular solids — what was true of a cube was also true of a dodecahedron or an icosahedron, for all of which subtracting from the number of faces the number of edges and then adding the number of vertices equaled 2. (Try it: A cube has 6 faces, 12 edges, and 8 vertices, so 6 – 12 + 8 = 2.) But Descartes, perhaps enchanted by philosophy’s grander questions, saw the equation as a minor curiosity and never published it. Only centuries later mathematicians recognized it as monumentally important. It eventually resulted in Euler’s formula, which helps explain everything from how enzymes act on cellular DNA to why the motion of the celestial bodies can be chaotic.

So how did equations begin, anyway? Stewart explains:

An equation derives its power from a simple source. It tells us that two calculations, which appear different, have the same answer. The key symbol is the equals sign, =. The origins of most mathematical symbols are either lost in the mists of antiquity, or are so recent that there is no doubt where they came from. The equals sign is unusual because it dates back more than 450 years, yet we not only know who invented it, we even know why. The inventor was Robert Recorde, in 1557, in The Whetstone of Witte. He used two parallel lines (he used an obsolete word gemowe, meaning ‘twin’) to avoid tedious repetition of the words ‘is equal to’. He chose that symbol because ‘no two things can be more equal’. Recorde chose well. His symbol has remained in use for 450 years.

The original coinage appeared as follows:

To avoide the deiouse repetition of these woordes: is equalle to: I will sette as I doe often in woorke use, a paire of paralleles, or gemowe lines of one lengthe: =, bicause noe .2. thynges, can be moare equalle.

Far from being a mere math primer or trivia aid, In Pursuit of the Unknown is an essential piece of modern literacy, wrapped in an articulate argument for why this kind of knowledge should be precisely that.

Stewart concludes by turning his gaze towards the future, offering a kind of counter-vision to algo-utopians like Stephen Wolfram and making, instead, a case for the reliable humanity of the equation:

It is still entirely credible that we might soon find new laws of nature based on discrete, digital structures and systems. The future may consist of algorithms, not equations. But until that day dawns, if ever, our greatest insights into nature’s laws take the form of equations, and we should learn to understand them and appreciate them. Equations have a track record. They really have changed the world — and they will change it again.

Originally featured in full in April.

IGNORANCE

“Science is always wrong,” George Bernard Shaw famously proclaimed in a toast to Albert Einstein. “It never solves a problem without creating 10 more.”

In the fifth century BC, long before science as we know it existed, Socrates, the very first philosopher, famously observed, “I know one thing, that I know nothing.” Some 21 centuries later, while inventing calculus in 1687, Sir Isaac Newton likely knew all there was to know in science at the time — a time when it was possible for a single human brain to hold all of mankind’s scientific knowledge. Fast-forward 40 generations to today, and the average high school student has more scientific knowledge than Newton did at the end of his life. But somewhere along that superhighway of progress, we seem to have developed a kind of fact-fetishism that shackles us to the allure of the known and makes us indifferent to the unknown knowable. Yet it’s the latter — the unanswered questions — that makes science, and life, interesting. That’s the eloquently argued case at the heart of Ignorance: How It Drives Science, in which Stuart Firestein sets out to debunk the popular idea that knowledge follows ignorance, demonstrating instead that it’s the other way around and, in the process, laying out a powerful manifesto for getting the public engaged with science — a public to whom, as Neil deGrasse Tyson recently reminded Senate, the government is accountable in making the very decisions that shape the course of science.

The tools and currencies of our information economy, Firestein points out, are doing little in the way of fostering the kind of question-literacy essential to cultivating curiosity:

Are we too enthralled with the answers these days? Are we afraid of questions, especially those that linger too long? We seem to have come to a phase in civilization marked by a voracious appetite for knowledge, in which the growth of information is exponential and, perhaps more important, its availability easier and faster than ever.*

(For a promise of a solution, see Clay Johnson’s excellent The Information Diet.)

The cult of expertise — whose currency are static answers — obscures the very capacity for cultivating a thirst for ignorance:

There are a lot of facts to be known in order to be a professional anything — lawyer, doctor, engineer, accountant, teacher. But with science there is one important difference. The facts serve mainly to access the ignorance… Scientists don’t concentrate on what they know, which is considerable but minuscule, but rather on what they don’t know…. Science traffics in ignorance, cultivates it, and is driven by it. Mucking about in the unknown is an adventure; doing it for a living is something most scientists consider a privilege.

[…]

Working scientists don’t get bogged down in the factual swamp because they don’t care all that much for facts. It’s not that they discount or ignore them, but rather that they don’t see them as an end in themselves. They don’t stop at the facts; they begin there, right beyond the facts, where the facts run out. Facts are selected, by a process that is a kind of controlled neglect, for the questions they create, for the ignorance they point to.

Firestein, who chairs the Department of Biological Sciences at Columbia University, stresses that beyond simply accumulating facts, scientists use them as raw material, not finished product. He cautions:

Understanding the raw material for the product is a subtle error but one that can have surprisingly far-reaching consequences. Understanding this error and its ramifications, and setting it straight, is crucial to understanding science.

What emerges is an elegant definition of science:

Real science is a revision in progress, always. It proceeds in fits and starts of ignorance.

(What is true of science is actually also true of all creativity: As Jonah Lehrer puts it “The only way to be creative over time — to not be undone by our expertise — is to experiment with ignorance, to stare at things we don’t fully understand.” Einstein knew that, too, when he noted that without a preoccupation with “the eternally unattainable in the field of art and scientific research, life would have seemed… empty.” And Kathryn Schulz touched on it with her meditation on pessimistic meta-induction.)

In highlighting this commonality science holds with other domains of creative and intellectual labor, Firestein turns to the poet John Keats, who described the ideal state of the literary psyche as Negative Capability — “that is when a man is capable of being in uncertainties, Mysteries, doubts without any irritable reaching after fact & reason.” Firestein translates this to science:

Being a scientist requires having faith in uncertainty, finding pleasure in mystery, and learning to cultivate doubt. There is no surer way to screw up an experiment than to be certain of its outcome.

He captures the heart of this argument in an eloquent metaphor:

Science, then, is not like the onion in the often used analogy of stripping away layer after layer to get at some core, central, fundamental truth. Rather it’s like the magic well: no matter how many buckets of water you remove, there’s always another one to be had. Or even better, it’s like the widening ripples on the surface of a pond, the ever larger circumference in touch with more and more of what’s outside the circle, the unknown. This growing forefront is where science occurs… It is a mistake to bob around in the circle of facts instead of riding the wave to the great expanse lying outside the circle.

However, more important than the limits of our knowledge, Firestein is careful to point out, are the limits to our ignorance. (Cue in Errol Morris’s fantastic 2010 five-part New York Times series, The Anosognosic’s Dilemma.) Science historian and Stanford professor Robert Proctor has even coined a term for the study of ignorance — agnotology — and, Firestein argues, it is a conduit to better understanding progress.

Science historian and philosopher Nicholas Rescher has offered a different term for a similar concept: Copernican cognitivism, suggesting that just like Copernicus showed us there was nothing privileged about our position in space by debunking the geocentric model of the universe, there is also nothing privileged about our cognitive landscape.

But the most memorable articulation of the limits of our own ignorance comes from the Victorian novella Flatland, where a three-dimensional sphere shows up in a two-dimensional land and inadvertently wreaks havoc on its geometric inhabitants’ most basic beliefs about the world as they struggle to imagine the very possibility of a third dimension.

An engagement with the interplay of ignorance and knowledge, the essential bargaining chips of science, is what elevated modern civilization from the intellectual flatness of the Middle Ages. Firestein points out that “the public’s direct experience of the empirical methods of science” helped humanity evolve from the magical and mystical thinking of Western medieval thought to the rational discourse of contemporary culture.

At the same time, Firestein laments, science today is often “as inaccessible to the public as if it were written in classical Latin.” Making it more accessible, he argues, necessitates introducing explanations of science that focus on the unknown as an entry point — a more inclusive gateway than the known.

In one of the most compelling passages of the book, he broadens this insistence on questions over answers to the scientific establishment itself:

Perhaps the most important application of ignorance is in the sphere of education, particularly of scientists… We must ask ourselves how we should educate scientists in the age of Google and whatever will supersede it… The business model of our Universities, in place now for nearly a thousand years, will need to be revised.

[…]

Instead of a system where the collection of facts is an end, where knowledge is equated with accumulation, where ignorance is rarely discussed, we will have to provide the Wiki-raised student with a taste of and for boundaries, the edge of the widening circle of ignorance, how the data, which are not unimportant, frames the unknown. We must teach students how to think in questions, how to manage ignorance. W. B. Yeats admonished that ‘education is not the filling of a pail, but the lighting of a fire.’

Firestein sums it up beautifully:

Science produces ignorance, and ignorance fuels science. We have a quality scale for ignorance. We judge the value of science by the ignorance it defines. Ignorance can be big or small, tractable or challenging. Ignorance can be thought about in detail. Success in science, either doing it or understanding it, depends on developing comfort with the ignorance, something akin to Keats’ negative capability.

Originally featured in April.

* See some thoughts on the difference between access and accessibility.

DREAMLAND

The Ancient Greeks believed that one fell asleep when the brain filled with blood and awakened once it drained back out. Nineteenth-century philosophers contended that sleep happened when the brain was emptied of ambitions and stimulating thoughts. “If sleep doesn’t serve an absolutely vital function, it is the greatest mistake evolution ever made,” biologist Allan Rechtschaffen once remarked. Even today, sleep remains one of the most poorly understood human biological functions, despite some recent strides in understanding the “social jetlag” of our internal clocks and the relationship between dreaming and depression. In Dreamland: Adventures in the Strange Science of Sleep (public library), journalist David K. Randall — who stumbled upon the idea after crashing violently into a wall while sleepwalking — explores “the largest overlooked part of your life and how it affects you even if you don’t have a sleep problem.” From gender differences to how come some people snore and others don’t to why we dream, he dives deep into this mysterious third of human existence to illuminate what happens when night falls and how it impacts every aspect of our days.

Most of us will spend a full third of our lives asleep, and yet we don’t have the faintest idea of what it does for our bodies and our brains. Research labs offer surprisingly few answers. Sleep is one of the dirty little secrets of science. My neurologist wasn’t kidding when he said there was a lot that we don’t know about sleep, starting with the most obvious question of all — why we, and every other animal, need to sleep in the first place.

But before we get too anthropocentrically arrogant in our assumptions, it turns out the quantitative requirement of sleep isn’t correlated with how high up the evolutionary chain an organism is:

Lions and gerbils sleep about thirteen hours a day. Tigers and squirrels nod off for about fifteen hours. At the other end of the spectrum, elephants typically sleep three and a half hours at a time, which seems lavish compared to the hour and a half of shut-eye that the average giraffe gets each night.

[…]

Humans need roughly one hour of sleep for every two hours they are awake, and the body innately knows when this ratio becomes out of whack. Each hour of missed sleep one night will result in deeper sleep the next, until the body’s sleep debt is wiped clean.

What, then, happens as we doze off, exactly? Like all science, our understanding of sleep seems to be a constant “revision in progress”:

Despite taking up so much of life, sleep is one of the youngest fields of science. Until the middle of the twentieth century, scientists thought that sleep was an unchanging condition during which time the brain was quiet. The discovery of rapid eye movements in the 1950s upended that. Researchers then realized that sleep is made up of five distinct stages that the body cycles through over roughly ninety-minute periods. The first is so light that if you wake up from it, you might not realize that you have been sleeping. The second is marked by the appearance of sleep-specific brain waves that last only a few seconds at a time. If you reach this point in the cycle, you will know you have been sleeping when you wake up. This stage marks the last drop before your brain takes a long ride away from consciousness. Stages three and four are considered deep sleep. In three, the brain sends out long, rhythmic bursts called delta waves. Stage four is known as slow-wave sleep for the speed of its accompanying brain waves. The deepest form of sleep, this is the farthest that your brain travels from conscious thought. If you are woken up while in stage four, you will be disoriented, unable to answer basic questions, and want nothing more than to go back to sleep, a condition that researchers call sleep drunkenness. The final stage is REM sleep, so named because of the rapid movements of your eyes dancing against your eyelids. In this stage of sleep, the brain is as active as it is when it is awake. This is when most dreams occur.

(Recall the role of REM sleep in regulating negative emotions.)

Randall’s most urgent point, however, echoes what we’ve already heard from German chronobiologist Till Roenneberg (see above) — in our blind lust for the “luxuries” of modern life, with all its 24-hour news cycles, artificial lighting on demand, and expectations of round-the-clock telecommunications availability, we’ve thrown ourselves into a kind of circadian schizophrenia:

We are living in an age when sleep is more comfortable than ever and yet more elusive. Even the worst dorm-room mattress in America is luxurious compared to sleeping arrangements that were common not long ago. During the Victorian era, for instance, laborers living in workhouses slept sitting on benches, with their arms dangling over a taut rope in front of them. They paid for this privilege, implying that it was better than the alternatives. Families up to the time of the Industrial Revolution engaged in the nightly ritual of checking for rats and mites burrowing in the one shared bedroom. Modernity brought about a drastic improvement in living standards, but with it came electric lights, television, and other kinds of entertainment that have thrown our sleep patterns into chaos.

Work has morphed into a twenty-four-hour fact of life, bringing its own set of standards and expectations when it comes to sleep … Sleep is ingrained in our cultural ethos as something that can be put off, dosed with coffee, or ignored. And yet maintaining a healthy sleep schedule is now thought of as one of the best forms of preventative medicine.

Reflecting on his findings, Randall marvels:

As I spent more time investigating the science of sleep, I began to understand that these strange hours of the night underpin nearly every moment of our lives.

Indeed, Dreamland goes on to explore how sleep — its mechanisms, its absence, its cultural norms — affects everyone from police officers and truck drivers to artists and entrepreneurs, permeating everything from our decision-making to our emotional intelligence.

Originally featured in August.

TREES OF LIFE

Since the dawn of recorded history, humanity has been turning to the visual realm as a sensemaking tool for the world and our place in it, mapping and visualizing everything from the body to the brain to the universe to information itself. Trees of Life: A Visual History of Evolution (public library) catalogs 230 tree-like branching diagrams, culled from 450 years of mankind’s visual curiosity about the living world and our quest to understand the complex ecosystem we share with other organisms, from bacteria to birds, microbes to mammals.

Though the use of a tree as a metaphor for understanding the relationships between organisms is often attributed to Darwin, who articulated it in his Origin of Species by Means of Natural Selection in 1859, the concept, most recently appropriated in mapping systems and knowledge networks, is actually much older, predating the theory of evolution itself. The collection is thus at once a visual record of the evolution of science and of its opposite — the earliest examples, dating as far back as the sixteenth century, portray the mythic order in which God created Earth, and the diagrams’ development over the centuries is as much a progression of science as it is of culture, society, and paradigm.

Theodore W. Pietsch writes in the introduction:

The tree as an iconographic metaphor is perhaps the most universally widespread of all great cultural symbols. Trees appear and reappear throughout human history to illustrate nearly every aspect of life. The structural complexity of a tree — its roots, trunk, bifurcating branches, and leaves — has served as an ideal symbol throughout the ages to visualize and map hierarchies of knowledge and ideas.

The Ladder of Ascent and Descent of the Intellect, not tree-like at first glance, but certainly branching dichotomously, the steps labeled from bottom to top, with representative figures on the right and upper left: Lapis (stone), Flamma (fire), Planta (plant), Brutum (beast), Homo (human), Caelum (sky), Angelus (angel), and Deus (God), a scheme that shows how one might ascend from inferior to superior beings and vice versa. After Ramon Lull (1232–1315), Liber de ascensu et descensu intellectus, written about 1305 but not published until 1512.

The 'Crust of the Earth as Related to Zoology,' presenting, at one glance, the 'distribution of the principle types of animals, and the order of their successive appearance in the layers of the earth’s crust,' published by Louis Agassiz and Augustus Addison Gould as the frontispiece of their 1848 Principles of Zoölogy. The diagram is like a wheel with numerous radiating spokes, each spoke representing a group of animals, superimposed over a series of concentric rings of time, from pre-Silurian to the 'modern age.' According to a divine plan, different groups of animals appear within the various 'spokes' of the wheel and then, in some cases, go extinct. Humans enter only in the outermost layer, at the very top of the diagram, shown as the crowning achievement of all Creation.

'Genealogy of the class of fishes' published by Louis Agassiz in his Recherches sur les poissons fossiles (Research on fossil fishes) of 1844.

The 'genealogico-geographical affinities' of plant families based on the natural orders of Carl Linnaeus (1751), published by Paul Dietrich Giseke in 1792. Each family is represented by a numbered circle (roman numerals), the diameter of which gives a rough measure of the relative number of included genera (arabic numerals).

The unique egg-shaped 'system of animals' published by German zoologist Georg August Goldfuss in his Über de Entwicklungsstufen des Thieres (On animal development) of 1817.

'Universal system of nature,' from Paul Horaninow’s Primae lineae systematis naturae (Primary system of nature) of 1834, an ingenious and seemingly indecipherable clockwise spiral that places animals in the center of the vortex, arranged in a series of concentric circles, surrounded in turn by additional nested circles that contain the plants, nonmetallic minerals, and finally metals within the outermost circle. Not surprisingly, everything is subjugated to humans (Homo) located in the center.

Ernst Haeckel’s famous 'great oak,' a family tree of animals, from the first edition of his 1874 Anthropogenie oder Entwickelungsgeschichte des menschen (The evolution of man).

(More on Haeckel’s striking biological art here.)

Tree by John Henry Schaffner showing the relationships of the flowering plants. The early split at the base of the tree leads to the monocotyledonous plants on the left and the dicotyledons on the right.

Schaffner, 1934, Quarterly Review of Biology, 9(2):150, fig. 2; courtesy of Perry Cartwright and the University of Chicago Press.

A phylogeny of horses showing their geological distribution throughout the Tertiary, by Ruben Arthur Stirton.

Stirton, 1940, plate following page 198; courtesy of Rebecca Wells and the University of California Press.

Ruben Arthur Stirton’s revised view of horse phylogeny.

Stirton, 1959, Time, Life, and Man: The Fossil Record, p. 466, fig. 250; courtesy of Sheik Safdar and John Wiley & Sons, Inc. Used with permission.

William King Gregory’s 1946 tree of rodent relationships.

Gregory, 1951, Evolution Emerging: A Survey of Changing Patterns from Primeval Life to Man, vol. 2, p. 757; fig. 20.33; courtesy of Mary DeJong, Mai Qaraman, and the American Museum of Natural History.

The frontispiece of William King Gregory’s two-volume Evolution Emerging.

Gregory, 1951, Evolution Emerging: A Survey of Changing Patterns from Primeval Life to Man, vol. 2, p. 757; fig. 20.33; courtesy of Mary DeJong, Mai Qaraman, and the American Museum of Natural History.

Originally featured in May.

SPACE CHRONICLES

Neil deGrasse Tyson might be one of today’s most prominent astrophysicists, but he’s also a kind of existential philosopher, bringing his insights from science into the broader realm of the human condition — a kind of modern-day Carl Sagan with a rare gift for blending science and storytelling to both rub neurons with his fellow scientists and engage a popular-interest audience. In Space Chronicles: Facing the Ultimate Frontier, Tyson explores the future of space travel in the wake of NASA’s decision to put human space flight essentially on hold, using his signature wit and scientific prowess to lay out an urgent manifesto for the economic, social, moral, and cultural importance of space exploration. This excerpt from the introduction captures Tyson’s underlying ethos and echoes other great thinkers’ ideas about intuition and rationality, blending the psychosocial with the political:

Some of the most creative leaps ever taken by the human mind are decidedly irrational, even primal. Emotive forces are what drive the greatest artistic and inventive expressions of our species. How else could the sentence ‘He’s either a madman or a genius’ be understood?

It’s okay to be entirely rational, provided everybody else is too. But apparently this state of existence has been achieved only in fiction [where] societal decisions get made with efficiency and dispatch, devoid of pomp, passion, and pretense.

To govern a society shared by people of emotion, people of reason, and everybody in between — as well as people who think their actions are shaped by logic but in fact are shaped by feelings and nonempirical philosophies — you need politics. At its best, politics navigates all the minds-states for the sake of the greater good, alert to the rocky shoals of community, identity, and the economy. At its worst, politics thrives on the incomplete disclosure or misrepresentation of data required by an electorate to make informed decisions, whether arrived at logically or emotionally.

Nowhere does Tyson’s gift shine more brilliantly than in this goosebump-inducing mashup by Max Schlickenmeyer, remixing images of nature at its most inspiring with the narration of Tyson’s answer to a TIME magazine reader, who asked, “What is the most astounding fact you can share with us about the Universe?”

When I look up at the night sky and I know that, yes, we are part of this Universe, we are in this Universe, but perhaps more important than most of those facts is that the Universe is in us. When I reflect on that fact, I look up — many people feel small, because they’re small, the Universe is big — but I feel big, because my atoms came from those stars. There’s a level of connectivity — that’s really what you want in life. You want to feel connected, you want to feel relevant. You want to feel like you’re a participant in the goings on and activities and events around you. That’s precisely what we are, just by being alive.”

Originally featured in March.

HIDDEN TREASURE

For the past 175 years, the The National Library of Medicine in Bethesda has been building the world’s largest collection of biomedical images, artifacts, and ephemera. With more than 17 million items spanning ten centuries, it’s a treasure trove of rare, obscure, extravagant wonders, most of which remain unseen by the public and unknown even to historians, librarians, and curators. Until now.

Hidden Treasure is an exquisite large-format volume that culls some of the most fascinating, surprising, beautiful, gruesome, and idiosyncratic objects from the Library’s collection in 450 full-color illustrations. From rare “magic lantern slides” doctors used to entertain and cure inmates at the St. Elizabeth’s Hospital for the Insane to astonishing anatomical atlases to the mimeographed report of the Japanese medical team first to enter Hiroshima after the atomic blast, each of the curious ephemera is contextualized in a brief essay by a prominent scholar, journalist, artist, collector, or physician. What results is a remarkable journey not only into the evolution of mankind’s understanding of the physicality of being human, but also into the evolution of librarianship itself, amidst the age of the digital humanities.

The Artificial Teledioptric Eye, or Telescope (1685-86) by Johann Zahn

Zahn's baroque diagram of the anatomy of vision (left) needs to be viewed in relation to his creation of a mechanical eye (right), the scioptric ball designed to project the image of the sun in a camera obscura

Printed book, 3 volumes

International Nurse Uniform Photograph Collection (ca. 1950), helene Flud Health Foundation

Left to right, top to bottom: Philippines, Denmark, British Honduras; Hong Kong, Madeira, Kenya; Nepal, Dominican Republic, Colombia

Jersey City, New Jersey. 93 color photographs, glossy

Michael North, Jeffrey Reznick, and Michael Sappol remind us in the introduction:

It’s no secret that nowadays we look for libraries on the Internet — without moving from our desks or laptops or mobile phones… We’re in a new and miraculous age. But there are still great libraries, in cities and on campuses, made of brick, sandstone, marble, and glass, containing physical objects, and especially enshrining the book: the Library of Congress, Bibliotheque Nationale de France, the British Library, the New York Public Library, the Wellcome Library, the great university libraries at Oxford, Harvard, Yale, Johns Hopkins, and elsewhere. And among them is the National LIbrary of Medicine in Bethesda, the world’s largest medical library, with its collection of over 17 million books, journals, manuscripts, prints, photographs, posters, motion pictures, sound recordings, and “ephemera” (pamphlets, matchbook covers, stereograph cards, etc.).

Complete Notes on the Dissection of Cadavers (1772)

Muscles and attachments

Kaishi Hen. Kyoto, Japan. Printed woodblock book, color illustrations

Darwin Collection (1859-1903)

The expression of emotions in cats and dogs, The Expression of Emotions in Man and Animals (London, 1872)

London, New York, and other locations

(Also see how Darwin’s photographic studies of human emotions changed visual culture forever.)

Civil War Surgical Card Collection (1860s)

The Army Medical Museum's staff mined incoming reports for 'interesting' cases -- such as a gunshot would to the 'left side of scalp, denuding skull' or 'gunshot would, right elbow with gangrene supervening' -- and cases that demonstrated the use of difficult surgical techniques, such as an amputation by circular incision or resection of the 'head of humerus and three inches of the left clavicle.'

Washington, DC. 146 numbered cards, with tipped-in photographs and case histories

Studies in Anatomy of the Nervous System and Connective Tissue (1875-76) by Axel Key and Gustaf Retzius

Arachnoid villi, or pacchionian bodies, of the human brain.

Studien in der Anatomie des Nervensystems und des Bindegewebes. Stockholm. Printed book, with color and black-and-white lithographs, 2 volumes.

Anti-Germ Warfare Campaign Posters (ca. 1952), Second People's Cultural Institute

Hand-drawn Korean War propaganda posters, from two incomplete sequence in the collection of Chinese medical and health materials acquired by the National Library of Medicine

Fuping County, Shaanxi Province, China. Hand-inked and painted posters on paper.

Medical Trade Card Collection (ca. 1920-1940s)

The front of a Dr. Miles' Laxative Tablets movable, die-cut advertising novelty card, lowered and raised (Elkhart, Indiana, ca. 1910)

France, Great Britain, Mexico, United States, and other counties. Donor: William Helfand

Thoughtfully curated, beautifully produced, and utterly transfixing, Hidden Treasure unravels our civilization’s relationship with that most human of humannesses. Because try as we might to order the heavens, map the mind, and chart time in our quest to know the abstract, we will have failed at being human if we neglect this most fascinating frontier of concrete existence, the mysterious and ever-alluring physical body.

Originally featured, with more images, in April.

Honorable mention: The Art of Medicine.

QUANTUM UNIVERSE

“The universe is made of stories, not atoms,” poet Muriel Rukeyser famously remarked. “We’re made of star-stuff,” Carl Sagan countered. But some of the most fascinating and important stories are those that explain atoms and “star stuff.” Such is the case of The Quantum Universe: Everything That Can Happen Does Happen by rockstar-physicist Brian Cox and University of Manchester professor Jeff Forshaw — a remarkable and absorbing journey into the fundamental fabric of nature, exploring how quantum theory provides a framework for explaining everything from silicon chips to stars to human behavior.

Quantum theory is perhaps the prime example of the infinitely esoteric becoming the profoundly useful. Esoteric, because it describes a world in which a particle really can be in several places at once and moves from one place to another by exploring the entire Universe simultaneously. Useful, because understanding the behaviour of the smallest building blocks of the universe underpins our understanding of everything else. This claim borders on the hubristic, because the world is filled with diverse and complex phenomena. Notwithstanding this complexity, we have discovered that everything is constructed out of a handful of tiny particles that move around according to the rules of quantum theory. The rules are so simple that they can be summarized on the back of an envelope. And the fact that we do not need a whole library of books to explain the essential nature of things is one of the greatest mysteries of all.

The story weaves a century of scientific hindsight and theoretical developments, from Einstein to Feynman by way of Max Planck, who coined the term “quantum” in 1900 to describe the “black body radiation” of hot objects through light emitted in little packets of energy he called “quanta,” to arrive at a modern perspective on quantum theory and its primary role in predicting observable phenomena.

The picture of the universe we inhabit, as revealed by modern physics, [is] one of underlying simplicity; elegant phenomena dance away out of sight and the diversity of the macroscopic world emerges. This is perhaps the crowning achievement of modern science; the reduction of the tremendous complexity in the world, human beings included, to a description of the behaviour of just a handful of tiny subatomic particles and the four forces that act between them.

To demonstrate that quantum theory is intimately entwined with the fabric of our everyday, rather than a weird and esoteric fringe of science, Cox offers an example rooted in the familiar. (An example, in this particular case, based on the wrong assumption — I was holding an iPad — in a kind of ironic meta-wink from Heisenberg’s uncertainty principle.)

Consider the world around you. You are holding a book made of paper, the crushed pulp of a tree. Trees are machines able to take a supply of atoms and molecules, break them down and rearrange them into cooperating colonies composed of many trillions of individual parts. They do this using a molecule known as chlorophyll, composed of over a hundred carbon, hydrogen and oxygen atoms twisted into an intricate shape with a few magnesium and nitrogen atoms bolted on. This assembly of particles is able to capture the light that has travelled the 93 million miles from our star, a nuclear furnace the volume of a million earths, and transfer that energy into the heart of cells, where it is used to build molecules from carbon dioxide and water, giving out life-enriching oxygen as it does so. It’s these molecular chains that form the superstructure of trees and all living things, the paper in your book. You can read the book and understand the words because you have eyes that can convert the scattered light from the pages into electrical impulses that are interpreted by your brain, the most complex structure we know of in the Universe. We have discovered that all these things are nothing more than assemblies of atoms, and that the wide variety of atoms are constructed using only three particles: electrons, protons and neutrons. We have also discovered that the protons and neutrons are themselves made up of smaller entities called quarks, and that it is where things stop, as far as we can tell today. Underpinning all of this is quantum theory.

But at the core of The Quantum Universe are a handful of grand truths that transcend the realm of science as an academic discipline and shine out into the vastest expanses of human existence: that in science, as in art, everything builds on what came before; that everything is connected to everything else; and, perhaps most importantly, that despite our greatest compulsions for control and certainty, much of the universe — to which the human heart and mind belong — remains reigned over by chance and uncertainty. Cox puts it this way:

A key feature of quantum theory [is that] it deals with probabilities rather than certainties, not because we lack absolute knowledge, but because some aspects of Nature are, at their very heart, governed by the laws of chance.”

Originally featured in February.

BIG QUESTIONS

“If you wish to make an apple pie from scratch,” Carl Sagan famously observed in Cosmos, “you must first invent the universe.” The questions children ask are often so simple, so basic, that they turn unwittingly yet profoundly philosophical in requiring apple-pie-from-scratch type of answers. To explore this fertile intersection of simplicity and expansiveness, Gemma Elwin Harris asked thousands of primary school children between the ages of four and twelve to send in their most restless questions, then invited some of today’s most prominent scientists, philosophers, and writers to answer them. The result is Big Questions from Little People & Simple Answers from Great Minds (public library) — a compendium of fascinating explanations of deceptively simple everyday phenomena, featuring such modern-day icons as Mary Roach, Noam Chomsky, Philip Pullman, Richard Dawkins, and many more, with a good chunk of the proceeds being donated to Save the Children.

Alain de Botton explores why we have dreams:

Most of the time, you feel in charge of your own mind. You want to play with some Lego? Your brain is there to make it happen. You fancy reading a book? You can put the letters together and watch characters emerge in your imagination.

But at night, strange stuff happens. While you’re in bed, your mind puts on the weirdest, most amazing and sometimes scariest shows.

[…]

In the olden days, people believed that our dreams were full of clues about the future. Nowadays, we tend to think that dreams are a way for the mind to rearrange and tidy itself up after the activities of the day.

Why are dreams sometimes scary? During the day, things may happen that frighten us, but we are so busy we don’t have time to think properly about them. At night, while we are sleeping safely, we can give those fears a run around. Or maybe something you did during the day was lovely but you were in a hurry and didn’t give it time. It may pop up in a dream. In dreams, you go back over things you missed, repair what got damaged, make up stories about what you’d love, and explore the fears you normally put to the back of your mind.

Dreams are both more exciting and more frightening than daily life. They’re a sign that our brains are marvellous machines — and that they have powers we don’t often give them credit for, when we’re just using them to do our homework or play a computer game. Dreams show us that we’re not quite the bosses of our own selves.

Evolutionary biologist Richard Dawkins breaks down the math of evolution and cousin marriages to demonstrate that we are all related:

Yes, we are all related. You are a (probably distant) cousin of the Queen, and of the president of the United States, and of me. You and I are cousins of each other. You can prove it to yourself.

Everybody has two parents. That means, since each parent had two parents of their own, that we all have four grandparents. Then, since each grandparent had to have two parents, everyone has eight great-grandparents, and sixteen great- great-grandparents and thirty-two great-great-great-grandparents and so on.

You can go back any number of generations and work out the number of ancestors you must have had that same number of generations ago. All you have to do is multiply two by itself that number of times.

Suppose we go back ten centuries, that is to Anglo-Saxon times in England, just before the Norman Conquest, and work out how many ancestors you must have had alive at that time.

If we allow four generations per century, that’s about forty generations ago.

Two multiplied by itself forty times comes to more than a thousand trillion. Yet the total population of the world at that time was only around three hundred million. Even today the population is seven billion, yet we have just worked out that a thousand years ago your ancestors alone were more than 150 times as numerous.

[…]

The real population of the world at the time of Julius Caesar was only a few million, and all of us, all seven billion of us, are descended from them. We are indeed all related. Every marriage is between more or less distant cousins, who already share lots and lots of ancestors before they have children of their own.
By the same kind of argument, we are distant cousins not only of all human beings but of all animals and plants. You are a cousin of my dog and of the lettuce you had for lunch, and of the next bird that you see fly past the window. You and I share ancestors with all of them. But that is another story.

Neuroscientist David Eagleman explains why we can’t tickle ourselves:

To understand why, you need to know more about how your brain works. One of its main tasks is to try to make good guesses about what’s going to happen next. While you’re busy getting on with your life, walking downstairs or eating your breakfast, parts of your brain are always trying to predict the future.

Remember when you first learned how to ride a bicycle? At first, it took a lot of concentration to keep the handlebars steady and push the pedals. But after a while, cycling became easy. Now you’re not aware of the movements you make to keep the bike going. From experience, your brain knows exactly what to expect so your body rides the bike automatically. Your brain is predicting all the movements you need to make.

You only have to think consciously about cycling if something changes — like if there’s a strong wind or you get a flat tyre. When something unexpected happens like this, your brain is forced to change its predictions about what will happen next. If it does its job well, you’ll adjust to the strong wind, leaning your body so you don’t fall.

Why is it so important for our brains to predict what will happen next? It helps us make fewer mistakes and can even save our lives.

[…]

Because your brain is always predicting your own actions, and how your body will feel as a result, you cannot tickle yourself. Other people can tickle you because they can surprise you. You can’t predict what their tickling actions will be.

And this knowledge leads to an interesting truth: if you build a machine that allows you to move a feather, but the feather moves only after a delay of a second, then you can tickle your- self. The results of your own actions will now surprise you.

Particle physicist and cosmologist Lawrence Krauss explains why we’re all made of stardust:

Everything in your body, and everything you can see around you, is made up of tiny objects called atoms. Atoms come in different types called elements. Hydrogen, oxygen and carbon are three of the most important elements in your body.

[…]

How did those elements get into our bodies? The only way they could have got there, to make up all the material on our Earth, is if some of those stars exploded a long time ago, spew- ing all the elements from their cores into space. Then, about four and a half billion years ago, in our part of our galaxy, the material in space began to collapse. This is how the Sun was formed, and the solar system around it, as well as the material that forms all life on earth.

So, most of the atoms that now make up your body were created inside stars! The atoms in your left hand might have come from a different star from those in your right hand. You are really a child of the stars.

But my favorite answers are to the all-engulfing question, How do we fall in love? Author Jeanette Winterson offers this breathlessly poetic response:

You don’t fall in love like you fall in a hole. You fall like falling through space. It’s like you jump off your own private planet to visit someone else’s planet. And when you get there it all looks different: the flowers, the animals, the colours people wear. It is a big surprise falling in love because you thought you had everything just right on your own planet, and that was true, in a way, but then somebody signalled to you across space and the only way you could visit was to take a giant jump. Away you go, falling into someone else’s orbit and after a while you might decide to pull your two planets together and call it home. And you can bring your dog. Or your cat. Your goldfish, hamster, collection of stones, all your odd socks. (The ones you lost, including the holes, are on the new planet you found.)

And you can bring your friends to visit. And read your favourite stories to each other. And the falling was really the big jump that you had to make to be with someone you don’t want to be without. That’s it.

PS You have to be brave.

Evolutionary psychologist and sociologist Robin Dunbar balances out the poetics with a scientific look at what goes on inside the brain when we love:

What happens when we fall in love is probably one of the most difficult things in the whole universe to explain. It’s something we do without thinking. In fact, if we think about it too much, we usually end up doing it all wrong and get in a terrible muddle. That’s because when you fall in love, the right side of your brain gets very busy. The right side is the bit that seems to be especially important for our emotions. Language, on the other hand, gets done almost completely in the left side of the brain. And this is one reason why we find it so difficult to talk about our feelings and emotions: the language areas on the left side can’t send messages to the emotional areas on the right side very well. So we get stuck for words, unable to describe our feelings.

But science does allow us to say a little bit about what happens when we fall in love. First of all, we know that love sets off really big changes in how we feel. We feel all light-headed and emotional. We can be happy and cry with happiness at the same time. Suddenly, some things don’t matter any more and the only thing we are interested in is being close to the person we have fallen in love with.

These days we have scanner machines that let us watch a person’s brain at work. Different parts of the brain light up on the screen, depending on what the brain is doing. When people are in love, the emotional bits of their brains are very active, lighting up. But other bits of the brain that are in charge of more sensible thinking are much less active than normal. So the bits that normally say ‘Don’t do that because it would be crazy!’ are switched off, and the bits that say ‘Oh, that would be lovely!’ are switched on.

Why does this happen? One reason is that love releases certain chemicals in our brains. One is called dopamine, and this gives us a feeling of excitement. Another is called oxytocin and seems to be responsible for the light-headedness and cosiness we feel when we are with the person we love. When these are released in large quantities, they go to parts of the brain that are especially responsive to them.

But all this doesn’t explain why you fall in love with a particular person. And that is a bit of a mystery, since there seems to be no good reason for our choices. In fact, it seems to be just as easy to fall in love with someone after you’ve married them as before, which seems the wrong way round. And here’s another odd thing. When we are in love, we can trick ourselves into thinking the other person is perfect. Of course, no one is really perfect. But the more perfect we find each other, the longer our love will last.

Big Questions from Little People is a wonderful complement to The Where, the Why, and the How: 75 Artists Illustrate Wondrous Mysteries of Science and is certain to give you pause about much of what you thought you knew, or at the very least rekindle that childlike curiosity about and awe at the basic fabric of the world we live in.

Originally featured earlier this month.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

27 DECEMBER, 2011

The 11 Best Biographies and Memoirs of 2011

By:

Illustrated correspondence, rock’n’roll, and what an old Kurt Vonnegut has to do with a young Hemingway.

After the year’s best children’s books, art and design books, photography books, science books, history books, food books, and psychology and philosophy books, the 2011 best-of series continues with the most compelling, provocative and thought-provoking psychology and philosophy books featured here this year.

STEVE JOBS

In 2004, Steve Jobs asked former TIME Magazine editor and prolific biographer Walter Isaacson to write his biography. Isaacson — who has previously profiled such icons as Albert Einstein, Benjamin Franklin, and Henry Kissinger — thought the request not only presumptuous but also odd for a man of Jobs’s age. What he didn’t know was that Jobs had just been diagnosed with pancreatic cancer and had starkly brushed up against his mortality. Over the next few years, Isaacson ended up having over 40 interviews and conversations with Jobs, from which he gleaned the backbone for Steve Jobs, his highly anticipated biography — perhaps an expected pick for my omnibus of the year’s best biographers and memoirs, yet very much a deserving one, not merely because Jobs was a personal hero who shaped my own intellectual and creative development, but also because beneath the story of Jobs as an individual lies a broader story about the meat of innovation and creativity at large.

He was not a model boss or human being, tidily packaged for emulation. Driven by demons, he could drive those around him to fury and despair. But his personality and passions and products were all interrelated, just as Apple’s hardware and software tended to be, as if part of an integrated system. His tale is thus both instructive and cautionary, filled with lessons about innovation, character, leadership, and values.”

Sample the book through Isaacson’s conversation with Charlie Rose and Nick Bilton’s excellent one-on-one interview with the author.

For a complementary read, see I, Steve: Steve Jobs in His Own Words — a wonderful anthology of more than 200 quotes and excerpts from his many appearances in the media over the years.

RADIOACTIVE

Just when you thought I couldn’t possibly slip Radioactive: Marie & Pierre Curie: A Tale of Love and Fallout into another best-of reading list — it appeared among the year’s best art and design books, best science books, and best history books — here it is, again. But consider this a measure of its merit: In this cross-disciplinary gem, artist Lauren Redniss tells the story of Marie Curie — one of the most extraordinary figures in the history of science, a pioneer in researching radioactivity, a field the very name for which she coined, and not only the first woman to win a Nobel Prize but also the first person to win two Nobel Prizes, and in two different sciences — through the two invisible but immensely powerful forces that guided her life: radioactivity and love. It’s remarkable feat of thoughtful design and creative vision. To honor Curie’s spirit and legacy, Redniss rendered her poetic artwork in cyanotype, an early-20th-century image printing process critical to the discovery of both X-rays and radioactivity itself — a cameraless photographic technique in which paper is coated with light-sensitive chemicals. Once exposed to the sun’s UV rays, this chemically-treated paper turns a deep shade of blue. The text in the book is a unique typeface Redniss designed using the title pages of 18th- and 19th-century manuscripts from the New York Public Library archive. She named it Eusapia LR, for the croquet-playing, sexually ravenous Italian Spiritualist medium whose séances the Curies used to attend. The book’s cover is printed in glow-in-the-dark ink.

It’s also a remarkable feat of thoughtful design and creative vision. To honor Curie’s spirit and legacy, Redniss rendered her poetic artwork in cyanotype, an early-20th-century image printing process critical to the discovery of both X-rays and radioactivity itself — a cameraless photographic technique in which paper is coated with light-sensitive chemicals. Once exposed to the sun’s UV rays, this chemically-treated paper turns a deep shade of blue. The text in the book is a unique typeface Redniss designed using the title pages of 18th- and 19th-century manuscripts from the New York Public Library archive. She named it Eusapia LR, for the croquet-playing, sexually ravenous Italian Spiritualist medium whose séances the Curies used to attend. The book’s cover is printed in glow-in-the-dark ink.

Full review, with more images and Redniss’s TEDxEast talk, here.

AND SO IT GOES

Kurt Vonnegut — prolific author, anarchist, Second Life dweller, imaginary interviewer of the dead. And, apparently, troubled soul. At least that’s what’s behind the curtain Charles Shields (of Mockingbird: A Portrait of Harper Lee fame) peels in And So It Goes, subtitled Kurt Vonnegut: A Life — the first-ever true Vonnegut biography, revealing a vulnerable private man behind the public persona, a difficult and damaged man deeply scarred by his experiences.

The project began in 2006, when Shields reached out to Vonnegut in a letter, asking his permission for a planned biography. Though Vonnegut at first declined, Shields wasn’t ready to take “no” for an answer and eventually persuaded the counterculture hero into a “yes,” spending precious time with Vonnegut and his letters during the last year of the author’s life.

From his uneasy childhood to his tortured divorces to his attempted suicide to his explosion into celebrity, Vonnegut’s life was an intricate osmotic balance between private hell and public performance. As a leading figure in a movement of authors as a public intellectuals and a former PR agent for GE, he knew how to craft an image that would appeal to an audience — an art timelier than ever as we watch some of yesterday’s media pundits voice increasingly disconnected opinions on today’s issues.

He read the signs of what was happening in the country, and he realized that he was going to have to be a lot hipper than a nearly 50-year-old dad in a rumpled cardigan to be a good match with what he was writing about.” ~ Charles Shields

In a lot of ways, Vonnegut was an embodiment of the spirit behind today’s Occupy movement. Shields observes on NPR:

Kurt was a disenchanted American. He believed in America, he believed in its ideals … and he wanted babies to enter a world where they could be treated well, and he wanted to emphasize that people should be kind to one another.”

But Shields makes a special point not to vilify Vonnegut or frame him as cynical. Beneath the discomfort with this new private persona lies a deep respect for the iconic author and the intricate balance between private demons and public creativity, channelled perhaps most eloquently in this quote from Vonnegut himself, printed on the book’s opening page:

I keep losing and regaining my equilibrium, which is the basic plot of all popular fiction. I am myself a work of fiction.”

The downside of And So It Goes is that it perpetuates, all too dangerously in my opinion, the myth of the creative genius as a damaged soul — something Vonnegut’s son has since attacked the book for misportraying. Nonetheless, it remains a powerful, revealing, and ultimately deeply human read.

Originally featured in November.

FELTRON REPORT 2010

Every year since 2005, Nicholas Felton has been publishing his wonderful and entertaining annual reports, which capture the minutia of his life — drinks drunk, trips taken, methods of transportation, mood experienced, and just about everything in between — in clean, beautiful infographics. In 2010, however, Felton lost his father and decided to make his annual report a reconstruction of his father’s life based on calendars, letters, slides, postcards, passports, and other ephemera in his possession. The result is a poignant, beautiful, and tender journey into the adventures and qualities of Felton’s father through the unexpected lens of the quantitative.

The report was printed in a limited-edition run of 3,000 and is long sold out, but you can see it online in its entirety.

AN EMERGENCY IN SLOW MOTION

Iconic photographer Diane Arbus is as known for her stunning, stark black-and-white square photographs of fringe characters — dwarfs, giants, nudists, nuns, transvestites — as she is for her troubled life and its untimely end with suicide at the age of 48. Barely a year after her death, Arbus became the first American photographer represented at the prestigious Venice Biennale. In the highly anticipated biography An Emergency in Slow Motion: The Inner Life of Diane Arbus, also one of the year’s best photography books, psychologist Todd Schultz offers an ambitious “psychobiography” of the misunderstood photographer, probing the darkness of the artist’s mind in an effort to shed new light on her art. Shultz not only got unprecedented access to Arbus’s therapist, but also closely examined some recently released, previously unpublished work and writings by Arbus and, in the process, fought an uphill battle with her estate who, as he puts it, “seem to have this idea that any attempt to interpret the art diminishes the art.”

Schultz explores the mystery of Arbus’s unsettled existence through five key areas of inquiry — her childhood, her penchant for the marginalized, her sexuality, her time in therapy, and her suicide — underpinned by a thoughtful larger narrative about secrets and sex. Ultimately, Schultz’s feat is in exposing the two-sided mirror of Arbus’s lens to reveal how the discomfort her photographs of “freaks” elicited in the viewer was a reflection of her own unease and self-perception as a hopeless outcast.

Identical Twins, Roselle, New Jersey, 1967

Child with Toy Hand Grenade in Central Park, New York City, 1962

Eddie Carmel, Jewish Giant, taken at Home with His Parents in the Bronx, NY, 1970

Poignant and provocative, An Emergency in Slow Motion: The Inner Life of Diane Arbus offers an entirely new way of relating to and understanding one of the most revered and influential postmodern photographers, in the process raising timeless and universal questions about otherness, the human condition, and the quest for making peace with the self.

Originally featured in August.

BOSSYPANTS

It’s hard not to adore Tina Fey, who has had a pretty grand year, from becoming the third female and youngest ever recipient of the Mark Twain Prize for American Humor — and giving a brilliant acceptance speech that unequivocally validates the award — to the publication of Bossypants, her excellent and impossibly funny sort-of-memoir about modern comedy, that whole gender thing and, well, life.

Once in a generation a woman comes along who changes everything. Tina Fey is not that woman, but she met that woman once and acted weird around her.”

In April, Fey brought Bossypants to the fantastic Authors@Google. Besides Fey’s lovable brand of awkward, it’s particularly priceless to watch Google’s Eric Schmidt — who’s had quite a year himself — fumble with various politically incorrect phrases and, you know, “women things.”

Originally featured in April.

YOUNG HEMINGWAY’S LETTERS

Though neither exactly a memoir nor exactly a biography, The Letters of Ernest Hemingway: Volume 1, 1907-1922 captures the lived experience and biographical milestones of the iconic author’s life through the unusual lens of his previously unpublished correspondence. After spending a decade sifting through Hemingway’s correspondence, Penn State professor Sandra Spanier collaborated with Kent State University’s Robert W. Trogdon to curate this first in what will be a series of at least 16 volumes, peeling away at a young Hemingway different, richer, more tender than the machismo-encrusted persona we’ve come to know through his published works.

Though Hemingway had articulated to his wife in the 1950s that he didn’t want his correspondence published, his son, Patrick Hemingway, says these letters could dispel the myth of the writer as a tortured figure and distorted soul, a pop-culture image of his father he feels doesn’t tell a complete and honest story. (Note the contrast with the Vonnegut biography above.)

My principal motive for wanting it to happen was that I think it gives a much better picture of Hemingway’s life than any of his biographers to date […] [My father] was not a tragic figure. He had the misfortune to have mental troubles in old age. Up until that, he was a rather lighthearted and humorous person.” ~ Patrick Hemingway

The letters — lively, quirky, full of doodles and delightfully unusual spellings — cover everything from Hemingway’s childhood in Oak Park, Illinois, to his adventures as an ambulance driver on the Italian front in WWI to the heartbreak of his romance with a Red Cross nurse named Agnes von Kurowsky and his eventual marriage to Hadley Richardson.

From lovers to rivals to his mother, the recipients of the letters each seem to get a different piece of Hemingway, custom-tailored for them not in the hypocritical way of an inauthentic social chameleon but in the way great writers know the heart, mind, and language of their reader. The letters thus become not only a tender homage to this unknown Hemingway, revealing new insights into his creative process along the way, but also a bow before the lost art of letter-writing itself.

Originally featured in October.

LIFE

For the past 10 years, Rolling Stone Keith Richards has been consistently chosen in music magazine list after music magazine list as the rocker most likely to die. And, yet, he hasn’t. Instead, he has recorded his rocking, rolling, riveting story in Life — a formidable 547-page tome of a memoir that traces his tale from his childhood in the grey suburbs of London, to the unlikely formation and rapid rise of the Stones (who, at their peak, didn’t finish a single show in 18 months, playing five to ten minutes before the teenage fans started screaming, then the fainting, then getting piled unconscious on the stage by the security), to the drugs and the disillusionment and the ultimate downfall. Funny, difficult, touching, harrowing, mischievous, the narrative — written with the help of James Fox — spans the entire spectrum of emotion and experience, only to always return to its heart: the love of rock.

You try going into a truck stop in 1964 or ’65 or ’66 down south or in Texas. It felt much more dangerous than anything in the city. You’d walk in and there’s the good ol’ boys and slowly you realize that you’re not going to have a very comfortable meal in there… They’d call us girls because of the long hair. ‘How you doing, girls? Dance with me.’ Hair… the little things that you wouldn’t think about that changed whole cultures.”

Best paired with Patti Smith’s Just Kids, which came out late last year.

FEYNMAN

Legendary iconoclastic physicist Richard Feynman is a longtime favorite, his insights on beauty, honors, and curiosity pure gold. Feynman is a charming, affectionate, and inspiring graphic novel biography from librarian by day, comic nonfictionist by night Jim Ottoviani and illustrator Leland Myrick, also one of the year’s best science books and a fine addition to our 10 favorite masterpieces of graphic nonfiction.

From Feynman’s childhood in Long Island to his work on the Manhattan Project to the infamous Challenger disaster, by way of quantum electrodynamics and bongo drums, the graphic narrative unfolds with equal parts humor and respect as it tells the story of one of the founding fathers of popular physics.

Colorful, vivid, and obsessive, the pages of Feynman exude the famous personality of the man himself, full of immense brilliance, genuine excitement for science, and a healthy dose of snark.

Originally featured, with more images, in October.

MOONWALKING WITH EINSTEIN

Why do we remember, and how? Is there a finite capacity to our memory reservoir? Can we hack our internal memory chip? Those questions are precisely what science writer Joshua Foer sought to unravel when he set out to cover and compete in the U.S. Memory Championship. Moonwalking with Einstein: The Art and Science of Remembering Everything is his fascinating sort-of-memoir, telling the story of his journey as he became enthralled by the secrets of the participants and learned how to play with the pre-wired quirks of the brain, optimizing it to remember information it ordinarily wouldn’t. (It’s also a fine addition to the year’s best psychology and philosophy books.)

The title refers to a memory device I used in the US Memory Championship—specifically it’s a mnemonic that helped me memorize a deck of playing cards. Moonwalking with Einstein works as a mnemonic because it’s such a goofy image. Things that are weird or colorful are the most memorable. If you try to picture Albert Einstein sliding backwards across a dance floor wearing penny loafers and a diamond glove, that’s pretty much unforgettable.” ~ Joshua Foer

In the process of studying these techniques, I learned something remarkable: that there’s far more potential in our minds than we often give them credit for. I’m not just talking about the fact that it’s possible to memorize lots of information using memory techniques. I’m talking about a lesson that is more general, and in a way much bigger: that it’s possible, with training and hard work, to teach oneself to do something that might seem really difficult.” ~ Joshua Foer

Originally featured in March.

FLOATING WORLDS

Between September 1968 and October 1969, Edward Gorey — mid-century illustrator of the macabre, whose work influenced generations of creators, from Nine Inch Nails to Tim Burton — set out to collaborate on three children’s books with author and editor Peter F. Neumeyer. Over the course of this 13-month period, the two exchanged a series of letters on topics that soon expanded well beyond the three books and into everything from metaphysics to pancake recipes.

This year, Neumeyer opened up the treasure trove of this fascinating, never-before-published correspondence in Floating Worlds: The Letters of Edward Gorey and Peter F. Neumeyer — a magnificent collection of 75 typewriter-transcribed letters, 38 stunningly illustrated envelopes, and more than 60 postcards and illustrations exchanged between the two collaborators-turned-close-friends, featuring Gorey’s witty, wise meditations on such eclectic topics as insect life, the writings of Jorge Luis Borges, and Japanese art. Though neither a biography of Gorey nor a memoir by Neumeyer, it’s a delightful and revealing blend of both, full of intellectual banter and magnificent illustrations, and is also one of the year’s finest art and design books.

In light of his body of work, and because of the interest that his private person has aroused, I feel strongly that these letters should not be lost to posterity. I still read in them Ted’s wisdom, charm, and affection and a profound personal integrity that deserves to be in the record. As for my own letters to Ted, I had no idea that he had kept them until one day a couple of years ago when a co-trustee of his estate, Andras Brown, sent me a package of photocopies of my half of the correspondence. I am very grateful for that.” ~ Peter F. Neumeyer

Equally fascinating is the unlikely story of how Gorey and Neumeyer met in the first place — a story involving a hospital waiting room, a watercolor of a housefly, and a one-and-a-half-inch scrap of paper with a dot — and the affectionate friendship into which it unfolded.

There’s a remarkable hue to Gorey’s writing, a kind of thinking-big-thoughts-without-taking-oneself-too-seriously quality. In September of 1968, in what he jokingly termed “E. Gorey’s Great Simple Theory About Art,” Gorey wrote these Yodaesque words:

This is the theory… that anything that is art… is presumably about some certain thing, but is really always about something else, and it’s no good having one without the other, because if you just have the something it is boring and if you just have the something else it’s irritating.”

Illustrations © The Edward Gorey Charitable Trust. All rights reserved.

Originally featured, with more wonderful illustrations, in September.

Donating = Loving

In 2011, bringing you (ad-free) Brain Pickings took more than 5,000 hours. If you found any joy and stimulation here this year, please consider a modest donation.





Brain Pickings has a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

22 DECEMBER, 2011

The 11 Best Psychology and Philosophy Books of 2011

By:

What it means to be human, how pronouns are secretly shaping our lives, and why we believe.

After the year’s best children’s books, art and design books, photography books, science books, history books, and food books, the 2011 best-of series continues with the most compelling, provocative and thought-provoking psychology and philosophy books featured here this year.

YOU ARE NOT SO SMART

We spend most of our lives going around believing we are rational, logical beings who make carefully weighted decisions based on objective facts in stable circumstances. Of course, as both a growing body of research and our own retrospective experience demonstrate, this couldn’t be further from the truth. For the past three years, David McRaney’s cheekily titled yet infinitely intelligent You Are Not So Smart has been one of my favorite smart blogs, tirelessly debunking the many ways in which our minds play tricks on us and the false interpretations we have of those trickeries. This month, YANSS joins my favorite blog-turned-book success stories with You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, and 46 Other Ways You’re Deluding Yourself — an illuminating and just the right magnitude of uncomfortable almanac of some of the most prevalent and enduring lies we tell ourselves.

The original trailer for the book deals with something the psychology of which we’ve previously explored — procrastination:

And this excellent alternative trailer is a straight shot to our favorite brilliant book trailers:

From confirmation bias — our tendency to seek out information, whether or not it’s true, that confirms our existing beliefs, something all the more perilous in the age of the filter bubble — to Dunbar’s Number, our evolution-imposed upper limit of 150 friends, which pulls into question those common multi-hundred Facebook “friendships,” McRaney blends the rigor of his career as a journalist with his remarkable penchant for synthesis, humanizing some of the most important psychology research of the past century and framing it in the context of our daily lives.

Despite his second-person directive narrative, McRaney manages to keep his tone from being preachy or patronizing, instead weaving an implicit “we” into his “you” to encompass all our shared human fallibility.

From the greatest scientist to the most humble artisan, every brain within every body is infested with preconceived notions and patterns of thought that lead it astray without the brain knowing it. So you are in good company. No matter who your idols and mentors are, they too are prone to spurious speculation.” ~ David McRaney

And in the age of Books That Should’ve Stayed Articles, it’s refreshing to see McRaney distill each of these complex phenomena in articulate, lucid narratives just the right length to be stimulating without being tediously prolix.

Originally featured in November.

MONOCULTURE

“The universe is made of stories, not atoms,” poet Muriel Rukeyser famously proclaimed. The stories we tell ourselves and each other are how we make sense of the world and our place in it. Some stories become so sticky, so pervasive that we internalize them to a point where we no longer see their storiness — they become not one of many lenses on reality, but reality itself. And breaking through them becomes exponentially difficult because part of our shared human downfall is our ego’s blind conviction that we’re autonomous agents acting solely on our own volition, rolling our eyes at any insinuation we might be influenced by something external to our selves. Yet we are — we’re infinitely influenced by these stories we’ve come to internalize, stories we’ve heard and repeated so many times they’ve become the invisible underpinning of our entire lived experience.

That’s exactly what F. S. Michaels explores in Monoculture: How One Story Is Changing Everything — a provocative investigation of the dominant story of our time and how it’s shaping six key areas of our lives: our work, our relationships with others and the natural world, our education, our physical and mental health, our communities, and our creativity.

The governing pattern a culture obeys is a master story– one narrative in society that takes over the others, shrinking diversity and forming a monoculture. When you’re inside a master story at a particular time in history, you tend to accept its definition of reality. You unconsciously believe and act on certain things, and disbelieve and fail to act on other things. That’s the power of the monoculture; it’s able to direct us without us knowing too much about it.” ~ F. S. Michaels

During the Middle Ages, the dominant monoculture was one of religion and superstition. When Galileo challenged the Catholic Church’s geocentricity with his heliocentric model of the universe, he was accused of heresy and punished accordingly, but he did spark the drawn of the next monoculture, which reached a tipping point in the seventeenth century as humanity came to believe the world was fully knowable and discoverable through science, machines and mathematics — the scientific monoculture was born.

Ours, Micheals demonstrates, is a monoculture shaped by economic values and assumptions, and it shapes everything from the obvious things (our consumer habits, the music we listen to, the clothes we wear) to the less obvious and more uncomfortable to relinquish the belief of autonomy over (our relationships, our religion, our appreciation of art).

A monoculture doesn’t mean that everyone believes exactly the same thing or acts in exactly the same way, but that we end up sharing key beliefs and assumptions that direct our lives. Because a monoculture is mostly left unarticulated until it has been displaced years later, we learn its boundaries by trial and error. We somehow come to know how the mater story goes, though no one tells us exactly what the story is or what its rules are. We develop a strong sense of what’s expected of us at work, in our families and communities — even if we sometimes choose not to meet those expectations. We usually don’t ask ourselves where those expectations came from in the first place. They just exist — or they do until we find ourselves wishing things were different somehow, though we can’t say exactly what we would change, or how.” ~ F. S. Michaels

Neither a dreary observation of all the ways in which our economic monoculture has thwarted our ability to live life fully and authentically nor a blindly optimistic sticking-it-to-the-man kumbaya, Michaels offers a smart and realistic guide to first recognizing the monoculture and the challenges of transcending its limitations, then considering ways in which we, as sentient and autonomous individuals, can move past its confines to live a more authentic life within a broader spectrum of human values.

The independent life begins with discovering what it means to live alongside the monoculture, given your particular circumstances, in your particular life and time, which will not be duplicated for anyone else. Out of your own struggle to live an independent life, a parallel structure may eventually be birthed. But the development and visibility of that parallel structure is not the goal — the goal is to live many stories, within a wider spectrum of human values.” ~ F. S. Michaels

We’ve previously examined various aspects of this dominant story — why we choose what we choose, how the media’s filter bubble shapes our worldview, why we love whom and how we love, how money came to rule the world — but Monoculture, which comes from the lovely Red Clover, weaves these threads and many more into a single lucid narrative that’s bound to first make you somewhat uncomfortable and insecure, then give you the kind of pause from which you can step back and move forward with more autonomy, authenticity and mindfulness than ever.

The book’s epilogue captures Michaels’ central premise in the most poetic and beautiful way possible:

Once we’ve thrown off our habitual paths, we think all is lost; but it’s only here that the new and the good begins.” ~ Leo Tolstoy

Originally featured in September.

THINKING, FAST AND SLOW

Legendary Israeli-American psychologist Daniel Kahneman is one of the most influential thinkers of our time. A Nobel laureate and founding father of modern behavioral economics, his work has shaped how we think about human error, risk, judgement, decision-making, happiness, and more. For the past half-century, he has profoundly impacted the academy and the C-suite, but it wasn’t until this year’s highly anticipated release of his “intellectual memoir,” Thinking, Fast and Slow, that Kahneman’s extraordinary contribution to humanity’s cerebral growth reached the mainstream — in the best way possible.

Absorbingly articulate and infinitely intelligent, this “intellectual memoir” introduces what Kahneman calls the machinery of the mind — the dual processor of the brain, divided into two distinct systems that dictate how we think and make decisions. One is fast, intuitive, reactive, and emotional. (If you’ve read Jonathan Haidt’s excellent The Happiness Hypothesis, as you should have, this system maps roughly to the metaphor of the elephant.) The other is slow, deliberate, methodical, and rational. (That’s Haidt’s rider.)

The mind functions thanks to a delicate, intricate, sometimes difficult osmotic balance between the two systems, a push and pull responsible for both our most remarkable capabilities and our enduring flaws. From the role of optimism in entrepreneurship to the heuristics of happiness to our propensity for error, Kahneman covers an extraordinary scope of cognitive phenomena to reveal a complex and fallible yet, somehow comfortingly so, understandable machine we call consciousness.

Much of the discussion in this book is about biases of intuition. However, the focus on error does not denigrate human intelligence, any more than the attention to diseases in medical texts denies good health… [My aim is to] improve the ability to identify and understand errors of judgment and choice, in others and eventually in ourselves, by providing a richer and more precise language to discuss them.” ~ Daniel Kahneman

Among the book’s most fascinating facets are the notions of the experiencing self and the remembering self, underpinning the fundamental duality of the human condition — one voiceless and immersed in the moment, the other occupied with keeping score and learning from experience.

I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.” ~ Daniel Kahneman

Kahneman spoke of these two selves and the cognitive traps around them in his fantastic 2010 TED talk:

The word happiness is just not a useful word anymore because we apply it to too many different things.”

What’s most enjoyable and compelling about Thinking, Fast and Slow is that it’s so utterly, refreshingly anti-Gladwellian. There is nothing pop about Kahneman’s psychology, no formulaic story arc, no beating you over the head with an artificial, buzzword-encrusted Big Idea. It’s just the wisdom that comes from five decades of honest, rigorous scientific work, delivered humbly yet brilliantly, in a way that will forever change the way you think about thinking.

Originally featured in October.

THE SECRET LIFE OF PRONOUNS

We’re social beings wired for communicating with one another, and as new modes and platforms of communication become available to us, so do new ways of understanding the complex patterns, motivations and psychosocial phenomena that underpin that communication. That’s exactly what social psychologist and language expert James W. Pennebaker explores in The Secret Life of Pronouns: What Our Words Say About Us — a fascinating look at what Pennebaker’s groundbreaking research in computational linguistics reveals about our emotions, our sense of self, and our perception of our belonging in society. Analyzing the subtle linguistic patterns in everything from Craigslist ads to college admission essays to political speeches to Lady Gaga lyrics, Pennebaker offers hard evidence for the insight that our most unmemorable words — pronouns, prepositions, prefixes — can be most telling of true sentiment and intention.

Both a fascinating slice of human psychology and a practical toolkit for deciphering our everyday email exchanges, tweets and Facebook statuses, the research looks at what our choice of words like “I,” “she,” “mine” and “who” reveals about our deeper thoughts, emotions and motivations — and those of the people with whom we communicate.

One of the most interesting results was part of a study my students and I conducted dealing with status in email correspondence. Basically, we discovered that in any interaction, the person with the higher status uses I-words less (yes, less) than people who are low in status.” ~ James Pennebaker

Like much of scientific discovery, Pennebaker’s interest in pronouns began as a complete fluke — in the 1980s, he and his students discovered when asked to write about emotional upheavals, people’s physical health improved, indicating that putting emotional experiences into language changed the ways people thought about their upheavals. They eventually developed a computerized text analysis program to examine how language use might predict later health improvements, trying to find out whether there was a “healthy” way to write. To his surprise, the greatest predictor of health was people’s choice of pronouns.

Scientific American has an excellent interview with Pennebaker:

As I pondered these findings, I started looking at how people used pronouns in other texts — blogs, emails, speeches, class writing assignments, and natural conversation. Remarkably, how people used pronouns was correlated with almost everything I studied. For example, use of first-person singular pronouns (I, me, my) was consistently related to gender, age, social class, honesty, status, personality, and much more. Although the findings were often robust, people in daily life were unable to pick them up when reading or listening to others. It was almost as if there was a secret world of pronouns that existed outside our awareness.” ~ James Pennebaker

From gender differences that turn everything you know on its head to an analysis of the language of suicidal vs. non-suicidal poets to unexpected insights into famous historical documents, The Secret Life of Pronouns gleans insights with infinite applications, from government-level lie-detection to your everyday email inbox, and makes a fine addition to these 5 essential books on language.

Originally featured in September.

INCOGNITO

Sum: Forty Tales from the Afterlives by neuroscientist David Eagleman is one of my favorite books of the past few years, so I was thrilled for the release of Eagleman’s latest gem, Incognito: The Secret Lives of the Brain — a fascinating, dynamic, faceted look under the hood of the conscious mind to reveal the complex machinery of the subconscious. Equal parts entertaining and illuminating, the book’s case studies, examples, and insight are more than mere talking points to impressed at the next dinner party, poised instead to radically shift your understanding of the world, other people, and your own mind.

Bringing a storyteller’s articulate and fluid narrative to a scientist’s quest, Eagleman dances across an incredible spectrum of issues — brain damage, dating, drugs, beauty, synesthesia, criminal justice, artificial intelligence, optical illusions and much more — to reveal that things we take as passive givens, from our capacity for seeing a rainbow to our ability to overhear our name in a conversation we weren’t paying attention to, are the function of remarkable neural circuitry, biological wiring and cognitive conditioning.

The three-pound organ in your skull — with its pink consistency of Jell-o — is an alien kind of computational material. It is composed of miniaturized, self-configuring parts, and it vastly outstrips anything we’ve dreamt of building. So if you ever feel lazy or dull, take heart: you’re the busiest, brightest thing on the planet.” ~ David Eagleman

Sample some of Eagleman’s fascinating areas of study with this excellent talk from TEDxAlamo:

Originally featured in June.

WHAT DOES IT MEAN TO BE HUMAN?

Last year, we explored what it means to be human from the perspectives of three different disciplines — philosophy, neuroscience, and evolutionary biology — and that omnibus went on to become one of the most-read articles in Brain Pickings history. But the question at its heart is among the most fundamental inquiries of existence, one that has puzzled, tormented, and inspired humanity for centuries. That is exactly what Joanna Bourke (of Fear: A Cultural History fame) explores in What It Means to Be Human: Historical Reflections from the 1800s to the Present.

Decades before women sought liberation in the bicycle or their biceps, a more rudimentary liberation was at stake. The book opens with a letter penned in 1872 by an anonymous author identified simply as “An Earnest Englishwoman,” a letter titled “Are Women Animals?” by the newspaper editor who printed it:

Sir, —

Whether women are the equals of men has been endlessly debated; whether they have souls has been a moot point; but can it be too much to ask [for a definitive acknowledgement that at least they are animals?… Many hon. members may object to the proposed Bill enacting that, in statutes respecting the suffrage, ‘wherever words occur which import the masculine gender they shall be held to include women;’ but could any object to the insertion of a clause in another Act that ‘whenever the word “animal” occur it shall be held to include women?’ Suffer me, thorough your columns, to appeal to our 650 [parliamentary] representatives, and ask — Is there not one among you then who will introduce such a motion? There would then be at least an equal interdict on wanton barbarity to cat, dog, or woman…

Yours respectfully,

AN EARNEST ENGLISHWOMAN

The broader question at the heart of the Earnest Englishwoman’s outrage, of course, isn’t merely about gender — “women” could have just as easily been any other marginalized group, from non-white Europeans to non-Westerners to even children, or a delegitimized majority-politically-treated-as-minority more appropriate to our time, such as the “99 percent.” The question, really, is what entitles one to humanness.

But seeking an answer in the ideology of humanism, Bourke is careful to point out, is hasty and incomplete:

The humanist insistence on an autonomous, willful human subject capable of acting independently in the world was based on a very particular type of human. Human civilization had been forged in the image of the male, white, well-off, educated human. Humanism installed only some humans at the centre of the universe. It disparaged ‘the woman,’ ‘the subaltern’ and ‘the non-European’ even more than ‘the animal.’ As a result, it is hardly surprising that many of these groups rejected the idea of a universal and straightforward essence of ‘the human’, substituting something much more contingent, outward-facing and complex. To rephrase Simone de Beauvoir’s inspired conclusion about women, one is not born, but made, a human.”

Bourke also admonishes against seeing the historical trend in paradigms about humanness as linear, as shifting “from the theological towards the rationalist and scientific” or “from humanist to post-humanist.” How, then, are we to examine the “porous boundary between the human and the animal”?

In complex and sometimes contradictory ways, the ideas, values and practices used to justify the sovereignty of a particular understanding of ‘the human’ over the rest of sentient life are what create society and social life. Perhaps the very concept of ‘culture’ is an attempt to differentiate ourselves from our ‘creatureliness,’ our fleshly vulnerability.”

(Cue in 15 years of leading scientists’ meditations on “culture”.)

Bourke goes on to explore history’s varied definitions of what it means to be human, which have used a wide range of imperfect, incomplete criteria — intellectual ability, self-consciousness, private property, tool-making, language, the possession of a soul, and many more.

For Aristotle, writing in the 4th century B.C., it meant having a telos — an appropriate end or goal — and to belong to a polis where “man” could truly speak:

…the power of speech is intended to set forth the expedient and inexpedient, and therefore likewise the just and the unjust. And it is a characteristic of man that he alone has any sense of good and evil, or just and unjust, and the like, and the association of living beings who have this sense makes a family and a state.”

In the early 17th century, René Descartes, whose famous statement “Cogito ergo sum” (“I think, therefore I am”) implied only humans possess minds, argued animals were “automata” — moving machines, driven by instinct alone:

Nature which acts in them according to the disposition of their organs, as one sees that a clock, which is made up of only wheels and springs can count the hours and measure time more exactly than we can with all our art.”

For late 18th-century German philosopher Immanuel Kant, rationality was the litmus test for humanity, embedded in his categorical claim that the human being was “an animal endowed with the capacity of reason”:

[The human is] markedly distinguished from all other living beings by his technical predisposition for manipulating things (mechanically joined with consciousness), by his pragmatic predisposition (to use other human beings skillfully for his purposes), and by the moral predisposition in his being (to treat himself and others according to the principle of freedom under the laws.)”

In The Descent of Man, Darwin reflected:

The difference in mind between man and the higher animals, great as it is, is certainly one of degree and not of kind. We have seen that the senses and intuitions, the various emotions and faculties, such as love, memory, attention, curiosity, imitation, reason, etc., of which man boasts, may be found in an incipient, or even sometimes in a well-developed condition, in the lower animals.”

(For more on Darwin’s fascinating studies of emotion, don’t forget Darwin’s Camera.)

Darwin’s concern was echoed quantitatively by Jared Diamond in 1990s when, in The Third Chimpanzee, he wondered how the 2.9% genetic difference between two kids of birds or the 2.2% difference between two gibbons made for a different species, but the 1.6% difference between humans and chimpanzees makes a different genus.

In the 1930s, Bertrand Lloyd, who penned Humanitarianism and Freedom, observed a difficult paradox of any definition:

Deny reason to animals, and you must equally deny it to infants; affirm the existence of an immortal soul in your baby or yourself, and you must at least have the grace to allow something of the kind to your dog.”

In 2001, Jacques Derrida articulated a similar concern:

None of the traits by which the most authorized philosophy or culture has thought it possible to recognize this ‘proper of man’ — none of them is, in all rigor, the exclusive reserve of what we humans call human. Either because some animals also possess such traits, or because man does not possess it as surely as is claimed.”

A Möbius strip, from a 1963 poster of the woodcut by M. C. Escher: 'Which side of the strip are the ants walking on?'

M. C. Escher's 'Möbius Strip 11' © The M. C. Escher Company -- Holland

Curiously, Bourke uses the Möbius strip as the perfect metaphor for deconstructing the human vs. animal dilemma. Just as the one-sided surface of the strip has “no inside or outside; no beginning or end; no single point of entry or exit; no hierarchical ladder to clamber up or slide down,” so “the boundaries of the human and the animal turn out to be as entwined and indistinguishable as the inner and outer sides of a Möbius strip.” Bourke points to Derrida’s definition as the most rewarding, calling him “the philosopher of the Möbius strip.”

Ultimately, What It Means to Be Human is less an answer than it is an invitation to a series of questions, questions about who and what we are as a species, as souls, and as nodes in a larger complex ecosystem of sentient beings. As Bourke poetically puts it,

Erasing the awe-inspiring variety of sentient life impoverishes all our lives.”

And whether this lens applies to animals or social stereotypes, one thing is certain: At a time when the need to celebrate both our shared humanity and our meaningful differences is all the more painfully evident, the question of what makes us human becomes not one of philosophy alone but also of politics, justice, identity, and every fiber of existence that lies between.

Originally featured earlier this month. For a related read that missed the cut by a hair, see Christian Smith’s excellent What Is A Person.

THE EGO TRICK

How “you” are you, really? Character is something we tend to think of as a static, enduring quality, and yet we glorify stories of personal transformation. In reality, our essence oscillates between a set of hard-wired patterns and a fluid spectrum of tendencies that shift over time and in reaction to circumstances. This is exactly what journalist Julian Baggini, co-founder of The Philosopher’s Magazine, tries to reconcile in The Ego Trick: In Search of the Self — an absorbing journey across philosophy, anthropology, sociology, neuroscience, religion and psychology, painting “I” as a dynamic verb rather than a static noun, a concept in conflict with much of common sense and, certainly, with the ideals of Romantic individualism we examined this morning. In his illuminating recent talk at The RSA, Baggini probes deeper into the theory of self-creation and the essence of our identity.

The topic of personal identity is strictly speaking nonexistent. It’s important to recognize that we are not the kind of things that simply popped into existence at birth, continue to exist, the same thing, then die off the cliff edge or go into another realm. We are these very remarkably ordered collections of things. It is because we’re so ordered that we are able to think of ourselves as being singular persons. But there is no singular person there, that means we’re forever changing.” ~ Julian Baggini

For a great companion read, you won’t go wrong with Antonio Damasio’s excellent Self Comes to Mind: Constructing the Conscious Brain.

Originally featured in June.

FLOURISH

Back in the day, I had the pleasure of studying under Dr. Martin Seligman, father of the thriving positive psychology movement — a potent antidote to the traditional “disease model” of psychology, which focuses on how to relieve suffering rather than how to amplify well-being. His seminal book, Authentic Happiness, was among the 7 essential books on the art and science of happiness, and this year marked the release of his highly anticipated follow-up. Flourish: A Visionary New Understanding of Happiness and Well-being is rather radical departure from Seligman’s prior conception of happiness, which he now frames as overly simplistic and inferior to the higher ideal of lasting well-being.

Flourish is definitely not a self-help book, though it does offer insightful techniques to optimize yourself, your relationships and your business for well-being. If anything, it can read a bit wonky at times, as Seligman delves into fascinating empirical evidence culled from years of rigorous research. But I find this remarkably refreshing and stimulating amidst the sea of dumbed down psycho-fluff.

Relieving the states that make life miserable… has made building the states that make life worth living less of a priority. The time has finally arrived for a science that seeks to understand positive emotion, build strength and virtue, and provide guideposts for finding what Aristotle called the ‘good life.'” ~ Martin Seligman

Seligman identifies five endeavors crucial to human flourishing — positive emotion, engagement, good relationships, meaning and purpose in life, and accomplishment — and examines each in detail, ultimately proposing that public policy have flourishing as its central goal.

The content itself — happiness, flow, meaning, love, gratitude, accomplishment, growth, better relationships — constitutes human flourishing. Learning that you can have more of these things is life changing. Glimpsing the vision of a flourishing human future is life changing.” ~ Martin Seligman

Seligman’s work over the years has taken him inside the brains of British lords, Australian school kids, billionaire philanthropists, Army generals, artists, educators, scientists and countless more of humanity’s most interesting and inspired specimens. The insights gleaned from these clinical cases are both sage and surprising, inviting you to look at the pillars of your own happiness with new eyes.

Originally featured in April.

THE TELL-TALE BRAIN

V.S. Ramachandran — one of the most influential neuroscientists of our time, whose work has not only made seminal contributions to the understanding of autism, phantom limbs and synesthesia, among other fascinating phenomena, but has also helped introduce neuroscience to popular culture. The fact that he is better-known as Rama — you know, like Prince or Madonna or Che — is a fitting reflection of his cultural cachet. This year, in furthering the inquiry into what it means to be human, Rama released his highly anticipated new book: The Tell-Tale Brain: A Neuroscientist’s Quest for What Makes Us Human — an ambitious exploration of everything from the origins of language to our relationship with art to the very mental foundation of civilization. Both empirically rooted in specific patient cases and philosophically speculative in an intelligent, grounded way, with a healthy dose of humor thrown in for good measure, it’s an absolute masterpiece of cognitive science and a living manifesto for the study of the brain.

As heady as our progress [in the sciences of the mind] has been, we need to stay completely honest with ourselves and acknowledge that we have only discovered a tiny fraction of what there is to know about the human brain. But the modest amount that we have discovered makes for a story more exciting than any Sherlock Holmes novel. I feel certain that as progress continues through the coming decades, the conceptual twists and technological turns we are in for are going to be at least as mind bending, at last as intuition shaking, and as simultaneously humbling and exalting to the human spirit as the conceptual revolutions that upended physics a century ago. The adage that fact is stranger than fiction seems to be especially true for the workings of the brain.” ~ V. S. Ramachandran

You can sample Rama’s remarkable quest to illuminate the brain with his excellent 2007 TED talk:

Originally featured in January.

THE BELIEF INSTINCT

We’re deeply fascinated by how the human mind makes sense of the world, and religion is one of the primary sensemaking mechanisms humanity has created to explain reality. On the heels of our recent explorations of the relationship between science and religion, the neuroscience of being human and the nature of reality comes The Belief Instinct: The Psychology of Souls, Destiny, and the Meaning of Life — an ambitious new investigation by evolutionary psychologist Jesse Bering, exploring one of the most important questions of human existence. Eloquently argued and engagingly written, it provides a compelling missing link between theory of mind and the need for God.

If humans are really natural rather than supernatural beings, what accounts for our beliefs about souls, immortality, a moral ‘eye in the sky’ that judges us, and so forth?”

A leading scholar of religious cognition, Bering — who heads Oxford’s Explaining Religion Project — proposes a powerful new hypothesis for the nature, origin and cognitive function of spirituality. Far from merely regurgitating existing thinking on the subject, he connects dots across different disciplines, ideologies and materials, from neuroscience to Buddhist scriptures to The Wizard of Oz. Blending empirical evidence from seminal research with literary allusions and cultural critique, Bering examines the central tenets of spirituality, from life’s purpose to the notion of afterlife, in a sociotheological context underlines by the rigor of a serious scientists.

Originally featured in February, and one of our 7 fundamental meditations on faith.

OUT OF CHARACTER

The dichotomy of good and evil is as old as the story of the world, and timeless in its relevance to just about everything we do in life, from our political and spiritual views to our taste in music, art and literature to how we think about our simple dietary choices. But while most of us recognize that these concepts of good and bad aren’t always black-and-white categories, we never cease to be surprised when someone or something we’ve perceived as “good” does or becomes something we perceive as “bad,” from an esteemed politician’s transgression to a beloved celebrity’s slip into addiction or scientology or otherwise socially undesirable behavior.

In Out of Character: Surprising Truths About the Liar, Cheat, Sinner (and Saint) Lurking in All of Us, researchers David DeSteno and Piercarlo Valdesolo explore this curious disconnect through the rigorous lens of science. Drawing on their research at the Social Emotions Lab at Northeastern University, the authors offer a fascinating yet highly readable perspective on the psychology of the hero/villain spectrum of human character, inviting us to reconceive personality, both our own and that of others, with a more balanced moral view that reflects the fluidity of human psychology.

The derivation of the word ‘character’ comes from an ancient Greek term referring to the indelible marks stamped on coins. Once character was pressed into your mind or soul, people assumed it was fixed. But what modern science repeatedly shows is that this just isn’t the case. As we discuss in our book, everyone’s moral behavior is much more variable than any of us would have initially predicted.” ~ David DeSteno

In this excellent talk from Northeastern’s Insights series, DeSteno reveals some of the fascinating research behind the book and the illuminating insights that came from it.

The analogy of color is an interesting way to think about [character]. Most of us think that colors are very discrete things — something’s red, it’s got redness; something’s blue, it’s got blueness. But we are creating these categories. They’re not natural kinds, they’re not given in ways that represent fundamentally distinct things. Ultimately, what determines what colors we see are the frequencies of light waves entering our eyes, so it’s along a continuum. It’s kind of the same with character. Things blend. We assume that if someone is good, that we’ve characterized them as good, that’s a discrete category, they can’t be bad. And when they are, our categories shatter. That’s because we have this illusory, arbitrary idea of what vice and virtue mean” ~ David DeSteno

Ultimately, Out of Character: Surprising Truths About the Liar, Cheat, Sinner (and Saint) Lurking in All of Us makes a compelling case for seeing human character as a grayscale continuum, not a black-and-white dichotomy of good and bad, enlisting neuroscience and cognitive psychology to reaffirm the age-old Aristotelian view of virtue and vice as fluid, interlaced existential capacities.

Originally featured in May.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.