The Marginalian
The Marginalian

The Best Science Books of 2012

It’s that time of year again, the time for those highly subjective, grossly non-exhaustive, yet inevitable and invariably fun best-of reading lists. To kick off the season, here are, in no particular order, my ten favorite science books of 2012. (Catch up on last year’s reading list here.)

INTERNAL TIME

“Six hours’ sleep for a man, seven for a woman, and eight for a fool,” Napoleon famously prescribed. (He would have scoffed at Einstein, then, who was known to require ten hours of sleep for optimal performance.) This perceived superiority of those who can get by on less sleep isn’t just something Napoleon shared with dictators like Hitler and Stalin, it’s an enduring attitude woven into our social norms and expectations, from proverbs about early birds to the basic scheduling structure of education and the workplace. But in Internal Time: Chronotypes, Social Jet Lag, and Why You’re So Tired (public library), a fine addition to these 7 essential books on time, German chronobiologist Till Roenneberg demonstrates through a wealth of research that our sleep patterns have little to do with laziness and other such scorned character flaws, and everything to do with biology.

In fact, each of us possesses a different chronotype — an internal timing type best defined by your midpoint of sleep, or midsleep, which you can calculate by dividing your average sleep duration by two and adding the resulting number to your average bedtime on free days, meaning days when your sleep and waking times are not dictated by the demands of your work or school schedule. For instance, if you go to bed at 11 P.M. and wake up at 7 A.M., add four hours to 11pm and you get 3 A.M. as your midsleep.

The distribution of midsleep in Central Europe. The midsleep times (on free days) of over 60 percent of the population fall between 3:30 and 5:30 A.M.

Roenneberg traces the evolutionary roots of different sleep cycles and argues that while earlier chronotypes might have had a social advantage in agrarian and industrial societies, today’s world of time-shift work and constant connectivity has invalidated such advantages but left behind the social stigma around later chronotypes.

This myth that early risers are good people and that late risers are lazy has its reasons and merits in rural societies but becomes questionable in a modern 24/7 society. The old moral is so prevalent, however, that it still dominates our beliefs, even in modern times. The postman doesn’t think for a second that the young man might have worked until the early morning hours because he is a night-shift worker or for other reasons. He labels healthy young people who sleep into the day as lazy — as long sleepers. This attitude is reflected in the frequent use of the word-pair early birds and long sleepers [in the media]. Yet this pair is nothing but apples and oranges, because the opposite of early is late and the opposite of long is short.

Roenneberg goes on to explore sleep duration, a measure of sleep types that complements midsleep, demonstrating just as wide a spectrum of short and long sleepers and debunking the notion that people who get up late sleep longer than others — this judgment, after all, is based on the assumption that everyone goes to bed at the same time, which we increasingly do not.

Sleep duration shows a bell-shaped distribution within a population, but there are more short sleepers (on the left) than long sleepers (on the right).

The disconnect between our internal, biological time and social time — defined by our work schedules and social engagements — leads to what Roenneberg calls social jet lag, a kind of chronic exhaustion resembling the symptoms of jet lag and comparable to having to work for a company a few time zones to the east of your home.

Unlike what happens in real jet lag, people who suffer from social jet lag never leave their home base and can therefore never adjust to a new light-dark environment … While real jet lag is acute and transient, social jet lag is chronic. The amount of social jet lag that an individual is exposed to can be quantified as the difference between midsleep on free days and midsleep on work days … Over 40 percent of the Central European population suffers from social jet lag of two hours or more, and the internal time of over 15 percent is three hours or more out of synch with external time. There is no reason to assume that this would be different in other industrialized nations.

The scissors of sleep. Depending on chronotype, sleep duration can be very different between work days and free days.

Chronotypes vary with age:

Young children are relatively early chronotypes (to the distress of many young parents), and then gradually become later. During puberty and adolescence humans become true night owls, and then around twenty years of age reach a turning point and become earlier again for the rest of their lives. On average, women reach this turning point at nineteen and a half while men start to become earlier again at twenty-one … [T]his clear turning point in the developmental changes of chronotype … [is] the first biological marker for the end of adolescence.

Roenneberg points out that in our culture, there is a great disconnect between teenagers’ biological abilities and our social expectations of them, encapsulated in what is known as the disco hypothesis — the notion that if only teens would go to bed earlier, meaning not party until late, they’d be better able to wake up clear-headed and ready for school at the expected time. The data, however, indicate otherwise — adolescents’ internal time is shifted so they don’t find sleep before the small hours of the night, a pattern also found in the life cycle of rodents.

Here, we brush up against a painfully obtrusive cultural obstacle: School starts early — as early as 7 A.M. in some European countries — and teens are expected to perform well on a schedule not designed with their internal time in mind. As a result, studies have shown that many students show the signs of narcolepsy — a severe sleeping disorder that makes one fall asleep at once when given the chance, immediately entering REM sleep. The implications are worrisome:

Teenagers need around eight to ten hours of sleep but get much less during their workweek. A recent study found that when the starting time of high school is delayed by an hour, the percentage of students who get at least eight hours of sleep per night jumps from 35.7 percent to 50 percent. Adolescent students’ attendance rate, their performance, their motivation, even their eating habits all improve significantly if school times are delayed.

Similar detrimental effects of social jet lag are found in shift work, which Roenneberg calls “one of the most blatant assaults on the body clock in modern society.” (And while we may be tempted to equate shift work with the service industry, any journalist, designer, developer, or artist who works well into the night on deadline can relate — hey, it’s well past midnight again as I’m writing this.) In fact, the World Health Organization recently classified “shift work that involves circadian disruption” as a potential cause of cancer, and the consequences of social jet lag and near-narcolepsy extend beyond the usual suspects of car accidents and medical errors:

We are only beginning to understand the potentially detrimental consequences of social jet lag. One of these has already been worked out with frightening certainty: the more severe the social jet lag that people suffer, the more likely it is that they are smokers. Tis is not a question of quantity (number of cigarettes per day) but simple whether they are smokers or not … Statistically, we experience the worst social jet lag as teenagers, when our body clocks are drastically delayed for biological reasons, but we still have to get up at the same traditional times for school. This coincides with the age when most individuals start smoking. Assuredly there are many different reasons people start smoking at that age, but social jet lag certainly contributes to the risk.

If young people’s psychological and emotional well-being isn’t incentive enough for policy makers — who, by the way, Roenneberg’s research indicates tend to be early chronotypes themselves — to consider later school times, one would think their health should be.

The correlation between social jet lag and smoking continues later in life as well, particularly when it comes to quitting:

[T]he less stress smokers have, the easier it is for them to quit. Social jet lag is stress, so the chances of successfully quitting smoking are higher when the mismatch of internal and external time is smaller. The numbers connecting smoking with social jet lag are striking: Among those who suffer less than an hour of social jet lag per day, we find 15 to 20 percent are smokers. This percentage systematically rises to over 60 percent when internal and external time are more than five hours out of sync.

Another factor contributing to our social jet lag is Daylight Savings Time. Even though DST’s proponents argue that it’s just one small hour, the data suggest that between October and March, DST throws off our body clocks by up to four weeks, depending on our latitude, not allowing our bodies to properly adjust to the time change, especially if we happen to be later chronotypes. The result is increased social jet lag and decreased sleep duration.

But what actually regulates our internal time? Though the temporal structures of sun time — tide, day, month, and year — play a significant role in the lives of all organisms, our biological clocks evolved in a “time-free” world and are somewhat independent of such external stimuli as light and dark. For instance, early botanical studies showed that a mimosa plant kept in a pitch-dark closet would still open and close its leaves the way it does in the day-night cycle, and subsequent studies of human subjects confined to dark bunkers showed similar preservation of their sleep and waking patterns, which followed, albeit imperfectly, the 24-hour cycle of day and night.

Our internal clocks, in fact, can be traced down to the genetic level, with individual “clock genes” and, most prominently, the suprachiasmatic nucleus, or SCN — a small region in the brain’s midline that acts as a kind of “master clock” for mammals, regulating neuronal and hormonal activity around our circadian rhythms. Roenneberg explains how our internal clocks work on the DNA level:

In the nucleus, the DNA sequence of a clock gene is transcribed to mRNA; the resulting message is exported from the nucleus, translated into a clock protein, and is then modified. This clock protein is itself part of the molecular machinery that controls the transcription of its ‘own’ gene. When enough clock proteins have been made, they are imported back into the nucleus, where they start to inhibit the transcription of their own mRNA. Once this inhibition is strong enough, no more mRNA molecules are transcribed, and the existing ones are gradually destroyed. As a consequence, no more proteins can be produced and the existing ones will also gradually be destroyed. When they are all gone, the transcriptional machinery is not suppressed anymore, and a new cycle can begin.

[…]

Despite this complexity, the important take-home message is that daily rhythms are generated by molecular mechanisms that could potentially work in a single cell, for example a single neuron of the SCN.

Internal Time goes on to illuminate many other aspects of how chronotypes and social jet lag impact our daily lives, from birth and suicide rates to when we borrow books from the library to why older men marry younger women, and even why innovators and entrepreneurs tend to have later chronotypes.

Originally featured at length in May.

THE WHERE, THE WHY, AND THE HOW

At the intersection of art and science, The Where, the Why, and the How: 75 Artists Illustrate Wondrous Mysteries of Science (public library) invites some of today’s most celebrated artists to create scientific illustrations and charts to accompany short essays about the most fascinating unanswered questions on the minds of contemporary scientists across biology, astrophysics, chemistry, quantum mechanics, anthropology, and more. The questions cover such mind-bending subjects as whether there are more than three dimensions, why we sleep and dream, what causes depression, how long trees live, and why humans are capable of language.

The images, which come from a mix of well-known titans and promising up-and-comers, including favorites like Lisa Congdon, Gemma Correll, and Jon Klassen, borrow inspiration from antique medical illustrations, vintage science diagrams, and other historical ephemera from periods of explosive scientific curiosity.

Above all, the project is a testament to the idea that ignorance is what drives discovery and wonder is what propels science — a reminder to, as Rilke put it, live the questions and delight in reflecting on the mysteries themselves. The trio urge in the introduction:

With this book, we wanted to bring back a sense of the unknown that has been lost in the age of information. … Remember that before you do a quick online search for the purpose of the horned owl’s horns, you should give yourself some time to wonder.

The motion graphics book trailer is an absolute masterpiece itself:

Pondering the age-old question of why the universe exists, Brian Yanny asks:

Was there an era before our own, out of which our current universe was born? Do the laws of physics, the dimensions of space-time, the strengths and types and asymmetries of nature’s forces and particles, and the potential for life have to be as we observe them, or is there a branching multi-verse of earlier and later epochs filled with unimaginably exotic realms? We do not know.

What existed before the big bang?
Illustrated by Josh Cochran

Exploring how gravity works, Terry Matilsky notes:

[T]he story is not finished. We know that general relativity is not the final answer, because we have not been able to synthesize gravity with the other known laws of physics in a comprehensive “theory of everything.”

How does gravity work?
Illustrated by The Heads of State

Zooming in on the microcosm of our own bodies and their curious behaviors, Jill Conte considers why we blush:

The ruddy or darkened hue of a blush occurs when muscles in the walls of blood vessels within the skin relax and allow more blood to flow. Interestingly, the skin of the blush region contains more blood vessels than do other parts of the body. These vessels are also larger and closer to the surface, which indicates a possible relationship among physiology, emotion, and social communication. While it is known that blood flow to the skin, which serves to feed cells and regulate surface body temperature, is controlled by the sympathetic nervous system, the exact mechanism by which this process is activated specifically to produce a blush remains unknown.

What is dark matter?
Illustrated by Betsy Walton

Equal parts delightful and illuminating, The Where, the Why, and the How is the kind of treat bound to tickle your brain from both sides.

Originally featured in October.

IN PURSUIT OF THE UNKNOWN

When legendary theoretical physicist Stephen Hawking was setting out to release A Brief History of Time, one of the most influential science books in modern history, his publishers admonished him that every equation included would halve the book’s sales. Undeterred, he dared include E = mc², even though cutting it out would have allegedly sold another 10 million copies. The anecdote captures the extent of our culture’s distaste for, if not fear of, equations. And yet, argues mathematician Ian Stewart in In Pursuit of the Unknown: 17 Equations That Changed the World, equations have held remarkable power in facilitating humanity’s progress and, as such, call for rudimentary understanding as a form of our most basic literacy.

Stewart writes:

The power of equations lies in the philosophically difficult correspondence between mathematics, a collective creation of human minds, and an external physical reality. Equations model deep patterns in the outside world. By learning to value equations, and to read the stories they tell, we can uncover vital features of the world around us… This is the story of the ascent of humanity, told in 17 equations.

From how the Pythagorean theorem, which linked geometry and algebra, laid the groundwork of the best current theories of space, time, and gravity to how the Navier-Stokes equation applies to modeling climate change, Stewart delivers a scientist’s gift in a storyteller’s package to reveal how these seemingly esoteric equations are really the foundation for nearly everything we know and use today.

Some the most revolutionary of the breakthroughs Stewart outlines came from thinkers actively interested in both the sciences and the humanities. Take René Descartes, for instance, who is best remembered for his timeless soundbite, Cogito ergo sumI think, therefore I am. But Descartes’ interests, Stewart points out, extended beyond philosophy and into science and mathematics. In 1639, he observed a curious numerical pattern in regular solids — what was true of a cube was also true of a dodecahedron or an icosahedron, for all of which subtracting from the number of faces the number of edges and then adding the number of vertices equaled 2. (Try it: A cube has 6 faces, 12 edges, and 8 vertices, so 6 – 12 + 8 = 2.) But Descartes, perhaps enchanted by philosophy’s grander questions, saw the equation as a minor curiosity and never published it. Only centuries later mathematicians recognized it as monumentally important. It eventually resulted in Euler’s formula, which helps explain everything from how enzymes act on cellular DNA to why the motion of the celestial bodies can be chaotic.

So how did equations begin, anyway? Stewart explains:

An equation derives its power from a simple source. It tells us that two calculations, which appear different, have the same answer. The key symbol is the equals sign, =. The origins of most mathematical symbols are either lost in the mists of antiquity, or are so recent that there is no doubt where they came from. The equals sign is unusual because it dates back more than 450 years, yet we not only know who invented it, we even know why. The inventor was Robert Recorde, in 1557, in The Whetstone of Witte. He used two parallel lines (he used an obsolete word gemowe, meaning ‘twin’) to avoid tedious repetition of the words ‘is equal to’. He chose that symbol because ‘no two things can be more equal’. Recorde chose well. His symbol has remained in use for 450 years.

The original coinage appeared as follows:

To avoide the deiouse repetition of these woordes: is equalle to: I will sette as I doe often in woorke use, a paire of paralleles, or gemowe lines of one lengthe: =, bicause noe .2. thynges, can be moare equalle.

Far from being a mere math primer or trivia aid, In Pursuit of the Unknown is an essential piece of modern literacy, wrapped in an articulate argument for why this kind of knowledge should be precisely that.

Stewart concludes by turning his gaze towards the future, offering a kind of counter-vision to algo-utopians like Stephen Wolfram and making, instead, a case for the reliable humanity of the equation:

It is still entirely credible that we might soon find new laws of nature based on discrete, digital structures and systems. The future may consist of algorithms, not equations. But until that day dawns, if ever, our greatest insights into nature’s laws take the form of equations, and we should learn to understand them and appreciate them. Equations have a track record. They really have changed the world — and they will change it again.

Originally featured in full in April.

IGNORANCE

“Science is always wrong,” George Bernard Shaw famously proclaimed in a toast to Albert Einstein. “It never solves a problem without creating 10 more.”

In the fifth century BC, long before science as we know it existed, Socrates, the very first philosopher, famously observed, “I know one thing, that I know nothing.” Some 21 centuries later, while inventing calculus in 1687, Sir Isaac Newton likely knew all there was to know in science at the time — a time when it was possible for a single human brain to hold all of mankind’s scientific knowledge. Fast-forward 40 generations to today, and the average high school student has more scientific knowledge than Newton did at the end of his life. But somewhere along that superhighway of progress, we seem to have developed a kind of fact-fetishism that shackles us to the allure of the known and makes us indifferent to the unknown knowable. Yet it’s the latter — the unanswered questions — that makes science, and life, interesting. That’s the eloquently argued case at the heart of Ignorance: How It Drives Science, in which Stuart Firestein sets out to debunk the popular idea that knowledge follows ignorance, demonstrating instead that it’s the other way around and, in the process, laying out a powerful manifesto for getting the public engaged with science — a public to whom, as Neil deGrasse Tyson recently reminded Senate, the government is accountable in making the very decisions that shape the course of science.

The tools and currencies of our information economy, Firestein points out, are doing little in the way of fostering the kind of question-literacy essential to cultivating curiosity:

Are we too enthralled with the answers these days? Are we afraid of questions, especially those that linger too long? We seem to have come to a phase in civilization marked by a voracious appetite for knowledge, in which the growth of information is exponential and, perhaps more important, its availability easier and faster than ever.*

(For a promise of a solution, see Clay Johnson’s excellent The Information Diet.)

The cult of expertise — whose currency are static answers — obscures the very capacity for cultivating a thirst for ignorance:

There are a lot of facts to be known in order to be a professional anything — lawyer, doctor, engineer, accountant, teacher. But with science there is one important difference. The facts serve mainly to access the ignorance… Scientists don’t concentrate on what they know, which is considerable but minuscule, but rather on what they don’t know…. Science traffics in ignorance, cultivates it, and is driven by it. Mucking about in the unknown is an adventure; doing it for a living is something most scientists consider a privilege.

[…]

Working scientists don’t get bogged down in the factual swamp because they don’t care all that much for facts. It’s not that they discount or ignore them, but rather that they don’t see them as an end in themselves. They don’t stop at the facts; they begin there, right beyond the facts, where the facts run out. Facts are selected, by a process that is a kind of controlled neglect, for the questions they create, for the ignorance they point to.

Firestein, who chairs the Department of Biological Sciences at Columbia University, stresses that beyond simply accumulating facts, scientists use them as raw material, not finished product. He cautions:

Understanding the raw material for the product is a subtle error but one that can have surprisingly far-reaching consequences. Understanding this error and its ramifications, and setting it straight, is crucial to understanding science.

What emerges is an elegant definition of science:

Real science is a revision in progress, always. It proceeds in fits and starts of ignorance.

(What is true of science is actually also true of all creativity: As Jonah Lehrer puts it “The only way to be creative over time — to not be undone by our expertise — is to experiment with ignorance, to stare at things we don’t fully understand.” Einstein knew that, too, when he noted that without a preoccupation with “the eternally unattainable in the field of art and scientific research, life would have seemed… empty.” And Kathryn Schulz touched on it with her meditation on pessimistic meta-induction.)

In highlighting this commonality science holds with other domains of creative and intellectual labor, Firestein turns to the poet John Keats, who described the ideal state of the literary psyche as Negative Capability — “that is when a man is capable of being in uncertainties, Mysteries, doubts without any irritable reaching after fact & reason.” Firestein translates this to science:

Being a scientist requires having faith in uncertainty, finding pleasure in mystery, and learning to cultivate doubt. There is no surer way to screw up an experiment than to be certain of its outcome.

He captures the heart of this argument in an eloquent metaphor:

Science, then, is not like the onion in the often used analogy of stripping away layer after layer to get at some core, central, fundamental truth. Rather it’s like the magic well: no matter how many buckets of water you remove, there’s always another one to be had. Or even better, it’s like the widening ripples on the surface of a pond, the ever larger circumference in touch with more and more of what’s outside the circle, the unknown. This growing forefront is where science occurs… It is a mistake to bob around in the circle of facts instead of riding the wave to the great expanse lying outside the circle.

However, more important than the limits of our knowledge, Firestein is careful to point out, are the limits to our ignorance. (Cue in Errol Morris’s fantastic 2010 five-part New York Times series, The Anosognosic’s Dilemma.) Science historian and Stanford professor Robert Proctor has even coined a term for the study of ignorance — agnotology — and, Firestein argues, it is a conduit to better understanding progress.

Science historian and philosopher Nicholas Rescher has offered a different term for a similar concept: Copernican cognitivism, suggesting that just like Copernicus showed us there was nothing privileged about our position in space by debunking the geocentric model of the universe, there is also nothing privileged about our cognitive landscape.

But the most memorable articulation of the limits of our own ignorance comes from the Victorian novella Flatland, where a three-dimensional sphere shows up in a two-dimensional land and inadvertently wreaks havoc on its geometric inhabitants’ most basic beliefs about the world as they struggle to imagine the very possibility of a third dimension.

An engagement with the interplay of ignorance and knowledge, the essential bargaining chips of science, is what elevated modern civilization from the intellectual flatness of the Middle Ages. Firestein points out that “the public’s direct experience of the empirical methods of science” helped humanity evolve from the magical and mystical thinking of Western medieval thought to the rational discourse of contemporary culture.

At the same time, Firestein laments, science today is often “as inaccessible to the public as if it were written in classical Latin.” Making it more accessible, he argues, necessitates introducing explanations of science that focus on the unknown as an entry point — a more inclusive gateway than the known.

In one of the most compelling passages of the book, he broadens this insistence on questions over answers to the scientific establishment itself:

Perhaps the most important application of ignorance is in the sphere of education, particularly of scientists… We must ask ourselves how we should educate scientists in the age of Google and whatever will supersede it… The business model of our Universities, in place now for nearly a thousand years, will need to be revised.

[…]

Instead of a system where the collection of facts is an end, where knowledge is equated with accumulation, where ignorance is rarely discussed, we will have to provide the Wiki-raised student with a taste of and for boundaries, the edge of the widening circle of ignorance, how the data, which are not unimportant, frames the unknown. We must teach students how to think in questions, how to manage ignorance. W. B. Yeats admonished that ‘education is not the filling of a pail, but the lighting of a fire.’

Firestein sums it up beautifully:

Science produces ignorance, and ignorance fuels science. We have a quality scale for ignorance. We judge the value of science by the ignorance it defines. Ignorance can be big or small, tractable or challenging. Ignorance can be thought about in detail. Success in science, either doing it or understanding it, depends on developing comfort with the ignorance, something akin to Keats’ negative capability.

Originally featured in April.

* See some thoughts on the difference between access and accessibility.

DREAMLAND

The Ancient Greeks believed that one fell asleep when the brain filled with blood and awakened once it drained back out. Nineteenth-century philosophers contended that sleep happened when the brain was emptied of ambitions and stimulating thoughts. “If sleep doesn’t serve an absolutely vital function, it is the greatest mistake evolution ever made,” biologist Allan Rechtschaffen once remarked. Even today, sleep remains one of the most poorly understood human biological functions, despite some recent strides in understanding the “social jetlag” of our internal clocks and the relationship between dreaming and depression. In Dreamland: Adventures in the Strange Science of Sleep (public library), journalist David K. Randall — who stumbled upon the idea after crashing violently into a wall while sleepwalking — explores “the largest overlooked part of your life and how it affects you even if you don’t have a sleep problem.” From gender differences to how come some people snore and others don’t to why we dream, he dives deep into this mysterious third of human existence to illuminate what happens when night falls and how it impacts every aspect of our days.

Most of us will spend a full third of our lives asleep, and yet we don’t have the faintest idea of what it does for our bodies and our brains. Research labs offer surprisingly few answers. Sleep is one of the dirty little secrets of science. My neurologist wasn’t kidding when he said there was a lot that we don’t know about sleep, starting with the most obvious question of all — why we, and every other animal, need to sleep in the first place.

But before we get too anthropocentrically arrogant in our assumptions, it turns out the quantitative requirement of sleep isn’t correlated with how high up the evolutionary chain an organism is:

Lions and gerbils sleep about thirteen hours a day. Tigers and squirrels nod off for about fifteen hours. At the other end of the spectrum, elephants typically sleep three and a half hours at a time, which seems lavish compared to the hour and a half of shut-eye that the average giraffe gets each night.

[…]

Humans need roughly one hour of sleep for every two hours they are awake, and the body innately knows when this ratio becomes out of whack. Each hour of missed sleep one night will result in deeper sleep the next, until the body’s sleep debt is wiped clean.

What, then, happens as we doze off, exactly? Like all science, our understanding of sleep seems to be a constant “revision in progress”:

Despite taking up so much of life, sleep is one of the youngest fields of science. Until the middle of the twentieth century, scientists thought that sleep was an unchanging condition during which time the brain was quiet. The discovery of rapid eye movements in the 1950s upended that. Researchers then realized that sleep is made up of five distinct stages that the body cycles through over roughly ninety-minute periods. The first is so light that if you wake up from it, you might not realize that you have been sleeping. The second is marked by the appearance of sleep-specific brain waves that last only a few seconds at a time. If you reach this point in the cycle, you will know you have been sleeping when you wake up. This stage marks the last drop before your brain takes a long ride away from consciousness. Stages three and four are considered deep sleep. In three, the brain sends out long, rhythmic bursts called delta waves. Stage four is known as slow-wave sleep for the speed of its accompanying brain waves. The deepest form of sleep, this is the farthest that your brain travels from conscious thought. If you are woken up while in stage four, you will be disoriented, unable to answer basic questions, and want nothing more than to go back to sleep, a condition that researchers call sleep drunkenness. The final stage is REM sleep, so named because of the rapid movements of your eyes dancing against your eyelids. In this stage of sleep, the brain is as active as it is when it is awake. This is when most dreams occur.

(Recall the role of REM sleep in regulating negative emotions.)

Randall’s most urgent point, however, echoes what we’ve already heard from German chronobiologist Till Roenneberg (see above) — in our blind lust for the “luxuries” of modern life, with all its 24-hour news cycles, artificial lighting on demand, and expectations of round-the-clock telecommunications availability, we’ve thrown ourselves into a kind of circadian schizophrenia:

We are living in an age when sleep is more comfortable than ever and yet more elusive. Even the worst dorm-room mattress in America is luxurious compared to sleeping arrangements that were common not long ago. During the Victorian era, for instance, laborers living in workhouses slept sitting on benches, with their arms dangling over a taut rope in front of them. They paid for this privilege, implying that it was better than the alternatives. Families up to the time of the Industrial Revolution engaged in the nightly ritual of checking for rats and mites burrowing in the one shared bedroom. Modernity brought about a drastic improvement in living standards, but with it came electric lights, television, and other kinds of entertainment that have thrown our sleep patterns into chaos.

Work has morphed into a twenty-four-hour fact of life, bringing its own set of standards and expectations when it comes to sleep … Sleep is ingrained in our cultural ethos as something that can be put off, dosed with coffee, or ignored. And yet maintaining a healthy sleep schedule is now thought of as one of the best forms of preventative medicine.

Reflecting on his findings, Randall marvels:

As I spent more time investigating the science of sleep, I began to understand that these strange hours of the night underpin nearly every moment of our lives.

Indeed, Dreamland goes on to explore how sleep — its mechanisms, its absence, its cultural norms — affects everyone from police officers and truck drivers to artists and entrepreneurs, permeating everything from our decision-making to our emotional intelligence.

Originally featured in August.

TREES OF LIFE

Since the dawn of recorded history, humanity has been turning to the visual realm as a sensemaking tool for the world and our place in it, mapping and visualizing everything from the body to the brain to the universe to information itself. Trees of Life: A Visual History of Evolution (public library) catalogs 230 tree-like branching diagrams, culled from 450 years of mankind’s visual curiosity about the living world and our quest to understand the complex ecosystem we share with other organisms, from bacteria to birds, microbes to mammals.

Though the use of a tree as a metaphor for understanding the relationships between organisms is often attributed to Darwin, who articulated it in his Origin of Species by Means of Natural Selection in 1859, the concept, most recently appropriated in mapping systems and knowledge networks, is actually much older, predating the theory of evolution itself. The collection is thus at once a visual record of the evolution of science and of its opposite — the earliest examples, dating as far back as the sixteenth century, portray the mythic order in which God created Earth, and the diagrams’ development over the centuries is as much a progression of science as it is of culture, society, and paradigm.

Theodore W. Pietsch writes in the introduction:

The tree as an iconographic metaphor is perhaps the most universally widespread of all great cultural symbols. Trees appear and reappear throughout human history to illustrate nearly every aspect of life. The structural complexity of a tree — its roots, trunk, bifurcating branches, and leaves — has served as an ideal symbol throughout the ages to visualize and map hierarchies of knowledge and ideas.

The Ladder of Ascent and Descent of the Intellect, not tree-like at first glance, but certainly branching dichotomously, the steps labeled from bottom to top, with representative figures on the right and upper left: Lapis (stone), Flamma (fire), Planta (plant), Brutum (beast), Homo (human), Caelum (sky), Angelus (angel), and Deus (God), a scheme that shows how one might ascend from inferior to superior beings and vice versa. After Ramon Lull (1232–1315), Liber de ascensu et descensu intellectus, written about 1305 but not published until 1512.
The ‘Crust of the Earth as Related to Zoology,’ presenting, at one glance, the ‘distribution of the principle types of animals, and the order of their successive appearance in the layers of the earth’s crust,’ published by Louis Agassiz and Augustus Addison Gould as the frontispiece of their 1848 Principles of Zoölogy. The diagram is like a wheel with numerous radiating spokes, each spoke representing a group of animals, superimposed over a series of concentric rings of time, from pre-Silurian to the ‘modern age.’ According to a divine plan, different groups of animals appear within the various ‘spokes’ of the wheel and then, in some cases, go extinct. Humans enter only in the outermost layer, at the very top of the diagram, shown as the crowning achievement of all Creation.
‘Genealogy of the class of fishes’ published by Louis Agassiz in his Recherches sur les poissons fossiles (Research on fossil fishes) of 1844.
The ‘genealogico-geographical affinities’ of plant families based on the natural orders of Carl Linnaeus (1751), published by Paul Dietrich Giseke in 1792. Each family is represented by a numbered circle (roman numerals), the diameter of which gives a rough measure of the relative number of included genera (arabic numerals).
The unique egg-shaped ‘system of animals’ published by German zoologist Georg August Goldfuss in his Über de Entwicklungsstufen des Thieres (On animal development) of 1817.
‘Universal system of nature,’ from Paul Horaninow’s Primae lineae systematis naturae (Primary system of nature) of 1834, an ingenious and seemingly indecipherable clockwise spiral that places animals in the center of the vortex, arranged in a series of concentric circles, surrounded in turn by additional nested circles that contain the plants, nonmetallic minerals, and finally metals within the outermost circle. Not surprisingly, everything is subjugated to humans (Homo) located in the center.

Ernst Haeckel’s famous ‘great oak,’ a family tree of animals, from the first edition of his 1874 Anthropogenie oder Entwickelungsgeschichte des menschen (The evolution of man).

(More on Haeckel’s striking biological art here.)

Tree by John Henry Schaffner showing the relationships of the flowering plants. The early split at the base of the tree leads to the monocotyledonous plants on the left and the dicotyledons on the right.
Schaffner, 1934, Quarterly Review of Biology, 9(2):150, fig. 2; courtesy of Perry Cartwright and the University of Chicago Press.
A phylogeny of horses showing their geological distribution throughout the Tertiary, by Ruben Arthur Stirton.
Stirton, 1940, plate following page 198; courtesy of Rebecca Wells and the University of California Press.
Ruben Arthur Stirton’s revised view of horse phylogeny.
Stirton, 1959, Time, Life, and Man: The Fossil Record, p. 466, fig. 250; courtesy of Sheik Safdar and John Wiley & Sons, Inc. Used with permission.
William King Gregory’s 1946 tree of rodent relationships.
Gregory, 1951, Evolution Emerging: A Survey of Changing Patterns from Primeval Life to Man, vol. 2, p. 757; fig. 20.33; courtesy of Mary DeJong, Mai Qaraman, and the American Museum of Natural History.
The frontispiece of William King Gregory’s two-volume Evolution Emerging.
Gregory, 1951, Evolution Emerging: A Survey of Changing Patterns from Primeval Life to Man, vol. 2, p. 757; fig. 20.33; courtesy of Mary DeJong, Mai Qaraman, and the American Museum of Natural History.

Originally featured in May.

SPACE CHRONICLES

Neil deGrasse Tyson might be one of today’s most prominent astrophysicists, but he’s also a kind of existential philosopher, bringing his insights from science into the broader realm of the human condition — a kind of modern-day Carl Sagan with a rare gift for blending science and storytelling to both rub neurons with his fellow scientists and engage a popular-interest audience. In Space Chronicles: Facing the Ultimate Frontier, Tyson explores the future of space travel in the wake of NASA’s decision to put human space flight essentially on hold, using his signature wit and scientific prowess to lay out an urgent manifesto for the economic, social, moral, and cultural importance of space exploration. This excerpt from the introduction captures Tyson’s underlying ethos and echoes other great thinkers’ ideas about intuition and rationality, blending the psychosocial with the political:

Some of the most creative leaps ever taken by the human mind are decidedly irrational, even primal. Emotive forces are what drive the greatest artistic and inventive expressions of our species. How else could the sentence ‘He’s either a madman or a genius’ be understood?

It’s okay to be entirely rational, provided everybody else is too. But apparently this state of existence has been achieved only in fiction [where] societal decisions get made with efficiency and dispatch, devoid of pomp, passion, and pretense.

To govern a society shared by people of emotion, people of reason, and everybody in between — as well as people who think their actions are shaped by logic but in fact are shaped by feelings and nonempirical philosophies — you need politics. At its best, politics navigates all the minds-states for the sake of the greater good, alert to the rocky shoals of community, identity, and the economy. At its worst, politics thrives on the incomplete disclosure or misrepresentation of data required by an electorate to make informed decisions, whether arrived at logically or emotionally.

Nowhere does Tyson’s gift shine more brilliantly than in this goosebump-inducing mashup by Max Schlickenmeyer, remixing images of nature at its most inspiring with the narration of Tyson’s answer to a TIME magazine reader, who asked, “What is the most astounding fact you can share with us about the Universe?”

When I look up at the night sky and I know that, yes, we are part of this Universe, we are in this Universe, but perhaps more important than most of those facts is that the Universe is in us. When I reflect on that fact, I look up — many people feel small, because they’re small, the Universe is big — but I feel big, because my atoms came from those stars. There’s a level of connectivity — that’s really what you want in life. You want to feel connected, you want to feel relevant. You want to feel like you’re a participant in the goings on and activities and events around you. That’s precisely what we are, just by being alive.”

Originally featured in March.

HIDDEN TREASURE

For the past 175 years, the The National Library of Medicine in Bethesda has been building the world’s largest collection of biomedical images, artifacts, and ephemera. With more than 17 million items spanning ten centuries, it’s a treasure trove of rare, obscure, extravagant wonders, most of which remain unseen by the public and unknown even to historians, librarians, and curators. Until now.

Hidden Treasure is an exquisite large-format volume that culls some of the most fascinating, surprising, beautiful, gruesome, and idiosyncratic objects from the Library’s collection in 450 full-color illustrations. From rare “magic lantern slides” doctors used to entertain and cure inmates at the St. Elizabeth’s Hospital for the Insane to astonishing anatomical atlases to the mimeographed report of the Japanese medical team first to enter Hiroshima after the atomic blast, each of the curious ephemera is contextualized in a brief essay by a prominent scholar, journalist, artist, collector, or physician. What results is a remarkable journey not only into the evolution of mankind’s understanding of the physicality of being human, but also into the evolution of librarianship itself, amidst the age of the digital humanities.

The Artificial Teledioptric Eye, or Telescope (1685-86) by Johann Zahn
Zahn’s baroque diagram of the anatomy of vision (left) needs to be viewed in relation to his creation of a mechanical eye (right), the scioptric ball designed to project the image of the sun in a camera obscura
Printed book, 3 volumes
International Nurse Uniform Photograph Collection (ca. 1950), helene Flud Health Foundation
Left to right, top to bottom: Philippines, Denmark, British Honduras; Hong Kong, Madeira, Kenya; Nepal, Dominican Republic, Colombia
Jersey City, New Jersey. 93 color photographs, glossy

Michael North, Jeffrey Reznick, and Michael Sappol remind us in the introduction:

It’s no secret that nowadays we look for libraries on the Internet — without moving from our desks or laptops or mobile phones… We’re in a new and miraculous age. But there are still great libraries, in cities and on campuses, made of brick, sandstone, marble, and glass, containing physical objects, and especially enshrining the book: the Library of Congress, Bibliotheque Nationale de France, the British Library, the New York Public Library, the Wellcome Library, the great university libraries at Oxford, Harvard, Yale, Johns Hopkins, and elsewhere. And among them is the National LIbrary of Medicine in Bethesda, the world’s largest medical library, with its collection of over 17 million books, journals, manuscripts, prints, photographs, posters, motion pictures, sound recordings, and “ephemera” (pamphlets, matchbook covers, stereograph cards, etc.).

Complete Notes on the Dissection of Cadavers (1772)
Muscles and attachments
Kaishi Hen. Kyoto, Japan. Printed woodblock book, color illustrations
Darwin Collection (1859-1903)
The expression of emotions in cats and dogs, The Expression of Emotions in Man and Animals (London, 1872)
London, New York, and other locations

(Also see how Darwin’s photographic studies of human emotions changed visual culture forever.)

Civil War Surgical Card Collection (1860s)
The Army Medical Museum’s staff mined incoming reports for ‘interesting’ cases — such as a gunshot would to the ‘left side of scalp, denuding skull’ or ‘gunshot would, right elbow with gangrene supervening’ — and cases that demonstrated the use of difficult surgical techniques, such as an amputation by circular incision or resection of the ‘head of humerus and three inches of the left clavicle.’
Washington, DC. 146 numbered cards, with tipped-in photographs and case histories
Studies in Anatomy of the Nervous System and Connective Tissue (1875-76) by Axel Key and Gustaf Retzius
Arachnoid villi, or pacchionian bodies, of the human brain.
Studien in der Anatomie des Nervensystems und des Bindegewebes. Stockholm. Printed book, with color and black-and-white lithographs, 2 volumes.
Anti-Germ Warfare Campaign Posters (ca. 1952), Second People’s Cultural Institute
Hand-drawn Korean War propaganda posters, from two incomplete sequence in the collection of Chinese medical and health materials acquired by the National Library of Medicine
Fuping County, Shaanxi Province, China. Hand-inked and painted posters on paper.
Medical Trade Card Collection (ca. 1920-1940s)
The front of a Dr. Miles’ Laxative Tablets movable, die-cut advertising novelty card, lowered and raised (Elkhart, Indiana, ca. 1910)
France, Great Britain, Mexico, United States, and other counties. Donor: William Helfand

Thoughtfully curated, beautifully produced, and utterly transfixing, Hidden Treasure unravels our civilization’s relationship with that most human of humannesses. Because try as we might to order the heavens, map the mind, and chart time in our quest to know the abstract, we will have failed at being human if we neglect this most fascinating frontier of concrete existence, the mysterious and ever-alluring physical body.

Originally featured, with more images, in April.

Honorable mention: The Art of Medicine.

QUANTUM UNIVERSE

“The universe is made of stories, not atoms,” poet Muriel Rukeyser famously remarked. “We’re made of star-stuff,” Carl Sagan countered. But some of the most fascinating and important stories are those that explain atoms and “star stuff.” Such is the case of The Quantum Universe: Everything That Can Happen Does Happen by rockstar-physicist Brian Cox and University of Manchester professor Jeff Forshaw — a remarkable and absorbing journey into the fundamental fabric of nature, exploring how quantum theory provides a framework for explaining everything from silicon chips to stars to human behavior.

Quantum theory is perhaps the prime example of the infinitely esoteric becoming the profoundly useful. Esoteric, because it describes a world in which a particle really can be in several places at once and moves from one place to another by exploring the entire Universe simultaneously. Useful, because understanding the behaviour of the smallest building blocks of the universe underpins our understanding of everything else. This claim borders on the hubristic, because the world is filled with diverse and complex phenomena. Notwithstanding this complexity, we have discovered that everything is constructed out of a handful of tiny particles that move around according to the rules of quantum theory. The rules are so simple that they can be summarized on the back of an envelope. And the fact that we do not need a whole library of books to explain the essential nature of things is one of the greatest mysteries of all.

The story weaves a century of scientific hindsight and theoretical developments, from Einstein to Feynman by way of Max Planck, who coined the term “quantum” in 1900 to describe the “black body radiation” of hot objects through light emitted in little packets of energy he called “quanta,” to arrive at a modern perspective on quantum theory and its primary role in predicting observable phenomena.

The picture of the universe we inhabit, as revealed by modern physics, [is] one of underlying simplicity; elegant phenomena dance away out of sight and the diversity of the macroscopic world emerges. This is perhaps the crowning achievement of modern science; the reduction of the tremendous complexity in the world, human beings included, to a description of the behaviour of just a handful of tiny subatomic particles and the four forces that act between them.

To demonstrate that quantum theory is intimately entwined with the fabric of our everyday, rather than a weird and esoteric fringe of science, Cox offers an example rooted in the familiar. (An example, in this particular case, based on the wrong assumption — I was holding an iPad — in a kind of ironic meta-wink from Heisenberg’s uncertainty principle.)

Consider the world around you. You are holding a book made of paper, the crushed pulp of a tree. Trees are machines able to take a supply of atoms and molecules, break them down and rearrange them into cooperating colonies composed of many trillions of individual parts. They do this using a molecule known as chlorophyll, composed of over a hundred carbon, hydrogen and oxygen atoms twisted into an intricate shape with a few magnesium and nitrogen atoms bolted on. This assembly of particles is able to capture the light that has travelled the 93 million miles from our star, a nuclear furnace the volume of a million earths, and transfer that energy into the heart of cells, where it is used to build molecules from carbon dioxide and water, giving out life-enriching oxygen as it does so. It’s these molecular chains that form the superstructure of trees and all living things, the paper in your book. You can read the book and understand the words because you have eyes that can convert the scattered light from the pages into electrical impulses that are interpreted by your brain, the most complex structure we know of in the Universe. We have discovered that all these things are nothing more than assemblies of atoms, and that the wide variety of atoms are constructed using only three particles: electrons, protons and neutrons. We have also discovered that the protons and neutrons are themselves made up of smaller entities called quarks, and that it is where things stop, as far as we can tell today. Underpinning all of this is quantum theory.

But at the core of The Quantum Universe are a handful of grand truths that transcend the realm of science as an academic discipline and shine out into the vastest expanses of human existence: that in science, as in art, everything builds on what came before; that everything is connected to everything else; and, perhaps most importantly, that despite our greatest compulsions for control and certainty, much of the universe — to which the human heart and mind belong — remains reigned over by chance and uncertainty. Cox puts it this way:

A key feature of quantum theory [is that] it deals with probabilities rather than certainties, not because we lack absolute knowledge, but because some aspects of Nature are, at their very heart, governed by the laws of chance.”

Originally featured in February.

BIG QUESTIONS

“If you wish to make an apple pie from scratch,” Carl Sagan famously observed in Cosmos, “you must first invent the universe.” The questions children ask are often so simple, so basic, that they turn unwittingly yet profoundly philosophical in requiring apple-pie-from-scratch type of answers. To explore this fertile intersection of simplicity and expansiveness, Gemma Elwin Harris asked thousands of primary school children between the ages of four and twelve to send in their most restless questions, then invited some of today’s most prominent scientists, philosophers, and writers to answer them. The result is Big Questions from Little People & Simple Answers from Great Minds (public library) — a compendium of fascinating explanations of deceptively simple everyday phenomena, featuring such modern-day icons as Mary Roach, Noam Chomsky, Philip Pullman, Richard Dawkins, and many more, with a good chunk of the proceeds being donated to Save the Children.

Alain de Botton explores why we have dreams:

Most of the time, you feel in charge of your own mind. You want to play with some Lego? Your brain is there to make it happen. You fancy reading a book? You can put the letters together and watch characters emerge in your imagination.

But at night, strange stuff happens. While you’re in bed, your mind puts on the weirdest, most amazing and sometimes scariest shows.

[…]

In the olden days, people believed that our dreams were full of clues about the future. Nowadays, we tend to think that dreams are a way for the mind to rearrange and tidy itself up after the activities of the day.

Why are dreams sometimes scary? During the day, things may happen that frighten us, but we are so busy we don’t have time to think properly about them. At night, while we are sleeping safely, we can give those fears a run around. Or maybe something you did during the day was lovely but you were in a hurry and didn’t give it time. It may pop up in a dream. In dreams, you go back over things you missed, repair what got damaged, make up stories about what you’d love, and explore the fears you normally put to the back of your mind.

Dreams are both more exciting and more frightening than daily life. They’re a sign that our brains are marvellous machines — and that they have powers we don’t often give them credit for, when we’re just using them to do our homework or play a computer game. Dreams show us that we’re not quite the bosses of our own selves.

Evolutionary biologist Richard Dawkins breaks down the math of evolution and cousin marriages to demonstrate that we are all related:

Yes, we are all related. You are a (probably distant) cousin of the Queen, and of the president of the United States, and of me. You and I are cousins of each other. You can prove it to yourself.

Everybody has two parents. That means, since each parent had two parents of their own, that we all have four grandparents. Then, since each grandparent had to have two parents, everyone has eight great-grandparents, and sixteen great- great-grandparents and thirty-two great-great-great-grandparents and so on.

You can go back any number of generations and work out the number of ancestors you must have had that same number of generations ago. All you have to do is multiply two by itself that number of times.

Suppose we go back ten centuries, that is to Anglo-Saxon times in England, just before the Norman Conquest, and work out how many ancestors you must have had alive at that time.

If we allow four generations per century, that’s about forty generations ago.

Two multiplied by itself forty times comes to more than a thousand trillion. Yet the total population of the world at that time was only around three hundred million. Even today the population is seven billion, yet we have just worked out that a thousand years ago your ancestors alone were more than 150 times as numerous.

[…]

The real population of the world at the time of Julius Caesar was only a few million, and all of us, all seven billion of us, are descended from them. We are indeed all related. Every marriage is between more or less distant cousins, who already share lots and lots of ancestors before they have children of their own.
By the same kind of argument, we are distant cousins not only of all human beings but of all animals and plants. You are a cousin of my dog and of the lettuce you had for lunch, and of the next bird that you see fly past the window. You and I share ancestors with all of them. But that is another story.

Neuroscientist David Eagleman explains why we can’t tickle ourselves:

To understand why, you need to know more about how your brain works. One of its main tasks is to try to make good guesses about what’s going to happen next. While you’re busy getting on with your life, walking downstairs or eating your breakfast, parts of your brain are always trying to predict the future.

Remember when you first learned how to ride a bicycle? At first, it took a lot of concentration to keep the handlebars steady and push the pedals. But after a while, cycling became easy. Now you’re not aware of the movements you make to keep the bike going. From experience, your brain knows exactly what to expect so your body rides the bike automatically. Your brain is predicting all the movements you need to make.

You only have to think consciously about cycling if something changes — like if there’s a strong wind or you get a flat tyre. When something unexpected happens like this, your brain is forced to change its predictions about what will happen next. If it does its job well, you’ll adjust to the strong wind, leaning your body so you don’t fall.

Why is it so important for our brains to predict what will happen next? It helps us make fewer mistakes and can even save our lives.

[…]

Because your brain is always predicting your own actions, and how your body will feel as a result, you cannot tickle yourself. Other people can tickle you because they can surprise you. You can’t predict what their tickling actions will be.

And this knowledge leads to an interesting truth: if you build a machine that allows you to move a feather, but the feather moves only after a delay of a second, then you can tickle your- self. The results of your own actions will now surprise you.

Particle physicist and cosmologist Lawrence Krauss explains why we’re all made of stardust:

Everything in your body, and everything you can see around you, is made up of tiny objects called atoms. Atoms come in different types called elements. Hydrogen, oxygen and carbon are three of the most important elements in your body.

[…]

How did those elements get into our bodies? The only way they could have got there, to make up all the material on our Earth, is if some of those stars exploded a long time ago, spew- ing all the elements from their cores into space. Then, about four and a half billion years ago, in our part of our galaxy, the material in space began to collapse. This is how the Sun was formed, and the solar system around it, as well as the material that forms all life on earth.

So, most of the atoms that now make up your body were created inside stars! The atoms in your left hand might have come from a different star from those in your right hand. You are really a child of the stars.

But my favorite answers are to the all-engulfing question, How do we fall in love? Author Jeanette Winterson offers this breathlessly poetic response:

You don’t fall in love like you fall in a hole. You fall like falling through space. It’s like you jump off your own private planet to visit someone else’s planet. And when you get there it all looks different: the flowers, the animals, the colours people wear. It is a big surprise falling in love because you thought you had everything just right on your own planet, and that was true, in a way, but then somebody signalled to you across space and the only way you could visit was to take a giant jump. Away you go, falling into someone else’s orbit and after a while you might decide to pull your two planets together and call it home. And you can bring your dog. Or your cat. Your goldfish, hamster, collection of stones, all your odd socks. (The ones you lost, including the holes, are on the new planet you found.)

And you can bring your friends to visit. And read your favourite stories to each other. And the falling was really the big jump that you had to make to be with someone you don’t want to be without. That’s it.

PS You have to be brave.

Evolutionary psychologist and sociologist Robin Dunbar balances out the poetics with a scientific look at what goes on inside the brain when we love:

What happens when we fall in love is probably one of the most difficult things in the whole universe to explain. It’s something we do without thinking. In fact, if we think about it too much, we usually end up doing it all wrong and get in a terrible muddle. That’s because when you fall in love, the right side of your brain gets very busy. The right side is the bit that seems to be especially important for our emotions. Language, on the other hand, gets done almost completely in the left side of the brain. And this is one reason why we find it so difficult to talk about our feelings and emotions: the language areas on the left side can’t send messages to the emotional areas on the right side very well. So we get stuck for words, unable to describe our feelings.

But science does allow us to say a little bit about what happens when we fall in love. First of all, we know that love sets off really big changes in how we feel. We feel all light-headed and emotional. We can be happy and cry with happiness at the same time. Suddenly, some things don’t matter any more and the only thing we are interested in is being close to the person we have fallen in love with.

These days we have scanner machines that let us watch a person’s brain at work. Different parts of the brain light up on the screen, depending on what the brain is doing. When people are in love, the emotional bits of their brains are very active, lighting up. But other bits of the brain that are in charge of more sensible thinking are much less active than normal. So the bits that normally say ‘Don’t do that because it would be crazy!’ are switched off, and the bits that say ‘Oh, that would be lovely!’ are switched on.

Why does this happen? One reason is that love releases certain chemicals in our brains. One is called dopamine, and this gives us a feeling of excitement. Another is called oxytocin and seems to be responsible for the light-headedness and cosiness we feel when we are with the person we love. When these are released in large quantities, they go to parts of the brain that are especially responsive to them.

But all this doesn’t explain why you fall in love with a particular person. And that is a bit of a mystery, since there seems to be no good reason for our choices. In fact, it seems to be just as easy to fall in love with someone after you’ve married them as before, which seems the wrong way round. And here’s another odd thing. When we are in love, we can trick ourselves into thinking the other person is perfect. Of course, no one is really perfect. But the more perfect we find each other, the longer our love will last.

Big Questions from Little People is a wonderful complement to The Where, the Why, and the How: 75 Artists Illustrate Wondrous Mysteries of Science and is certain to give you pause about much of what you thought you knew, or at the very least rekindle that childlike curiosity about and awe at the basic fabric of the world we live in.

Originally featured earlier this month.


Published November 19, 2012

https://www.themarginalian.org/2012/11/19/best-science-books-2012/

BP

www.themarginalian.org

BP

PRINT ARTICLE

Filed Under

View Full Site

The Marginalian participates in the Bookshop.org and Amazon.com affiliate programs, designed to provide a means for sites to earn commissions by linking to books. In more human terms, this means that whenever you buy a book from a link here, I receive a small percentage of its price, which goes straight back into my own colossal biblioexpenses. Privacy policy. (TLDR: You're safe — there are no nefarious "third parties" lurking on my watch or shedding crumbs of the "cookies" the rest of the internet uses.)