The Marginalian
The Marginalian

The 13 Best Science and Technology Books of 2013

On the heels of the year’s best reads in psychology and philosophy, art and design, history and biography, and children’s books, the season’s subjective selection of best-of reading lists continues with the finest science and technology books of 2013. (For more timeless stimulation, revisit the selections for 2012 and 2011.)

1. THIS EXPLAINS EVERYTHING

Every year since 1998, intellectual impresario and Edge editor John Brockman has been posing a single grand question to some of our time’s greatest thinkers across a wide spectrum of disciplines, then collecting the answers in an annual anthology. Last year’s answers to the question “What scientific concept will improve everybody’s cognitive toolkit?” were released in This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking, one of the year’s best psychology and philosophy books.

In 2012, the question Brockman posed, proposed by none other than Steven Pinker, was “What is your favorite deep, elegant, or beautiful explanation?” The answers, representing an eclectic mix of 192 (alas, overwhelmingly male) minds spanning psychology, quantum physics, social science, political theory, philosophy, and more, are collected in the edited compendium This Explains Everything: Deep, Beautiful, and Elegant Theories of How the World Works (UK; public library) and are also available online.

In the introduction preceding the micro-essays, Brockman frames the question and its ultimate objective, adding to history’s most timeless definitions of science:

The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology, genetics, computer science, neurophysiology, psychology, cosmology, and physics. Emerging out of these contributions is a new natural philosophy, new ways of understanding physical systems, new ways of thinking that call into question many of our basic assumptions.

[…]

Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple principles in a surprising way. These explanations are called ‘beautiful’ or ‘elegant.’

[…]

The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining knowledge about anything — including such fields of inquiry as philosophy, mathematics, economics, history, language, and human behavior. The common thread is that a simple and nonobvious idea is proposed as the explanation of a diverse and complicated set of phenomena.

Puffer fish with Akule by photographer Wayne Levin. Click image for details.

Stanford neuroscientist Robert Sapolsky, eloquent as ever, marvels at the wisdom of the crowd and the emergence of swarm intelligence:

Observe a single ant, and it doesn’t make much sense, walking in one direction, suddenly careening in another for no obvious reason, doubling back on itself. Thoroughly unpredictable.

The same happens with two ants, a handful of ants. But a colony of ants makes fantastic sense. Specialized jobs, efficient means of exploiting new food sources, complex underground nests with temperature regulated within a few degrees. And critically, there’s no blueprint or central source of command—each individual ants has algorithms for their behaviors. But this is not wisdom of the crowd, where a bunch of reasonably informed individuals outperform a single expert. The ants aren’t reasonably informed about the big picture. Instead, the behavior algorithms of each ant consist of a few simple rules for interacting with the local environment and local ants. And out of this emerges a highly efficient colony.

Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple rules about when to lay down a pheromone trail and what to do when encountering someone else’s trail—approximations of optimal solutions to the Traveling Salesman problem. This has useful applications. In “ant-based routing,” simulations using virtual ants with similar rules can generate optimal ways of connecting the nodes in a network, something of great interest to telecommunications companies. It applies to the developing brain, which must wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of connecting axons. And migrating fetal neurons generate an efficient solution with a different version of ant-based routine.

A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple molecules in an organic soup to occasionally form more complex ones. Life may have originated this way without the requirement of bolts of lightning to catalyze the formation of complex molecules.

And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems don’t require a blue print, they don’t require a blue print maker. If they don’t require lightning bolts, they don’t require Someone hurtling lightning bolts.

Developmental psychologist Howard Gardner, who famously coined the seminal theory of multiple intelligences, echoes Anaïs Nin in advocating for the role of the individual and Susan Sontag in stressing the impact of individual acts on collective fate. His answer, arguing for the importance of human beings, comes as a welcome antidote to a question that suffers the danger of being inherently reductionist:

In a planet occupied now by seven billion inhabitants, I am amazed by the difference that one human being can make. Think of classical music without Mozart or Stravinsky; of painting without Caravaggio, Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible contributions of Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs (or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses or Christ.

[…]

Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of single individuals, or of small groups, working against the odds. As scholars, we cannot and should not sweep these instances under the investigative rug. We should bear in mind anthropologist Margaret Mead’s famous injunction: ‘Never doubt that a small group of thoughtful committed citizens can change the world. It is the only thing that ever has.’

Uber-curator Hans Ulrich Obrist, who also contributed to last year’s volume, considers the parallel role of patterns and chance in the works of iconic composer John Cage and painter Gerhard Richter, and the role of uncertainty in the creative process:

In art, the title of a work can often be its first explanation. And in this context I am thinking especially of the titles of Gerhard Richter. In 2006, when I visited Richter in his studio in Cologne, he had just finished a group of six corresponding abstract paintings which he gave the title Cage.

There are many relations between Richter’s painting and the compositions of John Cage. In a book about the Cage series, Robert Storr has traced them from Richter‘s attendance of a Cage performance at the Festum Fluxorum Fluxus in Düsseldorf 1963 to analogies in their artistic processes. Cage has often applied chance procedures in his compositions, notably with the use of the I Ching. Richter in his abstract paintings also intentionally allows effects of chance. In these paintings, he applies the oil paint on the canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the paint leaves on the canvas is to a large extent the outcome of chance.

[…]

Richter‘s concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other works)—but, one can say, the short form already contains everything. The title, like an explanation of a phenomenon, unlocks the works, describing their relation to one of the most important cultural figures of the twentieth century, John Cage, who shares with Richter the great themes of chance and uncertainty.

Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains indispensable, offers a lyrical meditation on the peculiar odds behind coincidences and déja vus:

I take comfort in the fact that there are two human moments that seem to be doled out equally and democratically within the human condition—and that there is no satisfying ultimate explanation for either. One is coincidence, the other is déja vu. It doesn’t matter if you’re Queen Elizabeth, one of the thirty-three miners rescued in Chile, a South Korean housewife or a migrant herder in Zimbabwe—in the span of 365 days you will pretty much have two déja vus as well as one coincidence that makes you stop and say, “Wow, that was a coincidence.”

The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can happen at any given moment, the fact is, that in practice, coincidences almost never do occur. Coincidences are actually so rare that when they do occur they are, in fact memorable. This suggests to me that the universe is designed to ward off coincidence whenever possible—the universe hates coincidence—I don’t know why—it just seems to be true. So when a coincidence happens, that coincidence had to work awfully hard to escape the system. There’s a message there. What is it? Look. Look harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a theorem for something larger than what they think it is.

What’s both eerie and interesting to me about déja vus is that they occur almost like metronomes throughout our lives, about one every six months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely assume that my thirteen year old niece, Stephen Hawking and someone working in a Beijing luggage-making factory each experience two déja vus a year. Not one. Not three. Two.

The underlying biodynamics of déja vus is probably ascribable to some sort of tingling neurons in a certain part of the brain, yet this doesn’t tell us why they exist. They seem to me to be a signal from larger point of view that wants to remind us that our lives are distinct, that they have meaning, and that they occur throughout a span of time. We are important, and what makes us valuable to the universe is our sentience and our curse and blessing of perpetual self-awareness.

Originally featured in January — read more here.

2. YOU ARE STARDUST

“Everyone you know, everyone you ever heard of, every human being who ever was … lived there — on a mote of dust suspended in a sunbeam,” Carl Sagan famously marveled in his poetic Pale Blue Dot monologue, titled after the iconic 1990 photograph of Earth. The stardust metaphor for our interconnection with the cosmos soon permeated popular culture and became a vehicle for the allure of space exploration. There’s something at once incredibly empowering and incredibly humbling in knowing that the flame in your fireplace came from the sun.

That’s precisely the kind of cosmic awe environmental writer Elin Kelsey and Toronto-based Korean artist Soyeon Kim seek to inspire in kids in You Are Stardust (public library) — an exquisite picture-book that instills that profound sense of connection with the natural world, and also among the best children’s books of the year. Underpinning the narrative is a bold sense of optimism — a refreshing antidote to the fear-appeal strategy plaguing most environmental messages today.

Kim’s breathtaking dioramas, to which this screen does absolutely no justice, mix tactile physical materials with fine drawing techniques and digital compositing to illuminate the relentlessly wondrous realities of our intertwined existence: The water in your sink once quenched the thirst of dinosaurs; with every sneeze, wind blasts out of your nose faster than a cheetah’s sprint; the electricity that powers every thought in your brain is stronger than lightning.

But rather than dry science trivia, the message is carried on the wings of poetic admiration for these intricate relationships:

Be still. Listen.

Like you, the Earth breathes.

Your breath is alive with the promise of flowers.

Each time you blow a kiss to the world, you spread pollen that might grow to be a new plant.

The book is nonetheless grounded in real science. Kelsey notes:

I wrote this book as a celebration — one to honor the extraordinary ways in which all of us simply are nature. Every example in this book is backed by current science. Every day, for instance, you breathe in more than a million pollen grains.

But what makes the project particularly exciting is that, in the face of the devastating gender gap in science education, here is a thoughtful, beautiful piece of early science education presented by two women, the most heartening such example since Lauren Redniss’s Radioactive.

A companion iPad app features sound effects, animation, an original score by Paul Aucoin, behind-the-scenes glimpses of Kim’s process in creating her stunning 3D dioramas, and even build-your-own-diorama adventures.

Originally featured in March — see more here.

3. ON LOOKING

“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library), also among the best psychology and philosophy books of the year — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Her approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.

4. WILD ONES

Wild Ones: A Sometimes Dismaying, Weirdly Reassuring Story About Looking at People Looking at Animals in America (public library) by journalist Jon Mooallem isn’t the typical story designed to make us better by making us feel bad, to scare us into behaving, into environmental empathy; Mooallem’s is not the self-righteous tone of capital-K knowing typical of many environmental activists but the scientist’s disposition of not-knowing, the poet’s penchant for “negative capability.” Rather than ready-bake answers, he offers instead directions of thought and signposts for curiosity and, in the process, somehow gently moves us a little bit closer to our better selves, to a deep sense of, as poet Diane Ackerman beautifully put it in 1974, “the plain everythingness of everything, in cahoots with the everythingness of everything else.”

In the introduction, Mooallem recalls looking at his four-year-old daughter Isla’s menagerie of stuffed animals and the odd cultural disconnect they mime:

[T]hey were foraging on the pages of every bedtime story, and my daughter was sleeping in polar bear pajamas under a butterfly mobile with a downy snow owl clutched to her chin. Her comb handle was a fish. Her toothbrush handle was a whale. She cut her first tooth on a rubber giraffe.

Our world is different, zoologically speaking — less straightforward and more grisly. We are living in the eye of a great storm of extinction, on a planet hemorrhaging living things so fast that half of its nine million species could be gone by the end of the century. At my place, the teddy bears and giggling penguins kept coming. But I didn’t realize the lengths to which humankind now has to go to keep some semblance of actual wildlife in the world. As our own species has taken over, we’ve tried to retain space for at least some of the others being pushed aside, shoring up their chances of survival. But the threats against them keep multiplying and escalating. Gradually, America’s management of its wild animals has evolved, or maybe devolved, into a surreal kind of performance art.

Yet even conservationists’ small successes — crocodile species bouncing back from the brink of extinction, peregrine falcons filling the skies once again — even these pride points demonstrate the degree to which we’ve assumed — usurped, even — a puppeteer role in the theater of organic life. Citing a scientist who lamented that “right now, nature is unable to stand on its own,” Mooallem writes:

We’ve entered what some scientists are calling the Anthropocene — a new geologic epoch in which human activity, more than any other force, steers change on the planet. Just as we’re now causing the vast majority of extinctions, the vast majority of endangered species will only survive if we keep actively rigging the world around them in their favor. … We are gardening the wilderness. The line between conservation and domestication has blurred.

He finds himself uncomfortably straddling these two animal worlds — the idyllic little-kid’s dreamland and the messy, fragile ecosystem of the real world:

Once I started looking around, I noticed the same kind of secondhand fauna that surrounds my daughter embellishing the grown-up world, too — not just the conspicuous bald eagle on flagpoles and currency, or the big-cat and raptor names we give sports teams and computer operating systems, but the whale inexplicably breaching in the life-insurance commercial, the glass dolphin dangling from a rearview mirror, the owl sitting on the rump of a wild boar silk-screened on a hipster’s tote bag. I spotted wolf after wolf airbrushed on the sides of old vans, and another wolf, painted against a full moon on purple velvet, greeting me over the toilet in a Mexican restaurant bathroom. … [But] maybe we never outgrow the imaginary animal kingdom of childhood. Maybe it’s the one we are trying to save.

[…]

From the very beginning, America’s wild animals have inhabited the terrain of our imagination just as much as they‘ve inhabited the actual land. They are free-roaming Rorschachs, and we are free to spin whatever stories we want about them. The wild animals always have no comment.

So he sets out to better understand the dynamics of the cultural forces that pull these worlds together with shared abstractions and rip them apart with the brutal realities of environmental collapse. His quest, in which little Isla is a frequent companion, sends him on the trails of three endangered species — a bear, a butterfly, and a bird — which fall on three different points on the spectrum of conservation reliance, relying to various degrees on the mercy of the very humans who first disrupted “the machinery of their wildness.” On the way, he encounters a remarkably vibrant cast of characters — countless passionate citizen scientists, a professional theater actor who, after an HIV diagnosis, became a professional butterfly enthusiast, and even Martha Stewart — and finds in their relationship with the environment “the same creeping disquiet about the future” that Mooallem himself came to know when he became a father. In fact, the entire project was inextricably linked to his sense of fatherly responsibility:

I’m part of a generation that seems especially resigned to watching things we encountered in childhood disappear: landline telephones, newspapers, fossil fuels. But leaving your kids a world without wild animals feels like a special tragedy, even if it’s hard to rationalize why it should.

The truth is that most of us will never experience the Earth’s endangered animals as anything more than beautiful ideas. They are figments of our shared imagination, recognizable from TV, but stalking places — places out there — to which we have no intention of going. I wondered how that imaginative connection to wildlife might fray or recalibrate as we’re forced to take more responsibility for its wildness.

It also occurred to me early on that all three endangered species I was getting to know could be gone by the time Isla is my age. It’s possible that, thirty years from now, they’ll have receded into the realm of dinosaurs, or the realm of Pokémon, for that matter — fantastical creatures whose names and diets little kids memorize from books. And it’s possible, too, I realized, that it might not even make a difference, that there would still be polar bears on footsy pajamas and sea turtle-shaped gummy vitamins — that there could be so much actual destruction without ever meaningfully upsetting the ecosystems in our minds.

Originally featured in May — read more here.

5. THINKING IN NUMBERS

Daniel Tammet was born with an unusual mind — he was diagnosed with high-functioning autistic savant syndrome, which meant his brain’s uniquely wired circuits made possible such extraordinary feats of computation and memory as learning Icelandic in a single week and reciting the number pi up to the 22,514th digit. He is also among the tiny fraction of people diagnosed with synesthesia — that curious crossing of the senses that causes one to “hear” colors, “smell” sounds, or perceive words and numbers in different hues, shapes, and textures. Synesthesia is incredibly rare — Vladimir Nabokov was among its few famous sufferers — which makes it overwhelmingly hard for the majority of us to imagine precisely what it’s like to experience the world through this sensory lens. Luckily, Tammet offers a fascinating first-hand account in Thinking In Numbers: On Life, Love, Meaning, and Math (public library) — a magnificent collection of 25 essays on “the math of life,” celebrating the magic of possibility in all its dimensions. In the process, he also invites us to appreciate the poetics of numbers, particularly of ordered sets — in other words, the very lists that dominate everything from our productivity tools to our creative inventories to the cheapened headlines flooding the internet.

Reflecting on his second book, Embracing the Wide Sky: A Tour Across the Horizons of the Mind, and the overwhelming response from fascinated readers seeking to know what it’s really like to experience words and numbers as colors and textures — to experience the beauty that a poem and a prime number exert on a synesthete in equal measure — Tammet offers an absorbing simulation of the synesthetic mind:

Imagine.

Close your eyes and imagine a space without limits, or the infinitesimal events that can stir up a country’s revolution. Imagine how the perfect game of chess might start and end: a win for white, or black, or a draw? Imagine numbers so vast that they exceed every atom in the universe, counting with eleven or twelve fingers instead of ten, reading a single book in an infinite number of ways.

Such imagination belongs to everyone. It even possesses its own science: mathematics. Ricardo Nemirovsky and Francesca Ferrara, who specialize in the study of mathematical cognition, write that “like literary fiction, mathematical imagination entertains pure possibilities.” This is the distillation of what I take to be interesting and important about the way in which mathematics informs our imaginative life. Often we are barely aware of it, but the play between numerical concepts saturates the way we experience the world.

Sketches from synesthetic artist and musician Michal Levy’s animated visualization of John Coltrane’s ‘Giant Steps.’ Click image for details.

Tammet, above all, is enchanted by the mesmerism of the unknown, which lies at the heart of science and the heart of poetry:

The fact that we have never read an endless book, or counted to infinity (and beyond!) or made contact with an extraterrestrial civilization (all subjects of essays in the book) should not prevent us from wondering: what if? … Literature adds a further dimension to the exploration of those pure possibilities. As Nemirovsky and Ferrara suggest, there are numerous similarities in the patterns of thinking and creating shared by writers and mathematicians (two vocations often considered incomparable.)

In fact, this very link between mathematics and fiction, between numbers and storytelling, underpins much of Tammet’s exploration. Growing up as one of nine siblings, he recounts how the oppressive nature of existing as a small number in a large set spurred a profound appreciation of numbers as sensemaking mechanisms for life:

Effaced as individuals, my brothers, sisters, and I existed only in number. The quality of our quantity became something we could not escape. It preceded us everywhere: even in French, whose adjectives almost always follow the noun (but not when it comes to une grande famille). … From my family I learned that numbers belong to life. The majority of my math acumen came not from books but from regular observations and day-to-day interactions. Numerical patterns, I realized, were the matter of our world.

This awareness was the beginning of Tammet’s synesthetic sensibility:

Like colors, the commonest numbers give character, form, and dimension to our world. Of the most frequent — zero and one — we might say that they are like black and white, with the other primary colors — red, blue, and yellow — akin to two, three, and four. Nine, then, might be a sort of cobalt or indigo: in a painting it would contribute shading, rather than shape. We expect to come across samples of nine as we might samples of a color like indigo—only occasionally, and in small and subtle ways. Thus a family of nine children surprises as much as a man or woman with cobalt-colored hair.

Daniel Tammet. Portrait by Jerome Tabet.

Sampling from Jorge Luis Borges’s humorous fictional taxonomy of animals, inspired by the work of nineteenth-century German mathematician Georg Cantor, Tammet points to the deeper insight beneath our efforts to itemize and organize the universe — something Umberto Eco knew when he proclaimed that “the list is the origin of culture” and Susan Sontag intuited when she reflected on why lists appeal to us. Tammet writes:

Borges here also makes several thought-provoking points. First, though a set as familiar to our understanding as that of “animals” implies containment and comprehension, the sheer number of its possible subsets actually swells toward infinity. With their handful of generic labels (“mammal,” “reptile,” “amphibious,” etc.), standard taxonomies conceal this fact. To say, for example, that a flea is tiny, parasitic, and a champion jumper is only to begin to scratch the surface of all its various aspects.

Second, defining a set owes more to art than it does to science. Faced with the problem of a near endless number of potential categories, we are inclined to choose from a few — those most tried and tested within our particular culture. Western descriptions of the set of all elephants privilege subsets like “those that are very large,” and “those possessing tusks,” and even “those possessing an excellent memory,” while excluding other equally legitimate possibilities such as Borges’s “those that at a distance resemble flies,” or the Hindu “those that are considered lucky.”

[…]

Reading Borges invites me to consider the wealth of possible subsets into which my family “set” could be classified, far beyond those that simply point to multiplicity.

Tammet circles back to the shared gifts of literature and mathematics, which both help cultivate our capacity for compassion:

Like works of literature, mathematical ideas help expand our circle of empathy, liberating us from the tyranny of a single, parochial point of view. Numbers, properly considered, make us better people.

Originally featured in August — read more here.

6. SMARTER THAN YOU THINK

“The dangerous time when mechanical voices, radios, telephones, take the place of human intimacies, and the concept of being in touch with millions brings a greater and greater poverty in intimacy and human vision,” Anaïs Nin wrote in her diary in 1946, decades before the internet as we know it even existed. Her fear has since been echoed again and again with every incremental advance in technology, often with simplistic arguments about the attrition of attention in the age of digital distraction. But in Smarter Than You Think: How Technology is Changing Our Minds for the Better (public library), Clive Thompson — one of the finest technology writers I know, with regular bylines for Wired and The New York Times — makes a powerful and rigorously thought out counterpoint. He argues that our technological tools — from search engines to status updates to sophisticated artificial intelligence that defeats the world’s best chess players — are now inextricably linked to our minds, working in tandem with them and profoundly changing the way we remember, learn, and “act upon that knowledge emotionally, intellectually, and politically,” and this is a promising rather than perilous thing.

He writes in the introduction:

These tools can make even the amateurs among us radically smarter than we’d be on our own, assuming (and this is a big assumption) we understand how they work. At their best, today’s digital tools help us see more, retain more, communicate more. At their worst, they leave us prey to the manipulation of the toolmakers. But on balance, I’d argue, what is happening is deeply positive. This book is about the transformation.

Page from ‘Charley Harper: An Illustrated Life.’ Click image for details.

But Thompson is nothing if not a dimensional thinker with extraordinary sensitivity to the complexities of cultural phenomena. Rather than revisiting painfully familiar and trite-by-overuse notions like distraction and information overload, he examines the deeper dynamics of how these new tools are affecting the way we make sense of the world and of ourselves. Several decades after Vannevar Bush’s now-legendary meditation on how technology will impact our thinking, Thompson reaches even further into the fringes of our cultural sensibility — past the cheap techno-dystopia, past the pollyannaish techno-utopia, and into that intricate and ever-evolving intersection of technology and psychology.

One of his most fascinating and important points has to do with our outsourcing of memory — or, more specifically, our increasingly deft, search-engine-powered skills of replacing the retention of knowledge in our own brains with the on-demand access to knowledge in the collective brain of the internet. Think, for instance, of those moments when you’re trying to recall the name of a movie but only remember certain fragmentary features — the name of the lead actor, the gist of the plot, a song from the soundtrack. Thompson calls this “tip-of-the-tongue syndrome” and points out that, today, you’ll likely be able to reverse-engineer the name of the movie you don’t remember by plugging into Google what you do remember about it. Thompson contextualizes the phenomenon, which isn’t new, then asks the obvious, important question about our culturally unprecedented solutions to it:

Tip-of-the-tongue syndrome is an experience so common that cultures worldwide have a phrase for it. Cheyenne Indians call it navonotootse’a, which means “I have lost it on my tongue”; in Korean it’s hyeu kkedu-te mam-dol-da, which has an even more gorgeous translation: “sparkling at the end of my tongue.” The phenomenon generally lasts only a minute or so; your brain eventually makes the connection. But … when faced with a tip-of-the-tongue moment, many of us have begun to rely instead on the Internet to locate information on the fly. If lifelogging … stores “episodic,” or personal, memories, Internet search engines do the same for a different sort of memory: “semantic” memory, or factual knowledge about the world. When you visit Paris and have a wonderful time drinking champagne at a café, your personal experience is an episodic memory. Your ability to remember that Paris is a city and that champagne is an alcoholic beverage — that’s semantic memory.

[…]

What’s the line between our own, in-brain knowledge and the sea of information around us? Does it make us smarter when we can dip in so instantly? Or dumber with every search?

Vannevar Bush’s ‘memex’ — short for ‘memory index’ — a primitive vision for a personal hard drive for information storage and management. Click image for the full story.

That concern, of course, is far from unique to our age — from the invention of writing to Alvin Toffler’s Future Shock, new technology has always been a source of paralyzing resistance and apprehension:

Writing — the original technology for externalizing information — emerged around five thousand years ago, when Mesopotamian merchants began tallying their wares using etchings on clay tablets. It emerged first as an economic tool. As with photography and the telephone and the computer, newfangled technologies for communication nearly always emerge in the world of commerce. The notion of using them for everyday, personal expression seems wasteful, risible, or debased. Then slowly it becomes merely lavish, what “wealthy people” do; then teenagers take over and the technology becomes common to the point of banality.

Thompson reminds us of the anecdote, by now itself familiar “to the point of banality,” about Socrates and his admonition that the “technology” of writing would devastate the Greek tradition of debate and dialectic, and would render people incapable of committing anything to memory because “knowledge stored was not really knowledge at all.” He cites Socrates’s parable of the Egyptian god Theuth and how he invented writing, offering it as a gift to the king of Egypt, Thamus, who met the present with defiant indignation:

This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

That resistance endured as technology changed shape, across the Middle Ages and past Gutenberg’s revolution, but it wasn’t without counter-resistance: Those who recorded their knowledge in writing and, eventually, collected it in the form of books argued that it expanded the scope of their curiosity and the ideas they were able to ponder, whereas the mere act of rote memorization made no guarantees of deeper understanding.

Ultimately, however, Thompson points out that Socrates was both right and wrong: It’s true that, with some deliberately cultivated exceptions and neurological outliers, few thinkers today rely on pure memorization and can recite extensive passages of text from memory. But what Socrates failed to see was the extraordinary dot-connecting enabled by access to knowledge beyond what our own heads can hold — because, as Amanda Palmer poignantly put it, “we can only connect the dots that we collect,” and the outsourcing of memory has exponentially enlarged our dot-collections.

With this in mind, Thompson offers a blueprint to this newly developed system of knowledge management in which access is critical:

If you are going to read widely but often read books only once; if you going to tackle the ever-expanding universe of ideas by skimming and glancing as well as reading deeply; then you are going to rely on the semantic-memory version of gisting. By which I mean, you’ll absorb the gist of what you read but rarely retain the specifics. Later, if you want to mull over a detail, you have to be able to refind a book, a passage, a quote, an article, a concept.

But Thompson argues that despite history’s predictable patterns of resistance followed by adoption and adaptation, there’s something immutably different about our own era:

The history of factual memory has been fairly predictable up until now. With each innovation, we’ve outsourced more information, then worked to make searching more efficient. Yet somehow, the Internet age feels different. Quickly pulling up [the answer to a specific esoteric question] on Google seems different from looking up a bit of trivia in an encyclopedia. It’s less like consulting a book than like asking someone a question, consulting a supersmart friend who lurks within our phones.

And therein lies the magic of the internet — that unprecedented access to humanity’s collective brain. Thompson cites the work of Harvard psychologist Daniel Wegner, who first began exploring this notion of collective rather than individual knowledge in the 1980s by observing how partners in long-term relationships often divide and conquer memory tasks in sharing the household’s administrative duties:

Wegner suspected this division of labor takes place because we have pretty good “metamemory.” We’re aware of our mental strengths and limits, and we’re good at intuiting the abilities of others. Hang around a workmate or a romantic partner long enough and you begin to realize that while you’re terrible at remembering your corporate meeting schedule, or current affairs in Europe, or how big a kilometer is relative to a mile, they’re great at it. So you begin to subconsciously delegate the task of remembering that stuff to them, treating them like a notepad or encyclopedia. In many respects, Wegner noted, people are superior to these devices, because what we lose in accuracy we make up in speed.

[…]

Wegner called this phenomenon “transactive” memory: two heads are better than one. We share the work of remembering, Wegner argued, because it makes us collectively smarter — expanding our ability to understand the world around us.

This ability to “google” one another’s memory stores, Thompson argues, is the defining feature of our evolving relationship with information — and it’s profoundly shaping our experience of knowledge:

Transactive memory helps explain how we’re evolving in a world of on-tap information.

He illustrates this by turning to the work of Betsy Sparrow, a graduate student of Wegner’s, who conducted a series of experiments demonstrating that when we know a digital tool will store information for us, we’re far less likely to commit it to memory. On the surface, this may appear like the evident and worrisome shrinkage of our mental capacity. But there’s a subtler yet enormously important layer that such techno-dystopian simplifications miss: This very outsourcing of memory requires that we learn what the machine knows — a kind of meta-knowledge that enables us to retrieve the information when we need it. And, reflecting on Sparrow’s findings, Thomspon points out that this is neither new nor negative:

We’ve been using transactive memory for millennia with other humans. In everyday life, we are only rarely isolated, and for good reason. For many thinking tasks, we’re dumber and less cognitively nimble if we’re not around other people. Not only has transactive memory not hurt us, it’s allowed us to perform at higher levels, accomplishing acts of reasoning that are impossible for us alone. It wasn’t until recently that computer memory became fast enough to be consulted on the fly, but once it did — with search engines boasting that they return results in tenths of a second — our transactive habits adapted.

Thompson’s most important point, however, has to do with how outsourcing our knowledge to digital tools actually hampers the very process of creative thought, which relies on our ability to connect existing ideas from our mental pool of resources into new combinations, or what the French polymath Henri Poincaré has famously termed “sudden illuminations.” Without a mental catalog of materials which to mull and let incubate in our fringe consciousness, our capacity for such illuminations is greatly deflated. Thompson writes:

These eureka moments are familiar to all of us; they’re why we take a shower or go for a walk when we’re stuck on a problem. But this technique works only if we’ve actually got a lot of knowledge about the problem stored in our brains through long study and focus. … You can’t come to a moment of creative insight if you haven’t got any mental fuel. You can’t be googling the info; it’s got to be inside you.

But while this is a valid concern, Thompson doubts that we’re outsourcing too many bits of knowledge and thus curtailing our creativity. He argues, instead, that we’re mostly employing this newly evolved skill to help us sift the meaningful from the meaningless, but we remain just as capable of absorbing that which truly stimulates us:

Evidence suggests that when it comes to knowledge we’re interested in — anything that truly excites us and has meaning — we don’t turn off our memory. Certainly, we outsource when the details are dull, as we now do with phone numbers. These are inherently meaningless strings of information, which offer little purchase on the mind. … It makes sense that our transactive brains would hand this stuff off to machines. But when information engages us — when we really care about a subject — the evidence suggests we don’t turn off our memory at all.

Originally featured in September — read more here.

7. COSMIC APPRENTICE

As if to define what science is and what philosophy is weren’t hard enough, to delineate how the two fit together appears a formidable task, one that has spurred rather intense opinions. But that’s precisely what Dorion Sagan, who has previously examined the prehistoric history of sex, braves in the introduction to Cosmic Apprentice: Dispatches from the Edges of Science (public library) as he sets out to explore the intricate ways in which the two fields hang “in a kind of odd balance, watching each other, holding hands”:

The difference between science and philosophy is that the scientist learns more and more about less and less until she knows everything about nothing, whereas a philosopher learns less and less about more and more until he knows nothing about everything. There is truth in this clever crack, but, as Niels Bohr impressed, while the opposite of a trivial truth is false, the opposite of a great truth is another great truth.

I would say that applies to the flip side of the above flip takedown: Science’s eye for detail, buttressed by philosophy’s broad view, makes for a kind of alembic, an antidote to both. This intellectual electrum cuts the cloying taste of idealist and propositional philosophy with the sharp nectar of fact yet softens the edges of a technoscience that has arguably lost both its moral and its epistemological compass, the result in part of its being funded by governments and corporations whose relationship to the search for truth and its open dissemination can be considered problematic at best.

Sagan refutes the popular perception of science as rationally objective, a vessel of capital-T Truth, reminding us that every scientific concept and theory was birthed by a subjective, fallible human mind:

All observations are made from distinct places and times, and in science no less than art or philosophy by particular individuals. … Although philosophy isn’t fiction, it can be more personal, creative and open, a kind of counterbalance for science even as it argues that science, with its emphasis on a kind of impersonal materialism, provides a crucial reality check for philosophy and a tendency to overtheorize that [is] inimical to the scientific spirit. Ideally, in the search for truth, science and philosophy, the impersonal and autobiographical, can “keep each other honest,” in a kind of open circuit. Philosophy as the underdog even may have an advantage, because it’s not supposed to be as advanced as science, nor does it enjoy science’s level of institutional support — or the commensurate heightened risks of being beholden to one’s benefactors.

Like Richard Feynman, who argued tirelessly for the scientist’s responsibility to remain unsure, Sagan echoes the idea that willful ignorance is what drives science and the fear of being wrong is one of its greatest hindrances:

Science’s spirit is philosophical. It is the spirit of questioning, of curiosity, of critical inquiry combined with fact-checking. It is the spirit of being able to admit you’re wrong, of appealing to data, not authority, which does not like to admit it is wrong.

Sagan reflects on his father’s conviction that “the effort to popularize science is a crucial one for society,” one he shared with Richard Feynman, and what made Carl’s words echo as profoundly and timelessly as they do:

Science and philosophy both had a reputation for being dry, but my father helped inject life into the former, partly by speaking in plain English and partly by focusing on the science fiction fantasy of discovering extraterrestrial life.

In that respect, science could learn from philosophy’s intellectual disposition:

Philosophy today, not taught in grade school in the United States, is too often merely an academic pursuit, a handmaiden or apologetics of science, or else a kind of existential protest, a trendy avocation of grad students and the dark-clad coffeehouse set. But philosophy, although it historically gives rise to experimental science, sometimes preserves a distinct mode of sustained questioning that sharply distinguishes it from modern science, which can be too quick to provide answers.

[…]

Philosophy is less cocksure, less already-knowing, or should be, than the pundits’ diatribes that relieve us of the difficulties of not knowing, of carefully weighing, of looking at the other side, of having to think things through for ourselves. Dwell in possibility, wrote Emily Dickinson: Philosophy at its best seems a kind of poetry, not an informational delivery but a dwelling, an opening of our thoughts to the world.

Like Buckminster Fuller, who vehemently opposed specialization, Sagan attests to the synergetic value of intellectual cross-pollination, attesting to the idea that true breakthroughs in science require cross-disciplinary connections and originality consists of linking up ideas whose connection was not previously suspected:

It is true that science requires analysis and that it has fractured into microdisciplines. But because of this, more than ever, it requires synthesis. Science is about connections. Nature no more obeys the territorial divisions of scientific academic disciplines than do continents appear from space to be colored to reflect the national divisions of their human inhabitants. For me, the great scientific satoris, epiphanies, eurekas, and aha! moments are characterized by their ability to connect.

“In disputes upon moral or scientific points,” advised Martine in his wonderful 1866 guide to the art of conversation, “ever let your aim be to come at truth, not to conquer your opponent. So you never shall be at a loss in losing the argument, and gaining a new discovery.” Science, Sagan suggests — at least at its most elegant — is a conversation of constant revision, where each dead end brings to life a new fruitful question:

Theories are not only practical, and wielded like intellectual swords to the death … but beautiful. A good one is worth more than all the ill-gotten hedge fund scraps in the world. A good scientific theory shines its light, revealing the world’s fearful symmetry. And its failure is also a success, as it shows us where to look next.

Supporting Neil deGrasse Tyson’s contention that intelligent design is a philosophy of ignorance, Sagan applies this very paradigm of connection-making to the crux of the age-old science vs. religion debate, painting evolution not as a tool of certitude but as a reminder of our connectedness to everything else:

Connecting humanity with other species in a single process was Darwin’s great natural historical accomplishment. It showed that some of the issues relegated to religion really come under the purview of science. More than just a research program for technoscience, it provides a eureka moment, a subject of contemplation open in principle to all thinking minds. Beyond the squabbles over its mechanisms and modes, evolution’s epiphany derives from its widening of vistas, its showing of the depths of our connections to others from whom we’d thought we were separate. Philosophy, too … in its ancient, scientifico-genic spirit of inquiry so different from a mere, let alone peevish, recounting of facts, needs to be reconnected to science for the latter to fulfill its potential not just as something useful but as a source of numinous moments, deep understanding, and indeed, religious-like epiphanies of cosmic comprehension and aesthetic contemplation.

Originally featured in April — see more here.

8. SOCIAL

“Without the sense of fellowship with men of like mind,” Einstein wrote, “life would have seemed to me empty.” It is perhaps unsurprising that the iconic physicist, celebrated as “the quintessential modern genius,” intuited something fundamental about the inner workings of the human mind and soul long before science itself had attempted to concretize it with empirical evidence. Now, it has: In Social: Why Our Brains Are Wired to Connect (public library), neuroscientist Matthew D. Lieberman, director of UCLA’s Social Cognitive Neuroscience lab, sets out to “get clear about ‘who we are’ as social creatures and to reveal how a more accurate understanding of our social nature can improve our lives and our society. Lieberman, who has spent the past two decades using tools like fMRI to study how the human brain responds to its social context, has found over and over again that our brains aren’t merely simplistic mechanisms that only respond to pain and pleasure, as philosopher Jeremy Bentham famously claimed, but are instead wired to connect. At the heart of his inquiry is a simple question: Why do we feel such intense agony when we lose a loved one? He argues that, far from being a design flaw in our neural architecture, our capacity for such overwhelming grief is a vital feature of our evolutionary constitution:

The research my wife and I have done over the past decade shows that this response, far from being an accident, is actually profoundly important to our survival. Our brains evolved to experience threats to our social connections in much the same way they experience physical pain. By activating the same neural circuitry that causes us to feel physical pain, our experience of social pain helps ensure the survival of our children by helping to keep them close to their parents. The neural link between social and physical pain also ensures that staying socially connected will be a lifelong need, like food and warmth. Given the fact that our brains treat social and physical pain similarly, should we as a society treat social pain differently than we do? We don’t expect someone with a broken leg to “just get over it.” And yet when it comes to the pain of social loss, this is a common response. The research that I and others have done using fMRI shows that how we experience social pain is at odds with our perception of ourselves. We intuitively believe social and physical pain are radically different kinds of experiences, yet the way our brains treat them suggests that they are more similar than we imagine.

Citing his research, Lieberman affirms the notion that there is no such thing as a nonconformist, pointing out the social construction of what we call our individual “selves” — empirical evidence for what the novelist William Gibson so eloquently termed one’s “personal micro-culture” — and observes “our socially malleable sense of self”:

The neural basis for our personal beliefs overlaps significantly with one of the regions of the brain primarily responsible for allowing other people’s beliefs to influence our own. The self is more of a superhighway for social influence than it is the impenetrable private fortress we believe it to be.

Contextualizing it in a brief evolutionary history, he argues that this osmosis of sociality and individuality is an essential aid in our evolutionary development rather than an aberrant defect in it:

Our sociality is woven into a series of bets that evolution has laid down again and again throughout mammalian history. These bets come in the form of adaptations that are selected because they promote survival and reproduction. These adaptations intensify the bonds we feel with those around us and increase our capacity to predict what is going on in the minds of others so that we can better coordinate and cooperate with them. The pain of social loss and the ways that an audience’s laughter can influence us are no accidents. To the extent that we can characterize evolution as designing our modern brains, this is what our brains were wired for: reaching out to and interacting with others. These are design features, not flaws. These social adaptations are central to making us the most successful species on earth.

The implications of this span across everything from the intimacy of our personal relationships to the intricacy of organizational management and teamwork. But rather than entrusting a single cognitive “social network” with these vital functions, our brains turn out to host many. Lieberman explains:

Just as there are multiple social networks on the Internet such as Facebook and Twitter, each with its own strengths, there are also multiple social networks in our brains, sets of brain regions that work together to promote our social well-being.

These networks each have their own strengths, and they have emerged at different points in our evolutionary history moving from vertebrates to mammals to primates to us, Homo sapiens. Additionally, these same evolutionary steps are recapitulated in the same order during childhood.

He goes on to explore three major adaptations that have made us so inextricably responsive to the social world:

  • Connection: Long before there were any primates with a neocortex, mammals split off from other vertebrates and evolved the capacity to feel social pains and pleasures, forever linking our well-being to our social connectedness. Infants embody this deep need to stay connected, but it is present through our entire lives.
  • Mindreading: Primates have developed an unparalleled ability to understand the actions and thoughts of those around them, enhancing their ability to stay connected and interact strategically. In the toddler years, forms of social thinking develop that outstrip those seen in the adults of any other species. This capacity allows humans to create groups that can implement nearly any idea and to anticipate the needs and wants of those around us, keeping our groups moving smoothly.
  • Harmonizing: The sense of self is one of the most recent evolutionary gifts we have received. Although the self may appear to be a mechanism for distinguishing us from others and perhaps accentuating our selfishness, the self actually operates as a powerful force for social cohesiveness. During the preteen and teenage years, adolescent refers to the neural adaptations that allow group beliefs and values to influence our own.

Originally featured in November — see more here, including Liberman’s fantastic TEDxStLouis talk.

9. GULP

Few writers are able to write about science in a way that’s provocative without being sensationalistic, truthful without being dry, enchanting without being forced — and even fewer are able to do so on subjects that don’t exactly lend themselves to Saganesque whimsy. After all, it’s infinitely easier to inspire awe while discussing the bombastic magnificence of the cosmos than, say, the function of bodily fluids and the structures that secrete them. But Mary Roach is one of those rare writers, and that’s precisely what she proves once more in Gulp: Adventures on the Alimentary Canal (public library) — a fascinating tour of the body’s most private hydraulics.

Roach writes in the introduction:

The early anatomists had that curiosity in spades. They entered the human form like an unexplored continent. Parts were named like elements of geography: the isthmus of the thyroid, the isles of the pancreas, the straits and inlets of the pelvis. The digestive tract was for centuries known as the alimentary canal. How lovely to picture one’s dinner making its way down a tranquil, winding waterway, digestion and excretion no more upsetting or off-putting than a cruise along the Rhine. It’s this mood, these sentiments — the excitement of exploration and the surprises and delights of travel to foreign locales — that I hope to inspire with this book.

It may take some doing. The prevailing attitude is one of disgust. … I remember, for my last book, talking to the public-affairs staff who choose what to stream on NASA TV. The cameras are often parked on the comings and goings of Mission Control. If someone spots a staffer eating lunch at his desk, the camera is quickly repositioned. In a restaurant setting, conviviality distracts us from the biological reality of nutrient intake and oral processing. But a man alone with a sandwich appears as what he is: an organism satisfying a need. As with other bodily imperatives, we’d rather not be watched. Feeding, and even more so its unsavory correlates, are as much taboos as mating and death.

The taboos have worked in my favor. The alimentary recesses hide a lode of unusual stories, mostly unmined. Authors have profiled the brain, the heart, the eyes, the skin, the penis and the female geography, even the hair, but never the gut. The pie hole and the feed chute are mine.

Roach goes on to bring real science to those subjects that make teenagers guffaw and that populate mediocre standup jokes, exploring such bodily mysteries as what flatulence research reveals about death, why tasting has little to do with taste, how thorough chewing can lower the national debt, and why we like the foods we like and loathe the rest.< /p>

10. WONDERS OF THE UNIVERSE

“I know that I am mortal by nature and ephemeral,” ur-astronomer Ptolemy contemplated nearly two millennia ago, “but when I trace at my pleasure the windings to and fro of the heavenly bodies, I no longer touch earth with my feet. I stand in the presence of Zeus himself and take my fill of ambrosia.” But while the cosmos has fascinated humanity since the dawn of time, its mesmerism isn’t that of an abstract other but, rather, the very self-reflexive awareness that Ptolemy attested to, that intimate and inextricable link between the wonders of life here on Earth and the magic we’ve always found in our closest cosmic neighbors.

That’s precisely what modern-day science-enchanter Brian Cox explores in Wonders of the Solar System (public library) — the fantastic and illuminating book based on his BBC series of the same title celebrating the spirit of exploration, and a follow-up to his Wonders of Life and every bit as brimming with his signature blend of enthralling storytelling, scientific brilliance, and contagious conviction.

Cox begins by reminding us that preserving the spirit of exploration is both a joy and a moral obligation — especially at a time when it faces tragic threats of indifference and neglect from the very authorities whose job it is to fuel it, despite a citizenry profoundly in love with the ethos of exploration:

[The spirit of exploration] is desperately relevant, an idea so important that celebration is perhaps too weak a word. It is a plea for the spirit of the navigators of the seas and the pioneers of aviation and spaceflight to be restored and cherished; a case made to the viewer and reader that reaching for worlds beyond our grasp is an essential driver of progress and necessary sustenance for the human spirit. Curiosity is the rocket fuel that powers our civilization. If we deny this innate and powerful urge, perhaps because earthly concerns seem more worthy or pressing, then the borders of our intellectual and physical domain will shrink with our ambitions. We are part of a much wider ecosystem, and our prosperity and even long-term survival are contingent on our understanding of it.

But most revelational of all is Cox’s gift from illustrating what our Earthly phenomena, right here on our seemingly ordinary planet, reveal about the wonders and workings of the Solar System.

Tornadoes, for instance, tell us how our star system was born — the processes that drive these giant rotating storms obey the same physics forces that caused clumps to form at the center of nebulae five billion years ago, around which the gas cloud collapsed and began spinning ever-faster, ordering the chaos, until the early Solar System was churned into existence. This universal principle, known as the conservation of angular momentum, is also what drives a tornado’s destructive spiral.

Cox synthesizes:

This is how our Solar System was born: rather than the whole system collapsing into the Sun, a disc of dust and gas extending billions of kilometers into space formed around the new shining star. In just a few hundred million years, pieces of the cloud collapsed to form planets and moons, and so a star system, our Solar System, was formed. The journey from chaos into order had begun.

Then we have Iceland’s icebergs and glacial lagoons, which offer remarkable insight into the nature of Saturn’s rings. Both shine with puzzling brightness — the lagoons, here on Earth, by bringing pure water that is thousands of years old and free of pollutants from the bottom of the seabed to the surface as they rise, forming ice crystals of exceptional vibrance; Saturn’s rings, young and ever-changing, by circling icy ring particles around the planet, constantly crashing them together and breaking them apart, thus exposing bright new facets of ice that catch the sunlight and dazzle amidst a Solar System that is otherwise “a very dirty place.”

Cox explains:

It’s difficult to imagine the scale, beauty and intricacy of Saturn’s rings here on Earth, but the glacial lagoons of Iceland can transport our minds across millions of kilometers of space and help us understand the true nature of the rings. … At first sight, the lagoon appears to be a solid sheet of pristine ice,but this is an illusion. The surface is constantly shifting, an almost organic, every-changing raft of thousands of individual icebergs floating on the water. The structure of Saturn’s rings is similar, because despite appearances the rings aren’t solid. Each ring is made up of hundreds of ringlets and each ringlet is made up of billions of separate pieces. Captured by Saturn’s gravity, the ring particles independently orbit the panel in an impossibly thin layer.

Cox goes on to explore other such illuminating parallels, from how Alaska’s Lake Eyak illustrate the methane cycles of the universe to what Hawaii’s Big Island tells us about the forces that keep any planet alive to how the volcanic features of India’s Deccan Traps explain why Venus choked to death. He ends with T. S. Eliot’s timeless verses on the spirit of exploration and echoes Neil deGrasse Tyson’s wisdom on your ego and the cosmic perspective, concluding:

You could take the view that our exploration of the Universe has made us somehow insignificant; one tiny planet around one star amongst hundreds of billions. But I don’t take that view, because we’ve discovered that it takes the rarest combination of chance and the laws of Nature to produce a planet that can support a civilization, that most magnificent structure that allows us to explore and understand the Universe. That’s why, for me, our civilization is the wonder of the Solar System, and if you were to be looking at the Earth from outside the Solar System that much would be obvious. We have written the evidence of our existence onto the surface of our planet. Our civilization has become a beacon that identifies our planet as a home to life.

Originally featured in August — see more here.

11. SAVE OUR SCIENCE

“What is crucial is not that technical ability, but it is imagination in all of its applications,” the great E. O. Wilson offered in his timeless advice to young scientists — a conviction shared by some of history’s greatest scientific minds. And yet it is rote memorization and the unimaginative application of technical skill that our dominant education system prioritizes — so it’s no wonder it is failing to produce the Edisons and Curies of our day. In Save Our Science: How to Inspire a New Generation of Scientists, materials scientist, inventor, and longtime Yale professor Ainissa Ramirez takes on a challenge Isaac Asimov presaged a quarter century ago, advocating for the value of science education and critiquing its present failures, with a hopeful and pragmatic eye toward improving its future. She writes in the introduction:

The 21st century requires a new kind of learner — not someone who can simply churn out answers by rote, as has been done in the past, but a student who can think expansively and solve problems resourcefully.

To do that, she argues, we need to replace the traditional academic skills of “reading, ’riting, and ’rithmetic” with creativity, curiosity, critical-thinking, and problem-solving. (Though, as psychology has recently revealed, problem-finding might be the more valuable skill.)

Ainissa Ramirez at TED 2012 (Photograph: James Duncan Davidson for TED)

She begins with the basics:

While the acronym STEM sounds very important, STEM answers just three questions: Why does something happen? How can we apply this knowledge in a practical way? How can we describe what is happening succinctly? Through the questions, STEM becomes a pathway to be curious, to create, and to think and figure things out.

Even for those of us who deem STEAM (wherein the A stands for “arts”) superior to STEM, Ramirez’s insights are razor-sharp and consistent with the oft-affirmed idea that creativity relies heavily upon connecting the seemingly disconnected and aligning the seemingly misaligned:

There are two schools of thought on defining creativity: divergent thinking, which is the formation of a creative idea resulting from generating lots of ideas, and a Janusian approach, which is the act of making links between two remote ideas. The latter takes its name from the two-faced Roman god of beginnings, Janus, who was associated with doorways and the idea of looking forward and backward at the same time. Janusian creativity hinges on the belief that the best ideas come from linking things that previously did not seem linkable. Henri Poincaré, a French mathematician, put it this way: ‘To create consists of making new combinations. … The most fertile will often be those formed of elements drawn from domains which are far apart.’

Another element inherent to the scientific process but hardly rewarded, if not punished, in education is the role of ignorance, or what the poet John Keats has eloquently and timelessly termed “negative capability” — the art of brushing up against the unknown and proceeding anyway. Ramirez writes:

My training as a scientist allows me to stare at an unknown and not run away, because I learned that this melding of uncertainty and curiosity is where innovation and creativity occur.

Yet these very qualities are missing from science education in the United States — and it shows. When the Programme for International Student Assessment (PISA) took their annual poll in 2006, the U.S. ranked 35th in math and 29th in science out of the 40 high-income, developed countries surveyed.

Average PISA scores versus expenditures for selected countries (Source: Organisation for Economic Co-operation and Development)

Ramirez offers a historical context: When American universities first took root in the colonial days, their primary role was to educate men for the clergy, so science, technology, and math were not a priority. But then Justin Smith Morrill, a little-known congressman from Vermont who had barely completed his high school education, came along in 1861 and quietly but purposefully sponsored legislation that forever changed American education, resulting in more than 70 new colleges and universities that included STEM subjects in their curricula. This catapulted enrollment rates from the mere 2% of the population who attended higher education prior to the Civil War and greatly increased diversity in academia, with the act’s second revision in 1890 extending education opportunities to women and African-Americans.

The growth of U.S. college enrollment from 1869 to 1994. (Source: S. B. Carter et al., Historical Statistics of the United States)

But what really propelled science education, Ramirez notes, was the competitive spirit of the Space Race:

The mixture of being outdone and humiliated motivated the U.S. to create NASA and bolster the National Science Foundation’s budget to support science research and education. Sputnik forced the U.S. to think about its science position and to look hard into a mirror — and the U.S. did not like what it saw. In 1956, before Sputnik, the National Science Foundation’s budget was a modest $15.9 million. In 1958, it tripled to $49.5 million, and it doubled again in 1959 to $132.9 million. The space race was on. We poured resources, infrastructure, and human capital into putting an American on the moon, and with that goal, STEM education became a top priority.

President John F. Kennedy addresses a crowd of 35,000 at Rice University in 1962, proclaiming again his desire to reach the moon with the words, ‘We set sail on this new sea because there is new knowledge to be gained.’ Credit: NASA / Public domain

Ramirez argues for returning to that spirit of science education as an investment in national progress:

The U.S. has a history of changing education to meet the nation’s needs. We need similar innovative forward-thinking legislation now, to prepare our children and our country for the 21st century. Looking at our history allows us to see that we have been here before and prevailed. Let’s meet this challenge, for it will, as Kennedy claimed, draw out the very best in all of us.

In confronting the problems that plague science education and the public’s relationship with scientific culture, Ramirez points to the fact that women account for only 26% of STEM bachelor’s degrees and explores the heart of the glaring gender problem:

[There is a] false presumption that girls are not as good as boys in science and math. This message absolutely pervades our national mindset. Even though girls and boys sit next to each other in class, fewer women choose STEM careers than men. This is the equivalent to a farmer sowing seeds and then harvesting only half of the fields.

The precipitous drop in girls’ enrollment in STEM classes. (Source: J. F. Latimer, What’s Happened To Our High Schools)

In turning toward possible solutions, Ramirez calls out the faulty models of standardized testing, which fail to account for more dimensional definitions of intelligence. She writes:

There is a concept in physics that the observer of an experiment can change the results just by the act of observing (this is called, not surprisingly, the observer effect). For example, knowing the required pressure of your tires and observing that they are overinflated dictates that you let some air out, which changes the pressure slightly.

Although this theory is really for electrons and atoms, we also see it at work in schools. Schools are evaluated, by the federal and state governments, by tests. The students are evaluated by tests administered by the teachers. It is the process of testing that has changed the mission of the school from instilling a wide knowledge of the subject matter to acquiring a good score on the tests.

The United States is one of the most test-taking countries in the world, and the standard weapon is the multiple-choice question. Although multiple-choice tests are efficient in schools, they don’t inspire learning. In fact, they do just the opposite. This is hugely problematic in encouraging the skills needed for success in the 21st century. Standardized testing teaches skills that are counter to skills needed for the future, such as curiosity, problem solving, and having a healthy relationship with failure. Standardized tests draw up a fear of failure, since you seek a specific answer and you will be either right or wrong; they kick problem solving in the teeth, since you never need to show your work and never develop a habit of figuring things out; and they slam the doors to curiosity, since only a small selection of the possible answers is laid out before you. These kinds of tests produce thinkers who are unwilling to stretch and take risks and who cannot handle failure. They crush a sense of wonder.

Like Noam Chomsky, who has questioned why schools train for passing tests rather than for creative inquiry, and Sir Ken Robinson, who has eloquently advocated for changing the factory model of education, Ramirez urges:

While scientists passionately explore, reason, discover, synthesize, compare, contrast, and connect the dots, students drudgingly memorize, watch, and passively consume. Students are exercising the wrong muscle. An infusion of STEM taught in compelling ways will give students an opportunity to acquire these active learning skills.

Ramirez goes on to propose a multitude of small changes and larger shifts that communities, educators, cities, institutions, and policy-makers could implement — from neighborhood maker-spaces to wifi hotspots on school buses to university science festivals to new curricula and testing methods — that would begin to bridge the gap between what science education currently is and what scientific culture could and should be. She concludes, echoing Alvin Toffler’s famous words that “the illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn”:

The skills of the 21st century need us to create scholars who can link the unlinkable. … Nurturing curious, creative problem solvers who can master the art of figuring things out will make them ready for this unknown brave new world. And that is the best legacy we can possibly leave.

Originally featured in February — see more here.

12. THE ELEMENTS OF EUCLID

Almost a century before Mondrian made his iconic red, yellow, and blue geometric compositions, and around the time that Edward Livingston Youmans was creating his stunning chemistry diagrams, an eccentric 19th-century civil engineer and mathematician named Oliver Byrne produced a striking series of vibrant diagrams in primary colors for a 1847 edition of the legendary Greek mathematical treatise Euclid’s Elements. Byrne, a vehement opponent of pseudoscience with an especial distaste phrenology, was early to the insight that great design and graphic elegance can powerfully aid learning. He explained that in his edition of Euclid, “coloured diagrams and symbols are used instead of letters for the greater ease of learners.” The book, a masterpiece of Victorian printing and graphic design long before “graphic design” existed as a discipline, is celebrated as one of the most unusual and most beautiful books of the 19th century.

Now, the fine folks of Taschen — who have brought us such visual treasures as the best illustrations from 150 years of Hans Christian Andersen, the life and legacy of infographics godfather Fritz Kahn, and the visual history of magic — are resurrecting Byrne’s gem in the lavish tome The First Six Books of the Elements of Euclid (public library), edited by Swiss polymath Werner Oechslin.

Proof of the Pythagorean theorem

A masterwork of art and science in equal measure, this newly rediscovered treasure mesmerizes the eye with its brightly colored circles, squares, and triangles while it tickles the brain with its mathematical magic.

Originally featured in November — see more here.

13. DOES MY GOLDFISH KNOW WHO I AM?

In 2012, I wrote about a lovely book titled Big Questions from Little People & Simple Answers from Great Minds, in which some of today’s greatest scientists, writers, and philosophers answer kids’ most urgent questions, deceptively simple yet profound. It went on to become one of the year’s best books and among readers’ favorites. A few months later, Gemma Elwin Harris, the editor who had envisioned the project, reached out to invite me to participate in the book’s 2013 edition by answering one randomly assigned question from a curious child. Naturally, I was thrilled to do it, and honored to be a part of something as heartening as Does My Goldfish Know Who I Am? (public library), also among the best children’s books of the year — a compendium of primary school children’s funny, poignant, innocent yet insightful questions about science and how life works, answered by such celebrated minds as rockstar physicist Brian Cox, beloved broadcaster and voice-of-nature Sir David Attenborough, legendary linguist Noam Chomsky, science writer extraordinaire Mary Roach, stat-showman Hans Rosling, Beatle Paul McCartney, biologist and Beagle Project director Karen James, and iconic illustrator Sir Quentin Blake. As was the case with last year’s edition, more than half of the proceeds from the book — which features illustrations by the wonderful Andy Smith — are being donated to a children’s charity.

The questions range from what the purpose of science is to why onions make us cry to whether spiders can speak to why we blink when we sneeze. Psychologist and broadcaster Claudia Hammond, who recently explained the fascinating science of why time slows down when we’re afraid, speeds up as we age, and gets all warped while we’re on vacation in one of the best psychology and philosophy books of 2013, answers the most frequently asked question by the surveyed children: Why do we cry?

It’s normal to cry when you feel upset and until the age of twelve boys cry just as often as girls. But when you think about it, it is a bit strange that salty water spills out from the corners of your eyes just because you feel sad.

One professor noticed people often say that, despite their blotchy faces, a good cry makes them feel better. So he did an experiment where people had to breathe in over a blender full of onions that had just been chopped up. Not surprisingly this made their eyes water. He collected the tears and put them in the freezer. Then he got people to sit in front of a very sad film wearing special goggles which had tiny buckets hanging off the bottom, ready to catch their tears if they cried. The people cried, but the buckets didn’t work and in the end he gathered their tears in tiny test tubes instead.

He found that the tears people cried when they were upset contained extra substances, which weren’t in the tears caused by the onions. So he thinks maybe we feel better because we get rid of these substances by crying and that this is the purpose of tears.

But not everyone agrees. Many psychologists think that the reason we cry is to let other people know that we need their sympathy or help. So crying, provided we really mean it, brings comfort because people are nice to us.

Crying when we’re happy is a bit more of a mystery, but strong emotions have a lot in common, whether happy or sad, so they seem to trigger some of the same processes in the body.

(For a deeper dive into the biological mystery of crying, see the science of sobbing and emotional tearing.)

Joshua Foer, who knows a thing or two about superhuman memory and the limits of our mind, explains to 9-year-old Tom how the brain can store so much information despite being that small:

An adult’s brain only weighs about 1.4 kilograms, but it’s made up of about 100 billion microscopic neurons. Each of those neurons looks like a tiny branching tree, whose limbs reach out and touch other neurons. In fact, each neuron can make between 5,000 and 10,000 connections with other neurons — sometimes even more. That’s more than 500 trillion connections! A memory is essentially a pattern of connections between neurons.

Every sensation that you remember, every thought that you think, transforms your brain by altering the connections within that vast network. By the time you get to the end of this sentence, you will have created a new memory, which means your brain will have physically changed.

Neuroscientist Tali Sharot, who has previously studied why our brains are wired for optimism, answers 8-year-old Maia’s question about why we don’t have memories from the time we were babies and toddlers:

We use our brain for memory. In the first few years of our lives, our brain grows and changes a lot, just like the rest of our body. Scientists think that because the parts of our brain that are important for memory have not fully developed when we are babies, we are unable to store memories in the same way that we do when we are older.

Also, when we are very young we do not know how to speak. This makes it difficult to keep events in your mind and remember them later, because we use language to remember what happened in the past.

In answering 8-year-old Hannah’s question about what newspapers do when there is no news, writer and journalist Oliver Burkeman, author of the excellent The Antidote: Happiness for People Who Can’t Stand Positive Thinking, offers a primer on media literacy — an important caveat on news that even we, as alleged grown-ups, frequently forget:

Newspapers don’t really go out and find the news: they decide what gets to count as news. The same goes for television and radio. And you might disagree with their decisions! (For example, journalists are often accused of focusing on bad news and ignoring the good, making the world seem worse than it is.)

The important thing to remember, whenever you’re reading or watching the news, is that someone decided to tell you those things, while leaving out other things. They’re presenting one particular view of the world — not the only one. There’s always another side to the story.

And my answer, to 9-year-old Ottilie’s question about why we have books:

Some people might tell you that books are no longer necessary now that we have the internet. Don’t believe them. Books help us know other people, know how the world works, and, in the process, know ourselves more deeply in a way that has nothing to with what you read them on and everything to do with the curiosity, integrity and creative restlessness you bring to them.

Books build bridges to the lives of others, both the characters in them and your countless fellow readers across other lands and other eras, and in doing so elevate you and anchor you more solidly into your own life. They give you a telescope into the minds of others, through which you begin to see with ever greater clarity the starscape of your own mind.

And though the body and form of the book will continue to evolve, its heart and soul never will. Though the telescope might change, the cosmic truths it invites you to peer into remain eternal like the Universe.

In many ways, books are the original internet — each fact, each story, each new bit of information can be a hyperlink to another book, another idea, another gateway into the endlessly whimsical rabbit hole of the written word. Just like the web pages you visit most regularly, your physical bookmarks take you back to those book pages you want to return to again and again, to reabsorb and relive, finding new meaning on each visit — because the landscape of your life is different, new, “reloaded” by the very act of living.

Originally featured in November — read more of the questions and answers here.

HONORABLE MENTIONS

The Space Book: From the Beginning to the End of Time, 250 Milestones in the History of Space & Astronomy by Jim Bell, An Appetite for Wonder: The Making of a Scientist by Richard Dawkins, and The Age of Edison: Electric Light and the Invention of Modern America by Ernest Freeberg.


Published December 10, 2013

https://www.themarginalian.org/2013/12/10/best-science-technology-books-2013/

BP

www.themarginalian.org

BP

PRINT ARTICLE

Filed Under

View Full Site

The Marginalian participates in the Bookshop.org and Amazon.com affiliate programs, designed to provide a means for sites to earn commissions by linking to books. In more human terms, this means that whenever you buy a book from a link here, I receive a small percentage of its price, which goes straight back into my own colossal biblioexpenses. Privacy policy. (TLDR: You're safe — there are no nefarious "third parties" lurking on my watch or shedding crumbs of the "cookies" the rest of the internet uses.)