Brain Pickings Icon
Brain Pickings

Search results for “Timeless Advice on writing from famous authors”

The 13 Best Science and Technology Books of 2013

The wonders of the gut, why our brains are wired to be social, what poetry and math have in common, swarm intelligence vs. “God,” and more.

On the heels of the year’s best reads in psychology and philosophy, art and design, history and biography, and children’s books, the season’s subjective selection of best-of reading lists continues with the finest science and technology books of 2013. (For more timeless stimulation, revisit the selections for 2012 and 2011.)

1. THIS EXPLAINS EVERYTHING

Every year since 1998, intellectual impresario and Edge editor John Brockman has been posing a single grand question to some of our time’s greatest thinkers across a wide spectrum of disciplines, then collecting the answers in an annual anthology. Last year’s answers to the question “What scientific concept will improve everybody’s cognitive toolkit?” were released in This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking, one of the year’s best psychology and philosophy books.

In 2012, the question Brockman posed, proposed by none other than Steven Pinker, was “What is your favorite deep, elegant, or beautiful explanation?” The answers, representing an eclectic mix of 192 (alas, overwhelmingly male) minds spanning psychology, quantum physics, social science, political theory, philosophy, and more, are collected in the edited compendium This Explains Everything: Deep, Beautiful, and Elegant Theories of How the World Works (UK; public library) and are also available online.

In the introduction preceding the micro-essays, Brockman frames the question and its ultimate objective, adding to history’s most timeless definitions of science:

The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology, genetics, computer science, neurophysiology, psychology, cosmology, and physics. Emerging out of these contributions is a new natural philosophy, new ways of understanding physical systems, new ways of thinking that call into question many of our basic assumptions.

[…]

Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple principles in a surprising way. These explanations are called ‘beautiful’ or ‘elegant.’

[…]

The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining knowledge about anything — including such fields of inquiry as philosophy, mathematics, economics, history, language, and human behavior. The common thread is that a simple and nonobvious idea is proposed as the explanation of a diverse and complicated set of phenomena.

Puffer fish with Akule by photographer Wayne Levin. Click image for details.

Stanford neuroscientist Robert Sapolsky, eloquent as ever, marvels at the wisdom of the crowd and the emergence of swarm intelligence:

Observe a single ant, and it doesn’t make much sense, walking in one direction, suddenly careening in another for no obvious reason, doubling back on itself. Thoroughly unpredictable.

The same happens with two ants, a handful of ants. But a colony of ants makes fantastic sense. Specialized jobs, efficient means of exploiting new food sources, complex underground nests with temperature regulated within a few degrees. And critically, there’s no blueprint or central source of command—each individual ants has algorithms for their behaviors. But this is not wisdom of the crowd, where a bunch of reasonably informed individuals outperform a single expert. The ants aren’t reasonably informed about the big picture. Instead, the behavior algorithms of each ant consist of a few simple rules for interacting with the local environment and local ants. And out of this emerges a highly efficient colony.

Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple rules about when to lay down a pheromone trail and what to do when encountering someone else’s trail—approximations of optimal solutions to the Traveling Salesman problem. This has useful applications. In “ant-based routing,” simulations using virtual ants with similar rules can generate optimal ways of connecting the nodes in a network, something of great interest to telecommunications companies. It applies to the developing brain, which must wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of connecting axons. And migrating fetal neurons generate an efficient solution with a different version of ant-based routine.

A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple molecules in an organic soup to occasionally form more complex ones. Life may have originated this way without the requirement of bolts of lightning to catalyze the formation of complex molecules.

And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems don’t require a blue print, they don’t require a blue print maker. If they don’t require lightning bolts, they don’t require Someone hurtling lightning bolts.

Developmental psychologist Howard Gardner, who famously coined the seminal theory of multiple intelligences, echoes Anaïs Nin in advocating for the role of the individual and Susan Sontag in stressing the impact of individual acts on collective fate. His answer, arguing for the importance of human beings, comes as a welcome antidote to a question that suffers the danger of being inherently reductionist:

In a planet occupied now by seven billion inhabitants, I am amazed by the difference that one human being can make. Think of classical music without Mozart or Stravinsky; of painting without Caravaggio, Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible contributions of Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs (or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses or Christ.

[…]

Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of single individuals, or of small groups, working against the odds. As scholars, we cannot and should not sweep these instances under the investigative rug. We should bear in mind anthropologist Margaret Mead’s famous injunction: ‘Never doubt that a small group of thoughtful committed citizens can change the world. It is the only thing that ever has.’

Uber-curator Hans Ulrich Obrist, who also contributed to last year’s volume, considers the parallel role of patterns and chance in the works of iconic composer John Cage and painter Gerhard Richter, and the role of uncertainty in the creative process:

In art, the title of a work can often be its first explanation. And in this context I am thinking especially of the titles of Gerhard Richter. In 2006, when I visited Richter in his studio in Cologne, he had just finished a group of six corresponding abstract paintings which he gave the title Cage.

There are many relations between Richter’s painting and the compositions of John Cage. In a book about the Cage series, Robert Storr has traced them from Richter‘s attendance of a Cage performance at the Festum Fluxorum Fluxus in Düsseldorf 1963 to analogies in their artistic processes. Cage has often applied chance procedures in his compositions, notably with the use of the I Ching. Richter in his abstract paintings also intentionally allows effects of chance. In these paintings, he applies the oil paint on the canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the paint leaves on the canvas is to a large extent the outcome of chance.

[…]

Richter‘s concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other works)—but, one can say, the short form already contains everything. The title, like an explanation of a phenomenon, unlocks the works, describing their relation to one of the most important cultural figures of the twentieth century, John Cage, who shares with Richter the great themes of chance and uncertainty.

Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains indispensable, offers a lyrical meditation on the peculiar odds behind coincidences and déja vus:

I take comfort in the fact that there are two human moments that seem to be doled out equally and democratically within the human condition—and that there is no satisfying ultimate explanation for either. One is coincidence, the other is déja vu. It doesn’t matter if you’re Queen Elizabeth, one of the thirty-three miners rescued in Chile, a South Korean housewife or a migrant herder in Zimbabwe—in the span of 365 days you will pretty much have two déja vus as well as one coincidence that makes you stop and say, “Wow, that was a coincidence.”

The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can happen at any given moment, the fact is, that in practice, coincidences almost never do occur. Coincidences are actually so rare that when they do occur they are, in fact memorable. This suggests to me that the universe is designed to ward off coincidence whenever possible—the universe hates coincidence—I don’t know why—it just seems to be true. So when a coincidence happens, that coincidence had to work awfully hard to escape the system. There’s a message there. What is it? Look. Look harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a theorem for something larger than what they think it is.

What’s both eerie and interesting to me about déja vus is that they occur almost like metronomes throughout our lives, about one every six months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely assume that my thirteen year old niece, Stephen Hawking and someone working in a Beijing luggage-making factory each experience two déja vus a year. Not one. Not three. Two.

The underlying biodynamics of déja vus is probably ascribable to some sort of tingling neurons in a certain part of the brain, yet this doesn’t tell us why they exist. They seem to me to be a signal from larger point of view that wants to remind us that our lives are distinct, that they have meaning, and that they occur throughout a span of time. We are important, and what makes us valuable to the universe is our sentience and our curse and blessing of perpetual self-awareness.

Originally featured in January — read more here.

2. YOU ARE STARDUST

“Everyone you know, everyone you ever heard of, every human being who ever was … lived there — on a mote of dust suspended in a sunbeam,” Carl Sagan famously marveled in his poetic Pale Blue Dot monologue, titled after the iconic 1990 photograph of Earth. The stardust metaphor for our interconnection with the cosmos soon permeated popular culture and became a vehicle for the allure of space exploration. There’s something at once incredibly empowering and incredibly humbling in knowing that the flame in your fireplace came from the sun.

That’s precisely the kind of cosmic awe environmental writer Elin Kelsey and Toronto-based Korean artist Soyeon Kim seek to inspire in kids in You Are Stardust (public library) — an exquisite picture-book that instills that profound sense of connection with the natural world, and also among the best children’s books of the year. Underpinning the narrative is a bold sense of optimism — a refreshing antidote to the fear-appeal strategy plaguing most environmental messages today.

Kim’s breathtaking dioramas, to which this screen does absolutely no justice, mix tactile physical materials with fine drawing techniques and digital compositing to illuminate the relentlessly wondrous realities of our intertwined existence: The water in your sink once quenched the thirst of dinosaurs; with every sneeze, wind blasts out of your nose faster than a cheetah’s sprint; the electricity that powers every thought in your brain is stronger than lightning.

But rather than dry science trivia, the message is carried on the wings of poetic admiration for these intricate relationships:

Be still. Listen.

Like you, the Earth breathes.

Your breath is alive with the promise of flowers.

Each time you blow a kiss to the world, you spread pollen that might grow to be a new plant.

The book is nonetheless grounded in real science. Kelsey notes:

I wrote this book as a celebration — one to honor the extraordinary ways in which all of us simply are nature. Every example in this book is backed by current science. Every day, for instance, you breathe in more than a million pollen grains.

But what makes the project particularly exciting is that, in the face of the devastating gender gap in science education, here is a thoughtful, beautiful piece of early science education presented by two women, the most heartening such example since Lauren Redniss’s Radioactive.

A companion iPad app features sound effects, animation, an original score by Paul Aucoin, behind-the-scenes glimpses of Kim’s process in creating her stunning 3D dioramas, and even build-your-own-diorama adventures.

Originally featured in March — see more here.

3. ON LOOKING

“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library), also among the best psychology and philosophy books of the year — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Her approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.

4. WILD ONES

Wild Ones: A Sometimes Dismaying, Weirdly Reassuring Story About Looking at People Looking at Animals in America (public library) by journalist Jon Mooallem isn’t the typical story designed to make us better by making us feel bad, to scare us into behaving, into environmental empathy; Mooallem’s is not the self-righteous tone of capital-K knowing typical of many environmental activists but the scientist’s disposition of not-knowing, the poet’s penchant for “negative capability.” Rather than ready-bake answers, he offers instead directions of thought and signposts for curiosity and, in the process, somehow gently moves us a little bit closer to our better selves, to a deep sense of, as poet Diane Ackerman beautifully put it in 1974, “the plain everythingness of everything, in cahoots with the everythingness of everything else.”

In the introduction, Mooallem recalls looking at his four-year-old daughter Isla’s menagerie of stuffed animals and the odd cultural disconnect they mime:

[T]hey were foraging on the pages of every bedtime story, and my daughter was sleeping in polar bear pajamas under a butterfly mobile with a downy snow owl clutched to her chin. Her comb handle was a fish. Her toothbrush handle was a whale. She cut her first tooth on a rubber giraffe.

Our world is different, zoologically speaking — less straightforward and more grisly. We are living in the eye of a great storm of extinction, on a planet hemorrhaging living things so fast that half of its nine million species could be gone by the end of the century. At my place, the teddy bears and giggling penguins kept coming. But I didn’t realize the lengths to which humankind now has to go to keep some semblance of actual wildlife in the world. As our own species has taken over, we’ve tried to retain space for at least some of the others being pushed aside, shoring up their chances of survival. But the threats against them keep multiplying and escalating. Gradually, America’s management of its wild animals has evolved, or maybe devolved, into a surreal kind of performance art.

Yet even conservationists’ small successes — crocodile species bouncing back from the brink of extinction, peregrine falcons filling the skies once again — even these pride points demonstrate the degree to which we’ve assumed — usurped, even — a puppeteer role in the theater of organic life. Citing a scientist who lamented that “right now, nature is unable to stand on its own,” Mooallem writes:

We’ve entered what some scientists are calling the Anthropocene — a new geologic epoch in which human activity, more than any other force, steers change on the planet. Just as we’re now causing the vast majority of extinctions, the vast majority of endangered species will only survive if we keep actively rigging the world around them in their favor. … We are gardening the wilderness. The line between conservation and domestication has blurred.

He finds himself uncomfortably straddling these two animal worlds — the idyllic little-kid’s dreamland and the messy, fragile ecosystem of the real world:

Once I started looking around, I noticed the same kind of secondhand fauna that surrounds my daughter embellishing the grown-up world, too — not just the conspicuous bald eagle on flagpoles and currency, or the big-cat and raptor names we give sports teams and computer operating systems, but the whale inexplicably breaching in the life-insurance commercial, the glass dolphin dangling from a rearview mirror, the owl sitting on the rump of a wild boar silk-screened on a hipster’s tote bag. I spotted wolf after wolf airbrushed on the sides of old vans, and another wolf, painted against a full moon on purple velvet, greeting me over the toilet in a Mexican restaurant bathroom. … [But] maybe we never outgrow the imaginary animal kingdom of childhood. Maybe it’s the one we are trying to save.

[…]

From the very beginning, America’s wild animals have inhabited the terrain of our imagination just as much as they‘ve inhabited the actual land. They are free-roaming Rorschachs, and we are free to spin whatever stories we want about them. The wild animals always have no comment.

So he sets out to better understand the dynamics of the cultural forces that pull these worlds together with shared abstractions and rip them apart with the brutal realities of environmental collapse. His quest, in which little Isla is a frequent companion, sends him on the trails of three endangered species — a bear, a butterfly, and a bird — which fall on three different points on the spectrum of conservation reliance, relying to various degrees on the mercy of the very humans who first disrupted “the machinery of their wildness.” On the way, he encounters a remarkably vibrant cast of characters — countless passionate citizen scientists, a professional theater actor who, after an HIV diagnosis, became a professional butterfly enthusiast, and even Martha Stewart — and finds in their relationship with the environment “the same creeping disquiet about the future” that Mooallem himself came to know when he became a father. In fact, the entire project was inextricably linked to his sense of fatherly responsibility:

I’m part of a generation that seems especially resigned to watching things we encountered in childhood disappear: landline telephones, newspapers, fossil fuels. But leaving your kids a world without wild animals feels like a special tragedy, even if it’s hard to rationalize why it should.

The truth is that most of us will never experience the Earth’s endangered animals as anything more than beautiful ideas. They are figments of our shared imagination, recognizable from TV, but stalking places — places out there — to which we have no intention of going. I wondered how that imaginative connection to wildlife might fray or recalibrate as we’re forced to take more responsibility for its wildness.

It also occurred to me early on that all three endangered species I was getting to know could be gone by the time Isla is my age. It’s possible that, thirty years from now, they’ll have receded into the realm of dinosaurs, or the realm of Pokémon, for that matter — fantastical creatures whose names and diets little kids memorize from books. And it’s possible, too, I realized, that it might not even make a difference, that there would still be polar bears on footsy pajamas and sea turtle-shaped gummy vitamins — that there could be so much actual destruction without ever meaningfully upsetting the ecosystems in our minds.

Originally featured in May — read more here.

5. THINKING IN NUMBERS

Daniel Tammet was born with an unusual mind — he was diagnosed with high-functioning autistic savant syndrome, which meant his brain’s uniquely wired circuits made possible such extraordinary feats of computation and memory as learning Icelandic in a single week and reciting the number pi up to the 22,514th digit. He is also among the tiny fraction of people diagnosed with synesthesia — that curious crossing of the senses that causes one to “hear” colors, “smell” sounds, or perceive words and numbers in different hues, shapes, and textures. Synesthesia is incredibly rare — Vladimir Nabokov was among its few famous sufferers — which makes it overwhelmingly hard for the majority of us to imagine precisely what it’s like to experience the world through this sensory lens. Luckily, Tammet offers a fascinating first-hand account in Thinking In Numbers: On Life, Love, Meaning, and Math (public library) — a magnificent collection of 25 essays on “the math of life,” celebrating the magic of possibility in all its dimensions. In the process, he also invites us to appreciate the poetics of numbers, particularly of ordered sets — in other words, the very lists that dominate everything from our productivity tools to our creative inventories to the cheapened headlines flooding the internet.

Reflecting on his second book, Embracing the Wide Sky: A Tour Across the Horizons of the Mind, and the overwhelming response from fascinated readers seeking to know what it’s really like to experience words and numbers as colors and textures — to experience the beauty that a poem and a prime number exert on a synesthete in equal measure — Tammet offers an absorbing simulation of the synesthetic mind:

Imagine.

Close your eyes and imagine a space without limits, or the infinitesimal events that can stir up a country’s revolution. Imagine how the perfect game of chess might start and end: a win for white, or black, or a draw? Imagine numbers so vast that they exceed every atom in the universe, counting with eleven or twelve fingers instead of ten, reading a single book in an infinite number of ways.

Such imagination belongs to everyone. It even possesses its own science: mathematics. Ricardo Nemirovsky and Francesca Ferrara, who specialize in the study of mathematical cognition, write that “like literary fiction, mathematical imagination entertains pure possibilities.” This is the distillation of what I take to be interesting and important about the way in which mathematics informs our imaginative life. Often we are barely aware of it, but the play between numerical concepts saturates the way we experience the world.

Sketches from synesthetic artist and musician Michal Levy’s animated visualization of John Coltrane’s ‘Giant Steps.’ Click image for details.

Tammet, above all, is enchanted by the mesmerism of the unknown, which lies at the heart of science and the heart of poetry:

The fact that we have never read an endless book, or counted to infinity (and beyond!) or made contact with an extraterrestrial civilization (all subjects of essays in the book) should not prevent us from wondering: what if? … Literature adds a further dimension to the exploration of those pure possibilities. As Nemirovsky and Ferrara suggest, there are numerous similarities in the patterns of thinking and creating shared by writers and mathematicians (two vocations often considered incomparable.)

In fact, this very link between mathematics and fiction, between numbers and storytelling, underpins much of Tammet’s exploration. Growing up as one of nine siblings, he recounts how the oppressive nature of existing as a small number in a large set spurred a profound appreciation of numbers as sensemaking mechanisms for life:

Effaced as individuals, my brothers, sisters, and I existed only in number. The quality of our quantity became something we could not escape. It preceded us everywhere: even in French, whose adjectives almost always follow the noun (but not when it comes to une grande famille). … From my family I learned that numbers belong to life. The majority of my math acumen came not from books but from regular observations and day-to-day interactions. Numerical patterns, I realized, were the matter of our world.

This awareness was the beginning of Tammet’s synesthetic sensibility:

Like colors, the commonest numbers give character, form, and dimension to our world. Of the most frequent — zero and one — we might say that they are like black and white, with the other primary colors — red, blue, and yellow — akin to two, three, and four. Nine, then, might be a sort of cobalt or indigo: in a painting it would contribute shading, rather than shape. We expect to come across samples of nine as we might samples of a color like indigo—only occasionally, and in small and subtle ways. Thus a family of nine children surprises as much as a man or woman with cobalt-colored hair.

Daniel Tammet. Portrait by Jerome Tabet.

Sampling from Jorge Luis Borges’s humorous fictional taxonomy of animals, inspired by the work of nineteenth-century German mathematician Georg Cantor, Tammet points to the deeper insight beneath our efforts to itemize and organize the universe — something Umberto Eco knew when he proclaimed that “the list is the origin of culture” and Susan Sontag intuited when she reflected on why lists appeal to us. Tammet writes:

Borges here also makes several thought-provoking points. First, though a set as familiar to our understanding as that of “animals” implies containment and comprehension, the sheer number of its possible subsets actually swells toward infinity. With their handful of generic labels (“mammal,” “reptile,” “amphibious,” etc.), standard taxonomies conceal this fact. To say, for example, that a flea is tiny, parasitic, and a champion jumper is only to begin to scratch the surface of all its various aspects.

Second, defining a set owes more to art than it does to science. Faced with the problem of a near endless number of potential categories, we are inclined to choose from a few — those most tried and tested within our particular culture. Western descriptions of the set of all elephants privilege subsets like “those that are very large,” and “those possessing tusks,” and even “those possessing an excellent memory,” while excluding other equally legitimate possibilities such as Borges’s “those that at a distance resemble flies,” or the Hindu “those that are considered lucky.”

[…]

Reading Borges invites me to consider the wealth of possible subsets into which my family “set” could be classified, far beyond those that simply point to multiplicity.

Tammet circles back to the shared gifts of literature and mathematics, which both help cultivate our capacity for compassion:

Like works of literature, mathematical ideas help expand our circle of empathy, liberating us from the tyranny of a single, parochial point of view. Numbers, properly considered, make us better people.

Originally featured in August — read more here.

6. SMARTER THAN YOU THINK

“The dangerous time when mechanical voices, radios, telephones, take the place of human intimacies, and the concept of being in touch with millions brings a greater and greater poverty in intimacy and human vision,” Anaïs Nin wrote in her diary in 1946, decades before the internet as we know it even existed. Her fear has since been echoed again and again with every incremental advance in technology, often with simplistic arguments about the attrition of attention in the age of digital distraction. But in Smarter Than You Think: How Technology is Changing Our Minds for the Better (public library), Clive Thompson — one of the finest technology writers I know, with regular bylines for Wired and The New York Times — makes a powerful and rigorously thought out counterpoint. He argues that our technological tools — from search engines to status updates to sophisticated artificial intelligence that defeats the world’s best chess players — are now inextricably linked to our minds, working in tandem with them and profoundly changing the way we remember, learn, and “act upon that knowledge emotionally, intellectually, and politically,” and this is a promising rather than perilous thing.

He writes in the introduction:

These tools can make even the amateurs among us radically smarter than we’d be on our own, assuming (and this is a big assumption) we understand how they work. At their best, today’s digital tools help us see more, retain more, communicate more. At their worst, they leave us prey to the manipulation of the toolmakers. But on balance, I’d argue, what is happening is deeply positive. This book is about the transformation.

Page from ‘Charley Harper: An Illustrated Life.’ Click image for details.

But Thompson is nothing if not a dimensional thinker with extraordinary sensitivity to the complexities of cultural phenomena. Rather than revisiting painfully familiar and trite-by-overuse notions like distraction and information overload, he examines the deeper dynamics of how these new tools are affecting the way we make sense of the world and of ourselves. Several decades after Vannevar Bush’s now-legendary meditation on how technology will impact our thinking, Thompson reaches even further into the fringes of our cultural sensibility — past the cheap techno-dystopia, past the pollyannaish techno-utopia, and into that intricate and ever-evolving intersection of technology and psychology.

One of his most fascinating and important points has to do with our outsourcing of memory — or, more specifically, our increasingly deft, search-engine-powered skills of replacing the retention of knowledge in our own brains with the on-demand access to knowledge in the collective brain of the internet. Think, for instance, of those moments when you’re trying to recall the name of a movie but only remember certain fragmentary features — the name of the lead actor, the gist of the plot, a song from the soundtrack. Thompson calls this “tip-of-the-tongue syndrome” and points out that, today, you’ll likely be able to reverse-engineer the name of the movie you don’t remember by plugging into Google what you do remember about it. Thompson contextualizes the phenomenon, which isn’t new, then asks the obvious, important question about our culturally unprecedented solutions to it:

Tip-of-the-tongue syndrome is an experience so common that cultures worldwide have a phrase for it. Cheyenne Indians call it navonotootse’a, which means “I have lost it on my tongue”; in Korean it’s hyeu kkedu-te mam-dol-da, which has an even more gorgeous translation: “sparkling at the end of my tongue.” The phenomenon generally lasts only a minute or so; your brain eventually makes the connection. But … when faced with a tip-of-the-tongue moment, many of us have begun to rely instead on the Internet to locate information on the fly. If lifelogging … stores “episodic,” or personal, memories, Internet search engines do the same for a different sort of memory: “semantic” memory, or factual knowledge about the world. When you visit Paris and have a wonderful time drinking champagne at a café, your personal experience is an episodic memory. Your ability to remember that Paris is a city and that champagne is an alcoholic beverage — that’s semantic memory.

[…]

What’s the line between our own, in-brain knowledge and the sea of information around us? Does it make us smarter when we can dip in so instantly? Or dumber with every search?

Vannevar Bush’s ‘memex’ — short for ‘memory index’ — a primitive vision for a personal hard drive for information storage and management. Click image for the full story.

That concern, of course, is far from unique to our age — from the invention of writing to Alvin Toffler’s Future Shock, new technology has always been a source of paralyzing resistance and apprehension:

Writing — the original technology for externalizing information — emerged around five thousand years ago, when Mesopotamian merchants began tallying their wares using etchings on clay tablets. It emerged first as an economic tool. As with photography and the telephone and the computer, newfangled technologies for communication nearly always emerge in the world of commerce. The notion of using them for everyday, personal expression seems wasteful, risible, or debased. Then slowly it becomes merely lavish, what “wealthy people” do; then teenagers take over and the technology becomes common to the point of banality.

Thompson reminds us of the anecdote, by now itself familiar “to the point of banality,” about Socrates and his admonition that the “technology” of writing would devastate the Greek tradition of debate and dialectic, and would render people incapable of committing anything to memory because “knowledge stored was not really knowledge at all.” He cites Socrates’s parable of the Egyptian god Theuth and how he invented writing, offering it as a gift to the king of Egypt, Thamus, who met the present with defiant indignation:

This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

That resistance endured as technology changed shape, across the Middle Ages and past Gutenberg’s revolution, but it wasn’t without counter-resistance: Those who recorded their knowledge in writing and, eventually, collected it in the form of books argued that it expanded the scope of their curiosity and the ideas they were able to ponder, whereas the mere act of rote memorization made no guarantees of deeper understanding.

Ultimately, however, Thompson points out that Socrates was both right and wrong: It’s true that, with some deliberately cultivated exceptions and neurological outliers, few thinkers today rely on pure memorization and can recite extensive passages of text from memory. But what Socrates failed to see was the extraordinary dot-connecting enabled by access to knowledge beyond what our own heads can hold — because, as Amanda Palmer poignantly put it, “we can only connect the dots that we collect,” and the outsourcing of memory has exponentially enlarged our dot-collections.

With this in mind, Thompson offers a blueprint to this newly developed system of knowledge management in which access is critical:

If you are going to read widely but often read books only once; if you going to tackle the ever-expanding universe of ideas by skimming and glancing as well as reading deeply; then you are going to rely on the semantic-memory version of gisting. By which I mean, you’ll absorb the gist of what you read but rarely retain the specifics. Later, if you want to mull over a detail, you have to be able to refind a book, a passage, a quote, an article, a concept.

But Thompson argues that despite history’s predictable patterns of resistance followed by adoption and adaptation, there’s something immutably different about our own era:

The history of factual memory has been fairly predictable up until now. With each innovation, we’ve outsourced more information, then worked to make searching more efficient. Yet somehow, the Internet age feels different. Quickly pulling up [the answer to a specific esoteric question] on Google seems different from looking up a bit of trivia in an encyclopedia. It’s less like consulting a book than like asking someone a question, consulting a supersmart friend who lurks within our phones.

And therein lies the magic of the internet — that unprecedented access to humanity’s collective brain. Thompson cites the work of Harvard psychologist Daniel Wegner, who first began exploring this notion of collective rather than individual knowledge in the 1980s by observing how partners in long-term relationships often divide and conquer memory tasks in sharing the household’s administrative duties:

Wegner suspected this division of labor takes place because we have pretty good “metamemory.” We’re aware of our mental strengths and limits, and we’re good at intuiting the abilities of others. Hang around a workmate or a romantic partner long enough and you begin to realize that while you’re terrible at remembering your corporate meeting schedule, or current affairs in Europe, or how big a kilometer is relative to a mile, they’re great at it. So you begin to subconsciously delegate the task of remembering that stuff to them, treating them like a notepad or encyclopedia. In many respects, Wegner noted, people are superior to these devices, because what we lose in accuracy we make up in speed.

[…]

Wegner called this phenomenon “transactive” memory: two heads are better than one. We share the work of remembering, Wegner argued, because it makes us collectively smarter — expanding our ability to understand the world around us.

This ability to “google” one another’s memory stores, Thompson argues, is the defining feature of our evolving relationship with information — and it’s profoundly shaping our experience of knowledge:

Transactive memory helps explain how we’re evolving in a world of on-tap information.

He illustrates this by turning to the work of Betsy Sparrow, a graduate student of Wegner’s, who conducted a series of experiments demonstrating that when we know a digital tool will store information for us, we’re far less likely to commit it to memory. On the surface, this may appear like the evident and worrisome shrinkage of our mental capacity. But there’s a subtler yet enormously important layer that such techno-dystopian simplifications miss: This very outsourcing of memory requires that we learn what the machine knows — a kind of meta-knowledge that enables us to retrieve the information when we need it. And, reflecting on Sparrow’s findings, Thomspon points out that this is neither new nor negative:

We’ve been using transactive memory for millennia with other humans. In everyday life, we are only rarely isolated, and for good reason. For many thinking tasks, we’re dumber and less cognitively nimble if we’re not around other people. Not only has transactive memory not hurt us, it’s allowed us to perform at higher levels, accomplishing acts of reasoning that are impossible for us alone. It wasn’t until recently that computer memory became fast enough to be consulted on the fly, but once it did — with search engines boasting that they return results in tenths of a second — our transactive habits adapted.

Thompson’s most important point, however, has to do with how outsourcing our knowledge to digital tools actually hampers the very process of creative thought, which relies on our ability to connect existing ideas from our mental pool of resources into new combinations, or what the French polymath Henri Poincaré has famously termed “sudden illuminations.” Without a mental catalog of materials which to mull and let incubate in our fringe consciousness, our capacity for such illuminations is greatly deflated. Thompson writes:

These eureka moments are familiar to all of us; they’re why we take a shower or go for a walk when we’re stuck on a problem. But this technique works only if we’ve actually got a lot of knowledge about the problem stored in our brains through long study and focus. … You can’t come to a moment of creative insight if you haven’t got any mental fuel. You can’t be googling the info; it’s got to be inside you.

But while this is a valid concern, Thompson doubts that we’re outsourcing too many bits of knowledge and thus curtailing our creativity. He argues, instead, that we’re mostly employing this newly evolved skill to help us sift the meaningful from the meaningless, but we remain just as capable of absorbing that which truly stimulates us:

Evidence suggests that when it comes to knowledge we’re interested in — anything that truly excites us and has meaning — we don’t turn off our memory. Certainly, we outsource when the details are dull, as we now do with phone numbers. These are inherently meaningless strings of information, which offer little purchase on the mind. … It makes sense that our transactive brains would hand this stuff off to machines. But when information engages us — when we really care about a subject — the evidence suggests we don’t turn off our memory at all.

Originally featured in September — read more here.

7. COSMIC APPRENTICE

As if to define what science is and what philosophy is weren’t hard enough, to delineate how the two fit together appears a formidable task, one that has spurred rather intense opinions. But that’s precisely what Dorion Sagan, who has previously examined the prehistoric history of sex, braves in the introduction to Cosmic Apprentice: Dispatches from the Edges of Science (public library) as he sets out to explore the intricate ways in which the two fields hang “in a kind of odd balance, watching each other, holding hands”:

The difference between science and philosophy is that the scientist learns more and more about less and less until she knows everything about nothing, whereas a philosopher learns less and less about more and more until he knows nothing about everything. There is truth in this clever crack, but, as Niels Bohr impressed, while the opposite of a trivial truth is false, the opposite of a great truth is another great truth.

I would say that applies to the flip side of the above flip takedown: Science’s eye for detail, buttressed by philosophy’s broad view, makes for a kind of alembic, an antidote to both. This intellectual electrum cuts the cloying taste of idealist and propositional philosophy with the sharp nectar of fact yet softens the edges of a technoscience that has arguably lost both its moral and its epistemological compass, the result in part of its being funded by governments and corporations whose relationship to the search for truth and its open dissemination can be considered problematic at best.

Sagan refutes the popular perception of science as rationally objective, a vessel of capital-T Truth, reminding us that every scientific concept and theory was birthed by a subjective, fallible human mind:

All observations are made from distinct places and times, and in science no less than art or philosophy by particular individuals. … Although philosophy isn’t fiction, it can be more personal, creative and open, a kind of counterbalance for science even as it argues that science, with its emphasis on a kind of impersonal materialism, provides a crucial reality check for philosophy and a tendency to overtheorize that [is] inimical to the scientific spirit. Ideally, in the search for truth, science and philosophy, the impersonal and autobiographical, can “keep each other honest,” in a kind of open circuit. Philosophy as the underdog even may have an advantage, because it’s not supposed to be as advanced as science, nor does it enjoy science’s level of institutional support — or the commensurate heightened risks of being beholden to one’s benefactors.

Like Richard Feynman, who argued tirelessly for the scientist’s responsibility to remain unsure, Sagan echoes the idea that willful ignorance is what drives science and the fear of being wrong is one of its greatest hindrances:

Science’s spirit is philosophical. It is the spirit of questioning, of curiosity, of critical inquiry combined with fact-checking. It is the spirit of being able to admit you’re wrong, of appealing to data, not authority, which does not like to admit it is wrong.

Sagan reflects on his father’s conviction that “the effort to popularize science is a crucial one for society,” one he shared with Richard Feynman, and what made Carl’s words echo as profoundly and timelessly as they do:

Science and philosophy both had a reputation for being dry, but my father helped inject life into the former, partly by speaking in plain English and partly by focusing on the science fiction fantasy of discovering extraterrestrial life.

In that respect, science could learn from philosophy’s intellectual disposition:

Philosophy today, not taught in grade school in the United States, is too often merely an academic pursuit, a handmaiden or apologetics of science, or else a kind of existential protest, a trendy avocation of grad students and the dark-clad coffeehouse set. But philosophy, although it historically gives rise to experimental science, sometimes preserves a distinct mode of sustained questioning that sharply distinguishes it from modern science, which can be too quick to provide answers.

[…]

Philosophy is less cocksure, less already-knowing, or should be, than the pundits’ diatribes that relieve us of the difficulties of not knowing, of carefully weighing, of looking at the other side, of having to think things through for ourselves. Dwell in possibility, wrote Emily Dickinson: Philosophy at its best seems a kind of poetry, not an informational delivery but a dwelling, an opening of our thoughts to the world.

Like Buckminster Fuller, who vehemently opposed specialization, Sagan attests to the synergetic value of intellectual cross-pollination, attesting to the idea that true breakthroughs in science require cross-disciplinary connections and originality consists of linking up ideas whose connection was not previously suspected:

It is true that science requires analysis and that it has fractured into microdisciplines. But because of this, more than ever, it requires synthesis. Science is about connections. Nature no more obeys the territorial divisions of scientific academic disciplines than do continents appear from space to be colored to reflect the national divisions of their human inhabitants. For me, the great scientific satoris, epiphanies, eurekas, and aha! moments are characterized by their ability to connect.

“In disputes upon moral or scientific points,” advised Martine in his wonderful 1866 guide to the art of conversation, “ever let your aim be to come at truth, not to conquer your opponent. So you never shall be at a loss in losing the argument, and gaining a new discovery.” Science, Sagan suggests — at least at its most elegant — is a conversation of constant revision, where each dead end brings to life a new fruitful question:

Theories are not only practical, and wielded like intellectual swords to the death … but beautiful. A good one is worth more than all the ill-gotten hedge fund scraps in the world. A good scientific theory shines its light, revealing the world’s fearful symmetry. And its failure is also a success, as it shows us where to look next.

Supporting Neil deGrasse Tyson’s contention that intelligent design is a philosophy of ignorance, Sagan applies this very paradigm of connection-making to the crux of the age-old science vs. religion debate, painting evolution not as a tool of certitude but as a reminder of our connectedness to everything else:

Connecting humanity with other species in a single process was Darwin’s great natural historical accomplishment. It showed that some of the issues relegated to religion really come under the purview of science. More than just a research program for technoscience, it provides a eureka moment, a subject of contemplation open in principle to all thinking minds. Beyond the squabbles over its mechanisms and modes, evolution’s epiphany derives from its widening of vistas, its showing of the depths of our connections to others from whom we’d thought we were separate. Philosophy, too … in its ancient, scientifico-genic spirit of inquiry so different from a mere, let alone peevish, recounting of facts, needs to be reconnected to science for the latter to fulfill its potential not just as something useful but as a source of numinous moments, deep understanding, and indeed, religious-like epiphanies of cosmic comprehension and aesthetic contemplation.

Originally featured in April — see more here.

8. SOCIAL

“Without the sense of fellowship with men of like mind,” Einstein wrote, “life would have seemed to me empty.” It is perhaps unsurprising that the iconic physicist, celebrated as “the quintessential modern genius,” intuited something fundamental about the inner workings of the human mind and soul long before science itself had attempted to concretize it with empirical evidence. Now, it has: In Social: Why Our Brains Are Wired to Connect (public library), neuroscientist Matthew D. Lieberman, director of UCLA’s Social Cognitive Neuroscience lab, sets out to “get clear about ‘who we are’ as social creatures and to reveal how a more accurate understanding of our social nature can improve our lives and our society. Lieberman, who has spent the past two decades using tools like fMRI to study how the human brain responds to its social context, has found over and over again that our brains aren’t merely simplistic mechanisms that only respond to pain and pleasure, as philosopher Jeremy Bentham famously claimed, but are instead wired to connect. At the heart of his inquiry is a simple question: Why do we feel such intense agony when we lose a loved one? He argues that, far from being a design flaw in our neural architecture, our capacity for such overwhelming grief is a vital feature of our evolutionary constitution:

The research my wife and I have done over the past decade shows that this response, far from being an accident, is actually profoundly important to our survival. Our brains evolved to experience threats to our social connections in much the same way they experience physical pain. By activating the same neural circuitry that causes us to feel physical pain, our experience of social pain helps ensure the survival of our children by helping to keep them close to their parents. The neural link between social and physical pain also ensures that staying socially connected will be a lifelong need, like food and warmth. Given the fact that our brains treat social and physical pain similarly, should we as a society treat social pain differently than we do? We don’t expect someone with a broken leg to “just get over it.” And yet when it comes to the pain of social loss, this is a common response. The research that I and others have done using fMRI shows that how we experience social pain is at odds with our perception of ourselves. We intuitively believe social and physical pain are radically different kinds of experiences, yet the way our brains treat them suggests that they are more similar than we imagine.

Citing his research, Lieberman affirms the notion that there is no such thing as a nonconformist, pointing out the social construction of what we call our individual “selves” — empirical evidence for what the novelist William Gibson so eloquently termed one’s “personal micro-culture” — and observes “our socially malleable sense of self”:

The neural basis for our personal beliefs overlaps significantly with one of the regions of the brain primarily responsible for allowing other people’s beliefs to influence our own. The self is more of a superhighway for social influence than it is the impenetrable private fortress we believe it to be.

Contextualizing it in a brief evolutionary history, he argues that this osmosis of sociality and individuality is an essential aid in our evolutionary development rather than an aberrant defect in it:

Our sociality is woven into a series of bets that evolution has laid down again and again throughout mammalian history. These bets come in the form of adaptations that are selected because they promote survival and reproduction. These adaptations intensify the bonds we feel with those around us and increase our capacity to predict what is going on in the minds of others so that we can better coordinate and cooperate with them. The pain of social loss and the ways that an audience’s laughter can influence us are no accidents. To the extent that we can characterize evolution as designing our modern brains, this is what our brains were wired for: reaching out to and interacting with others. These are design features, not flaws. These social adaptations are central to making us the most successful species on earth.

The implications of this span across everything from the intimacy of our personal relationships to the intricacy of organizational management and teamwork. But rather than entrusting a single cognitive “social network” with these vital functions, our brains turn out to host many. Lieberman explains:

Just as there are multiple social networks on the Internet such as Facebook and Twitter, each with its own strengths, there are also multiple social networks in our brains, sets of brain regions that work together to promote our social well-being.

These networks each have their own strengths, and they have emerged at different points in our evolutionary history moving from vertebrates to mammals to primates to us, Homo sapiens. Additionally, these same evolutionary steps are recapitulated in the same order during childhood.

He goes on to explore three major adaptations that have made us so inextricably responsive to the social world:

  • Connection: Long before there were any primates with a neocortex, mammals split off from other vertebrates and evolved the capacity to feel social pains and pleasures, forever linking our well-being to our social connectedness. Infants embody this deep need to stay connected, but it is present through our entire lives.
  • Mindreading: Primates have developed an unparalleled ability to understand the actions and thoughts of those around them, enhancing their ability to stay connected and interact strategically. In the toddler years, forms of social thinking develop that outstrip those seen in the adults of any other species. This capacity allows humans to create groups that can implement nearly any idea and to anticipate the needs and wants of those around us, keeping our groups moving smoothly.
  • Harmonizing: The sense of self is one of the most recent evolutionary gifts we have received. Although the self may appear to be a mechanism for distinguishing us from others and perhaps accentuating our selfishness, the self actually operates as a powerful force for social cohesiveness. During the preteen and teenage years, adolescent refers to the neural adaptations that allow group beliefs and values to influence our own.

Originally featured in November — see more here, including Liberman’s fantastic TEDxStLouis talk.

9. GULP

Few writers are able to write about science in a way that’s provocative without being sensationalistic, truthful without being dry, enchanting without being forced — and even fewer are able to do so on subjects that don’t exactly lend themselves to Saganesque whimsy. After all, it’s infinitely easier to inspire awe while discussing the bombastic magnificence of the cosmos than, say, the function of bodily fluids and the structures that secrete them. But Mary Roach is one of those rare writers, and that’s precisely what she proves once more in Gulp: Adventures on the Alimentary Canal (public library) — a fascinating tour of the body’s most private hydraulics.

Roach writes in the introduction:

The early anatomists had that curiosity in spades. They entered the human form like an unexplored continent. Parts were named like elements of geography: the isthmus of the thyroid, the isles of the pancreas, the straits and inlets of the pelvis. The digestive tract was for centuries known as the alimentary canal. How lovely to picture one’s dinner making its way down a tranquil, winding waterway, digestion and excretion no more upsetting or off-putting than a cruise along the Rhine. It’s this mood, these sentiments — the excitement of exploration and the surprises and delights of travel to foreign locales — that I hope to inspire with this book.

It may take some doing. The prevailing attitude is one of disgust. … I remember, for my last book, talking to the public-affairs staff who choose what to stream on NASA TV. The cameras are often parked on the comings and goings of Mission Control. If someone spots a staffer eating lunch at his desk, the camera is quickly repositioned. In a restaurant setting, conviviality distracts us from the biological reality of nutrient intake and oral processing. But a man alone with a sandwich appears as what he is: an organism satisfying a need. As with other bodily imperatives, we’d rather not be watched. Feeding, and even more so its unsavory correlates, are as much taboos as mating and death.

The taboos have worked in my favor. The alimentary recesses hide a lode of unusual stories, mostly unmined. Authors have profiled the brain, the heart, the eyes, the skin, the penis and the female geography, even the hair, but never the gut. The pie hole and the feed chute are mine.

Roach goes on to bring real science to those subjects that make teenagers guffaw and that populate mediocre standup jokes, exploring such bodily mysteries as what flatulence research reveals about death, why tasting has little to do with taste, how thorough chewing can lower the national debt, and why we like the foods we like and loathe the rest.< /p>

10. WONDERS OF THE UNIVERSE

“I know that I am mortal by nature and ephemeral,” ur-astronomer Ptolemy contemplated nearly two millennia ago, “but when I trace at my pleasure the windings to and fro of the heavenly bodies, I no longer touch earth with my feet. I stand in the presence of Zeus himself and take my fill of ambrosia.” But while the cosmos has fascinated humanity since the dawn of time, its mesmerism isn’t that of an abstract other but, rather, the very self-reflexive awareness that Ptolemy attested to, that intimate and inextricable link between the wonders of life here on Earth and the magic we’ve always found in our closest cosmic neighbors.

That’s precisely what modern-day science-enchanter Brian Cox explores in Wonders of the Solar System (public library) — the fantastic and illuminating book based on his BBC series of the same title celebrating the spirit of exploration, and a follow-up to his Wonders of Life and every bit as brimming with his signature blend of enthralling storytelling, scientific brilliance, and contagious conviction.

Cox begins by reminding us that preserving the spirit of exploration is both a joy and a moral obligation — especially at a time when it faces tragic threats of indifference and neglect from the very authorities whose job it is to fuel it, despite a citizenry profoundly in love with the ethos of exploration:

[The spirit of exploration] is desperately relevant, an idea so important that celebration is perhaps too weak a word. It is a plea for the spirit of the navigators of the seas and the pioneers of aviation and spaceflight to be restored and cherished; a case made to the viewer and reader that reaching for worlds beyond our grasp is an essential driver of progress and necessary sustenance for the human spirit. Curiosity is the rocket fuel that powers our civilization. If we deny this innate and powerful urge, perhaps because earthly concerns seem more worthy or pressing, then the borders of our intellectual and physical domain will shrink with our ambitions. We are part of a much wider ecosystem, and our prosperity and even long-term survival are contingent on our understanding of it.

But most revelational of all is Cox’s gift from illustrating what our Earthly phenomena, right here on our seemingly ordinary planet, reveal about the wonders and workings of the Solar System.

Tornadoes, for instance, tell us how our star system was born — the processes that drive these giant rotating storms obey the same physics forces that caused clumps to form at the center of nebulae five billion years ago, around which the gas cloud collapsed and began spinning ever-faster, ordering the chaos, until the early Solar System was churned into existence. This universal principle, known as the conservation of angular momentum, is also what drives a tornado’s destructive spiral.

Cox synthesizes:

This is how our Solar System was born: rather than the whole system collapsing into the Sun, a disc of dust and gas extending billions of kilometers into space formed around the new shining star. In just a few hundred million years, pieces of the cloud collapsed to form planets and moons, and so a star system, our Solar System, was formed. The journey from chaos into order had begun.

Then we have Iceland’s icebergs and glacial lagoons, which offer remarkable insight into the nature of Saturn’s rings. Both shine with puzzling brightness — the lagoons, here on Earth, by bringing pure water that is thousands of years old and free of pollutants from the bottom of the seabed to the surface as they rise, forming ice crystals of exceptional vibrance; Saturn’s rings, young and ever-changing, by circling icy ring particles around the planet, constantly crashing them together and breaking them apart, thus exposing bright new facets of ice that catch the sunlight and dazzle amidst a Solar System that is otherwise “a very dirty place.”

Cox explains:

It’s difficult to imagine the scale, beauty and intricacy of Saturn’s rings here on Earth, but the glacial lagoons of Iceland can transport our minds across millions of kilometers of space and help us understand the true nature of the rings. … At first sight, the lagoon appears to be a solid sheet of pristine ice,but this is an illusion. The surface is constantly shifting, an almost organic, every-changing raft of thousands of individual icebergs floating on the water. The structure of Saturn’s rings is similar, because despite appearances the rings aren’t solid. Each ring is made up of hundreds of ringlets and each ringlet is made up of billions of separate pieces. Captured by Saturn’s gravity, the ring particles independently orbit the panel in an impossibly thin layer.

Cox goes on to explore other such illuminating parallels, from how Alaska’s Lake Eyak illustrate the methane cycles of the universe to what Hawaii’s Big Island tells us about the forces that keep any planet alive to how the volcanic features of India’s Deccan Traps explain why Venus choked to death. He ends with T. S. Eliot’s timeless verses on the spirit of exploration and echoes Neil deGrasse Tyson’s wisdom on your ego and the cosmic perspective, concluding:

You could take the view that our exploration of the Universe has made us somehow insignificant; one tiny planet around one star amongst hundreds of billions. But I don’t take that view, because we’ve discovered that it takes the rarest combination of chance and the laws of Nature to produce a planet that can support a civilization, that most magnificent structure that allows us to explore and understand the Universe. That’s why, for me, our civilization is the wonder of the Solar System, and if you were to be looking at the Earth from outside the Solar System that much would be obvious. We have written the evidence of our existence onto the surface of our planet. Our civilization has become a beacon that identifies our planet as a home to life.

Originally featured in August — see more here.

11. SAVE OUR SCIENCE

“What is crucial is not that technical ability, but it is imagination in all of its applications,” the great E. O. Wilson offered in his timeless advice to young scientists — a conviction shared by some of history’s greatest scientific minds. And yet it is rote memorization and the unimaginative application of technical skill that our dominant education system prioritizes — so it’s no wonder it is failing to produce the Edisons and Curies of our day. In Save Our Science: How to Inspire a New Generation of Scientists, materials scientist, inventor, and longtime Yale professor Ainissa Ramirez takes on a challenge Isaac Asimov presaged a quarter century ago, advocating for the value of science education and critiquing its present failures, with a hopeful and pragmatic eye toward improving its future. She writes in the introduction:

The 21st century requires a new kind of learner — not someone who can simply churn out answers by rote, as has been done in the past, but a student who can think expansively and solve problems resourcefully.

To do that, she argues, we need to replace the traditional academic skills of “reading, ’riting, and ’rithmetic” with creativity, curiosity, critical-thinking, and problem-solving. (Though, as psychology has recently revealed, problem-finding might be the more valuable skill.)

Ainissa Ramirez at TED 2012 (Photograph: James Duncan Davidson for TED)

She begins with the basics:

While the acronym STEM sounds very important, STEM answers just three questions: Why does something happen? How can we apply this knowledge in a practical way? How can we describe what is happening succinctly? Through the questions, STEM becomes a pathway to be curious, to create, and to think and figure things out.

Even for those of us who deem STEAM (wherein the A stands for “arts”) superior to STEM, Ramirez’s insights are razor-sharp and consistent with the oft-affirmed idea that creativity relies heavily upon connecting the seemingly disconnected and aligning the seemingly misaligned:

There are two schools of thought on defining creativity: divergent thinking, which is the formation of a creative idea resulting from generating lots of ideas, and a Janusian approach, which is the act of making links between two remote ideas. The latter takes its name from the two-faced Roman god of beginnings, Janus, who was associated with doorways and the idea of looking forward and backward at the same time. Janusian creativity hinges on the belief that the best ideas come from linking things that previously did not seem linkable. Henri Poincaré, a French mathematician, put it this way: ‘To create consists of making new combinations. … The most fertile will often be those formed of elements drawn from domains which are far apart.’

Another element inherent to the scientific process but hardly rewarded, if not punished, in education is the role of ignorance, or what the poet John Keats has eloquently and timelessly termed “negative capability” — the art of brushing up against the unknown and proceeding anyway. Ramirez writes:

My training as a scientist allows me to stare at an unknown and not run away, because I learned that this melding of uncertainty and curiosity is where innovation and creativity occur.

Yet these very qualities are missing from science education in the United States — and it shows. When the Programme for International Student Assessment (PISA) took their annual poll in 2006, the U.S. ranked 35th in math and 29th in science out of the 40 high-income, developed countries surveyed.

Average PISA scores versus expenditures for selected countries (Source: Organisation for Economic Co-operation and Development)

Ramirez offers a historical context: When American universities first took root in the colonial days, their primary role was to educate men for the clergy, so science, technology, and math were not a priority. But then Justin Smith Morrill, a little-known congressman from Vermont who had barely completed his high school education, came along in 1861 and quietly but purposefully sponsored legislation that forever changed American education, resulting in more than 70 new colleges and universities that included STEM subjects in their curricula. This catapulted enrollment rates from the mere 2% of the population who attended higher education prior to the Civil War and greatly increased diversity in academia, with the act’s second revision in 1890 extending education opportunities to women and African-Americans.

The growth of U.S. college enrollment from 1869 to 1994. (Source: S. B. Carter et al., Historical Statistics of the United States)

But what really propelled science education, Ramirez notes, was the competitive spirit of the Space Race:

The mixture of being outdone and humiliated motivated the U.S. to create NASA and bolster the National Science Foundation’s budget to support science research and education. Sputnik forced the U.S. to think about its science position and to look hard into a mirror — and the U.S. did not like what it saw. In 1956, before Sputnik, the National Science Foundation’s budget was a modest $15.9 million. In 1958, it tripled to $49.5 million, and it doubled again in 1959 to $132.9 million. The space race was on. We poured resources, infrastructure, and human capital into putting an American on the moon, and with that goal, STEM education became a top priority.

President John F. Kennedy addresses a crowd of 35,000 at Rice University in 1962, proclaiming again his desire to reach the moon with the words, ‘We set sail on this new sea because there is new knowledge to be gained.’ Credit: NASA / Public domain

Ramirez argues for returning to that spirit of science education as an investment in national progress:

The U.S. has a history of changing education to meet the nation’s needs. We need similar innovative forward-thinking legislation now, to prepare our children and our country for the 21st century. Looking at our history allows us to see that we have been here before and prevailed. Let’s meet this challenge, for it will, as Kennedy claimed, draw out the very best in all of us.

In confronting the problems that plague science education and the public’s relationship with scientific culture, Ramirez points to the fact that women account for only 26% of STEM bachelor’s degrees and explores the heart of the glaring gender problem:

[There is a] false presumption that girls are not as good as boys in science and math. This message absolutely pervades our national mindset. Even though girls and boys sit next to each other in class, fewer women choose STEM careers than men. This is the equivalent to a farmer sowing seeds and then harvesting only half of the fields.

The precipitous drop in girls’ enrollment in STEM classes. (Source: J. F. Latimer, What’s Happened To Our High Schools)

In turning toward possible solutions, Ramirez calls out the faulty models of standardized testing, which fail to account for more dimensional definitions of intelligence. She writes:

There is a concept in physics that the observer of an experiment can change the results just by the act of observing (this is called, not surprisingly, the observer effect). For example, knowing the required pressure of your tires and observing that they are overinflated dictates that you let some air out, which changes the pressure slightly.

Although this theory is really for electrons and atoms, we also see it at work in schools. Schools are evaluated, by the federal and state governments, by tests. The students are evaluated by tests administered by the teachers. It is the process of testing that has changed the mission of the school from instilling a wide knowledge of the subject matter to acquiring a good score on the tests.

The United States is one of the most test-taking countries in the world, and the standard weapon is the multiple-choice question. Although multiple-choice tests are efficient in schools, they don’t inspire learning. In fact, they do just the opposite. This is hugely problematic in encouraging the skills needed for success in the 21st century. Standardized testing teaches skills that are counter to skills needed for the future, such as curiosity, problem solving, and having a healthy relationship with failure. Standardized tests draw up a fear of failure, since you seek a specific answer and you will be either right or wrong; they kick problem solving in the teeth, since you never need to show your work and never develop a habit of figuring things out; and they slam the doors to curiosity, since only a small selection of the possible answers is laid out before you. These kinds of tests produce thinkers who are unwilling to stretch and take risks and who cannot handle failure. They crush a sense of wonder.

Like Noam Chomsky, who has questioned why schools train for passing tests rather than for creative inquiry, and Sir Ken Robinson, who has eloquently advocated for changing the factory model of education, Ramirez urges:

While scientists passionately explore, reason, discover, synthesize, compare, contrast, and connect the dots, students drudgingly memorize, watch, and passively consume. Students are exercising the wrong muscle. An infusion of STEM taught in compelling ways will give students an opportunity to acquire these active learning skills.

Ramirez goes on to propose a multitude of small changes and larger shifts that communities, educators, cities, institutions, and policy-makers could implement — from neighborhood maker-spaces to wifi hotspots on school buses to university science festivals to new curricula and testing methods — that would begin to bridge the gap between what science education currently is and what scientific culture could and should be. She concludes, echoing Alvin Toffler’s famous words that “the illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn”:

The skills of the 21st century need us to create scholars who can link the unlinkable. … Nurturing curious, creative problem solvers who can master the art of figuring things out will make them ready for this unknown brave new world. And that is the best legacy we can possibly leave.

Originally featured in February — see more here.

12. THE ELEMENTS OF EUCLID

Almost a century before Mondrian made his iconic red, yellow, and blue geometric compositions, and around the time that Edward Livingston Youmans was creating his stunning chemistry diagrams, an eccentric 19th-century civil engineer and mathematician named Oliver Byrne produced a striking series of vibrant diagrams in primary colors for a 1847 edition of the legendary Greek mathematical treatise Euclid’s Elements. Byrne, a vehement opponent of pseudoscience with an especial distaste phrenology, was early to the insight that great design and graphic elegance can powerfully aid learning. He explained that in his edition of Euclid, “coloured diagrams and symbols are used instead of letters for the greater ease of learners.” The book, a masterpiece of Victorian printing and graphic design long before “graphic design” existed as a discipline, is celebrated as one of the most unusual and most beautiful books of the 19th century.

Now, the fine folks of Taschen — who have brought us such visual treasures as the best illustrations from 150 years of Hans Christian Andersen, the life and legacy of infographics godfather Fritz Kahn, and the visual history of magic — are resurrecting Byrne’s gem in the lavish tome The First Six Books of the Elements of Euclid (public library), edited by Swiss polymath Werner Oechslin.

Proof of the Pythagorean theorem

A masterwork of art and science in equal measure, this newly rediscovered treasure mesmerizes the eye with its brightly colored circles, squares, and triangles while it tickles the brain with its mathematical magic.

Originally featured in November — see more here.

13. DOES MY GOLDFISH KNOW WHO I AM?

In 2012, I wrote about a lovely book titled Big Questions from Little People & Simple Answers from Great Minds, in which some of today’s greatest scientists, writers, and philosophers answer kids’ most urgent questions, deceptively simple yet profound. It went on to become one of the year’s best books and among readers’ favorites. A few months later, Gemma Elwin Harris, the editor who had envisioned the project, reached out to invite me to participate in the book’s 2013 edition by answering one randomly assigned question from a curious child. Naturally, I was thrilled to do it, and honored to be a part of something as heartening as Does My Goldfish Know Who I Am? (public library), also among the best children’s books of the year — a compendium of primary school children’s funny, poignant, innocent yet insightful questions about science and how life works, answered by such celebrated minds as rockstar physicist Brian Cox, beloved broadcaster and voice-of-nature Sir David Attenborough, legendary linguist Noam Chomsky, science writer extraordinaire Mary Roach, stat-showman Hans Rosling, Beatle Paul McCartney, biologist and Beagle Project director Karen James, and iconic illustrator Sir Quentin Blake. As was the case with last year’s edition, more than half of the proceeds from the book — which features illustrations by the wonderful Andy Smith — are being donated to a children’s charity.

The questions range from what the purpose of science is to why onions make us cry to whether spiders can speak to why we blink when we sneeze. Psychologist and broadcaster Claudia Hammond, who recently explained the fascinating science of why time slows down when we’re afraid, speeds up as we age, and gets all warped while we’re on vacation in one of the best psychology and philosophy books of 2013, answers the most frequently asked question by the surveyed children: Why do we cry?

It’s normal to cry when you feel upset and until the age of twelve boys cry just as often as girls. But when you think about it, it is a bit strange that salty water spills out from the corners of your eyes just because you feel sad.

One professor noticed people often say that, despite their blotchy faces, a good cry makes them feel better. So he did an experiment where people had to breathe in over a blender full of onions that had just been chopped up. Not surprisingly this made their eyes water. He collected the tears and put them in the freezer. Then he got people to sit in front of a very sad film wearing special goggles which had tiny buckets hanging off the bottom, ready to catch their tears if they cried. The people cried, but the buckets didn’t work and in the end he gathered their tears in tiny test tubes instead.

He found that the tears people cried when they were upset contained extra substances, which weren’t in the tears caused by the onions. So he thinks maybe we feel better because we get rid of these substances by crying and that this is the purpose of tears.

But not everyone agrees. Many psychologists think that the reason we cry is to let other people know that we need their sympathy or help. So crying, provided we really mean it, brings comfort because people are nice to us.

Crying when we’re happy is a bit more of a mystery, but strong emotions have a lot in common, whether happy or sad, so they seem to trigger some of the same processes in the body.

(For a deeper dive into the biological mystery of crying, see the science of sobbing and emotional tearing.)

Joshua Foer, who knows a thing or two about superhuman memory and the limits of our mind, explains to 9-year-old Tom how the brain can store so much information despite being that small:

An adult’s brain only weighs about 1.4 kilograms, but it’s made up of about 100 billion microscopic neurons. Each of those neurons looks like a tiny branching tree, whose limbs reach out and touch other neurons. In fact, each neuron can make between 5,000 and 10,000 connections with other neurons — sometimes even more. That’s more than 500 trillion connections! A memory is essentially a pattern of connections between neurons.

Every sensation that you remember, every thought that you think, transforms your brain by altering the connections within that vast network. By the time you get to the end of this sentence, you will have created a new memory, which means your brain will have physically changed.

Neuroscientist Tali Sharot, who has previously studied why our brains are wired for optimism, answers 8-year-old Maia’s question about why we don’t have memories from the time we were babies and toddlers:

We use our brain for memory. In the first few years of our lives, our brain grows and changes a lot, just like the rest of our body. Scientists think that because the parts of our brain that are important for memory have not fully developed when we are babies, we are unable to store memories in the same way that we do when we are older.

Also, when we are very young we do not know how to speak. This makes it difficult to keep events in your mind and remember them later, because we use language to remember what happened in the past.

In answering 8-year-old Hannah’s question about what newspapers do when there is no news, writer and journalist Oliver Burkeman, author of the excellent The Antidote: Happiness for People Who Can’t Stand Positive Thinking, offers a primer on media literacy — an important caveat on news that even we, as alleged grown-ups, frequently forget:

Newspapers don’t really go out and find the news: they decide what gets to count as news. The same goes for television and radio. And you might disagree with their decisions! (For example, journalists are often accused of focusing on bad news and ignoring the good, making the world seem worse than it is.)

The important thing to remember, whenever you’re reading or watching the news, is that someone decided to tell you those things, while leaving out other things. They’re presenting one particular view of the world — not the only one. There’s always another side to the story.

And my answer, to 9-year-old Ottilie’s question about why we have books:

Some people might tell you that books are no longer necessary now that we have the internet. Don’t believe them. Books help us know other people, know how the world works, and, in the process, know ourselves more deeply in a way that has nothing to with what you read them on and everything to do with the curiosity, integrity and creative restlessness you bring to them.

Books build bridges to the lives of others, both the characters in them and your countless fellow readers across other lands and other eras, and in doing so elevate you and anchor you more solidly into your own life. They give you a telescope into the minds of others, through which you begin to see with ever greater clarity the starscape of your own mind.

And though the body and form of the book will continue to evolve, its heart and soul never will. Though the telescope might change, the cosmic truths it invites you to peer into remain eternal like the Universe.

In many ways, books are the original internet — each fact, each story, each new bit of information can be a hyperlink to another book, another idea, another gateway into the endlessly whimsical rabbit hole of the written word. Just like the web pages you visit most regularly, your physical bookmarks take you back to those book pages you want to return to again and again, to reabsorb and relive, finding new meaning on each visit — because the landscape of your life is different, new, “reloaded” by the very act of living.

Originally featured in November — read more of the questions and answers here.

HONORABLE MENTIONS

The Space Book: From the Beginning to the End of Time, 250 Milestones in the History of Space & Astronomy by Jim Bell, An Appetite for Wonder: The Making of a Scientist by Richard Dawkins, and The Age of Edison: Electric Light and the Invention of Modern America by Ernest Freeberg.

BP

The 13 Best Children’s, Illustrated, and Picture Books of 2013

Young Mark Twain’s lost gem, the universe in illustrated dioramas, Maurice Sendak’s posthumous love letter to the world, Kafka for kids, and more treats for all ages.

“It is an error … to think of children as a special kind of creature, almost a different race, rather than as normal, if immature, members of a particular family, and of the human family at large,” J. R. R. Tolkien wrote in his superb meditation on fantasy and why there’s no such thing as writing “for children,” intimating that books able to captivate children’s imagination aren’t “children’s books” but simply really good books. After the year’s best books in psychology and philosophy, art and design, and history and biography, the season’s subjective selection of best-of reading lists continue with the loveliest “children’s” and picture-books of 2013. (Because the best children’s books are, as Tolkien believes, always ones of timeless delight, do catch up on the selections for 2012, 2011, and 2010.)

1. ADVICE TO LITTLE GIRLS

In 1865, when he was only thirty, Mark Twain penned a playful short story mischievously encouraging girls to think independently rather than blindly obey rules and social mores. In the summer of 2011, I chanced upon and fell in love with a lovely Italian edition of this little-known gem with Victorian-scrapbook-inspired artwork by celebrated Russian-born children’s book illustrator Vladimir Radunsky. I knew the book had to come to life in English, so I partnered with the wonderful Claudia Zoe Bedrick of Brooklyn-based indie publishing house Enchanted Lion, maker of extraordinarily beautiful picture-books, and we spent the next two years bringing Advice to Little Girls (public library) to life in America — a true labor-of-love project full of so much delight for readers of all ages. (And how joyous to learn that it was also selected among NPR’s best books of 2013!)

While frolicsome in tone and full of wink, the story is colored with subtle hues of grown-up philosophy on the human condition, exploring all the deft ways in which we creatively rationalize our wrongdoing and reconcile the good and evil we each embody.

Good little girls ought not to make mouths at their teachers for every trifling offense. This retaliation should only be resorted to under peculiarly aggravated circumstances.

If you have nothing but a rag-doll stuffed with sawdust, while one of your more fortunate little playmates has a costly China one, you should treat her with a show of kindness nevertheless. And you ought not to attempt to make a forcible swap with her unless your conscience would justify you in it, and you know you are able to do it.

One can’t help but wonder whether this particular bit may have in part inspired the irreverent 1964 anthology Beastly Boys and Ghastly Girls and its mischievous advice on brother-sister relations:

If at any time you find it necessary to correct your brother, do not correct him with mud — never, on any account, throw mud at him, because it will spoil his clothes. It is better to scald him a little, for then you obtain desirable results. You secure his immediate attention to the lessons you are inculcating, and at the same time your hot water will have a tendency to move impurities from his person, and possibly the skin, in spots.

If your mother tells you to do a thing, it is wrong to reply that you won’t. It is better and more becoming to intimate that you will do as she bids you, and then afterward act quietly in the matter according to the dictates of your best judgment.

Good little girls always show marked deference for the aged. You ought never to ‘sass’ old people unless they ‘sass’ you first.

Originally featured in April — see more spreads, as well as the story behind the project, here.

2. YOU ARE STARDUST

“Everyone you know, everyone you ever heard of, every human being who ever was … lived there — on a mote of dust suspended in a sunbeam,” Carl Sagan famously marveled in his poetic Pale Blue Dot monologue, titled after the iconic 1990 photograph of Earth. The stardust metaphor for our interconnection with the cosmos soon permeated popular culture and became a vehicle for the allure of space exploration. There’s something at once incredibly empowering and incredibly humbling in knowing that the flame in your fireplace came from the sun.

That’s precisely the kind of cosmic awe environmental writer Elin Kelsey and Toronto-based Korean artist Soyeon Kim seek to inspire in kids in You Are Stardust (public library) — an exquisite picture-book that instills that profound sense of connection with the natural world. Underpinning the narrative is a bold sense of optimism — a refreshing antidote to the fear-appeal strategy plaguing most environmental messages today.

Kim’s breathtaking dioramas, to which this screen does absolutely no justice, mix tactile physical materials with fine drawing techniques and digital compositing to illuminate the relentlessly wondrous realities of our intertwined existence: The water in your sink once quenched the thirst of dinosaurs; with every sneeze, wind blasts out of your nose faster than a cheetah’s sprint; the electricity that powers every thought in your brain is stronger than lightning.

But rather than dry science trivia, the message is carried on the wings of poetic admiration for these intricate relationships:

Be still. Listen.

Like you, the Earth breathes.

Your breath is alive with the promise of flowers.

Each time you blow a kiss to the world, you spread pollen that might grow to be a new plant.

The book is nonetheless grounded in real science. Kelsey notes:

I wrote this book as a celebration — one to honor the extraordinary ways in which all of us simply are nature. Every example in this book is backed by current science. Every day, for instance, you breathe in more than a million pollen grains.

But what makes the project particularly exciting is that, in the face of the devastating gender gap in science education, here is a thoughtful, beautiful piece of early science education presented by two women, the most heartening such example since Lauren Redniss’s Radioactive.

A companion iPad app features sound effects, animation, an original score by Paul Aucoin, behind-the-scenes glimpses of Kim’s process in creating her stunning 3D dioramas, and even build-your-own-diorama adventures.

Originally featured in March — see more here.

3. THE HOLE

The Hole (public library) by artist Øyvind Torseter, one of Norway’s most celebrated illustrators, tells the story of a lovable protagonist who wakes up one day and discovers a mysterious hole in his apartment, which moves and seems to have a mind of its own. Befuddled, he looks for its origin — in vain. He packs it in a box and takes it to a lab, but still no explanation.

With Torseter’s minimalist yet visually eloquent pen-and-digital line drawings, vaguely reminiscent of Sir Quentin Blake and Tomi Ungerer yet decidedly distinctive, the story is at once simple and profound, amusing and philosophical, the sort of quiet meditation that gently, playfully tickles us into existential inquiry.

What makes the book especially magical is that a die-cut hole runs from the wonderfully gritty cardboard cover through every page and all the way out through the back cover — an especial delight for those of us who swoon over masterpieces of die-cut whimsy. In every page, the hole is masterfully incorporated into the visual narrative, adding an element of tactile delight that only an analog book can afford. The screen thus does it little justice, as these digital images feature a mere magenta-rimmed circle where the die-cut hole actually appears, but I’ve tried to capture its charm in a few photographs accompanying the page illustrations.

Originally featured in September, with lots more illustrations.

4. MY BROTHER’S BOOK

For those of us who loved legendary children’s book author Maurice Sendak — famed creator of wild things, little-known illustrator of velveteen rabbits, infinitely warm heart, infinitely witty mind — his death in 2012 was one of the year’s greatest heartaches. Now, half a century after his iconic Where The Wild Things Are, comes My Brother’s Book (public library; UK) — a bittersweet posthumous farewell to the world, illustrated in vibrant, dreamsome watercolors and written in verse inspired by some of Sendak’s lifelong influences: Shakespeare, Blake, Keats, and the music of Mozart. In fact, a foreword by Shakespeare scholar Stephen Greenblatt reveals the book is based on the Bard’s “A Winter’s Tale.”

It tells the story of two brothers, Jack and Guy, torn asunder when a falling star crashes onto Earth. Though on the surface about the beloved author’s own brother Jack, who died 18 years ago, the story is also about the love of Sendak’s life and his partner of fifty years, psychoanalyst Eugene Glynn, whose prolonged illness and eventual loss in 2007 devastated Sendak — the character of Guy reads like a poetic fusion of Sendak and Glynn. And while the story might be a universal “love letter to those who have gone before,” as NPR’s Renee Montagne suggests in Morning Edition, it is in equal measure a private love letter to Glynn. (Sendak passed away the day before President Obama announced his support for same-sex marriage, but Sendak fans were quick to honor both historic moments with a bittersweet homage.)

Indeed, the theme of all-consuming love manifests viscerally in Sendak’s books. Playwright Tony Kushner, a longtime close friend of Sendak’s and one of his most heartfelt mourners, tells NPR:

There’s a lot of consuming and devouring and eating in Maurice’s books. And I think that when people play with kids, there’s a lot of fake ferocity and threats of, you know, devouring — because love is so enormous, the only thing you can think of doing is swallowing the person that you love entirely.

My Brother’s Book ends on a soul-stirring note, tender and poignant in its posthumous light:

And Jack slept safe
Enfolded in his brother’s arms
And Guy whispered ‘Good night
And you will dream of me.’

Originally featured in February.

5. DOES MY GOLDFISH KNOW WHO I AM?

In 2012, I wrote about a lovely book titled Big Questions from Little People & Simple Answers from Great Minds, in which some of today’s greatest scientists, writers, and philosophers answer kids’ most urgent questions, deceptively simple yet profound. It went on to become one of the year’s best books and among readers’ favorites. A few months later, Gemma Elwin Harris, the editor who had envisioned the project, reached out to invite me to participate in the book’s 2013 edition by answering one randomly assigned question from a curious child. Naturally, I was thrilled to do it, and honored to be a part of something as heartening as Does My Goldfish Know Who I Am? (public library) — a compendium of primary school children’s funny, poignant, innocent yet insightful questions about science and how life works, answered by such celebrated minds as rockstar physicist Brian Cox, beloved broadcaster and voice-of-nature Sir David Attenborough, legendary linguist Noam Chomsky, science writer extraordinaire Mary Roach, stat-showman Hans Rosling, Beatle Paul McCartney, biologist and Beagle Project director Karen James, and iconic illustrator Sir Quentin Blake. As was the case with last year’s edition, more than half of the proceeds from the book — which features illustrations by the wonderful Andy Smith — are being donated to a children’s charity.

The questions range from what the purpose of science is to why onions make us cry to whether spiders can speak to why we blink when we sneeze. Psychologist and broadcaster Claudia Hammond, who recently explained the fascinating science of why time slows down when we’re afraid, speeds up as we age, and gets all warped while we’re on vacation in one of the best psychology and philosophy books of 2013, answers the most frequently asked question by the surveyed children: Why do we cry?

It’s normal to cry when you feel upset and until the age of twelve boys cry just as often as girls. But when you think about it, it is a bit strange that salty water spills out from the corners of your eyes just because you feel sad.

One professor noticed people often say that, despite their blotchy faces, a good cry makes them feel better. So he did an experiment where people had to breathe in over a blender full of onions that had just been chopped up. Not surprisingly this made their eyes water. He collected the tears and put them in the freezer. Then he got people to sit in front of a very sad film wearing special goggles which had tiny buckets hanging off the bottom, ready to catch their tears if they cried. The people cried, but the buckets didn’t work and in the end he gathered their tears in tiny test tubes instead.

He found that the tears people cried when they were upset contained extra substances, which weren’t in the tears caused by the onions. So he thinks maybe we feel better because we get rid of these substances by crying and that this is the purpose of tears.

But not everyone agrees. Many psychologists think that the reason we cry is to let other people know that we need their sympathy or help. So crying, provided we really mean it, brings comfort because people are nice to us.

Crying when we’re happy is a bit more of a mystery, but strong emotions have a lot in common, whether happy or sad, so they seem to trigger some of the same processes in the body.

(For a deeper dive into the biological mystery of crying, see the science of sobbing and emotional tearing.)

Joshua Foer, who knows a thing or two about superhuman memory and the limits of our mind, explains to 9-year-old Tom how the brain can store so much information despite being that small:

An adult’s brain only weighs about 1.4 kilograms, but it’s made up of about 100 billion microscopic neurons. Each of those neurons looks like a tiny branching tree, whose limbs reach out and touch other neurons. In fact, each neuron can make between 5,000 and 10,000 connections with other neurons — sometimes even more. That’s more than 500 trillion connections! A memory is essentially a pattern of connections between neurons.

Every sensation that you remember, every thought that you think, transforms your brain by altering the connections within that vast network. By the time you get to the end of this sentence, you will have created a new memory, which means your brain will have physically changed.

Neuroscientist Tali Sharot, who has previously studied why our brains are wired for optimism, answers 8-year-old Maia’s question about why we don’t have memories from the time we were babies and toddlers:

We use our brain for memory. In the first few years of our lives, our brain grows and changes a lot, just like the rest of our body. Scientists think that because the parts of our brain that are important for memory have not fully developed when we are babies, we are unable to store memories in the same way that we do when we are older.

Also, when we are very young we do not know how to speak. This makes it difficult to keep events in your mind and remember them later, because we use language to remember what happened in the past.

In answering 8-year-old Hannah’s question about what newspapers do when there is no news, writer and journalist Oliver Burkeman, author of the excellent The Antidote: Happiness for People Who Can’t Stand Positive Thinking, offers a primer on media literacy — an important caveat on news that even we, as alleged grown-ups, frequently forget:

Newspapers don’t really go out and find the news: they decide what gets to count as news. The same goes for television and radio. And you might disagree with their decisions! (For example, journalists are often accused of focusing on bad news and ignoring the good, making the world seem worse than it is.)

The important thing to remember, whenever you’re reading or watching the news, is that someone decided to tell you those things, while leaving out other things. They’re presenting one particular view of the world — not the only one. There’s always another side to the story.

And my answer, to 9-year-old Ottilie’s question about why we have books:

Some people might tell you that books are no longer necessary now that we have the internet. Don’t believe them. Books help us know other people, know how the world works, and, in the process, know ourselves more deeply in a way that has nothing to with what you read them on and everything to do with the curiosity, integrity and creative restlessness you bring to them.

Books build bridges to the lives of others, both the characters in them and your countless fellow readers across other lands and other eras, and in doing so elevate you and anchor you more solidly into your own life. They give you a telescope into the minds of others, through which you begin to see with ever greater clarity the starscape of your own mind.

And though the body and form of the book will continue to evolve, its heart and soul never will. Though the telescope might change, the cosmic truths it invites you to peer into remain eternal like the Universe.

In many ways, books are the original internet — each fact, each story, each new bit of information can be a hyperlink to another book, another idea, another gateway into the endlessly whimsical rabbit hole of the written word. Just like the web pages you visit most regularly, your physical bookmarks take you back to those book pages you want to return to again and again, to reabsorb and relive, finding new meaning on each visit — because the landscape of your life is different, new, “reloaded” by the very act of living.

Originally featured in November — read more here.

6. LITTLE BOY BROWN

“I didn’t feel alone in the Lonely Crowd,” young Italo Calvino wrote of his visit to America, and it is frequently argued that hardly any place embodies the “Lonely Crowd” better than New York, city of “avoid-eye-contact indifference of the crowded subways.” That, perhaps, is what children’s book writer Isobel Harris set out to both affirm and decondition in Little Boy Brown (public library) — a magnificent ode to childhood and loneliness, easily the greatest ode to childhood and loneliness ever written, illustrated by the famed Hungarian-born French cartoonist and graphic designer André François. Originally published in 1949, this timeless story that stirred the hearts of generations has been newly resurrected by Enchanted Lion.

This is the story of a four-year-old boy living with his well-to-do mother and father in a Manhattan hotel, in which the elevator connects straight to the subway tunnel below the building and plugs right into the heart of the city. And yet Little Boy Brown, whose sole friends are the doormen and elevator operators, feels woefully lonely — until, one day, his hotel chambermaid Hilda invites him to visit her house outside the city, where he blossoms into a new sense of belonging.

Underpinning the charming tale of innocence and children’s inborn benevolence is a heartwarming message about connection across the lines of social class and bridging the gaps of privilege with simple human kindness.

Hilda’s mother kissed me before she even knew who I was!

[…]

Hilda’s family is smarter than we are. They can all speak two different languages, and they can close their eyes and think about two different countries. They’ve been on the Ocean, and they’ve climbed high mountains. They haven’t got quite enough of anything. It makes it exciting when a little more comes!

The story itself, at once a romantic time-capsule of a bygone New York and a timeless meditation on what it’s like feel so lonesome in a crowd of millions, invites us to explore the tender intersection of loneliness and loveliness. François, who studied with Picasso, illustrated a number of iconic New Yorker covers, and belongs to the same coterie of influential mid-century creative legends as Sir Quentin Blake, Tomi Ungerer, and his close friend and collaborator of Ronald Searle, brings all this wonderful dimensionality to life in his singular illustrations, all the more special given that this was his first children’s book.

Originally featured in November — see more here.

7. THE MIGHTY LALOUCHE

The more you win, the more you win, the science of the “winner effect” tells us. The same interplay of biochemistry, psychology and performance thus also holds true of the opposite — but perhaps this is why we love a good underdog story, those unlikely tales of assumed “losers” beating the odds to triumph as “winners.” Stories like this are fundamental to our cultural mythology of ambition and anything-is-possible aspiration, and they speak most powerfully to our young and hopeful selves, to our inner underdogs, to the child who dreams of defeating her bully in blazing glory.

That ever-alluring parable is at the heart of The Mighty Lalouche (public library), written by Matthew Olshan, who famously reimagined Twain’s Huckleberry Finn with an all-girl cast of characters, and illustrated by the inimitable Sophie Blackall, one of the most extraordinary book artists working today, who has previously given us such gems as her drawings of Craigslist missed connections and Aldous Huxley’s only children’s book. It tells the heartening story of a humble and lithe early-twentieth-century French postman named Lalouche, his profound affection for his pet finch Geneviève, and his surprising success in the era’s favorite sport of la boxe française, or French boxing.

One day, at the height of Parisians’ infatuation with the novelty of electric cars, Lalouche’s boss at the post office informs him that a new electric autocar is replacing all walking postmen, who are too slow by comparison. Desperate to provide for himself and Geneviève, Lalouche sees a flyer offering cash to any sparring partners willing to fight the champions at the Bastille Boxing Club. Though Lalouche is small and “rather bony,” his hands are nimble and strong from handling weighty packages, and his feet are fast from racing up apartment stairs in his deliveries — so he signs up.

One should never underestimate a man who loves his finch.

Thanks to his agility and love for the birdie, to everyone’s astonishment, he goes on to defeat each of the champions in turn — even the formidable Anaconda, “the biggest, baddest beast the city has ever seen,” infamous for his deadly sleeper hold. But when the postal service chief realizes the autocar is just a gimmick good for nothing and asks whether Lalouche is willing to take his job back, the tiny champ gladly agrees, for his heart is in the joy he brings people as their mail arrives.

Underpinning the simple allegory of unlikely triumph is a deeper reflection on our present-day anxieties about whether or not machines — gadgets, robots, algorithms — will replace us. The story gently assuring us that the most quintessential of human qualities and capacities — courage, integrity, love — will always remain ours and ours alone.

But what makes the book particularly exceptional are the curious archival images uncovered in the research, presented here exclusively alongside the soulful and expressive illustrations Blackall reincarnated them into:

Boxer trading cards, 1895

Boxer pose II, early 1900s

Three boxers, early 1900s

Originally featured n May — see more here.

8. GOBBLE YOU UP

For nearly two decades, independent India-based publisher Tara Books has been giving voice to marginalized art and literature through a collective of artists, writers, and designers collaborating on beautiful books based on regional folk traditions, producing such gems as Waterlife, The Night Life of Trees, and Drawing from the City. A year after I Saw a Peacock with a Fiery Tail — one of the best art books of 2012, a magnificent 17th-century British “trick” poem adapted in a die-cut narrative and illustrated in the signature Indian folk art style of the Gond tribe — comes Gobble You Up (public library), an oral Rajasthani trickster tale adapted as a cumulative rhyme in a mesmerizing handmade treasure released in a limited edition of 7,000 numbered handmade copies, illustrated by artist Sunita and silkscreened by hand in two colors on beautifully coarse kraft paper custom-made for the project. What makes it especially extraordinary, however, is that the Mandna tradition of tribal finger-painting — an ancient Indian art form practiced only by women and passed down from mother to daughter across the generations, created by soaking pieces of cloth in chalk and lime paste, which the artist squeezes through her fingers into delicate lines on the mud walls of village huts — has never before been used to tell a children’s story.

And what a story it is: A cunning jackal who decides to spare himself the effort of hunting for food by tricking his fellow forest creatures into being gobbled up whole, beginning with his friend the crane; he slyly swallows them one by one, until the whole menagerie fills his belly — a play on the classic Meena motif of the pregnant animal depicted with a baby inside its belly, reflecting the mother-daughter genesis of the ancient art tradition itself.

Indeed, Sunita herself was taught to paint by her mother and older sister — but unlike most Meena women, who don’t usually leave the confines of their village and thus contain their art within their community, Sunita has thankfully ventured into the wider world, offering us a portal into this age-old wonderland of art and storytelling.

Gita Wolf, Tara’s visionary founder, who envisioned the project and wrote the cumulative rhyme, describes the challenges of adapting this ephemeral, living art form onto the printed page without losing any of its expressive aliveness:

Illustrating the story in the Meena style of art involved two kinds of movement. The first was to build a visual narrative sequencing from a tradition which favored single, static images. The second challenge was to keep the quality of the wall art, while transferring it to a different, while also smaller, surface. We decided on using large sheets of brown paper, with Sunita squeezing diluted white acrylic paint through her fingers.

Originally featured in October — see more here.

9. BALLAD

The best, most enchanting stories live somewhere between the creative nourishment of our daydreams and the dark allure of our nightmares. That’s exactly where beloved French graphic artist Blexbolex transports us in Ballad (public library) — his exquisite and enthralling follow-up to People, one of the best illustrated books of 2011, and Seasons.

This continuously evolving story traces a child’s perception of his surroundings as he walks home from school. It unfolds over seven sequences across 280 glorious pages and has an almost mathematical beauty to it as each sequence exponentially blossoms into the next: We begin with school, path, and home; we progress to school, street, path, forest, home; before we know it, there’s a witch, a stranger, a sorcerer, a hot air balloon, and a kidnapped queen. All throughout, we’re invited to reimagine the narrative as we absorb the growing complexity of the world — a beautiful allegory for our walk through life itself.

The frontispiece makes a simple and alluring promise:

It’s a story as old as the world — a story that begins all over again each day.

The dark whimsy of Blexbolex’s unusual visual storytelling sings to us a ballad of danger and delight, serenading with the enchantment of fairy tales, the starkness of graphic novels, and the liberation of choose-your-own-adventure stories. And this is precisely where Blexbolex’s singular talent springs to life: Trained as a painter in the 1980s but having left art school to find himself as a silk-screen artist, he blends the charisma of vintage graphic design and traditional printing techniques with the dynamic mesmerism of contemporary graphic novels and experimental narratives to create an entirely new, wholly different form of bewitching visual storytelling, where a few carefully chosen words invite perpetual reinterpretation of layered and expressive scenes.

Originally featured in October — see more here.

10. THE DARK

Daniel Handler — beloved author, timelessly heartening literary jukeboxer — is perhaps better-known by his pen name Lemony Snicket, under which he pens his endlessly delightful children’s books. In fact, they owe much of their charisma to the remarkable creative collaborations Snicket spawns, from 13 Words illustrated by the inimitable Maira Kalman to Who Could It Be At This Hour? with artwork by celebrated cartoonist Seth. Snicket’s 2013 gem, reminiscent in spirit of Maya Angelou’s Life Doesn’t Frighten Me, is at least as exciting — a minimalist yet magnificently expressive story about a universal childhood fear, titled The Dark (public library) and illustrated by none other than Jon Klassen.

In a conversation with NPR, Handler echoes Aung San Suu Kyi’s timeless wisdom on freedom from fear and articulates the deeper, more universal essence of the book’s message:

I think books that are meant to be read in the nighttime ought to confront the very fears that we’re trying to think about. And I think that a young reader of The Dark will encounter a story about a boy who makes new peace with a fear, rather than a story that ignores whatever troubles are lurking in the corners of our minds when we go to sleep.

Originally featured in June.

11. JANE, THE FOX AND ME

“Reading is escape, and the opposite of escape; it’s a way to make contact with reality,” Nora Ephron wrote. “If I can’t stand the world I just curl up with a book, and it’s like a little spaceship that takes me away from everything,” Susan Sontag told an interviewer, articulating an experience at once so common and so deeply personal to all of us who have ever taken refuge from the world in the pages of a book and the words of a beloved author. It’s precisely this experience that comes vibrantly alive in Jane, the Fox, and Me (public library) — a stunningly illustrated graphic novel about a young girl named Hélène, who, cruelly teased by the “mean girls” clique at school, finds refuge in Charlotte Brönte’s Jane Eyre. In Jane, she sees both a kindred spirit and aspirational substance of character, one straddling the boundary between vulnerability and strength with remarkable grace — just the quality of heart and mind she needs as she confronts the common and heartbreaking trials of teenage girls tormented by bullying, by concerns over their emerging womanly shape, and by the soul-shattering feeling of longing for acceptance yet receiving none.

Written by Fanny Britt and illustrated by Isabelle Arsenault — the artist behind the magnificent Virginia Wolf, one of the best children’s books of 2012 — this masterpiece of storytelling is as emotionally honest and psychologically insightful as it is graphically stunning. What makes the visual narrative especially enchanting is that Hélène’s black-and-white world of daily sorrow springs to life in full color whenever she escapes with Brönte.

Originally featured in November — see more here.

12. MY FIRST KAFKA

Sylvia Plath believed it was never too early to dip children’s toes in the vast body of literature. But to plunge straight into Kafka? Why not, which is precisely what Brooklyn-based writer and videogame designer Matthue Roth has done in My First Kafka: Runaways, Rodents, and Giant Bugs (public library) — a magnificent adaptation of Kafka for kids. With stunning black-and-white illustrations by London-based fine artist Rohan Daniel Eason, this gem falls — rises, rather — somewhere between Edward Gorey, Maurice Sendak, and the Graphic Canon series.

The idea came to Roth after he accidentally started reading Kafka to his two little girls, who grew enchanted with the stories. As for the choice to adapt Kafka’s characteristically dark sensibility for children, Roth clearly subscribes to the Sendakian belief that grown-ups project their own fears onto kids, who welcome rather than dread the dark. Indeed, it’s hard not to see Sendak’s fatherly echo in Eason’s beautifully haunting black-and-white drawings.

Much like Jonathan Safran Foer used Street of Crocodiles to create his brilliant Tree of Codes literary remix and Darwin’s great-granddaughter adapted the legendary naturalist’s biography into verse, Roth scoured public domain texts and various translations of Kafka to find the perfect works for his singsong transformations: the short prose poem “Excursion into the Mountains,” the novella “The Metamorphosis,” which endures as Kafka’s best-known masterpiece, and “Josefine the Singer,” his final story.

“I don’t know!”
I cried without being heard.

“I do not know.”

If nobody comes,
then nobody comes.

I’ve done nobody any harm.
Nobody’s done me any harm.
But nobody will help me.

A pack of nobodies
would be rather fine,
on the other hand.

I’d love to go on a trip — why not? —
with a pack of nobodies.

Into the mountains, of course.
Where else?

In a way, the book — like most of Kafka’s writing — also bears the odd mesmerism of literary history’s letters and diaries, the semi-forbidden pleasure of which swells under the awareness that their writers never meant for us to read the very words we’re reading, never sought to invite us into their private worlds. Kafka wished for his entire world to remain private — he never finished any of his novels and burned the majority of his manuscripts; the rest he left with his closest friend and literary executor, Max Brod, whom he instructed to burn the remaining diaries, sketches, manuscripts, and letters. It was out of love that Brod chose not to, possibly displeasing his friend but eternally pleasing the literary public.

Originally featured in July — see more here.

13. MY FATHER’S ARMS ARE A BOAT

The finest children’s books have a way of exploring complex, universal themes through elegant simplicity and breathless beauty. From my friends at Enchanted Lion, collaborators on Mark Twain’s Advice to Little Girls and makers of some of the most extraordinary picture-books you’ll ever encounter, comes My Father’s Arms Are a Boat (public library) by writer Stein Erik Lunde and illustrator Øyvind Torseter. This tender and heartening Norwegian gem tells the story of an anxious young boy who climbs into his father’s arms seeking comfort on a cold sleepless night. The two step outside into the winter wonderland as the boy asks questions about the red birds in the spruce tree to be cut down the next morning, about the fox out hunting, about why his mother will never wake up again. With his warm and assuring answers, the father watches his son make sense of this strange world of ours where love and loss go hand in hand.

Lunde, who also writes lyrics and has translated Bob Dylan into Norwegian, is a masterful storyteller who unfolds incredible richness in few words. Meanwhile, Torseter’s exquisite 2D/3D style combining illustration and paper sculpture, reminiscent of Soyeon Kim’s wonderful You Are Stardust, envelops the story in a sheath of delicate whimsy.

Above all, My Father’s Arms Are a Boat is about the quiet way in which boundless love and unconditional assurance can lift even the most pensive of spirits from the sinkhole of existential anxiety.

Originally featured in April.

HONORABLE MENTIONS

Go: A Kidd’s Guide to Graphic Design by legendary graphic designer Chip Kidd, Night Light by New York Times art director and illustrator Nicholas Blechman, and Mr. Tiger Goes Wild by Caldecott Honor artist Peter Brown.

BP

The 13 Best Psychology and Philosophy Books of 2013

How to think like Sherlock Holmes, make better mistakes, master the pace of productivity, find fulfilling work, stay sane, and more.

After the best biographies, memoirs, and history books of 2013, the season’s subjective selection of best-of reading lists continue with the most stimulating psychology and philosophy books published this year. (Catch up on the 2012 roundup here and 2011’s here.)

1. ON LOOKING: ELEVEN WALKS WITH EXPERT EYES

“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library) — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Her approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.

2. TIME WARPED

Given my soft spot for famous diaries, it should come as no surprise that I keep one myself. Perhaps the greatest gift of the practice has been the daily habit of reading what I had written on that day a year earlier; not only is it a remarkable tool of introspection and self-awareness, but it also illustrates that our memory “is never a precise duplicate of the original [but] a continuing act of creation” and how flawed our perception of time is — almost everything that occurred a year ago appears as having taken place either significantly further in the past (“a different lifetime,” I’d often marvel at this time-illusion) or significantly more recently (“this feels like just last month!”). Rather than a personal deficiency of those of us befallen by this tendency, however, it turns out to be a defining feature of how the human mind works, the science of which is at first unsettling, then strangely comforting, and altogether intensely interesting.

That’s precisely what acclaimed BBC broadcaster and psychology writer Claudia Hammond explores in Time Warped: Unlocking the Mysteries of Time Perception (public library) — a fascinating foray into the idea that our experience of time is actively created by our own minds and how these sensations of what neuroscientists and psychologists call “mind time” are created. As disorienting as the concept might seem — after all, we’ve been nursed on the belief that time is one of those few utterly reliable and objective things in life — it is also strangely empowering to think that the very phenomenon depicted as the unforgiving dictator of life is something we might be able to shape and benefit from. Hammond writes:

We construct the experience of time in our minds, so it follows that we are able to change the elements we find troubling — whether it’s trying to stop the years racing past, or speeding up time when we’re stuck in a queue, trying to live more in the present, or working out how long ago we last saw our old friends. Time can be a friend, but it can also be an enemy. The trick is to harness it, whether at home, at work, or even in social policy, and to work in line with our conception of time. Time perception matters because it is the experience of time that roots us in our mental reality. Time is not only at the heart of the way we organize life, but the way we experience it.

Discus chronologicus, a depiction of time by German engraver Christoph Weigel, published in the early 1720s; from Cartographies of Time. (Click for details)

Among the most intriguing illustrations of “mind time” is the incredible elasticity of how we experience time. (“Where is it, this present?,” William James famously wondered. “It has melted in our grasp, fled ere we could touch it, gone in the instant of becoming.”) For instance, Hammond points out, we slow time down when gripped by mortal fear — the cliche about the slow-motion car crash is, in fact, a cognitive reality. This plays out even in situations that aren’t life-or-death per se but are still associated with strong feelings of fear. Hammond points to a study in which people with arachnophobia were asked to look at spiders — the very object of their intense fear — for 45 seconds and they overestimated the elapsed time. The same pattern was observed in novice skydivers, who estimated the duration of their peers’ falls as short, whereas their own, from the same altitude, were deemed longer.

Inversely, time seems to speed up as we get older — a phenomenon of which competing theories have attempted to make light. One, known as the “proportionality theory,” uses pure mathematics, holding that a year feels faster when you’re 40 than when you’re 8 because it only constitutes one fortieth of your life rather than a whole eighth. Among its famous proponents are Vladimir Nabokov and William James. But Hammond remains unconvinced:

The problem with the proportionality theory is that it fails to account for the way we experience time at any one moment. We don’t judge one day in the context of our whole lives. If we did, then for a 40-year-old every single day should flash by because it is less than one fourteen-thousandth of the life they’ve had so far. It should be fleeting and inconsequential, yet if you have nothing to do or an enforced wait at an airport for example, a day at 40 can still feel long and boring and surely longer than a fun day at the seaside packed with adventure for a child. … It ignores attention and emotion, which … can have a considerable impact on time perception.

Another theory suggests that perhaps it is the tempo of life in general that has accelerated, making things from the past appear as slower, including the passage of time itself.

But one definite change does take place with age: As we grow older, we tend to feel like the previous decade elapsed more rapidly, while the earlier decades of our lives seem to have lasted longer. Similarly, we tend to think of events that took place in the past 10 years as having happened more recently than they actually did. (Quick: What year did the devastating Japanese tsunami hit? When did we lose Maurice Sendak?) Conversely, we perceive events that took place more than a decade ago as having happened even longer ago. (When did Princess Diana die? What year was the Chernobyl disaster?) This, Hammond points out, is known as “forward telescoping”:

It is as though time has been compressed and — as if looking through a telescope — things seem closer than they really are. The opposite is called backward or reverse telescoping, also known as time expansion. This is when you guess that events happened longer ago than they really did. This is rare for distant events, but not uncommon for recent weeks.

[…]

The most straightforward explanation for it is called the clarity of memory hypothesis, proposed by the psychologist Norman Bradburn in 1987. This is the simple idea that because we know that memories fade over time, we use the clarity of a memory as a guide to its recency. So if a memory seems unclear we assume it happened longer ago.

Originally featured in July, with a deeper dive into the psychology of why time slows down when we’re afraid, speeds up as we age, and gets warped when we’re on vacation.

3. HOW TO FIND FULFILLING WORK

“If one wanted to crush and destroy a man entirely, to mete out to him the most terrible punishment,” wrote Dostoevsky, “all one would have to do would be to make him do work that was completely and utterly devoid of usefulness and meaning.” Indeed, the quest to avoid work and make a living of doing what you love is a constant conundrum of modern life. In How to Find Fulfilling Work (public library) — the latest installment in The School of Life’s wonderful series reclaiming the traditional self-help genre as intelligent, non-self-helpy, yet immensely helpful guides to modern living, which previously gave us Philippa Perry’s How to Stay Sane and Alain de Botton’s How to Think More About Sex — philosopher Roman Krznaric (remember him?) explores the roots of this contemporary quandary and guides us to its fruitful resolution:

The desire for fulfilling work — a job that provides a deep sense of purpose, and reflects our values, passions and personality — is a modern invention. … For centuries, most inhabitants of the Western world were too busy struggling to meet their subsistence needs to worry about whether they had an exciting career that used their talents and nurtured their wellbeing. But today, the spread of material prosperity has freed our minds to expect much more from the adventure of life.

We have entered a new age of fulfillment, in which the great dream is to trade up from money to meaning.

Krznaric goes on to outline two key afflictions of the modern workplace — “a plague of job dissatisfaction” and “uncertainty about how to choose the right career” — and frames the problem:

Never have so many people felt so unfulfilled in their career roles, and been so unsure what to do about it. Most surveys in the West reveal that at least half the workforce are unhappy in their jobs. One cross-European study showed that 60 per cent of workers would choose a different career if they could start again. In the United States, job satisfaction is at its lowest level — 45 per cent — since record-keeping began over two decades ago.

Of course, Krznaric points out, there’s plenty of cynicism and skepticism to go around, with people questioning whether it’s even possible to find a job in which we thrive and feel complete. He offers an antidote to the default thinking:

There are two broad ways of thinking about these questions. The first is the ‘grin and bear it’ approach. This is the view that we should get our expectations under control and recognize that work, for the vast majority of humanity — including ourselves — is mostly drudgery and always will be. Forget the heady dream of fulfillment and remember Mark Twain’s maxim. “Work is a necessary evil to be avoided.” … The history is captured in the word itself. The Latin labor means drudgery or toil, while the French travail derives from the tripalium, an ancient Roman instrument of torture made of three sticks. … The message of the ‘grin and bear it’ school of thought is that we need to accept the inevitable and put up with whatever job we can get, as long as it meets our financial needs and leaves us enough time to pursue our ‘real life’ outside office hours. The best way to protect ourselves from all the optimistic pundits pedaling fulfillment is to develop a hardy philosophy of acceptance, even resignation, and not set our hearts on finding a meaningful career.

I am more hopeful than this, and subscribe to a different approach, which is that it is possible to find work that is life-enhancing, that broadens our horizons and makes us feel more human.

[…]

This is a book for those who are looking for a job that is big enough for their spirit, something more than a ‘day job’ whose main function is to pay the bills.

‘Never have so many people felt so unfulfilled in their career roles, and been so unsure what to do about it.’

Krznaric considers the five keys to making a career meaningful — earning money, achieving status, making a difference, following our passions, and using our talents — but goes on to demonstrate that they aren’t all created equal. In particular, he echoes 1970s Zen pioneer Alan Watts and modern science in arguing that money alone is a poor motivator:

Schopenhauer may have been right that the desire for money is widespread, but he was wrong on the issue of equating money with happiness. Overwhelming evidence has emerged in the last two decades that the pursuit of wealth is an unlikely path to achieving personal wellbeing — the ancient Greek ideal of eudaimonia or ‘the good life.’ The lack of any clear positive relationship between rising income and rising happiness has become one of the most powerful findings in the modern social sciences. Once our income reaches an amount that covers our basic needs, further increases add little, if anything, to our levels of life satisfaction.

The second false prophet of fulfillment, as Y-Combinator Paul Graham has poignantly cautioned and Debbie Millman has poetically articulated, is prestige. Krznaric admonishes:

We can easily find ourselves pursuing a career that society considers prestigious, but which we are not intrinsically devoted to ourselves — one that does not fulfill us on a day-to-day basis.

Krznaric pits respect, which he defines as “being appreciated for what we personally bring to a job, and being valued for our individual contribution,” as the positive counterpart to prestige and status, arguing that “in our quest for fulfilling work, we should seek a job that offers not just good status prospects, but good respect prospects.”

Rather than hoping to create a harmonious union between the pursuit of money and values, we might have better luck trying to combine values with talents. This idea comes courtesy of Aristotle, who is attributed with saying, ‘Where the needs of the world and your talents cross, there lies your vocation.’

Originally featured in April — read the full article here.

4. INTUITION PUMPS

“If you are not making mistakes, you’re not taking enough risks,” Debbie Millman counseled. “Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before,” Neil Gaiman advised young creators. In Intuition Pumps And Other Tools for Thinking (public library), the inimitable Daniel Dennett, one of our greatest living philosophers, offers a set of thinking tools — “handy prosthetic imagination-extenders and focus holders” — that allow us to “think reliably and even gracefully about really hard questions” — to enhance your cognitive toolkit. He calls these tools “intuition pumps” — thought experiments designed to stir “a heartfelt, table-thumping intuition” (which we know is a pillar of even the most “rational” of science) about the question at hand, a kind of persuasion tool the reverse-engineering of which enables us to think better about thinking itself. Intuition, of course, is a domain-specific ability that relies on honed critical thinking rather than a mystical quality bestowed by the gods — but that’s precisely Dennett’s point, and his task is to help us hone it.

Though most of his 77 “intuition pumps” address concrete questions, a dozen are “general-purpose” tools that apply deeply and widely, across just about any domain of thinking. The first of them is also arguably the most useful yet most uncomfortable: making mistakes.

Echoing Dorion Sagan’s case for why science and philosophy need each other, Dennett begins with an astute contribution to the best definitions of philosophy, wrapped in a necessary admonition about the value of history:

The history of philosophy is in large measure the history of very smart people making very tempting mistakes, and if you don’t know the history, you are doomed to making the same darn mistakes all over again. … There is no such thing as philosophy-free science, just science that has been conducted without any consideration of its underlying philosophical assumptions.

He speaks for the generative potential of mistakes and their usefulness as an empirical tool:

Sometimes you don’t just want to risk making mistakes; you actually want to make them — if only to give you something clear and detailed to fix.

Therein lies the power of mistakes as a vehicle for, as Rilke famously put it, “living the questions” and thus advancing knowledge in a way that certainty cannot — for, as Richard Feynman memorably noted, the scientist’s job is to remain unsure, and so seems the philosopher’s. Dennett writes:

We philosophers are mistake specialists. … While other disciplines specialize in getting the right answers to their defining questions, we philosophers specialize in all the ways there are of getting things so mixed up, so deeply wrong, that nobody is even sure what the right questions are, let alone the answers. Asking the wrong questions risks setting any inquiry off on the wrong foot. Whenever that happens, this is a job for philosophers! Philosophy — in every field of inquiry — is what you have to do until you figure out what questions you should have been asking in the first place.

[…]

Mistakes are not just opportunities for learning; they are, in an important sense, the only opportunity for learning or making something truly new. Before there can be learning, there must be learners. There are only two non-miraculous ways for learners to come into existence: they must either evolve or be designed and built by learners that evolved. Biological evolution proceeds by a grand, inexorable process of trial and error — and without the errors the trials wouldn’t accomplish anything.

Dennett offers a caveat that at once highlights the importance of acquiring knowledge and reminds us of the power of “chance-opportunism”:

Trials can be either blind or foresighted. You, who know a lot, but not the answer to the question at hand, can take leaps — foresighted leaps. You can look before you leap, and hence be somewhat guided from the outset by what you already know. You need not be guessing at random, but don’t look down your nose at random guesses; among its wonderful products is … you!

And since evolution is the highest epitome of how the process of trial and error drives progress, Dennett makes a case for understanding evolution as a key to understanding everything else we humans value:

Evolution … is the central, enabling process not only of life but also of knowledge and learning and understanding. If you attempt to make sense of the world of ideas and meanings, free will and morality, art and science and even philosophy itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back. … For evolution, which knows nothing, the steps into novelty are blindly taken by mutations, which are random copying “errors” in DNA.

Dennett echoes Dostoyevsky (“Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others.”) and offers the key to making productive mistakes:

The chief trick to making good mistakes is not to hide them — especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own mistakes, turning them over in your mind as if they were works of art, which in a way they are. … The trick is to take advantage of the particular details of the mess you’ve made, so that your next attempt will be informed by it and not just another blind stab in the dark.

We have all heard the forlorn refrain “Well, it seemed like a good idea at the time!” This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say, “Well, it seemed like a good idea at the time!” is standing on the threshold of brilliance.

Originally featured in May — read the full article here.

5. MASTERMIND: HOW TO THINK LIKE SHERLOCK HOLMES

“The habit of mind which leads to a search for relationships between facts,” wrote James Webb Young in his famous 1939 5-step technique for creative problem-solving, “becomes of the highest importance in the production of ideas.” But just how does one acquire those vital cognitive customs? That’s precisely what science writer Maria Konnikova explores in Mastermind: How to Think Like Sherlock Holmes (UK; public library) — an effort to reverse-engineer Holmes’s methodology into actionable insights that help develop “habits of thought that will allow you to engage mindfully with yourself and your world as a matter of course.”

Bridging ample anecdotes from the adventures of Conan Doyle’s beloved detective with psychology studies both classic and cutting-edge, Konnikova builds a compelling case at the intersection of science and secular spiritualism, stressing the power of rigorous observation alongside a Buddhist-like, Cageian emphasis on mindfulness. She writes:

The idea of mindfulness itself is by no means a new one. As early as the end of the nineteenth century, William James, the father of modern psychology, wrote that, ‘The faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will. … An education which should improve this faculty would be the education par excellence.’ That faculty, at its core, is the very essence of mindfulness. And the education that James proposes, an education in a mindful approach to life and to thought.

[…]

In recent years, studies have shown that meditation-like thought (an exercise in the very attentional control that forms the center of mindfulness), for as little as fifteen minutes a day, can shift frontal brain activity toward a pattern that has been associated with more positive and more approach-oriented emotional states, and that looking at scenes of nature, for even a short while, can help us become more insightful, more creative, and more productive. We also know, more definitively than we ever have, that our brains are not built for multitasking — something that precludes mindfulness altogether. When we are forced to do multiple things at once, not only do we perform worse on all of them but our memory decreases and our general wellbeing suffers a palpable hit.

But for Sherlock Holmes, mindful presence is just a first step. It’s a means to a far larger, far more practical and practically gratifying goal. Holmes provides precisely what William James had prescribed: an education in improving our faculty of mindful thought and in using it in order to accomplish more, think better, and decide more optimally. In its broadest application, it is a means for improving overall decision making and judgment ability, starting from the most basic building block of your own mind.

But mindfulness, and the related mental powers it bestows upon its master, is a skill acquired with grit and practice, rather than an in-born talent or an easy feat attained with a few half-hearted tries:

It is most difficult to apply Holmes’s logic in those moments that matter the most. And so, all we can do is practice, until our habits are such that even the most severe stressors will bring out the very thought patterns that we’ve worked so hard to master.

Echoing Carl Sagan, Konnikova examines the role of intuition — a grab-bag concept embraced by some of history’s greatest scientific minds, cultural icons, and philosophers — as both a helpful directional signpost of intellectual inquiry and a dangerous blind spot:

Our intuition is shaped by context, and that context is deeply informed by the world we live in. It can thus serve as a blinder — or blind spot — of sorts. … With mindfulness, however, we can strive to find a balance between fact-checking our intuitions and remaining open-minded. We can then make our best judgments, with the information we have and no more, but with, as well, the understanding that time may change the shape and color of that information.

“I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose,” Holmes famously remarked. Indeed, much like the inventor’s mind, the problem-solver’s mind is the product of that very choice: The details and observations we select to include in our “brain attic” shape and filter our perception of reality. Konnikova writes:

Observation with a capital O — the way Holmes uses the word when he gives his new companion a brief history of his life with a single glance — does entail more than, well, observation (the lowercase kind). It’s not just about the passive process of letting objects enter into your visual field. It is about knowing what and how to observe and directing your attention accordingly: what details do you focus on? What details do you omit? And how do you take in and capture those details that you do choose to zoom in on? In other words, how do you maximize your brain attic’s potential? You don’t just throw any old detail up there, if you remember Holmes’s early admonitions; you want to keep it as clean as possible. Everything we choose to notice has the potential to become a future furnishing of our attics — and what’s more, its addition will mean a change in the attic’s landscape that will affect, in turn, each future addition. So we have to choose wisely.

Choosing wisely means being selective. It means not only looking but looking properly, looking with real thought. It means looking with the full knowledge that what you note — and how you note it — will form the basis of any future deductions you might make. It’s about seeing the full picture, noting the details that matter, and understanding how to contextualize those details within a broader framework of thought.

Originally featured in January — read the full article for more, including Konnikova’s four rules for Sherlockian thinking.

6. MAKE GOOD ART

Commencement season is upon us and, after Greil Marcus’s soul-stirring speech on the essence of art at the 2013 School of Visual Arts graduation ceremony, here comes an exceptional adaptation of one of the best commencement addresses ever delivered: In May of 2012, beloved author Neil Gaiman stood up in front of the graduating class at Philadelphia’s University of the Arts and dispensed some timeless advice on the creative life; now, his talk comes to life as a slim but potent book titled Make Good Art (public library).

Best of all, it’s designed by none other than the inimitable Chip Kidd, who has spent the past fifteen years shaping the voice of contemporary cover design with his prolific and consistently stellar output, ranging from bestsellers like cartoonist Chris Ware’s sublime Building Stories and neurologist Oliver Sacks’s The Mind’s Eye to lesser-known gems like The Paris Review‘s Women Writers at Work and The Letter Q, that wonderful anthology of queer writers’ letters to their younger selves. (Fittingly, Kidd also designed the book adaptation of Ann Patchett’s 2006 commencement address.)

When things get tough, this is what you should do: Make good art. I’m serious. Husband runs off with a politician — make good art. Leg crushed and then eaten by a mutated boa constrictor — make good art. IRS on your trail — make good art. Cat exploded — make good art. Someone on the Internet thinks what you’re doing is stupid or evil or it’s all been done before — make good art. Probably things will work out somehow, eventually time will take the sting away, and that doesn’t even matter. Do what only you can do best: Make good art. Make it on the bad days, make it on the good days, too.

A wise woman once said, “If you are not making mistakes, you’re not taking enough risks.” Gaiman articulates the same sentiment with his own brand of exquisite eloquence:

I hope that in this year to come, you make mistakes.

Because if you are making mistakes, then you are making new things, trying new things, learning, living, pushing yourself, changing yourself, changing your world. You’re doing things you’ve never done before, and more importantly, you’re Doing Something.

So that’s my wish for you, and all of us, and my wish for myself. Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before. Don’t freeze, don’t stop, don’t worry that it isn’t good enough, or it isn’t perfect, whatever it is: art, or love, or work or family or life.

Whatever it is you’re scared of doing, Do it.

Make your mistakes, next year and forever.

Originally featured in May — read the full article here, along with a video of Gaiman’s original commencement address.

7. HOW CHILDREN SUCCEED

In How Children Succeed: Grit, Curiosity, and the Hidden Power of Character (public library) — a necessary addition to these fantastic reads on educationPaul Tough, whose writing has appeared in The New Yorker, Slate, Esquire, The New York Times, sets out to investigate the essential building blocks of character through the findings and practical insight of exceptional educators and bleeding-edge researchers. One of his core arguments is based on the work of pioneering psychologist and 2013 MacArthur “genius” grantee Angela Duckworth, who studied under positive psychology godfather Martin Seligman at my alma mater, the University of Pennsylvania, and has done more than anyone for advancing our understanding of how self-control and grit — the relentless work ethic of sustaining your commitments toward a long-term goal — impact success.

Duckworth had come to Penn in 2002, at the age of thirty-two, later in life than a typical graduate student. The daughter of Chinese immigrants, she had been a classic multitasking overachiever in her teens and twenties. After completing her undergraduate degree at Harvard (and starting a summer school for low-income kids in Cambridge in her spare time), she had bounced from one station of the mid-nineties meritocracy to the next: intern in the White House speechwriting office, Marshall scholar at Oxford (where she studied neuroscience), management consultant for McKinsey and Company, charter-school adviser.

Duckworth spent a number of years toying with the idea of starting her own charter school, but eventually concluded that the model didn’t hold much promise for changing the circumstances of children from disadvantaged backgrounds, those whom the education system was failing most tragically. Instead, she decided to pursue a PhD program at Penn. In her application essay, she shared how profoundly the experience of working in schools had changed her view of school reform and wrote:

The problem, I think, is not only the schools but also the students themselves. Here’s why: learning is hard. True, learning is fun, exhilarating and gratifying — but it is also often daunting, exhausting and sometimes discouraging. . . . To help chronically low-performing but intelligent students, educators and parents must first recognize that character is at least as important as intellect.

Duckworth began her graduate work by studying self-discipline. But when she completed her first-year thesis, based on a group of 164 eighth-graders from a Philadelphia middle school, she arrived at a startling discovery that would shape the course of her career: She found that the students’ self-discipline scores were far better predictors of their academic performance than their IQ scores. So she became intensely interested in what strategies and tricks we might develop to maximize our self-control, and whether those strategies can be taught. But self-control, it turned out, was only a good predictor when it came to immediate, concrete goals — like, say, resisting a cookie. Tough writes:

Duckworth finds it useful to divide the mechanics of achievement into two separate dimensions: motivation and volition. Each one, she says, is necessary to achieve long-term goals, but neither is sufficient alone. Most of us are familiar with the experience of possessing motivation but lacking volition: You can be extremely motivated to lose weight, for example, but unless you have the volition — the willpower, the self-control — to put down the cherry Danish and pick up the free weights, you’re not going to succeed. If a child is highly motivated, the self-control techniques and exercises Duckworth tried to teach [the students in her study] might be very helpful. But what if students just aren’t motivated to achieve the goals their teachers or parents want them to achieve? Then, Duckworth acknowledges, all the self-control tricks in the world aren’t going to help.

This is where grit comes in — the X-factor that helps us attain more long-term, abstract goals. To address this, Duckworth and her colleague Chris Peterson developed the Grit Scale — a deceptively simple test, on which you evaluate how much twelve statements apply to you, from “I am a hard worker” to “New ideas and projects sometimes distract me from previous ones.” The results are profoundly predictive of success at such wide-ranging domains of achievement as the National Spelling Bee and the West Point military academy. Tough describes the surprising power of this seemingly mundane questionnaire:

For each statement, respondents score themselves on a five-point scale, ranging from 5, “very much like me,” to 1, “not like me at all.” The test takes about three minutes to complete, and it relies entirely on self-report — and yet when Duckworth and Peterson took it out into the field, they found it was remarkably predictive of success. Grit, Duckworth discovered, is only faintly related to IQ — there are smart gritty people and dumb gritty people — but at Penn, high grit scores allowed students who had entered college with relatively low college-board scores to nonetheless achieve high GPAs. At the National Spelling Bee, Duckworth found that children with high grit scores were more likely to survive to the later rounds. Most remarkable, Duckworth and Peterson gave their grit test to more than twelve hundred freshman cadets as they entered the military academy at West Point and embarked on the grueling summer training course known as Beast Barracks. The military has developed its own complex evaluation, called the whole candidate score, to judge incoming cadets and predict which of them will survive the demands of West Point; it includes academic grades, a gauge of physical fitness, and a leadership potential score. But the more accurate predictor of which cadets persisted in Beast Barracks and which ones dropped out turned out to be Duckworth’s simple little twelve-item grit questionnaire.

You can take the Grit Scale here (registration is free).

8. THINKING: THE NEW SCIENCE OF DECISION-MAKING, PROBLEM-SOLVING AND PREDICTION

Every year, intellectual impresario and Edge editor John Brockman summons some of our era’s greatest thinkers and unleashes them on one provocative question, whether it’s the single most elegant theory of how the world works or the best way to enhance our cognitive toolkit. This year, he sets out on the most ambitious quest yet, a meta-exploration of thought itself: Thinking: The New Science of Decision-Making, Problem-Solving, and Prediction (public library) collects short essays and lecture adaptations from such celebrated and wide-ranging (though not in gender) minds as Daniel Dennett, Jonathan Haidt, Dan Gilbert, and Timothy Wilson, covering subjects as diverse as morality, essentialism, and the adolescent brain.

One of the most provocative contributions comes from Nobel-winning psychologist Daniel Kahneman — author of the indispensable Thinking, Fast and Slow, one of the best psychology books of 2012 — who examines “the marvels and the flaws of intuitive thinking.”

In the 1970s, Kahneman and his colleague Amos Tversky, self-crowned “prophets of irrationality,” began studying what they called “heuristics and biases” — mental shortcuts we take, which frequently result in cognitive errors. Those errors, however, reveal a great deal about how our minds work:

If you want to characterize how something is done, then one of the most powerful ways of characterizing how the mind does anything is by looking at the errors that the mind produces while it’s doing it because the errors tell you what it is doing. Correct performance tells you much less about the procedure than the errors do.

One of the most fascinating examples of heuristics and biases is what we call intuition — a complex cluster of cognitive processes, sometimes helpful but often misleading. Kahneman notes that thoughts come to mind in one of two ways: Either by “orderly computation,” which involves a series of stages of remembering rules and then applying them, or by perception, an evolutionary function that allows us to predict outcomes based on what we’re perceiving. (For instance, seeing a woman’s angry face helps us predict the general sentiment and disposition of what she’s about to say.) It is the latter mode that precipitates intuition. Kahneman explains the interplay:

There is no sharp line between intuition and perception. … Perception is predictive. . . . If you want to understand intuition, it is very useful to understand perception, because so many of the rules that apply to perception apply as well to intuitive thinking. Intuitive thinking is quite different from perception. Intuitive thinking has language. Intuitive thinking has a lot of word knowledge organized in different ways more than mere perception. But some very basic characteristics [of] perception are extended almost directly to intuitive thinking.

He then considers how the two types of mental operations established by modern cognitive science illuminate intuition:

Type 1 is automatic, effortless, often unconscious, and associatively coherent. . . . Type 2 is controlled, effortful, usually conscious, tends to be logically coherent, rule-governed. Perception and intuition are Type 1. … Type 2 is more controlled, slower, is more deliberate. . . . Type 2 is who we think we are. [And yet] if one made a film on this, Type 2 would be a secondary character who thinks that he is the hero because that’s who we think we are, but in fact, it’s Type 1 that does most of the work, and it’s most of the work that is completely hidden from us.

Type 1 also encompasses all of our practiced skills — for instance, driving, speaking, and understanding a language — which after a certain threshold of mastery enter autopilot mode. (Though this presents its own set of problems.) Underpinning that mode of thinking is our associative memory, which Kahneman unpacks:

You have to think of [your associative memory] as a huge repository of ideas, linked to each other in many ways, including causal links and other links, and activation spreading from ideas to other ideas until a small subset of that enormous network is illuminated, and the subset is what’s happening in the mind at the moment. You’re not conscious of it, you’re conscious of very little of it.

The Type 1 modality of thought gives rise to a System 1 of interpretation, which is at the heart of what we call “intuition” — but which is far less accurate and reliable than we like to believe:

System 1 infers and invents causes and intentions. [This] happens automatically. Infants have it. . . . We’re equipped … for the perception of causality.

It neglects ambiguity and suppresses doubt and … exaggerates coherence. Associative coherence [is] in large part where the marvels turn into flaws. We see a world that is vastly more coherent than the world actually is. That’s because of this coherence-creating mechanism that we have. We have a sense-making organ in our heads, and we tend to see things that are emotionally coherent, and that are associatively coherent.

Most treacherous of all is our tendency to use our very confidence — and overconfidence — as evidence itself:

What’s interesting is that many a time people have intuitions that they’re equally confident about except they’re wrong. That happens through the mechanism I call “the mechanism of substitution.” You have been asked a question, and instead you answer another question, but that answer comes by itself with complete confidence, and you’re not aware that you’re doing something that you’re not an expert on because you have one answer. Subjectively, whether it’s right or wrong, it feels exactly the same. Whether it’s based on a lot of information, or a little information, this is something that you may step back and have a look at. But the subjective sense of confidence can be the same for intuition that arrives from expertise, and for intuitions that arise from heuristics. . . .

In other words, intuition, like attention, is “an intentional, unapologetic discriminator [that] asks what is relevant right now, and gears us up to notice only that” — a humbling antidote to our culture’s propensity for self-righteousness, and above all a reminder to allow yourself the uncomfortable luxury of changing your mind.

Originally featured in October — read the full article here.

9. MANAGE YOUR DAY-TO-DAY

We seem to have a strange but all too human cultural fixation on the daily routines and daily rituals of famous creators, from Vonnegut to Burroughs to Darwin — as if a glimpse of their day-to-day would somehow magically infuse ours with equal potency, or replicating it would allow us to replicate their genius in turn. And though much of this is mere cultural voyeurism, there is something to be said for the value of a well-engineered daily routine to anchor the creative process. Manage Your Day-to-Day: Build Your Routine, Find Your Focus, and Sharpen Your Creative Mind (public library), edited by Behance’s 99U editor-in-chief Jocelyn Glei, delves into the secrets of this holy grail of creativity. Twenty of today’s most celebrated thinkers and doers explore such facets of the creative life as optimizing your idea-generation, defying the demons of perfectionism, managing procrastination, and breaking through your creative blocks, with insights from magnificent minds ranging from behavioral economist Dan Ariely to beloved graphic designer Stefan Sagmeister.

In the foreword to the book, Behance founder Scott Belsky, author of the indispensable Making Ideas Happen, points to “reactionary workflow” — our tendency to respond to requests and other stimuli rather than create meaningful work — as today’s biggest problem and propounds a call to arms:

It’s time to stop blaming our surroundings and start taking responsibility. While no workplace is perfect, it turns out that our gravest challenges are a lot more primal and personal. Our individual practices ultimately determine what we do and how well we do it. Specifically, it’s our routine (or lack thereof), our capacity to work proactively rather than reactively, and our ability to systematically optimize our work habits over time that determine our ability to make ideas happen.

[…]

Only by taking charge of your day-to-day can you truly make an impact in what matters most to you. I urge you to build a better routine by stepping outside of it, find your focus by rising above the constant cacophony, and sharpen your creative prowess by analyzing what really matters most when it comes to making your ideas happen.

One of the book’s strongest insights comes from Gretchen Rubin — author of The Happiness Project: Or, Why I Spent a Year Trying to Sing in the Morning, Clean My Closets, Fight Right, Read Aristotle, and Generally Have More Fun, one of these 7 essential books on the art and science of happiness, titled after her fantastic blog of the same name — who points to frequency as the key to creative accomplishment:

We tend to overestimate what we can do in a short period, and underestimate what we can do over a long period, provided we work slowly and consistently. Anthony Trollope, the nineteenth-century writer who managed to be a prolific novelist while also revolutionizing the British postal system, observed, “A small daily task, if it be really daily, will beat the labours of a spasmodic Hercules.” Over the long run, the unglamorous habit of frequency fosters both productivity and creativity.

Frequency, she argues, helps facilitate what Arthur Koestler has famously termed “bisociation” — the crucial ability to link the seemingly unlinkable, which is the defining characteristic of the creative mind. Rubin writes:

You’re much more likely to spot surprising relationships and to see fresh connections among ideas, if your mind is constantly humming with issues related to your work. When I’m deep in a project, everything I experience seems to relate to it in a way that’s absolutely exhilarating. The entire world becomes more interesting. That’s critical, because I have a voracious need for material, and as I become hyperaware of potential fodder, ideas pour in. By contrast, working sporadically makes it hard to keep your focus. It’s easy to become blocked, confused, or distracted, or to forget what you were aiming to accomplish.

[…]

Creativity arises from a constant churn of ideas, and one of the easiest ways to encourage that fertile froth is to keep your mind engaged with your project. When you work regularly, inspiration strikes regularly.

Echoing Alexander Graham Bell, who memorably wrote that “it is the man who carefully advances step by step … who is bound to succeed in the greatest degree,” and Virginia Woolf, who extolled the creative benefits of keeping a diary, Rubin writes:

Step by step, you make your way forward. That’s why practices such as daily writing exercises or keeping a daily blog can be so helpful. You see yourself do the work, which shows you that you can do the work. Progress is reassuring and inspiring; panic and then despair set in when you find yourself getting nothing done day after day. One of the painful ironies of work life is that the anxiety of procrastination often makes people even less likely to buckle down in the future.

Riffing on wisdom from her latest book, Happier at Home: Kiss More, Jump More, Abandon a Project, Read Samuel Johnson, and My Other Experiments in the Practice of Everyday Life, Rubin offers:

I have a long list of “Secrets of Adulthood,” the lessons I’ve learned as I’ve grown up, such as: “It’s the task that’s never started that’s more tiresome,” “The days are long, but the years are short,” and “Always leave plenty of room in the suitcase.” One of my most helpful Secrets is, “What I do every day matters more than what I do once in a while.”

With a sentiment reminiscent of William James’s timeless words on habit, she concludes:

Day by day, we build our lives, and day by day, we can take steps toward making real the magnificent creations of our imaginations.

Entrepreneurship guru and culture-sage Seth Godin seconds Rubin and admonishes against confusing vacant ritualization with creative rituals that actually spur productivity:

Everybody who does creative work has figured out how to deal with their own demons to get their work done. There is no evidence that setting up your easel like Van Gogh makes you paint better. Tactics are idiosyncratic. But strategies are universal, and there are a lot of talented folks who are not succeeding the way they want to because their strategies are broken.

The strategy is simple, I think. The strategy is to have a practice, and what it means to have a practice is to regularly and reliably do the work in a habitual way.

There are many ways you can signify to yourself that you are doing your practice. For example, some people wear a white lab coat or a particular pair of glasses, or always work in a specific place — in doing these things, they are professionalizing their art.

He echoes Chuck Close (“Inspiration is for amateurs — the rest of us just show up and get to work.”), Tchaikovsky (“a self-respecting artist must not fold his hands on the pretext that he is not in the mood.”) E. B. White (“A writer who waits for ideal conditions under which to work will die without putting a word on paper.”), and Isabel Allende (“Show up, show up, show up, and after a while the muse shows up, too.”), observing:

The notion that I do my work here, now, like this, even when I do not feel like it, and especially when I do not feel like it, is very important. Because lots and lots of people are creative when they feel like it, but you are only going to become a professional if you do it when you don’t feel like it. And that emotional waiver is why this is your work and not your hobby.

Originally featured in May — read the full article here. Also of note: 99U’s sequel, Maximize Your Potential, which collects practical wisdom from 21 celebrated creative entrepreneurs.

10. GIVE AND TAKE

“The principle of give and take; that is diplomacy— give one and take ten,” Mark Twain famously smirked. But for every such cynicism, there’s a heartening meditation on the art of asking and the beautiful osmosis of altruism. “The world is just,” Amelia Barr admonished in her rules for success, “it may, it does, patronize quacks; but it never puts them on a level with true men.” After all, it pays to be nice because, as Austin Kleon put it, “the world is a small town,” right?

Well, maybe — maybe not. Just as the world may be, how givers and takers fare in matters of success proves to be more complicated. So argues organizational psychology wunderkind Adam Grant (remember him?), the youngest-tenured and highest-rated Wharton professor at my alma mater, in Give and Take: A Revolutionary Approach to Success (public library).

Grant’s extensive research has shed light on a crucial element of success, debunking some enduring tenets of cultural mythology:

According to conventional wisdom, highly successful people have three things in common: motivation, ability, and opportunity. If we want to succeed, we need a combination of hard work, talent, and luck. [But there is] a fourth ingredient, one that’s critical but often neglected: success depends heavily on how we approach our interactions with other people. Every time we interact with another person at work, we have a choice to make: do we try to claim as much value as we can, or contribute value without worrying about what we receive in return?

At the heart of his insight is a dichotomy of behavioral styles people adopt in pursuing success:

Takers have a distinctive signature: they like to get more than they give. They tilt reciprocity in their own favor, putting their own interests ahead of others’ needs. Takers believe that the world is a competitive, dog-eat-dog place. They feel that to succeed, they need to be better than others. To prove their competence, they self-promote and make sure they get plenty of credit for their efforts. Garden-variety takers aren’t cruel or cutthroat; they’re just cautious and self-protective. “If I don’t look out for myself first,” takers think, “no one will.”

Grant contrasts takers with givers:

In the workplace, givers are a relatively rare breed. They tilt reciprocity in the other direction, preferring to give more than they get. Whereas takers tend to be self-focused, evaluating what other people can offer them, givers are other-focused, paying more attention to what other people need from them. These preferences aren’t about money: givers and takers aren’t distinguished by how much they donate to charity or the compensation that they command from their employers. Rather, givers and takers differ in their attitudes and actions toward other people. If you’re a taker, you help others strategically, when the benefits to you outweigh the personal costs. If you’re a giver, you might use a different cost-benefit analysis: you help whenever the benefits to others exceed the personal costs. Alternatively, you might not think about the personal costs at all, helping others without expecting anything in return. If you’re a giver at work, you simply strive to be generous in sharing your time, energy, knowledge, skills, ideas, and connections with other people who can benefit from them.

Outside the workplace, Grant argues by citing Yale psychologist Margaret Clark’s research, most of us are givers in close relationships like marriages and friendships, contributing without preoccupation with keeping score. In the workplace, however, few of us are purely givers or takers — rather, what dominates is a third style:

We become matchers, striving to preserve an equal balance of giving and getting. Matchers operate on the principle of fairness: when they help others, they protect themselves by seeking reciprocity. If you’re a matcher, you believe in tit for tat, and your relationships are governed by even exchanges of favors.

True to psychologists’ repeated insistence that personality is fluid rather than fixed, Grant notes:

Giving, taking, and matching are three fundamental styles of social interaction, but the lines between them aren’t hard and fast. You might find that you shift from one reciprocity style to another as you travel across different work roles and relationships. It wouldn’t be surprising if you act like a taker when negotiating your salary, a giver when mentoring someone with less experience than you, and a matcher when sharing expertise with a colleague. But evidence shows that at work, the vast majority of people develop a primary reciprocity style, which captures how they approach most of the people most of the time. And this primary style can play as much of a role in our success as hard work, talent, and luck.

Originally featured in April — for a closer look at Grant’s findings on the science of success, read the full article here.

11. THE EXAMINED LIFE

Despite ample evidence and countless testaments to the opposite, there persists a toxic cultural mythology that creative and intellectual excellence comes from a passive gift bestowed upon the fortunate few by the gods of genius, rather than being the product of the active application and consistent cultivation of skill. So what might the root of that stubborn fallacy be? Childhood and upbringing, it turns out, might have a lot to do.

In The Examined Life: How We Lose and Find Ourselves (public library), psychoanalyst and University College London professor Stephen Grosz builds on more than 50,000 hours of conversation from his quarter-century experience as a practicing psychoanalyst to explore the machinery of our inner life, with insights that are invariably profound and often provocative — for instance, a section titled “How praise can cause a loss of confidence,” in which Grosz writes:

Nowadays, we lavish praise on our children. Praise, self-confidence and academic performance, it is commonly believed, rise and fall together. But current research suggests otherwise — over the past decade, a number of studies on self-esteem have come to the conclusion that praising a child as ‘clever’ may not help her at school. In fact, it might cause her to under-perform. Often a child will react to praise by quitting — why make a new drawing if you have already made ‘the best’? Or a child may simply repeat the same work — why draw something new, or in a new way, if the old way always gets applause?

Grosz cites psychologists Carol Dweck and Claudia Mueller’s famous 1998 study, which divided 128 children ages 10 and 11 into two groups. All were asked to solve mathematical problems, but one group were praised for their intellect (“You did really well, you’re so clever.”) while the other for their effort (“You did really well, you must have tried really hard.”) The kids were then given more complex problems, which those previously praised for their hard work approached with dramatically greater resilience and willingness to try different approaches whenever they reached a dead end. By contrast, those who had been praised for their cleverness were much more anxious about failure, stuck with tasks they had already mastered, and dwindled in tenacity in the face of new problems. Grosz summarizes the now-legendary findings:

Ultimately, the thrill created by being told ‘You’re so clever’ gave way to an increase in anxiety and a drop in self-esteem, motivation and performance. When asked by the researchers to write to children in another school, recounting their experience, some of the ‘clever’ children lied, inflating their scores. In short, all it took to knock these youngsters’ confidence, to make them so unhappy that they lied, was one sentence of praise.

He goes on to admonish against today’s culture of excessive parental praise, which he argues does more for lifting the self-esteem of the parents than for cultivating a healthy one in their children:

Admiring our children may temporarily lift our self-esteem by signaling to those around us what fantastic parents we are and what terrific kids we have — but it isn’t doing much for a child’s sense of self. In trying so hard to be different from our parents, we’re actually doing much the same thing — doling out empty praise the way an earlier generation doled out thoughtless criticism. If we do it to avoid thinking about our child and her world, and about what our child feels, then praise, just like criticism, is ultimately expressing our indifference.

To explore what the healthier substitute for praise might be, he recounts observing an eighty-year-old remedial reading teacher named Charlotte Stiglitz, the mother of the Nobel Prize-winning economist Joseph Stiglitz, who told Grosz of her teaching methodology:

I don’t praise a small child for doing what they ought to be able to do,’ she told me. ‘I praise them when they do something really difficult — like sharing a toy or showing patience. I also think it is important to say “thank you”. When I’m slow in getting a snack for a child, or slow to help them and they have been patient, I thank them. But I wouldn’t praise a child who is playing or reading.

Rather than utilizing the familiar mechanisms of reward and punishment, Grosz observed, Charlotte’s method relied on keen attentiveness to “what a child did and how that child did it.” Presence, he argues, helps build the child’s confidence by way of indicating he is worthy of the observer’s thoughts and attention — its absence, on the other hand, divorces in the child the journey from the destination by instilling a sense that the activity itself is worthless unless it’s a means to obtaining praise. Grosz reminds us how this plays out for all of us, and why it matters throughout life:

Being present, whether with children, with friends, or even with oneself, is always hard work. But isn’t this attentiveness — the feeling that someone is trying to think about us — something we want more than praise?

Originally featured in May — read the full article here.

12. TO SELL IS HUMAN

Whether it’s “selling” your ideas, your writing, or yourself to a potential mate, the art of the sell is crucial to your fulfillment in life, both personal and professional. So argues Dan Pink in To Sell Is Human: The Surprising Truth About Moving Others (public library; UK) — a provocative anatomy of the art-science of “selling” in the broadest possible sense of the word, substantiated by ample research spanning psychology, behavioral economics, and the social sciences.

Pink, wary of the disagreeable twinges accompanying the claim that everyone should self-identify as a salesperson, preemptively counters in the introduction:

I’m convinced we’ve gotten it wrong.

This is a book about sales. But it is unlike any book about sales you have read (or ignored) before. That’s because selling in all its dimensions — whether pushing Buicks on a lot or pitching ideas in a meeting — has changed more in the last ten years than it did over the previous hundred. Most of what we think we understand about selling is constructed atop a foundation of assumptions that have crumbled.

[…]

Selling, I’ve grown to understand, is more urgent, more important, and, in its own sweet way, more beautiful than we realize. The ability to move others to exchange what they have for what we have is crucial to our survival and our happiness. It has helped our species evolve, lifted our living standards, and enhanced our daily lives. The capacity to sell isn’t some unnatural adaptation to the merciless world of commerce. It is part of who we are.

One of Pink’s most fascinating arguments echoes artist Chuck Close, who famously noted that “our whole society is much too problem-solving oriented. It is far more interesting to [participate in] ‘problem creation.’” Pink cites the research of celebrated social scientists Jacob Getzels and Mihaly Csikszentmihalyi, who in the 1960s recruited three dozen fourth-year art students for an experiment. They brought the young artists into a studio with two large tables. The first table displayed 27 eclectic objects that the school used in its drawing classes. The students were instructed to select one or more objects, then arrange a still life on the second table and draw it. What happened next reveals an essential pattern about how creativity works:

The young artists approached their task in two distinct ways. Some examined relatively few objects, outlined their idea swiftly, and moved quickly to draw their still life. Others took their time. They handled more objects, turned them this way and that, rearranged them several times, and needed much longer to complete the drawing. As Csikszentmihalyi saw it, the first group was trying to solve a problem: How can I produce a good drawing? The second was trying to find a problem: What good drawing can I produce?

As Csikszentmihalyi then assembled a group of art experts to evaluate the resulting works, he found that the problem-finders’ drawings had been ranked much higher in creativity than the problem-solvers’. Ten years later, the researchers tracked down these art students, who at that point were working for a living, and found that about half had left the art world, while the other half had gone on to become professional artists. That latter group was composed almost entirely of problem-finders. Another decade later, the researchers checked in again and discovered that the problem-finders were “significantly more successful — by the standards of the artistic community — than their peers.” Getzels concluded:

It is in fact the discovery and creation of problems rather than any superior knowledge, technical skill, or craftsmanship that often sets the creative person apart from others in his field.

Pink summarizes:

The more compelling view of the nature of problems has enormous implications for the new world of selling. Today, both sales and non-sales selling depend more on the creative, heuristic, problem-finding skills of artists than on the reductive, algorithmic, problem-solving skills of technicians.

Another fascinating chapter reveals counterintuitive insights about the competitive advantages of introversion vs. extraversion. While new theories might extol the power of introverts over traditional exaltations of extraversion, the truth turns out to be quite different: Pink turns to the research of social psychologist Adam Grant, management professor at the Wharton School of Business at the University of Pennsylvania (my alma mater).

Grant measured where a sample of call center sales representatives fell on the introversion-extraversion spectrum, then correlated that with their actual sales figures. Unsurprisingly, Grant found that extraverts averaged $125 per hour in revenue, exceeding introverts’ $120. His most surprising finding, however, was that “ambiverts” — those who fell in the middle of the spectrum, “not too hot, not too cold” — performed best of all, with an hourly average of $155. The outliers who brought in an astounding $208 per hour scored a solid 4 on the 1-7 introversion-extraversion scale.

Pink synthesizes the findings into an everyday insight for the rest of us:

The best approach is for the people on the ends to emulate those in the center. As some have noted, introverts are ‘geared to inspect,’ while extraverts are ‘geared to respond.’ Selling of any sort — whether traditional sales or non-sales selling — requires a delicate balance of inspecting and responding. Ambiverts can find that balance. They know when to speak and when to shut up. Their wider repertoires allow them to achieve harmony with a broader range of people and a more varied set of circumstances. Ambiverts are the best movers because they’re the most skilled attuners.

Pink goes on to outline “the new ABCs of moving others” — attunement (“the ability to bring one’s actions and outlook into harmony with other people an with the context you’re [sic] in”), buoyancy (a trifecta of “interrogative self-talk” that moves from making statements to asking questions, contagious “positivity,” and an optimistic “explanatory style” of explaining negative events to yourself), and clarity (“the capacity to help others see their situations in fresh and more revealing ways and to identify problems they didn’t realize they had”).

Originally featured in February — read the full article here, where you can watch the charming companion video.

13. HOW TO STAY SANE

“I pray to Jesus to preserve my sanity,” Jack Kerouac professed in discussing his writing routine. But those of us who fall on the more secular end of the spectrum might need a slightly more potent sanity-preservation tool than prayer. That’s precisely what writer and psychotherapist Philippa Perry offers in How To Stay Sane (public library; UK), part of The School of Life’s wonderful series reclaiming the traditional self-help genre as intelligent, non-self-helpy, yet immensely helpful guides to modern living.

At the heart of Perry’s argument — in line with neurologist Oliver Sacks’s recent meditation on memory and how “narrative truth,” rather than “historical truth,” shapes our impression of the world — is the recognition that stories make us human and learning to reframe our interpretations of reality is key to our experience of life:

Our stories give shape to our inchoate, disparate, fleeting impressions of everyday life. They bring together the past and the future into the present to provide us with structures for working towards our goals. They give us a sense of identity and, most importantly, serve to integrate the feelings of our right brain with the language of our left.

[…]

We are primed to use stories. Part of our survival as a species depended upon listening to the stories of our tribal elders as they shared parables and passed down their experience and the wisdom of those who went before. As we get older it is our short-term memory that fades rather than our long-term memory. Perhaps we have evolved like this so that we are able to tell the younger generation about the stories and experiences that have formed us which may be important to subsequent generations if they are to thrive.

I worry, though, about what might happen to our minds if most of the stories we hear are about greed, war and atrocity.

Perry goes on to cite research indicating that people who watch television for more than four hours a day see themselves as far more likely to fall victim in a violent incident in the forthcoming week than their peers who watch less than two hours a day. Just like E. B. White advocated for the responsibility of the writer to “to lift people up, not lower them down,” so too is our responsibility as the writers of our own life-stories to avoid the well-documented negativity bias of modern media — because, as artist Austin Kleon wisely put it, “you are a mashup of what you let into your life.” Perry writes:

Be careful which stories you expose yourself to.

[…]

The meanings you find, and the stories you hear, will have an impact on how optimistic you are: it’s how we evolved. … If you do not know how to draw positive meaning from what happens in life, the neural pathways you need to appreciate good news will never fire up.

[…]

The trouble is, if we do not have a mind that is used to hearing good news, we do not have the neural pathways to process such news.

Yet despite the adaptive optimism bias of the human brain, Perry argues a positive outlook is a practice — and one that requires mastering the art of vulnerability and increasing our essential tolerance for uncertainty:

You may find that you have been telling yourself that practicing optimism is a risk, as though, somehow, a positive attitude will invite disaster and so if you practice optimism it may increase your feelings of vulnerability. The trick is to increase your tolerance for vulnerable feelings, rather than avoid them altogether.

[…]

Optimism does not mean continual happiness, glazed eyes and a fixed grin. When I talk about the desirability of optimism I do not mean that we should delude ourselves about reality. But practicing optimism does mean focusing more on the positive fall-out of an event than on the negative. … I am not advocating the kind of optimism that means you blow all your savings on a horse running at a hundred to one; I am talking about being optimistic enough to sow some seeds in the hope that some of them will germinate and grow into flowers.

Another key obstruction to our sanity is our chronic aversion to being wrong, entwined with our damaging fear of the unfamiliar. Perry cautions:

We all like to think we keep an open mind and can change our opinions in the light of new evidence, but most of us seem to be geared to making up our minds very quickly. Then we process further evidence not with an open mind but with a filter, only acknowledging the evidence that backs up our original impression. It is too easy for us to fall into the rap of believing that being right is more important than being open to what might be.

If we practice detachment from our thoughts we learn to observe them as though we are taking a bird’s eye view of our own thinking. When we do this, we might find that our thinking belongs to an older, and different, story to the one we are now living.

Perry concludes:

We need to look at the repetitions in the stories we tell ourselves [and] at the process of the stories rather than merely their surface content. Then we can begin to experiment with changing the filter through which we look at the world, start to edit the story and thus regain flexibility where we have been getting stuck.

Originally featured in February — read the full article here.

BP

Great Writers Reflect on the Divide Between Private Person and Public Persona in Hand-Drawn Self-Portraits

“Only the crazed and the privileged permit themselves the luxury of disintegration into more than one self.”

“It is to my other self, to Borges, that things happen… I live, I agree to go on living, so that Borges may fashion his literature,” Jorge Luis Borges wrote in his famous essay “Borges and I,” eloquently exploring our shared human tendency to disintegrate into multiple personas as our public and private selves slip in and out of different worlds. In 1996, Daniel Halpern asked 56 of our era’s most celebrated writers to reflect on Borges’s memorable meditation and contribute their own thoughts on the relationship between the person writing and the fictional persona of the writer. The resulting short essays, alongside hand-drawn self-portraits from each author — a recurring theme today — are gathered in Who’s Writing This?: Notations on the Authorial I with Self-Portraits (public library), a priceless addition to this omnibus of famous writers’ timeless wisdom on the craft.

Edward Albee
Cynthia Ozick
Diane Ackerman

Poet Diane Ackerman, whose timelessly beautiful cosmic poems never cease to stir, speaks to our multiple coexisting inner selves and the fluidity of human personality:

Selves will accumulate when one isn’t looking, and they don’t always act wisely or well.

True to her essay’s title, “Diane Ackerman and I,” she playfully turns to the third person to further explore how this notion played out in her own life, while touching on a great many human universalities:

It was only in her middle years that she began to notice how her selves had been forming layer upon layer, translucent like skin; and, like skin, they were evolving a certain identifiable “fingerprint” — a weather system of highs and lows, loops and whirls.

[…]

Older, what she craved was to be ten or twelve selves, each passionately committed to a different field — a dancer, a carpenter, a composer, an astronaut, a miner, etc. Some would be male, some female, and all of their sensations would feed back to one central source. Surely then she would begin to understand the huge spill of life, if she could perceive it from different view points, through simultaneous lives.

[…]

She thinks a lot about the pageant of being human — what it senses, loves, suffers, thrills like — while working silently in a small room, filling blank sheets of paper. It is a solitary mania. But there are times when, all alone, she could be arrested for unlawful assembly.

Mark Helprin

Mark Helprin echoes the same sentiment:

When the Queen of England speaks in the first person plural, it sounds marvelously schizoid, and probably is for her a deep embarrassment. When an American politician has gone around the bend, he begins to refer to himself in the third person. All people feel that they are more than one. Even an Eskimo who returns from the ice to sit in the shadows inside an igloo must sometimes ask himself what the hunt has done to him, must wonder why his tenderness with his children takes so long to flood back after his sinews have been bent and frozen hard in the chase. It happens to everyone and to all of us, and only the crazed and the privileged permit themselves the luxury of disintegration into more than one self.

And yet he has mastered that private integration that keeps his own multiple selves together:

However many of me there are, I have managed to fuse them into one. I cannot tell myself apart any more than the heavily breathing fox hiding under branches or in brush perceives in the mirror of his wide and alert eye a new dainty self or a different sad self or an admirably reflective self.

Margaret Atwood

In an essay titled “Me, She, and It,” Margaret Atwood — a woman of strong opinions about the problems of literature and its how-to’s — pokes at the common, flawed trope of the writerly persona as a separate, superior entity to the writer’s person:

Why do authors wish to pretend they don’t exist? It’s a way of skinning out, of avoiding truth and consequences. They’d like to deny the crime, although their fingerprints are allover the martini glasses, not to mention the hacksaw blade and the victim’s neck. Amnesia, they plead. Epilepsy. Sugar overdose. Demonic possession. How convenient to have an authorial twin, living in your body, looking out through your eyes, pushing pen down on paper or key down on keyboard, while you do what? File your nails?

Noting her own embodiment of this dichotomy, she admonishes:

A projection, a mass hallucination, a neurological disorder — call her what you will, but don’t confuse her with me.

Paul Bowles

Paul Bowles shares a similar sentiment in his essay titled “Bowles and It”:

What is this curious assumption, widely shared … that while writing, a writer can identify himself as one who is writing? The consciousness of oneself as oneself causes a short circuit, and the light goes out.

If I am writing fiction, I am being invented. I cannot retain any awareness of identity. The two states of being are antithetical. The author is not at a steering wheel: “I am driving this car. I command its movements. I can make it go wherever I please.” This assertion of identity is fatal; the writing at that point becomes meaningless.

Frank Bidart

Frank Bidart bleeds into the existential:

We fill pre-existing forms and when we fill them we change them and are changed.

He considers fiction as the mechanism of this perpetuum mobile of self-transformation:

Sweet fiction, in which bravado and despair beckon from a cold panache, in which the protected essential self suffers flashes of its existence to be immortalized by a writing self that is incapable of performing its actions without mixing our essence with what is false.

Paula Fox

In a short essay titled “Path,” Paula Fox rebels against this meta-awareness of the writer’s writing:

I cannot write of writing. To be at work, to write, must exclude thoughts about writing or about myself as a writer. To consider writing, to look at myself as a writer, holds for one sober moment, then plunges me into a tangle of misery that Cesare Pavese describes in his diary: “This terrible feeling that what you do is all wrong, so is what you think, what you are!”

It all suggests to me Heisenberg’s indeterminacy principle, which states that you can either know where a thing is or how fast it is moving — but not both simultaneously. The warring self disappears into the self-less concentration of work. Imagination is conjunctive and unifying; the sour, habitual wars of the self are disjunctive and separating.

When I begin a story at my desk, the window to my back, the path is not there. As I start to walk, I make the path.

Ward Just

Exploring his own inner duality, Ward Just indulges a play on his name:

The Just and the unJust inhabiting the same body, so close you can’t pry us apart, but we are not friends. He speaks, I edit. He plays, I work. He is famously convivial, I am a recluse. And at the end of the evening, when I’m exhausted and yearning for bed, knowing there’s an assignment to complete, he stays on, anything to keep me in the closet a little longer. And when the inevitable question comes, he answers it with aplomb, holding his glass —

Don’t mind if I do.

Allan Gurganus

Allan Gurganus, in an essay titled “The Fertile ‘We’ of One Chaste ‘I,'” considers the “inward, unsure, tender, professional empathizer” of the writer’s private self, in such stark contrast with his carefully constructed public persona:

What interests me about my own work and character is not the solid, admirable, good-nurse, self-motivated persona that I simulate toward Frans Hals warmth in scholarly talks, in photographs taken during charity banquets. That guy is about as real as his tweed jacket’s suede elbow patches and about that necessary. It’s Lint Man I’m a slave for. Poor dweeb hasn’t had a date since 1965; and hasn’t regretted that since January 1972.

He, the true writer, is the department store dummy at the very center of the whole establishment, the one left alone on display all night, a price tag stapled to every piece of clothing they’ve yanked onto him, binoculars and frog flippers included. He is the neutral, generic human form, the gray center who must always assume disguises — in order to be seen and, therefore, to feel himself.

And yet it’s “Lint Man” Gurganus relishes:

How lavish and how Godlike is Lint Man’s open-endedness. Lint Man’s specificity.

He ends on a somewhat solemn note:

The chances of achieving literary performance are, to the decimal point, the odds against becoming fully human.

That means one hundred and fifty million to one.

Which means one hundred and fifty million in one.

Ed Koren

Children’s book author and New Yorker cover artist Ed Koren offers his contribution in the medium of his forte:

Francine Prose

Francine Prose, who has taught us how to read like a writer, considers how to write like a writer in a meditation titled “She and I … and Someone Else”:

She never seems happier than when she is writing, when the work takes over, and the book (as she puts it, so unoriginally) seems to write itself. The characters are saying and doing things she hadn’t planned at all. What pleases her is that she isn’t there, she no longer feels herself present, and I…

Someone else is writing, and both she and I have vanished.

John Hawkes

John Hawkes writes:

Some time ago I discovered that I could no longer speak aloud or read aloud from a stage, even for the sake of hearing the effect that my writer’s voice produced on listeners. Now, curiously, the more I merely try to live, the more reclusive I become, the vainer I am. At last I am as vain as the one who instantly voices his silence inside me.

Arthur Miller

Arthur Miller considers the disconnect between the writer-person and the byline-persona:

I know Arthur Miller, but not “Arthur Miller” or Arthur Miller or “Miller.” About twenty-five years ago the Romanian government banned all “Miller” plays as pornographic. Privately I was very pleased, having admired Henry Miller’s work for a long time. Two theaters were in the midst of producing plays of mine and were forced to cancel them. Did this make me — slightly — Henry Miller? Or him — slightly — Arthur Miller?

He adds:

A book, a poem, a play — they start as fantasms but they end up as things, like a box of crackers or an automobile tire.

Edna O’Brien

Edna O’Brien offers a refreshing, poetic take on the old artist-muse relationship:

The other me, who did not mean to drown herself, went under the sea and remained there for a long time. Eventually she surfaced near Japan and people gave her gifts but she had been so long under the sea she did not recognize what they were. She is a sly one. Mostly at night we commune. Night. Harbinger of dream and nightmare and bearer of omens which defy the music of words. In the morning the fear of her going is very real and very alarming. It can make one tremble. Not that she cares. She is the muse. I am the messenger.

John Updike

John Updike, writing a decade before his death — a subject whose relationship with writing he once explored with such poignancy — considers the dissociation between the constructed Writer and the living person a sort of useful psychological buffer:

I created Updike out of the sticks and mud of my Pennsylvania boyhood, so I can scarcely resent it when people, mistaking me for him, stop me on the street and ask me for his autograph. I am always surprised that I resemble him so closely that we can be confused.

[…]

The distance between us is so great that the bad reviews he receives do not touch me, though I treasure his few prizes and mount them on the walls and shelves of my house, where they instantly yellow and tarnish.

[…]

Suppose, some day, he fails to show up? I would attempt to do his work, but no one would be fooled.

Max Apple
Elmore Leonard
Alice Hoffman
Frank Conroy
Henry Roth

Though Who’s Writing This?: Notations on the Authorial I with Self-Portraits is, regrettably, out of print, used copies can — and should — be tracked down for guaranteed enjoyment. Complement it with an entirely different kind of self-portrait.

Thanks, Kaye!

BP

View Full Site

Brain Pickings participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from a link on here, I receive a small percentage of its price. Privacy policy.