Brain Pickings

Posts Tagged ‘best of’

17 DECEMBER, 2013

The Best Photography Books of 2013

By:

From Mongolia to Mars, by way of mesmerizing mines and Manhattan’s characters.

“Needing to have reality confirmed and experience enhanced by photographs is an aesthetic consumerism to which everyone is now addicted,” Susan Sontag wrote in her timeless meditation on photography nearly three decades before the age of Instagram and the selfie. Indeed, the photographic image has not only retained by amplified its power to move, to mesmerize, to usurp power. On the heels of the year’s best books in psychology and philosophy, art and design, history and biography, science and technology, “children’s” (though we all know what that means), and pets and animals, here are 2013′s most exquisite books on photography.

1. THIS IS MARS

“Whether or not there is life on Mars now, there WILL be by the end of this century,” Arthur C. Clarke predicted in 1971 while contemplating humanity’s quest to conquer the Red Planet. “Whatever the reason you’re on Mars is, I’m glad you’re there. And I wish I was with you,” Carl Sagan said a quarter century later in his bittersweet message to future Mars explorers shortly before his death. Sagan, of course, has always been with us — especially as we fulfill, at least partially, Clarke’s prophecy: On March 10, 2006, we put a proxy of human life on, or at least very near, Mars — NASA’s Mars Reconnaissance Orbiter, with its powerful HiRISE telescope, arrived in the Red Planet’s orbit and began mapping its surface in unprecedented detail.

This Is Mars (public library) — a lavish visual atlas by French photographer, graphic designer and editor Xavier Barral, featuring 150 glorious ultra-high-resolution black-and-white images culled from the 30,000 photographs taken by NASA’s MRO, alongside essays by HiRISE telescope principal researcher Alfred S. McEwen, astrophysicist Francis Rocard, and geophysicist Nicolas Mangold — offers an unparalleled glimpse of those mesmerizing visions of otherworldly landscapes beamed back by the MRO in all their romantic granularity, making the ever-enthralling Red Planet feel at once more palpable and more mysterious than ever. At the intersection of art and science, these mesmerizing images belong somewhere between Berenice Abbot’s vintage science photography, the most enchanting aerial photography of Earth, and the NASA Art Project.

In a sentiment of beautiful symmetry to Eudora Welty’s meditation on place and fiction, Barral considers how these images simultaneously anchor us to a physical place and invite us into an ever-unfolding fantasy:

At the end of this voyage, I have gathered here the most endemic landscapes. They send us back to Earth, to the genesis of geological forms, and, at the same time, they upend our reference points: dunes that are made of black sand, ice that sublimates. These places and reliefs can be read as a series of hieroglyphs that take us back to our origins.

Originally featured in October.

2. HUMANS OF NEW YORK

The ever-evolving portrait of New York City has been painted through Gotham’s cats and its dogs, its buildings and its parks, its diaries and its letters. Underpinning all of those, of course, are the city’s true building blocks: its humans.

In the summer of 2010, Brandon Stanton — one of the warmest, most talented and most generous humans I know — lost his job as a bond trader in Chicago and was forced to make new light of his life. Having recently gotten his first camera and fallen in love with photography, he decided to follow that fertile combination of necessity and passion, and, to his parents’ terror and dismay, set out to pursue photography as a hobby-turned-vocation. (For his mother, who saw bond trading as a reputable occupation, photography “seemed like a thinly veiled attempt to avoid employment.”) Brandon recalls:

I had enjoyed my time as a trader. The job was challenging and stimulating. And I’d obsessed over markets in the same way I’d later obsess over photography. But the end goal of trading was always money. Two years of my life were spent obsessing over money, and in the end I had nothing to show for it. I wanted to spend the next phase of my life doing work that I valued as much as the reward.

In photography, he found that rewarding obsession. Approaching it with the priceless freshness of Beginner’s Mind, he brought to his new calling the gift of ignorance and an art of seeing untainted by the arrogance of expertise, hungry to make sense of the world through his lens as he made sense of his own life. And make he did: Brandon, who quickly realized that “the best way to become a photographer was to start photographing,” set out on a photo tour across several major American cities, beginning in Pittsburgh and ending up in New York City, where he had only planned to spend a week but where he found both his new home and his new calling.

And so, in a beautiful embodiment of how to find your purpose and do what you love, Brandon’s now-legendary online project documenting Gotham’s living fabric was born — at first a humble Facebook page, which blossomed into one of today’s most popular photojournalism blogs with millions of monthly readers. Now, his photographic census of the world’s most vibrant city spills into the eponymous offline masterpiece Humans of New York (public library) — a magnificent mosaic of lives constructed through four hundred of Brandon’s expressive and captivating photos, many never before featured online.

These portraits — poignant, poetic, playful, heartbreaking, heartening — dance across the entire spectrum of the human condition not with the mockingly complacent lens of a freak-show gawker but with the affectionate admiration and profound respect that one human holds for another.

In the age of the aesthetic consumerism of visual culture online, HONY stands as a warm beacon of humanity, gently reminding us that every image is not a disposable artifact to be used as social currency but a heart that beat in the blink of the shutter, one that will continue to beat with its private turbulence of daily triumphs and tribulations even as we move away from the screen or the page to resume our own lives.

The captions, some based on Brandon’s interviews with the subjects and others an unfiltered record of his own observations, add a layer of thought to the visual story: One photograph, depicting two elderly gentlemen intimately leaning into each other on a park bench, reads: “It takes a lot of disquiet to achieve this sort of quiet comfort.” Another, portraying a very old gentleman in a wheelchair with matching yellow sneakers, shorts, and baseball cap, surprises us by revealing that this is Banana George, world record-holder as the oldest barefoot water-skier.

Some are full of humor:

Damn liberal arts degree.

Others are hopelessly charming:

When I walked by, she was really moving to the music — hands up, head nodding, shoulders swinging. I really wanted to take her photo, so I walked up to the nearest adult and asked: “Does she belong to you?”

Suddenly the music stopped, and I heard: “I belong to myself!”

Others still are humbling and soul-stirring:

My wife passed away a few years back. Her name was Barbara, I used to call her Ba. My name was Lawrence, she used to call me La. When she died, I changed my name to Bala.

I stepped inside an Upper West Side nursing home, and met this man in the lobby. He was on his way to deliver a yellow teddy bear to his wife. “I visit her every day,” he said. “Even when the mind is gone, the heart shows through.”

Then there are the city’s favorite tropes: Its dogs

…and its bikes…

I’m ninety years old and I ride this thing around everywhere. I don’t see why more people don’t use them. I carry my cane in the basket, I get all my shopping done. I can go everywhere. I’ve never hit anyone and never been hit. Of course, I ride on the sidewalk, which I don’t think I’m supposed to do, but still…

…and the deuce delight of dogs on bikes:

Above all, however, there is something especially magical about framing these moments of stillness and of absolute attention to the individual amidst this bustling city of millions, a city that never sleeps and never stops.

Whatever your geographic givens, Humans of New York is an absolute masterpiece of cultural celebration, both as vibrant visual anthropology and as a meta-testament, by way of Brandon’s own story, to the heartening notion that this is indeed a glorious age in which we can make our own luck and make a living doing what we love.

Originally featured in October — see more here.

3. BLACK MAPS

For nearly three decades, photographer and visual artist David Maisel — whose gloriously haunting Library of Dust project you might recall from a few years back — has been transforming landscape photography with his stunning aerial images exploring the relationship between Earth and humanity. Now, the best of them are collected in the magnificent monograph Black Maps: American Landscape and the Apocalyptic Sublime (public library) — a lavish large-format tome featuring more than 100 of Maisel’s surreally entrancing portraits of our worldly reality, at once beautiful and tragic. From cyanide-leaching ponds to open-pit mines to the sprawl of urbanization, Maisel’s mesmerizing photographs — which, without context, could be mistaken as much for abstract impressionism as they could for cellular microscopy — capture fragments of the landscape that “correspond to the structure of human thought and feeling.”

From 'The Mining Project' © David Maisel

From 'The Mining Project' © David Maisel

From 'The Mining Project' © David Maisel

From 'The Mining Project' © David Maisel

From 'Oblivion' © David Maisel

From 'Terminal Mirage' © David Maisel

From 'Terminal Mirage' © David Maisel

From 'Terminal Mirage' © David Maisel

4. DOROTHEA LANGE

At the same time that pioneering photographer Berenice Abbott was busy capturing the urban fabric and trailblazing anthropologist Margaret Mead was laying the groundwork for modern anthropology, Dorothea Lange mastered the intersection of the two in her influential Depression-era photojournalism and documentary photography. In Dorothea Lange: Grab a Hunk of Lightning (public library), Lange’s goddaughter Elizabeth Partridge, an accomplished and prolific author in her own right, presents a first-of-its-kind career-spanning monograph of the legendary photographer’s work, placing her most famous and enduring photographs in a biographical context that adds new dimension to these iconic images.

Among the biographical sketches is also the story of Lange’s best-known, infinitely expressive, most iconic photograph of all — Migrant Mother, depicting an agricultural worker named Florence Owens Thompson with her children — which came to capture the harrowing realities of the Great Depression not merely as an economic phenomenon but as a human tragedy.

Migrant Mother, 1936

In 1935, Lange and her second husband, the Berkeley economics professor and self-taught photographer Paul Taylor, were transferred to the Resettlement Administration (RA), one of Roosevelt’s New Deal programs designed to help the country recover from the depression. Lange began working as a Field Investigator and Photographer under Roy Stryker, head of the Information Division.

Resettlement Administration Report, 'Rural Rehabilitation Camps for Migrants' by Paul Taylor and Dorothea Lange. Lange had absorbed Taylor's working habits, particularly the practice of listening attentively to the migrant workers and taking handwritten notes on what they said. (Prints & Photographs Division, Library of Congress)

In early February of 1936, while living in a small two-bedroom house in California with Taylor and her two step-children, Lange received an assignment to photograph California’s rural and urban slums and farmworkers. She was supposed to spend a month on the road, but severe weather along the coast delayed her departure. When she finally set out for Los Angeles, the first destination on her route, she wrote in a letter to Stryker:

Tried to work in the pea camps in heavy rain from the back of the station wagon. I doubt that I got anything. . . . Made other mistakes too. . . . I make the most mistakes on subject matter that I get excited about and enthusiastic. In other words, the worse the work, the richer the material was.

Accompanying this photograph was Lange's handwritten caption: 'Old Negro -- the kind the planters like. He hoes, picks cotton, and is full of good humor.' Aldridge Plantation, Mississippi, 1937 (Prints & Photographs Division, Library of Congress)

It was in the pea camps that she captured her most iconic image less than two weeks later — an image that, due to its unshakable grip of empathy, would transcend the status of mere visual icon and effect critical cultural awareness on both a social and political level. Partridge writes:

Two weeks of sleet and steady rain had caused a rust blight, destroying the pea crop. There was no work, no money to buy food. Dorothea approached “the hungry and desperate mother,” huddled under a torn canvas tent with her children. The family had been living on frozen vegetables they’d gleaned from the fields and birds the children killed. Working quickly, Dorothea made just a few exposures, climbed back in her car, and drove home.

Dorothea knew the starving pea pickers couldn’t wait for someone in Washington, DC to act. They needed help immediately. She developed the negatives of the stranded family, and rushed several photographs to the San Francisco News. Two of her images accompanied an article on March 10th as the federal government rushed twenty thousand pounds of food to the migrants.

Another shot of Florence Owens Thompson. Lange's caption from her notebook: 'Migrant agricultural worker’s family. Seven children without food. Mother aged thirty-two. Father is a native Californian.' Nipomo, California, 1936 (Prints & Photographs Division, Library of Congress)

The most remarkable part of the story, however, is that this was an image Lange almost didn’t take: At the end of that cold and wretched winter, she had been on the road for almost a month, with only the insufficient protection of her camera lens between her and the desperate, soul-stirringly dejected living and working conditions of California’s migratory farm workers. Downhearted and weary, both physically and psychologically, she decided she had seen and captured enough, packed up her clunky camera equipment, and headed north on Highway 101, bickering with herself in her notebook: “Haven’t you plenty of negatives already on the subject? Isn’t this just one more of the same?” But then something happened — a fleeting glance, one of those pivotal chance encounters that shape lives. Partridge transports us to that fateful March day:

The cold, wet conditions of Northern California gave way to sweltering heat in Los Angeles, a “vile town,” Dorothea wrote. By the beginning of March she was headed home, exhausted, her camera bags packed on the front seat beside her.

Hours later, the hand-lettered “Pea pickers camp” sign flashed by her. Did she have it in her to try one more time?

She did.

The long, hard rains that had delayed Dorothea at the outset of her journey had deluged the Nipomo pea pickers. And even as Dorothea drove north and homeward, the camp was still floundering in water and mud. Not long before Dorothea arrived, Florence Thompson and four of her six children, along with some of the other stranded migrants, had moved to a higher, sandy location nearby. Thompson left word at the first camp for her partner, Jim Hill, on where to find them. Earlier in the day he’d set off walking with Thompson’s two sons to find parts for their broken-down car.

The sandy camp in front of a windbreak of eucalyptus trees is where Dorothea pulled in and found Florence Thompson and her children. They were waiting for Hill and the boys to show up, for the ground to dry, for crops to ripen for harvesting. They were waiting for their luck to change.

In minutes, Dorothea took the photograph that would become the definitive icon of the Great Depression, intuitively conveying the migrants’ perilous predicament in the frame of her camera.

Dorothea Lange’s studio and darkroom, Berkeley, California (Photograph: Rondal Partridge, c. 1957 / Helen Dixon Collection)

Originally featured in November.

5. BEFORE THEY PASS AWAY

In the late 1990s, photographer Jimmy Nelson became fascinated by Earth’s last living indigenous tribes. It took him a decade to begin documenting their fascinating lives, but once he did, what came out of his 4×5 camera was nothing short of mesmerizing — a glimpse of what feels like a parallel universe, or rather parallel multiverses, to our Western eyes, yet one full of our immutable shared humanity. The magnificent results are now gathered in Before They Pass Away (public library) — a lavish large-format tome featuring 500 of Nelson’s striking photographs, standing somewhere between Jeroen Toirkens’s visual catalog of Earth’s last nomads and Rachel Sussman’s photographic record of the oldest living things in the world.

The journey took Nelson all over the world, from the deserts of Africa to the steppes of Siberia. He writes:

I wanted to create an ambitious aesthetic photographic document that would stand the test of time. A body of work that would be an irreplaceable ethnographic record of a fast disappearing world.

The semi-nomadic Kazakhs, descendent from the Huns, have been herding in the valleys of Mongolia since the 19th century and take great pride in their ancient art of eagle-hunting.

The Huli of Papua New Guinea migrated to the island about 45,000 years ago. Today, the remaining tribes often fight with one another for resources — land, livestock, women. To intimidate the enemy, the largest tribe, the Huli wigmen, continue the ancient tradition of painting their faces in yellow, red and white and making elaborate wigs of their own hair.

Though the Gauchos of South America might appear more “modern” than other indigenous tribes, these free-spirited nomadic horsemen have remained a self-contained culture since they first started roaming the prairies in the 1700s.

A distinct ethnic group and even more distinct cultural collective, Tibetans, descendent from aboriginal and nomadic Qiang tribes, are known for their prayer flags, sky burials, spirit traps, and festival devil dances, which encapsulate their history and beliefs.

The Maasai endure as one of the oldest and greatest warrior cultures. As they migrated from the Sudan in the 15th century, they took possession of the local tribes’ cattle and conquered much of the Rift Valley. To this day, they depend on the natural cycles of rainfall and drought for their cattle, which remain their core source of sustenance.

The reindeer-herding Nenets of northern Arctic Russia have thrived for over a millennium at temperatures ranging from 58ºF below zero in the winter to 95ºF in the summer, migrating across more than 620 miles per year, 30 of which consist in the grueling crossing of the frozen Ob River.

Originally featured in November — see more here, including Nelson’s entertaining and moving TEDxAmsterdam talk.

6. FACES OF JUSTICE

On the heels of Aung San Suu Kyi’s timeless wisdom on freedom from fear comes Justice: Faces of the Human Rights Revolution (public library) by New-York-based photographer Mariana Cook — who gave us this heart-warming portrait of Maurice Sendak and his dog Herman, a fine addition to history’s beloved literary pets. The humanist upgrade to Platon’s Power, Cook’s magnificent black-and-white portraits, poetic and dignified, capture 99 beloved luminaries ranging from Archbishop Desmond Tutu, who spearheaded the opposition to apartheid, to President Jimmy Carter to Sir Sydney Kentridge, who served as the lead lawyer in the 1962 trial of Nelson Mandela, to Justice Ruth Bader Ginsburg, who helped champion this week’s historic win for marriage equality.

Cook frames the project in her preface:

How do people come to feel so passionately about fairness and freedom that they will risk their livelihoods, even their lives, to pursue justice? A few years ago, I became fascinated by such people—people for whom the “rule of law” is no mere abstraction, for whom human rights is a fiercely urgent concern. I wanted to give a face to social justice by making portraits of human rights pioneers. I am a photographer. I understand by seeing. Peering through the camera lens, I hoped to gain an understanding of how they become so devoted to the rights and dignity of others.

Ludmilla Alexeeva

Photograph: Mariana Cook

Desmond Tutu

Photograph: Mariana Cook

Aung San Suu Kyi

Photograph: Mariana Cook

Raja Shehadeh

Photograph: Mariana Cook

Hina Jilani

Photograph: Mariana Cook

Takna Sangpo

Photograph: Mariana Cook

Ruth Bader Ginsburg

Photograph: Mariana Cook

Nicholas Kristof

Photograph: Mariana Cook

Accompanying each portrait is a micro-essay exploring the life, legacy, and singular spirit of its subject.

Originally featured in June.

7. VIVIAN MAIER: SELF-PORTRAITS

In 2007, 26-year-old amateur historian and collector John Maloof wandered into the auction house across from his home and won, for $380, a box of 30,000 extraordinary negatives by an unknown artist whose street photographs of mid-century Chicago and New York rivaled those of Berenice Abbott and predated modern fixtures like Humans of New York by decades. They turned out to be the work of a mysterious nanny named Vivian Maier, who made a living by raising wealthy suburbanites’ children and made her life by capturing the world around her in exquisite detail and striking composition. Mesmerized, Maloof began tracking down more of Maier’s work and amassed more than 100,000 negatives, thousands of prints, 700 rolls of undeveloped color film, home movies, audio interviews, and even her original cameras. Only after Maier’s death in 2009 did her remarkable work gain international acclaim — exhibitions were staged all over the world, magnificent monograph of her photographs published, and a documentary made.

But it wasn’t until 2013 that the most intimate and revealing of her photographs were at last released in Vivian Maier: Self-Portraits (public library) — a collection befitting the year of the “selfie” and helping to officially declare this the season of the creative self-portrait.

Maloof writes in the foreword:

As secretive as Vivian Maier was in life, in death her mystery has only deepened. Without the creator to reveal her motives and her craft, we are left to piece together the life and intent of an artist based on scraps of evidence, with no way to gain definitive answers.

There is, however, something fundamentally unsettling with this proposition — after all, a human being is a constantly evolving open question rather than a definitive answer, a fluid self only trapped by the labels applied from without. And so even though Maloof argues that the book answers “the nagging question of who Vivian Maier really was” by revealing her true self through her self-portraits, what it really does — and what its greatest, most enchanting gift is — is take us along as silent companions on a complex woman’s journey of self-knowledge and creative exploration, a journey without a definitive destination but one that is its own reward.

It’s also, however, hopelessly human to try to interpret others and assign them into categories based on the “scraps of evidence” they bequeath. I was certainly not immune to this tendency, as I began to suspect Maier was a queer woman who found in her art a vehicle for connection, for belonging, for feeling at once a part of the society she documented and an onlooker forever separated by her lens. Because we know so little about Maier’s life, this remains nothing more than intuitive speculation — but one I find increasingly hard to dismiss as her self-portraits peel off another layer of guarded intimacy.

The beauty and magnetism of Vivian Maier: Self-Portraits is that it leaves you with your own interpretations, not with definitive answers but with crystalline awareness of Maier’s elusive selfhood.

Originally featured in November.

* * *

Catch up on all the year’s best-of reading lists here.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

16 DECEMBER, 2013

Cats, Dogs, and the Human Condition: The Year’s Best Books about Pets and Animals

By:

Artful cats, literary dogs, Bob Dylan, and a whole lot of non-human genius.

After the year’s best books in psychology and philosophy, art and design, history and biography, science and technology, and “children’s” (though we all know what that means), the season’s subjective selection of best-of reading lists continue with the year’s loveliest reads about our fellow non-human beings.

1. E. B. WHITE ON DOGS

Literary history brims with famous authors who adored their pets, and E. B. White — extraordinary essayist, celebrator of New York, champion of integrity, upholder of linguistic style — was chief among them. He had a dozen dogs over the course of his long life, including his beloved Scotty Daisy, who was not only the only witness to White’s wedding to the love of his life but also “wrote” this utterly endearing letter to Katharine White on the occasion of her pregnancy. In E. B. White on Dogs (public library), Martha White, Elwyn’s granddaughter and literary executor, collects the beloved author’s finest letters, poems, sketches, and essays celebrating his canine companions.

In the introduction to the anthology, White’s granddaughter poignantly observes that her grandfather revealed so much of himself through his writing about his dogs, riffing on his poignant obituary for Daisy:

My grandfather also suffered from a chronic perplexity, I believe, and he spent his career trying to take hold of it, not infrequently through the literary device of his dogs.

In this particular case, it seems, Malcolm Gladwell was wrong in asserting, “Dogs are not about something else. Dogs are about dogs.” Dogs, for White, were about dogs, but also about how to be human.

Sample this fantastic collection with some of White’s gems here and here.

2. LOST CAT

“Dogs are not about something else. Dogs are about dogs,” Malcolm Gladwell asserted indignantly in the introduction to The Big New Yorker Book of Dogs. Though hailed as memetic rulers of the internet, cats have also enjoyed a long history as artistic and literary muses, but never have they been at once more about cats and more about something else than in Lost Cat: A True Story of Love, Desperation, and GPS Technology (public library) by firefighter-turned-writer Caroline Paul and illustrator extraordinaire Wendy MacNaughton, she of many wonderful collaborations — a tender, imaginative memoir infused with equal parts humor and humanity, also among the best biographies, memoirs, and history books of the year. Though “about” a cat, this heartwarming and heartbreaking tale is really about what it means to be human — about the osmosis of hollowing loneliness and profound attachment, the oscillation between boundless affection and paralyzing fear of abandonment, the unfair promise of loss implicit to every possibility of love.

After Caroline crashes an experimental plane she was piloting, she finds herself severely injured and spiraling into the depths of depression. It both helps and doesn’t that Caroline and Wendy have just fallen in love, soaring in the butterfly heights of new romance, “the phase of love that didn’t obey any known rules of physics,” until the crash pulls them into a place that would challenge even the most seasoned and grounded of relationships. And yet they persevere as Wendy patiently and lovingly takes care of Caroline.

When Caroline returns from the hospital with a shattered ankle, her two thirteen-year-old tabbies — the shy, anxious Tibby (short for Tibia, affectionately — and, in these circumstances, ironically — named after the shinbone) and the sociable, amicable Fibby (short for Fibula, after the calf bone on the lateral side of the tibia) — are, short of Wendy, her only joy and comfort:

Tibia and Fibula meowed happily when I arrived. They were undaunted by my ensuing stupor. In fact they were delighted; suddenly I had become a human who didn’t shout into a small rectangle of lights and plastic in her hand, peer at a computer, or get up and disappear from the vicinity, only to reappear through the front door hours later. Instead, I was completely available to them at all times. Amazed by their good luck, they took full feline advantage. They asked for ear scratches and chin rubs. They rubbed their whiskers along my face. They purred in response to my slurred, affectionate baby talk. But mostly they just settled in and went to sleep. Fibby snored into my neck. Tibby snored on the rug nearby. Meanwhile I lay awake, circling the deep dark hole of depression.

Without my cats, I would have fallen right in.

And then, one day, Tibby disappears.

Wendy and Caroline proceed to flyer the neighborhood, visit every animal shelter in the vicinity, and even, in their desperation, enlist the help of a psychic who specializes in lost pets — but to no avail. Heartbroken, they begin to mourn Tibby’s loss.

And then, one day five weeks later, Tibby reappears. But once the initial elation of the recovery has worn off, Caroline begins to wonder where he’d been and why he’d left. He is now no longer eating at home and regularly leaves the house for extended periods of time — Tibby clearly has a secret place he now returns to. Even more worrisomely, he’s no longer the shy, anxious tabby he’d been for thirteen years — instead, he’s a half pound heavier, chirpy, with “a youthful spring in his step.” But why would a happy cat abandon his loving lifelong companion and find comfort — find himself, even — elsewhere?

When the relief that my cat was safe began to fade, and the joy of his prone, snoring form — sprawled like an athlete after a celebratory night of boozing — started to wear thin, I was left with darker emotions. Confusion. Jealousy. Betrayal. I thought I’d known my cat of thirteen years. But that cat had been anxious and shy. This cat was a swashbuckling adventurer back from the high seas. What siren call could have lured him away? Was he still going to this gilded place, with its overflowing food bowls and endless treats?

There only one obvious thing left to do: Track Tibby on his escapades. So Caroline, despite Wendy’s lovingly suppressed skepticism, heads to a spy store — yes, those exist — and purchases a real-time GPS tracker, complete with a camera that they program to take snapshots every few minutes, which they then attach to Tibby’s collar.

What follows is a wild, hilarious, and sweet tale of tinkering, tracking, and tenderness. Underpinning the obsessive quest is the subtle yet palpable subplot of Wendy and Caroline’s growing love for each other, the deepening of trust and affection that happens when two people share in a special kind of insanity.

“Evert quest is a journey, every journey a story. Every story, in turn, has a moral,” writes Caroline in the final chapter, then offers several “possible morals” for the story, the last of which embody everything that makes Lost Cat an absolute treat from cover to cover:

6. You can never know your cat. In fact, you can never know anyone as completely as you want.

7. But that’s okay, love is better.

Take a closer look here, then hear MacNaughton and Paul in conversation about combining creative collaboration with a romantic relationship.

3. DOG SONGS

Mary Oliver is not only one of the sagest and most beloved poets of our time, a recipient of a Pulitzer Prize and a National Book Award, but is also among literary history’s greatest pet-lovers. Dog Songs (public library) collects her most soul-stirring poems and short prose celebrating that special human-canine relationship and what it reveals about the meaning of our own lives — a beautiful manifestation of Oliver’s singular sieve for extracting from the particularities of the poetic subject the philosophical universalities of the human condition to illuminate what it means to live a good life, a full life, a life of purpose and presence.

Inhale, for instance, this:

LUKE

I had a dog
  who loved flowers.
    Briskly she went
        through the fields,

yet paused
  for the honeysuckle
    or the rose,
        her dark head

and her wet nose
  touching
    the face
         of every one

with its petals
  of silk,
    with its fragrance
         rising

into the air
  where the bees,
    their bodies
        heavy with pollen,

hovered—
  and easily
     she adored
        every blossom,

not in the serious,
  careful way
    that we choose
        this blossom or that blossom—

the way we praise or don’t praise—
  the way we love
     or don’t love—
        but the way

we long to be—
  that happy
    in the heaven of earth—
        that wild, that loving.

Amidst the poetic, there are also the necessary, playfully practical reminders of how dogs illustrate the limitations of our own sensory awareness:

A dog can never tell you what she knows from the smells of the world, but you know, watching her, that you know almost nothing.

Then there are the fictional — or are they? — conversations with Oliver’s dog Ricky, which brim with love and wisdom. In one, titled “Show Time,” they watch a dog show on TV and wince at the unfortunate, borderline abusive grooming the contestants have had to endure. Ricky exclaims:

“If I ever meet one of these dogs I’m going
to invite him to come here, where he can
be a proper dog.”

Okay, I said. But remember, you can’t fix
everything in the world for everybody.

“However,” said Ricky, “you can’t do
anything at all unless you begin. Haven’t
I heard you say that once or twice, or
maybe a hundred times?”

In another poem, Oliver affectionately acknowledges that innocent canine gift for employing a dog’s intellect for his own self-gratification, as when he dupes both you the other household human into feeding him breakfast:

Be prepared. A dog is adorable and noble. A dog is a true and loving friend. A dog is also a hedonist.

In a short prose piece, Oliver considers the wretched elephant in every dog-lover’s room:

Dogs die so soon. I have my stories of that grief, no doubt many of you do also. It is almost a failure of will, a failure of love, to let them grow old — or so it feels. We would do anything to keep them with us, and to keep them young. The one gift we cannot give.

One of her most poignant meditations strokes the heart of why dogs are so much more than the ornament Virginia Woolf’s nephew reduced them to. It comes in the collection’s concluding essay, emanating the loving-kindness of Buddhism and condensing that in the prism of the dog:

Because of the dog’s joyfulness, our own is increased. It is no small gift. It is not the least reason why we should honor as well as love the dog of our own life, and the dog down the street, and all the dogs not yet born. What would the world be like without music or rivers or the green and tender grass? What would this world be like without dogs?

LITTLE DOG’S RHAPSODY IN THE NIGHT

He puts his cheek against mine
and makes small, expressive sounds.
And when I’m awake, or awake enough

he turns upside down, his four paws
  in the air
and his eyes dark and fervent.

“Tell me you love me,” he says.

“Tell me again.”

Could there be a sweeter arrangement? Over and over
he gets to ask.
I get to tell.

But even more powerful is the other direction of that affirmative affection — the wholehearted devotion of dogs, who love us unconditionally and in the process teach us to love; in letting us see ourselves through their eyes, they help us believe what they see, believe that we are worthy of love, that we are love.

THE SWEETNESS OF DOGS

What do you say, Percy? I am thinking
of sitting out on the sand to watch
the moon rise. It’s full tonight.
So we go

and the moon rises, so beautiful it
makes me shudder, makes me think about
time and space, makes me take
measure of myself: one iota
pondering heaven. Thus we sit, myself

thinking how grateful I am for the moon’s
perfect beauty and also, oh! how rich
it is to love the world. Percy, meanwhile,
leans against me and gazes up
into my face. As though I were just as wonderful
as the perfect moon.

Ultimately, the closing verses of the poem “Percy Wakes Me” speak for the entire collection:

This is a poem about Percy.
This is a poem about more than Percy.
Think about it.

And oh how much more is Dog Songs about.

4. THE BIG NEW YORKER BOOK OF CATS

“Dogs are not about something else. Dogs are about dogs,” Malcolm Gladwell proclaimed in the introduction to The Big New Yorker Book of Dogs, one of the best art books of 2012 and among the finest pet-related books of all time. Cats, on the other hand — despite their long history as literary muses, poetic devices, creative catalysts, and targets of artful grievances — are largely about something else, about some facet or other of our human needs, desires, and conceits: our relationships, our cities, our grappling with mortality.

So bespeaks The Big New Yorker Book of Cats (public library), the highly anticipated feline sequel to last year’s canine edition, also among the best art and design books of the year — a shiny, well-fed tome that gathers the best cat-coddling articles, essays, short stories, poems, cartoons, covers, and other feats of literature and art from the New Yorker archives. Spanning nearly nine decades, the collection featuring contributions from such celebrated minds as John Updike, Margaret Atwood, James Thurber, Susan Orlean, and even the patron saint of “the other side,” famed dog-lover E. B. White.

In the foreword, the great New Yorker film critic Anthony Lane lays out the decrees of cat-connoisseurship:

The first rule of felinology: you need to learn to look at cats down to the last whisker, every bit as closely as they look at you. To them, remember, nothing is lost in the dark.

And another solemn dictum:

Serious cat people, like first-rate art critics, are chivvied by passion into perspicacity. Believing is seeing.

Lane considers the singular allure of using the feline psyche as literary fodder:

This will never be anything but challenging, even if you wear motorcycle gauntlets and a knight’s visor, but it remains a quest to which many writers are lured. Perhaps they view it as a kind of scratching post — ready-made, abrasive chance to sharpen their natural skills.

Even Joyce, Lane tells us, was privy to it — in the fourth chapter of Ulysses, he tackled a “very specific quandary, the spelling of a cat’s ululation … and came up with the infinitesimal swell of ‘mkgnao’ into ‘mrkgnao.’” Lane illustrates the affectionate absurdity of it all with a tongue-in-cheek invitation: “Try both, out loud, but not after eating crackers, and see if you can tell them apart.”

More than anything, however, the anthology embodies the cat’s defining characteristic: its cluster of opposites, rolled together into a giant hairball of cultural attitudes — something, perhaps, at once uncomfortably and assuringly reflective of our own chronically conflicted selves. Lane writes:

So it is, as this well-fed book stretches out in languor, that the array of feline opposites starts to emerge. Cats must be destroyed; cats should be saved. Cats are like us; no, cats are not of this world. Cats can be savored for their fellowship, then eaten for their flesh. . . . Cats exist in these pages, as they do throughout our lives, both as obsessively singular … and as a barely controllable mass, doomed to proliferate forever, like poison ivy or biographies of Napoleon. Above all, for every cat who is liked, accepted or worshipped from afar, there is another who peers into our eyes — those hopeless orbs, superfluous at night — and spies only horror, indifference, and fear.

Indeed, despite the bountiful and often ardent cat-lovers among literary history’s famous pet-owners, Lane challenges the very notion that cats and literature go together:

Perhaps we need to rethink the assumption, deep-rooted but far from well grounded, that writers and cats are a good mix. Sure, Mark Twain had cats, such as Sour Mash and Blatherskite, and, up at the more louche and loping end of American literature, in the life and work of Poe, Kerouac, William Burroughs, Charles Bukowski, Edward Gorey, and Stephen King, you are never that far from the patter of ominous paws; whether a cat has been reared on a diet of neat Burroughs would find a niche at The New Yorker, however, is open to debate. We aim at the scrutable, the translucent, the undrugged, and the verified; whether we even get close is not for us to say, but such aspirations find no echo in the bosom of the cat. The cat sneers at clarity and career plans, and even its major stratagems can be dropped upon a whim. . . .

One of the best pieces in the collection, both for the sheer joy of exquisite language and for its disarming insight into the baffling paradoxes of the human-feline psychic bond, is a long 2002 feature by Susan Orlean, titled “The Lady and the Tigers.” Beyond the undeniable freakshow mesmerism of a true story about a New Jersey woman who owns more than two dozen tigers for no other reason than her intense love for the species, the essay, much like good visual caricature, also reveals a whole lot about the psychology of our ordinary relationships with small domestic cats through this woman’s extraordinary relationship with her gigantic felines. Take, for instance, the evolution of the woman’s tiger menagerie:

After arriving in Jackson, Byron-Marasek got six more tigers — Bengal, Hassan, Madras, Marco, Royal, and Kizmet — from McMillan and from Ringling Brothers. The next batch — Kirin, Kopan, Bali, Brunei, Brahman, and Burma — were born in the back yard after Byron-Marasek allowed her male and female tigers to commingle. More cubs were born, and more tigers obtained, and the tiger population of Holmeson’s Corner steadily increased. Byron-Marasek called her operation the Tigers Only Preservation Society. Its stated mission was, among other things, to conserve all tiger species, to return captive tigers to the wild, and “to resolve the human/tiger conflict and create a resolution.”

And so we get the perfect Orleanean spear at the heart of the human condition in all its absurdity:

You know how it is — you start with one tiger, then you get another and another, then a few are born and a few die, and you start to lose track of details like exactly how many tigers you actually have.

Tucked between the essays and short stories are also a number of delightful poems, such as this 1960 gem by Ted Hughes:

TOMCATS

Daylong this tomcat lies stretched flat
As an old rough mat, no mouth and no eyes.
Continual wars and wives are what
Have tattered his ears and battered his head.

Like a bundle of old rope and iron
Sleeps till blue dusk. Then reappear
His eyes, green as ringstones: he yawns wide red,
Fangs fine as a lady’s needle and bright.

A tomcat sprang at a mounted knight,
Locked round his neck like a trap of hooks
While the knight rode fighting its clawing and bite.
After hundreds of years the stain’s there

On the stone where he fell, dead of the tom:
That was at Barnborough. The tomcat still
Grallochs odd dogs on the quiet,
Will take the head clean off your simple pullet.

Is unkillable. From the dog’s fury,
From gunshot fired point-blank he brings
His skin whole, and whole
From owlish moons of bekittenings

Among ashcans. He leaps and lightly
Walks upon sleep, his mind on the moon
Nightly over the round world of men
Over the roofs go his eyes and outcry.

(The poem was penned the year Frieda, his daughter with Sylvia Plath, was born — a child nursed on nursery rhymes — so one can’t help but find in Hughes’s playful verses the hint of an irreverent nursery rhyme.)

In his 1992 piece “Cat Man,” George Steiner tells the story of “the most illustrious, compelling cat in the history of literature” — a Montparnasse tabby named Bébert, who was abandoned by his Germany-bound owners at the onset of WWII and met his second owner, the novelist, physician and “manic crank” Louis-Ferdinand Destouches, better-known as Céline, in Paris. Bébert promptly proceeded to enthrall the man into describing him as “magic itself, tact by wavelength.” When the cat’s time came in his Sphinx-like years at the end of 1952, the obituary Destouches wrote — rivaled only by E. B. White’s remembrance of his beloved dog Daisy — was nothing short of a literary micro-masterpiece:

After many an adventure, jail, bivouac, ashes, all of Europe … he died agile and graceful, impeccably, he had jumped out the window that very morning. . . . We, who are born old, look ridiculous in comparison!

Perhaps the most recurring theme of all, however, is the concept of the cat not as an extension of the human self, as a dog might be, but rather as something otherworldly, mysterious, with a mind of its own onto which we may project our human intentions and interpretations, but one which we will ultimately never comprehend — a force of nature, often as uncontrollable as its elements, as in this 1960 poem by Elizabeth Bishop:

ELECTRICAL STORM

Dawn an unsympathetic yellow.
Cra-aack! — dry and light.
The house was really struck.
Crack! A tinny sound, like a dropped tumbler.
Tobias jumped in the window, got in bed –
silent, his eyes bleached white, his fur on end.
Personal and spiteful as a neighbor’s child,
thunder began to bang and bump the roof.
One pink flash;
then hail, the biggest size of artificial pearls.
Dead-white, wax-white, cold –
diplomats’ wives’ favors
from an old moon party –
they lay in melting windrows
on the red ground until well after sunrise.
We got up to find the wiring fused,
no lights, a smell of saltpetre,
and the telephone dead.

The cat stayed in the warm sheets.
The Lent trees had shed all their petals:
wet, stuck, purple, among the dead-eye pearls.

Originally featured in October, with lots more art and excerpts.

5. WILD ONES

Wild Ones: A Sometimes Dismaying, Weirdly Reassuring Story About Looking at People Looking at Animals in America (public library) by journalist Jon Mooallem, also among the best science and technology books of the year, isn’t the typical story designed to make us better by making us feel bad, to scare us into behaving, into environmental empathy; Mooallem’s is not the self-righteous tone of capital-K knowing typical of many environmental activists but the scientist’s disposition of not-knowing, the poet’s penchant for “negative capability.” Rather than ready-bake answers, he offers instead directions of thought and signposts for curiosity and, in the process, somehow gently moves us a little bit closer to our better selves, to a deep sense of, as poet Diane Ackerman beautifully put it in 1974, “the plain everythingness of everything, in cahoots with the everythingness of everything else.”

In the introduction, Mooallem recalls looking at his four-year-old daughter Isla’s menagerie of stuffed animals and the odd cultural disconnect they mime:

[T]hey were foraging on the pages of every bedtime story, and my daughter was sleeping in polar bear pajamas under a butterfly mobile with a downy snow owl clutched to her chin. Her comb handle was a fish. Her toothbrush handle was a whale. She cut her first tooth on a rubber giraffe.

Our world is different, zoologically speaking — less straightforward and more grisly. We are living in the eye of a great storm of extinction, on a planet hemorrhaging living things so fast that half of its nine million species could be gone by the end of the century. At my place, the teddy bears and giggling penguins kept coming. But I didn’t realize the lengths to which humankind now has to go to keep some semblance of actual wildlife in the world. As our own species has taken over, we’ve tried to retain space for at least some of the others being pushed aside, shoring up their chances of survival. But the threats against them keep multiplying and escalating. Gradually, America’s management of its wild animals has evolved, or maybe devolved, into a surreal kind of performance art.

Yet even conservationists’ small successes — crocodile species bouncing back from the brink of extinction, peregrine falcons filling the skies once again — even these pride points demonstrate the degree to which we’ve assumed — usurped, even — a puppeteer role in the theater of organic life. Citing a scientist who lamented that “right now, nature is unable to stand on its own,” Mooallem writes:

We’ve entered what some scientists are calling the Anthropocene — a new geologic epoch in which human activity, more than any other force, steers change on the planet. Just as we’re now causing the vast majority of extinctions, the vast majority of endangered species will only survive if we keep actively rigging the world around them in their favor. … We are gardening the wilderness. The line between conservation and domestication has blurred.

He finds himself uncomfortably straddling these two animal worlds — the idyllic little-kid’s dreamland and the messy, fragile ecosystem of the real world:

Once I started looking around, I noticed the same kind of secondhand fauna that surrounds my daughter embellishing the grown-up world, too — not just the conspicuous bald eagle on flagpoles and currency, or the big-cat and raptor names we give sports teams and computer operating systems, but the whale inexplicably breaching in the life-insurance commercial, the glass dolphin dangling from a rearview mirror, the owl sitting on the rump of a wild boar silk-screened on a hipster’s tote bag. I spotted wolf after wolf airbrushed on the sides of old vans, and another wolf, painted against a full moon on purple velvet, greeting me over the toilet in a Mexican restaurant bathroom. … [But] maybe we never outgrow the imaginary animal kingdom of childhood. Maybe it’s the one we are trying to save.

[…]

From the very beginning, America’s wild animals have inhabited the terrain of our imagination just as much as they‘ve inhabited the actual land. They are free-roaming Rorschachs, and we are free to spin whatever stories we want about them. The wild animals always have no comment.

So he sets out to better understand the dynamics of the cultural forces that pull these worlds together with shared abstractions and rip them apart with the brutal realities of environmental collapse. His quest, in which little Isla is a frequent companion, sends him on the trails of three endangered species — a bear, a butterfly, and a bird — which fall on three different points on the spectrum of conservation reliance, relying to various degrees on the mercy of the very humans who first disrupted “the machinery of their wildness.” On the way, he encounters a remarkably vibrant cast of characters — countless passionate citizen scientists, a professional theater actor who, after an HIV diagnosis, became a professional butterfly enthusiast, and even Martha Stewart — and finds in their relationship with the environment “the same creeping disquiet about the future” that Mooallem himself came to know when he became a father. In fact, the entire project was inextricably linked to his sense of fatherly responsibility:

I’m part of a generation that seems especially resigned to watching things we encountered in childhood disappear: landline telephones, newspapers, fossil fuels. But leaving your kids a world without wild animals feels like a special tragedy, even if it’s hard to rationalize why it should.

The truth is that most of us will never experience the Earth’s endangered animals as anything more than beautiful ideas. They are figments of our shared imagination, recognizable from TV, but stalking places — places out there — to which we have no intention of going. I wondered how that imaginative connection to wildlife might fray or recalibrate as we’re forced to take more responsibility for its wildness.

It also occurred to me early on that all three endangered species I was getting to know could be gone by the time Isla is my age. It’s possible that, thirty years from now, they’ll have receded into the realm of dinosaurs, or the realm of Pokémon, for that matter — fantastical creatures whose names and diets little kids memorize from books. And it’s possible, too, I realized, that it might not even make a difference, that there would still be polar bears on footsy pajamas and sea turtle-shaped gummy vitamins — that there could be so much actual destruction without ever meaningfully upsetting the ecosystems in our minds.

Originally featured in May — read more here.

6. THE GENIUS OF DOGS

For much of modern history, dogs have inspired a wealth of art and literature, profound philosophical meditations, scientific curiosity, deeply personal letters, photographic admiration, and even some cutting-edge data visualization. But what is it that makes dogs so special in and of themselves, and so dear to us?

Despite the mind-numbing title, The Genius of Dogs: How Dogs Are Smarter than You Think (public library; UK) by Brian Hare, evolutionary anthropologist and founder of the Duke Canine Cognition Center, and Vanessa Woods offers a fascinating tour of radical research on canine cognition, from how the self-domestication of dogs gave them a new kind of social intelligence to what the minds of dogs reveal about our own. In fact, one of the most compelling parts of the book has less to do with dogs and more with genius itself.

In examining the definition of genius, Hare echoes British novelist Amelia E. Barr, who wisely noted in 1901 that “genius is nothing more nor less than doing well what anyone can do badly.” Hare points out that standardized tests provide a very narrow — and thus poor — definition of genius:

As you probably remember, tests such as IQ tests, GREs, and SATs focus on basic skills like reading, writing, and analytical abilities. The tests are favored because on average, they predict scholastic success. But they do not measure the full capabilities of each person. They do not explain Ted Turner, Ralph Lauren, Bill Gates, and Mark Zuckerberg, who all dropped out of college and became billionaires.

Instead, Hare offers a conception of genius that borrows from Howard Gardner’s seminal 1983 theory of multiple intelligences:

A cognitive approach is about celebrating different kinds of intelligence. Genius means that someone can be gifted with one type of cognition while being average or below average in another.

For a perfect example, Hare points to reconstructionist Temple Grandin:

Temple Grandin, at Colorado State University, is autistic yet is also the author of several books, including Animals Make Us Human, and has done more for animal welfare than almost anyone. Although Grandin struggles to read people’s emotions and social cues, her extraordinary understanding of animals has allowed her to reduce the stress of millions of farm animals.

The cognitive revolution changed the way we think about intelligence. It began in the decade that all social revolutions seemed to have happened, the sixties. Rapid advances in computer technology allowed scientists to think differently about the brain and how it solves problems. Instead of the brain being either more or less full of intelligence, like a glass of wine, the brain is more like a computer, where different parts work together. USB ports,keyboards, and modems bring in new information from the environment; a processor helps digest and alter the information into a usable format, while a hard drive stores important information for later use. Neuroscientists realized that, like a computer, many parts of the brain are specialized for solving different types of problems.

An example of this comes from the study of memory, which we already know is fascinating in its fallibility:

One of the best-studied cognitive abilities is memory. In fact, we usually think of geniuses as people who have an extraordinary memory for facts and figures, since such people often score off the charts on IQ tests. But just as there are different types of intelligence, there are different types of memory. There is memory for events, faces, navigation, things that occurred recently or long ago — the list goes on. If you have a good memory in one of these areas, it does not necessarily mean your other types of memory are equally good.

Ultimately, the notion of multiple intelligences is what informs the research on dog cognition:

There are many definitions of intelligence competing for attention in popular culture. But the definition that has guided my research and that applies throughout the book is a very simple one. The genius of dogs — of all animals, for that matter, including humans — has two criteria:

  1. A mental skill that is strong compared with others, either within your own species or in closely related species.
  2. The ability to spontaneously make inferences.

(This second criterion comes strikingly close to famous definitions of creativity.)

Originally featured in February.

Public domain photographs via Flickr Commons

7. ANIMAL WISE

Most people who have observed animals even briefly wouldn’t question their emotional lives and their thriving inner worlds. While anthropomorphic animal tales have populated storytelling for as long as humanity has existed, in Animal Wise: The Thoughts and Emotions of Our Fellow Creatures (public library) — part of my collaboration with The New York Public Library — science writer Virginia Morell takes us on an unprecedented tour of laboratories around the world and explores the work of pioneering animal cognition researchers to reveal the scientific basis for our basic intuition about what goes on in the hearts and minds of our fellow beings, from the laughter of rats to the intellectual curiosity of dolphins.

Observing her puppy Quincy invent a game, Morell contextualizes how far the study of animal sentience has come in the last half-century:

Why was I surprised when our pup invented a game? I think because at that time, in the late 1980s— not so very long ago—scientists were still stuck on the question “Do animals have minds?” A cautious search was under way for the answer, and the researchers’ caution had spilled over to society at large. In those days, if you suggested that dogs had imaginations or that rats laughed or had some degree of empathy for another’s pain, certain other people (and not just scientists) were likely to sneer at you and accuse you of being sentimental and of anthropomorphizing — interpreting an animal’s behavior as if the creature were a human dressed up in furs or feathers.

She goes on to explore how modern science has illuminated such marvels as how birds think, how an elephant’s memory works, how ants learn, and what goes on in the imagination of dolphins.

8. IF DOGS RUN FREE

As a lover of canine-centric literature and art, an aficionado of lesser-known children’s books by luminaries of grown-up culture — including gems by Mark Twain, Maya Angelou, James Joyce, Sylvia Plath, William Faulkner, Virginia Woolf, Gertrude Stein, Anne Sexton, T. S. Eliot, and John Updike — and a previous admirer of Bob Dylan’s music adapted in picturebook form, I was thrilled for the release of If Dogs Run Free (public library) — an utterly delightful adaptation of the beloved 1970 Dylan song from the album New Morning by celebrated illustrator Scott Campbell.

Originally featured in September.

BONUS: A CAT-HATER’S HANDBOOK

“If you want to concentrate deeply on some problem, and especially some piece of writing or paper-work,” Muriel Spark advised, “you should acquire a cat.” But while felines may have found their way into Joyce’s children’s books, Indian folk art, and Hemingway’s heart, their cultural status is quite different from that of dogs, which are in turn celebrated as literary muses, scientific heroes, philosophical stimuli, cartographic data points, and unabashed geniuses. In fact, there might even be a thriving subculture of militant anti-felinists — or so suggests A Cat-Hater’s Handbook (public library), a vintage gem by William Cole and beloved children’s book illustrator Tomi Ungerer, originally conceived in 1963, but not published until 1982. Since I chanced upon a surviving copy this year, and since it’s so impossibly wonderful, I’m throwing it in as a bonus pick.

The back cover boasts:

What’s so cute about an animal that loves absolutely nothing, makes your house smell terrible, and has a brain the size of an under-developed kidney bean? At last, a book that dares to answer these and other feline questions with the sane and sensible answer:

Not a damned thing!

Also included is a selection of “scathing anti-feline poetry and prose” from the likes of William Faulkner, Mark Twain, and Shel Silverstein.

Cole writes in the introductory pages:

Ailurophobia is, dictionarily speaking, a fear of cats. But words have a way of gradually sliding their meanings into something else, and ailurophobia is now accepted as meaning a strong dislike of the animals. Ailurophobes abound. Quiet cat-haters are everywhere. Often, a casual remark that I was doing anti-cat research would bring sparkle to the eyes of strangers. Firm bonds of friendship were immediately established. Mute lips were unsealed, and a delightful flow of long-repressed invective transpired. It was heart warming to find that what I thought would be a lonely crusade is truly a great popular cause.

What you’ll find, of course, is that underpinning Ungerer’s delightfully irreverent illustrations and Cole’s subversive writing is self-derision rather than cat-derision as this cat-hater’s handbook reveals itself as a cat-lover’s self-conscious and defiant love letter to the messy, unruly, all-consuming, but ultimately deeply fulfilling relationship with one’s loyal feline friend.

The intelligence of cats is a subject that arouses the cat-lover to fever pitch. Of course, there are all kinds of intelligences; the intelligence of a dolphin, for example, is particularly dolphinesque — it is suited to his surroundings and must be equated in those terms. Scientists balk at making comparative statements about animal intelligence. I spoke to one at the American Museum of Natural History who said that ‘a general judgement, from the literature, would put the intelligence of cats below dogs and above rats.’ (Which is the right place for them, anyway.)

On average, each suburban or country cat will kill 10 to 50 birds a year.

A Cat-Hater’s Handbook is, sadly, out of print, but used copies still abound online and are possibly available at your local public library.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

10 DECEMBER, 2013

The 13 Best Science and Technology Books of 2013

By:

The wonders of the gut, why our brains are wired to be social, what poetry and math have in common, swarm intelligence vs. “God,” and more.

On the heels of the year’s best reads in psychology and philosophy, art and design, history and biography, and children’s books, the season’s subjective selection of best-of reading lists continues with the finest science and technology books of 2013. (For more timeless stimulation, revisit the selections for 2012 and 2011.)

1. THIS EXPLAINS EVERYTHING

Every year since 1998, intellectual impresario and Edge editor John Brockman has been posing a single grand question to some of our time’s greatest thinkers across a wide spectrum of disciplines, then collecting the answers in an annual anthology. Last year’s answers to the question “What scientific concept will improve everybody’s cognitive toolkit?” were released in This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking, one of the year’s best psychology and philosophy books.

In 2012, the question Brockman posed, proposed by none other than Steven Pinker, was “What is your favorite deep, elegant, or beautiful explanation?” The answers, representing an eclectic mix of 192 (alas, overwhelmingly male) minds spanning psychology, quantum physics, social science, political theory, philosophy, and more, are collected in the edited compendium This Explains Everything: Deep, Beautiful, and Elegant Theories of How the World Works (UK; public library) and are also available online.

In the introduction preceding the micro-essays, Brockman frames the question and its ultimate objective, adding to history’s most timeless definitions of science:

The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology, genetics, computer science, neurophysiology, psychology, cosmology, and physics. Emerging out of these contributions is a new natural philosophy, new ways of understanding physical systems, new ways of thinking that call into question many of our basic assumptions.

[…]

Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple principles in a surprising way. These explanations are called ‘beautiful’ or ‘elegant.’

[…]

The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining knowledge about anything — including such fields of inquiry as philosophy, mathematics, economics, history, language, and human behavior. The common thread is that a simple and nonobvious idea is proposed as the explanation of a diverse and complicated set of phenomena.

Puffer fish with Akule by photographer Wayne Levin. Click image for details.

Stanford neuroscientist Robert Sapolsky, eloquent as ever, marvels at the wisdom of the crowd and the emergence of swarm intelligence:

Observe a single ant, and it doesn’t make much sense, walking in one direction, suddenly careening in another for no obvious reason, doubling back on itself. Thoroughly unpredictable.

The same happens with two ants, a handful of ants. But a colony of ants makes fantastic sense. Specialized jobs, efficient means of exploiting new food sources, complex underground nests with temperature regulated within a few degrees. And critically, there’s no blueprint or central source of command—each individual ants has algorithms for their behaviors. But this is not wisdom of the crowd, where a bunch of reasonably informed individuals outperform a single expert. The ants aren’t reasonably informed about the big picture. Instead, the behavior algorithms of each ant consist of a few simple rules for interacting with the local environment and local ants. And out of this emerges a highly efficient colony.

Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple rules about when to lay down a pheromone trail and what to do when encountering someone else’s trail—approximations of optimal solutions to the Traveling Salesman problem. This has useful applications. In “ant-based routing,” simulations using virtual ants with similar rules can generate optimal ways of connecting the nodes in a network, something of great interest to telecommunications companies. It applies to the developing brain, which must wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of connecting axons. And migrating fetal neurons generate an efficient solution with a different version of ant-based routine.

A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple molecules in an organic soup to occasionally form more complex ones. Life may have originated this way without the requirement of bolts of lightning to catalyze the formation of complex molecules.

And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems don’t require a blue print, they don’t require a blue print maker. If they don’t require lightning bolts, they don’t require Someone hurtling lightning bolts.

Developmental psychologist Howard Gardner, who famously coined the seminal theory of multiple intelligences, echoes Anaïs Nin in advocating for the role of the individual and Susan Sontag in stressing the impact of individual acts on collective fate. His answer, arguing for the importance of human beings, comes as a welcome antidote to a question that suffers the danger of being inherently reductionist:

In a planet occupied now by seven billion inhabitants, I am amazed by the difference that one human being can make. Think of classical music without Mozart or Stravinsky; of painting without Caravaggio, Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible contributions of Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs (or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses or Christ.

[…]

Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of single individuals, or of small groups, working against the odds. As scholars, we cannot and should not sweep these instances under the investigative rug. We should bear in mind anthropologist Margaret Mead’s famous injunction: ‘Never doubt that a small group of thoughtful committed citizens can change the world. It is the only thing that ever has.’

Uber-curator Hans Ulrich Obrist, who also contributed to last year’s volume, considers the parallel role of patterns and chance in the works of iconic composer John Cage and painter Gerhard Richter, and the role of uncertainty in the creative process:

In art, the title of a work can often be its first explanation. And in this context I am thinking especially of the titles of Gerhard Richter. In 2006, when I visited Richter in his studio in Cologne, he had just finished a group of six corresponding abstract paintings which he gave the title Cage.

There are many relations between Richter’s painting and the compositions of John Cage. In a book about the Cage series, Robert Storr has traced them from Richter‘s attendance of a Cage performance at the Festum Fluxorum Fluxus in Düsseldorf 1963 to analogies in their artistic processes. Cage has often applied chance procedures in his compositions, notably with the use of the I Ching. Richter in his abstract paintings also intentionally allows effects of chance. In these paintings, he applies the oil paint on the canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the paint leaves on the canvas is to a large extent the outcome of chance.

[…]

Richter‘s concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other works)—but, one can say, the short form already contains everything. The title, like an explanation of a phenomenon, unlocks the works, describing their relation to one of the most important cultural figures of the twentieth century, John Cage, who shares with Richter the great themes of chance and uncertainty.

Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains indispensable, offers a lyrical meditation on the peculiar odds behind coincidences and déja vus:

I take comfort in the fact that there are two human moments that seem to be doled out equally and democratically within the human condition—and that there is no satisfying ultimate explanation for either. One is coincidence, the other is déja vu. It doesn’t matter if you’re Queen Elizabeth, one of the thirty-three miners rescued in Chile, a South Korean housewife or a migrant herder in Zimbabwe—in the span of 365 days you will pretty much have two déja vus as well as one coincidence that makes you stop and say, “Wow, that was a coincidence.”

The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can happen at any given moment, the fact is, that in practice, coincidences almost never do occur. Coincidences are actually so rare that when they do occur they are, in fact memorable. This suggests to me that the universe is designed to ward off coincidence whenever possible—the universe hates coincidence—I don’t know why—it just seems to be true. So when a coincidence happens, that coincidence had to work awfully hard to escape the system. There’s a message there. What is it? Look. Look harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a theorem for something larger than what they think it is.

What’s both eerie and interesting to me about déja vus is that they occur almost like metronomes throughout our lives, about one every six months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely assume that my thirteen year old niece, Stephen Hawking and someone working in a Beijing luggage-making factory each experience two déja vus a year. Not one. Not three. Two.

The underlying biodynamics of déja vus is probably ascribable to some sort of tingling neurons in a certain part of the brain, yet this doesn’t tell us why they exist. They seem to me to be a signal from larger point of view that wants to remind us that our lives are distinct, that they have meaning, and that they occur throughout a span of time. We are important, and what makes us valuable to the universe is our sentience and our curse and blessing of perpetual self-awareness.

Originally featured in January — read more here.

2. YOU ARE STARDUST

“Everyone you know, everyone you ever heard of, every human being who ever was … lived there — on a mote of dust suspended in a sunbeam,” Carl Sagan famously marveled in his poetic Pale Blue Dot monologue, titled after the iconic 1990 photograph of Earth. The stardust metaphor for our interconnection with the cosmos soon permeated popular culture and became a vehicle for the allure of space exploration. There’s something at once incredibly empowering and incredibly humbling in knowing that the flame in your fireplace came from the sun.

That’s precisely the kind of cosmic awe environmental writer Elin Kelsey and Toronto-based Korean artist Soyeon Kim seek to inspire in kids in You Are Stardust (public library) — an exquisite picture-book that instills that profound sense of connection with the natural world, and also among the best children’s books of the year. Underpinning the narrative is a bold sense of optimism — a refreshing antidote to the fear-appeal strategy plaguing most environmental messages today.

Kim’s breathtaking dioramas, to which this screen does absolutely no justice, mix tactile physical materials with fine drawing techniques and digital compositing to illuminate the relentlessly wondrous realities of our intertwined existence: The water in your sink once quenched the thirst of dinosaurs; with every sneeze, wind blasts out of your nose faster than a cheetah’s sprint; the electricity that powers every thought in your brain is stronger than lightning.

But rather than dry science trivia, the message is carried on the wings of poetic admiration for these intricate relationships:

Be still. Listen.

Like you, the Earth breathes.

Your breath is alive with the promise of flowers.

Each time you blow a kiss to the world, you spread pollen that might grow to be a new plant.

The book is nonetheless grounded in real science. Kelsey notes:

I wrote this book as a celebration — one to honor the extraordinary ways in which all of us simply are nature. Every example in this book is backed by current science. Every day, for instance, you breathe in more than a million pollen grains.

But what makes the project particularly exciting is that, in the face of the devastating gender gap in science education, here is a thoughtful, beautiful piece of early science education presented by two women, the most heartening such example since Lauren Redniss’s Radioactive.

A companion iPad app features sound effects, animation, an original score by Paul Aucoin, behind-the-scenes glimpses of Kim’s process in creating her stunning 3D dioramas, and even build-your-own-diorama adventures.

Originally featured in March — see more here.

3. ON LOOKING

“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library), also among the best psychology and philosophy books of the year — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Her approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.

4. WILD ONES

Wild Ones: A Sometimes Dismaying, Weirdly Reassuring Story About Looking at People Looking at Animals in America (public library) by journalist Jon Mooallem isn’t the typical story designed to make us better by making us feel bad, to scare us into behaving, into environmental empathy; Mooallem’s is not the self-righteous tone of capital-K knowing typical of many environmental activists but the scientist’s disposition of not-knowing, the poet’s penchant for “negative capability.” Rather than ready-bake answers, he offers instead directions of thought and signposts for curiosity and, in the process, somehow gently moves us a little bit closer to our better selves, to a deep sense of, as poet Diane Ackerman beautifully put it in 1974, “the plain everythingness of everything, in cahoots with the everythingness of everything else.”

In the introduction, Mooallem recalls looking at his four-year-old daughter Isla’s menagerie of stuffed animals and the odd cultural disconnect they mime:

[T]hey were foraging on the pages of every bedtime story, and my daughter was sleeping in polar bear pajamas under a butterfly mobile with a downy snow owl clutched to her chin. Her comb handle was a fish. Her toothbrush handle was a whale. She cut her first tooth on a rubber giraffe.

Our world is different, zoologically speaking — less straightforward and more grisly. We are living in the eye of a great storm of extinction, on a planet hemorrhaging living things so fast that half of its nine million species could be gone by the end of the century. At my place, the teddy bears and giggling penguins kept coming. But I didn’t realize the lengths to which humankind now has to go to keep some semblance of actual wildlife in the world. As our own species has taken over, we’ve tried to retain space for at least some of the others being pushed aside, shoring up their chances of survival. But the threats against them keep multiplying and escalating. Gradually, America’s management of its wild animals has evolved, or maybe devolved, into a surreal kind of performance art.

Yet even conservationists’ small successes — crocodile species bouncing back from the brink of extinction, peregrine falcons filling the skies once again — even these pride points demonstrate the degree to which we’ve assumed — usurped, even — a puppeteer role in the theater of organic life. Citing a scientist who lamented that “right now, nature is unable to stand on its own,” Mooallem writes:

We’ve entered what some scientists are calling the Anthropocene — a new geologic epoch in which human activity, more than any other force, steers change on the planet. Just as we’re now causing the vast majority of extinctions, the vast majority of endangered species will only survive if we keep actively rigging the world around them in their favor. … We are gardening the wilderness. The line between conservation and domestication has blurred.

He finds himself uncomfortably straddling these two animal worlds — the idyllic little-kid’s dreamland and the messy, fragile ecosystem of the real world:

Once I started looking around, I noticed the same kind of secondhand fauna that surrounds my daughter embellishing the grown-up world, too — not just the conspicuous bald eagle on flagpoles and currency, or the big-cat and raptor names we give sports teams and computer operating systems, but the whale inexplicably breaching in the life-insurance commercial, the glass dolphin dangling from a rearview mirror, the owl sitting on the rump of a wild boar silk-screened on a hipster’s tote bag. I spotted wolf after wolf airbrushed on the sides of old vans, and another wolf, painted against a full moon on purple velvet, greeting me over the toilet in a Mexican restaurant bathroom. … [But] maybe we never outgrow the imaginary animal kingdom of childhood. Maybe it’s the one we are trying to save.

[…]

From the very beginning, America’s wild animals have inhabited the terrain of our imagination just as much as they‘ve inhabited the actual land. They are free-roaming Rorschachs, and we are free to spin whatever stories we want about them. The wild animals always have no comment.

So he sets out to better understand the dynamics of the cultural forces that pull these worlds together with shared abstractions and rip them apart with the brutal realities of environmental collapse. His quest, in which little Isla is a frequent companion, sends him on the trails of three endangered species — a bear, a butterfly, and a bird — which fall on three different points on the spectrum of conservation reliance, relying to various degrees on the mercy of the very humans who first disrupted “the machinery of their wildness.” On the way, he encounters a remarkably vibrant cast of characters — countless passionate citizen scientists, a professional theater actor who, after an HIV diagnosis, became a professional butterfly enthusiast, and even Martha Stewart — and finds in their relationship with the environment “the same creeping disquiet about the future” that Mooallem himself came to know when he became a father. In fact, the entire project was inextricably linked to his sense of fatherly responsibility:

I’m part of a generation that seems especially resigned to watching things we encountered in childhood disappear: landline telephones, newspapers, fossil fuels. But leaving your kids a world without wild animals feels like a special tragedy, even if it’s hard to rationalize why it should.

The truth is that most of us will never experience the Earth’s endangered animals as anything more than beautiful ideas. They are figments of our shared imagination, recognizable from TV, but stalking places — places out there — to which we have no intention of going. I wondered how that imaginative connection to wildlife might fray or recalibrate as we’re forced to take more responsibility for its wildness.

It also occurred to me early on that all three endangered species I was getting to know could be gone by the time Isla is my age. It’s possible that, thirty years from now, they’ll have receded into the realm of dinosaurs, or the realm of Pokémon, for that matter — fantastical creatures whose names and diets little kids memorize from books. And it’s possible, too, I realized, that it might not even make a difference, that there would still be polar bears on footsy pajamas and sea turtle-shaped gummy vitamins — that there could be so much actual destruction without ever meaningfully upsetting the ecosystems in our minds.

Originally featured in May — read more here.

5. THINKING IN NUMBERS

Daniel Tammet was born with an unusual mind — he was diagnosed with high-functioning autistic savant syndrome, which meant his brain’s uniquely wired circuits made possible such extraordinary feats of computation and memory as learning Icelandic in a single week and reciting the number pi up to the 22,514th digit. He is also among the tiny fraction of people diagnosed with synesthesia — that curious crossing of the senses that causes one to “hear” colors, “smell” sounds, or perceive words and numbers in different hues, shapes, and textures. Synesthesia is incredibly rare — Vladimir Nabokov was among its few famous sufferers — which makes it overwhelmingly hard for the majority of us to imagine precisely what it’s like to experience the world through this sensory lens. Luckily, Tammet offers a fascinating first-hand account in Thinking In Numbers: On Life, Love, Meaning, and Math (public library) — a magnificent collection of 25 essays on “the math of life,” celebrating the magic of possibility in all its dimensions. In the process, he also invites us to appreciate the poetics of numbers, particularly of ordered sets — in other words, the very lists that dominate everything from our productivity tools to our creative inventories to the cheapened headlines flooding the internet.

Reflecting on his second book, Embracing the Wide Sky: A Tour Across the Horizons of the Mind, and the overwhelming response from fascinated readers seeking to know what it’s really like to experience words and numbers as colors and textures — to experience the beauty that a poem and a prime number exert on a synesthete in equal measure — Tammet offers an absorbing simulation of the synesthetic mind:

Imagine.

Close your eyes and imagine a space without limits, or the infinitesimal events that can stir up a country’s revolution. Imagine how the perfect game of chess might start and end: a win for white, or black, or a draw? Imagine numbers so vast that they exceed every atom in the universe, counting with eleven or twelve fingers instead of ten, reading a single book in an infinite number of ways.

Such imagination belongs to everyone. It even possesses its own science: mathematics. Ricardo Nemirovsky and Francesca Ferrara, who specialize in the study of mathematical cognition, write that “like literary fiction, mathematical imagination entertains pure possibilities.” This is the distillation of what I take to be interesting and important about the way in which mathematics informs our imaginative life. Often we are barely aware of it, but the play between numerical concepts saturates the way we experience the world.

Sketches from synesthetic artist and musician Michal Levy's animated visualization of John Coltrane's 'Giant Steps.' Click image for details.

Tammet, above all, is enchanted by the mesmerism of the unknown, which lies at the heart of science and the heart of poetry:

The fact that we have never read an endless book, or counted to infinity (and beyond!) or made contact with an extraterrestrial civilization (all subjects of essays in the book) should not prevent us from wondering: what if? … Literature adds a further dimension to the exploration of those pure possibilities. As Nemirovsky and Ferrara suggest, there are numerous similarities in the patterns of thinking and creating shared by writers and mathematicians (two vocations often considered incomparable.)

In fact, this very link between mathematics and fiction, between numbers and storytelling, underpins much of Tammet’s exploration. Growing up as one of nine siblings, he recounts how the oppressive nature of existing as a small number in a large set spurred a profound appreciation of numbers as sensemaking mechanisms for life:

Effaced as individuals, my brothers, sisters, and I existed only in number. The quality of our quantity became something we could not escape. It preceded us everywhere: even in French, whose adjectives almost always follow the noun (but not when it comes to une grande famille). … From my family I learned that numbers belong to life. The majority of my math acumen came not from books but from regular observations and day-to-day interactions. Numerical patterns, I realized, were the matter of our world.

This awareness was the beginning of Tammet’s synesthetic sensibility:

Like colors, the commonest numbers give character, form, and dimension to our world. Of the most frequent — zero and one — we might say that they are like black and white, with the other primary colors — red, blue, and yellow — akin to two, three, and four. Nine, then, might be a sort of cobalt or indigo: in a painting it would contribute shading, rather than shape. We expect to come across samples of nine as we might samples of a color like indigo—only occasionally, and in small and subtle ways. Thus a family of nine children surprises as much as a man or woman with cobalt-colored hair.

Daniel Tammet. Portrait by Jerome Tabet.

Sampling from Jorge Luis Borges’s humorous fictional taxonomy of animals, inspired by the work of nineteenth-century German mathematician Georg Cantor, Tammet points to the deeper insight beneath our efforts to itemize and organize the universe — something Umberto Eco knew when he proclaimed that “the list is the origin of culture” and Susan Sontag intuited when she reflected on why lists appeal to us. Tammet writes:

Borges here also makes several thought-provoking points. First, though a set as familiar to our understanding as that of “animals” implies containment and comprehension, the sheer number of its possible subsets actually swells toward infinity. With their handful of generic labels (“mammal,” “reptile,” “amphibious,” etc.), standard taxonomies conceal this fact. To say, for example, that a flea is tiny, parasitic, and a champion jumper is only to begin to scratch the surface of all its various aspects.

Second, defining a set owes more to art than it does to science. Faced with the problem of a near endless number of potential categories, we are inclined to choose from a few — those most tried and tested within our particular culture. Western descriptions of the set of all elephants privilege subsets like “those that are very large,” and “those possessing tusks,” and even “those possessing an excellent memory,” while excluding other equally legitimate possibilities such as Borges’s “those that at a distance resemble flies,” or the Hindu “those that are considered lucky.”

[…]

Reading Borges invites me to consider the wealth of possible subsets into which my family “set” could be classified, far beyond those that simply point to multiplicity.

Tammet circles back to the shared gifts of literature and mathematics, which both help cultivate our capacity for compassion:

Like works of literature, mathematical ideas help expand our circle of empathy, liberating us from the tyranny of a single, parochial point of view. Numbers, properly considered, make us better people.

Originally featured in August — read more here.

6. SMARTER THAN YOU THINK

“The dangerous time when mechanical voices, radios, telephones, take the place of human intimacies, and the concept of being in touch with millions brings a greater and greater poverty in intimacy and human vision,” Anaïs Nin wrote in her diary in 1946, decades before the internet as we know it even existed. Her fear has since been echoed again and again with every incremental advance in technology, often with simplistic arguments about the attrition of attention in the age of digital distraction. But in Smarter Than You Think: How Technology is Changing Our Minds for the Better (public library), Clive Thompson — one of the finest technology writers I know, with regular bylines for Wired and The New York Times — makes a powerful and rigorously thought out counterpoint. He argues that our technological tools — from search engines to status updates to sophisticated artificial intelligence that defeats the world’s best chess players — are now inextricably linked to our minds, working in tandem with them and profoundly changing the way we remember, learn, and “act upon that knowledge emotionally, intellectually, and politically,” and this is a promising rather than perilous thing.

He writes in the introduction:

These tools can make even the amateurs among us radically smarter than we’d be on our own, assuming (and this is a big assumption) we understand how they work. At their best, today’s digital tools help us see more, retain more, communicate more. At their worst, they leave us prey to the manipulation of the toolmakers. But on balance, I’d argue, what is happening is deeply positive. This book is about the transformation.

Page from 'Charley Harper: An Illustrated Life.' Click image for details.

But Thompson is nothing if not a dimensional thinker with extraordinary sensitivity to the complexities of cultural phenomena. Rather than revisiting painfully familiar and trite-by-overuse notions like distraction and information overload, he examines the deeper dynamics of how these new tools are affecting the way we make sense of the world and of ourselves. Several decades after Vannevar Bush’s now-legendary meditation on how technology will impact our thinking, Thompson reaches even further into the fringes of our cultural sensibility — past the cheap techno-dystopia, past the pollyannaish techno-utopia, and into that intricate and ever-evolving intersection of technology and psychology.

One of his most fascinating and important points has to do with our outsourcing of memory — or, more specifically, our increasingly deft, search-engine-powered skills of replacing the retention of knowledge in our own brains with the on-demand access to knowledge in the collective brain of the internet. Think, for instance, of those moments when you’re trying to recall the name of a movie but only remember certain fragmentary features — the name of the lead actor, the gist of the plot, a song from the soundtrack. Thompson calls this “tip-of-the-tongue syndrome” and points out that, today, you’ll likely be able to reverse-engineer the name of the movie you don’t remember by plugging into Google what you do remember about it. Thompson contextualizes the phenomenon, which isn’t new, then asks the obvious, important question about our culturally unprecedented solutions to it:

Tip-of-the-tongue syndrome is an experience so common that cultures worldwide have a phrase for it. Cheyenne Indians call it navonotootse’a, which means “I have lost it on my tongue”; in Korean it’s hyeu kkedu-te mam-dol-da, which has an even more gorgeous translation: “sparkling at the end of my tongue.” The phenomenon generally lasts only a minute or so; your brain eventually makes the connection. But … when faced with a tip-of-the-tongue moment, many of us have begun to rely instead on the Internet to locate information on the fly. If lifelogging … stores “episodic,” or personal, memories, Internet search engines do the same for a different sort of memory: “semantic” memory, or factual knowledge about the world. When you visit Paris and have a wonderful time drinking champagne at a café, your personal experience is an episodic memory. Your ability to remember that Paris is a city and that champagne is an alcoholic beverage — that’s semantic memory.

[…]

What’s the line between our own, in-brain knowledge and the sea of information around us? Does it make us smarter when we can dip in so instantly? Or dumber with every search?

Vannevar Bush's 'memex' -- short for 'memory index' -- a primitive vision for a personal hard drive for information storage and management. Click image for the full story.

That concern, of course, is far from unique to our age — from the invention of writing to Alvin Toffler’s Future Shock, new technology has always been a source of paralyzing resistance and apprehension:

Writing — the original technology for externalizing information — emerged around five thousand years ago, when Mesopotamian merchants began tallying their wares using etchings on clay tablets. It emerged first as an economic tool. As with photography and the telephone and the computer, newfangled technologies for communication nearly always emerge in the world of commerce. The notion of using them for everyday, personal expression seems wasteful, risible, or debased. Then slowly it becomes merely lavish, what “wealthy people” do; then teenagers take over and the technology becomes common to the point of banality.

Thompson reminds us of the anecdote, by now itself familiar “to the point of banality,” about Socrates and his admonition that the “technology” of writing would devastate the Greek tradition of debate and dialectic, and would render people incapable of committing anything to memory because “knowledge stored was not really knowledge at all.” He cites Socrates’s parable of the Egyptian god Theuth and how he invented writing, offering it as a gift to the king of Egypt, Thamus, who met the present with defiant indignation:

This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

That resistance endured as technology changed shape, across the Middle Ages and past Gutenberg’s revolution, but it wasn’t without counter-resistance: Those who recorded their knowledge in writing and, eventually, collected it in the form of books argued that it expanded the scope of their curiosity and the ideas they were able to ponder, whereas the mere act of rote memorization made no guarantees of deeper understanding.

Ultimately, however, Thompson points out that Socrates was both right and wrong: It’s true that, with some deliberately cultivated exceptions and neurological outliers, few thinkers today rely on pure memorization and can recite extensive passages of text from memory. But what Socrates failed to see was the extraordinary dot-connecting enabled by access to knowledge beyond what our own heads can hold — because, as Amanda Palmer poignantly put it, “we can only connect the dots that we collect,” and the outsourcing of memory has exponentially enlarged our dot-collections.

With this in mind, Thompson offers a blueprint to this newly developed system of knowledge management in which access is critical:

If you are going to read widely but often read books only once; if you going to tackle the ever-expanding universe of ideas by skimming and glancing as well as reading deeply; then you are going to rely on the semantic-memory version of gisting. By which I mean, you’ll absorb the gist of what you read but rarely retain the specifics. Later, if you want to mull over a detail, you have to be able to refind a book, a passage, a quote, an article, a concept.

But Thompson argues that despite history’s predictable patterns of resistance followed by adoption and adaptation, there’s something immutably different about our own era:

The history of factual memory has been fairly predictable up until now. With each innovation, we’ve outsourced more information, then worked to make searching more efficient. Yet somehow, the Internet age feels different. Quickly pulling up [the answer to a specific esoteric question] on Google seems different from looking up a bit of trivia in an encyclopedia. It’s less like consulting a book than like asking someone a question, consulting a supersmart friend who lurks within our phones.

And therein lies the magic of the internet — that unprecedented access to humanity’s collective brain. Thompson cites the work of Harvard psychologist Daniel Wegner, who first began exploring this notion of collective rather than individual knowledge in the 1980s by observing how partners in long-term relationships often divide and conquer memory tasks in sharing the household’s administrative duties:

Wegner suspected this division of labor takes place because we have pretty good “metamemory.” We’re aware of our mental strengths and limits, and we’re good at intuiting the abilities of others. Hang around a workmate or a romantic partner long enough and you begin to realize that while you’re terrible at remembering your corporate meeting schedule, or current affairs in Europe, or how big a kilometer is relative to a mile, they’re great at it. So you begin to subconsciously delegate the task of remembering that stuff to them, treating them like a notepad or encyclopedia. In many respects, Wegner noted, people are superior to these devices, because what we lose in accuracy we make up in speed.

[…]

Wegner called this phenomenon “transactive” memory: two heads are better than one. We share the work of remembering, Wegner argued, because it makes us collectively smarter — expanding our ability to understand the world around us.

This ability to “google” one another’s memory stores, Thompson argues, is the defining feature of our evolving relationship with information — and it’s profoundly shaping our experience of knowledge:

Transactive memory helps explain how we’re evolving in a world of on-tap information.

He illustrates this by turning to the work of Betsy Sparrow, a graduate student of Wegner’s, who conducted a series of experiments demonstrating that when we know a digital tool will store information for us, we’re far less likely to commit it to memory. On the surface, this may appear like the evident and worrisome shrinkage of our mental capacity. But there’s a subtler yet enormously important layer that such techno-dystopian simplifications miss: This very outsourcing of memory requires that we learn what the machine knows — a kind of meta-knowledge that enables us to retrieve the information when we need it. And, reflecting on Sparrow’s findings, Thomspon points out that this is neither new nor negative:

We’ve been using transactive memory for millennia with other humans. In everyday life, we are only rarely isolated, and for good reason. For many thinking tasks, we’re dumber and less cognitively nimble if we’re not around other people. Not only has transactive memory not hurt us, it’s allowed us to perform at higher levels, accomplishing acts of reasoning that are impossible for us alone. It wasn’t until recently that computer memory became fast enough to be consulted on the fly, but once it did — with search engines boasting that they return results in tenths of a second — our transactive habits adapted.

Thompson’s most important point, however, has to do with how outsourcing our knowledge to digital tools actually hampers the very process of creative thought, which relies on our ability to connect existing ideas from our mental pool of resources into new combinations, or what the French polymath Henri Poincaré has famously termed “sudden illuminations.” Without a mental catalog of materials which to mull and let incubate in our fringe consciousness, our capacity for such illuminations is greatly deflated. Thompson writes:

These eureka moments are familiar to all of us; they’re why we take a shower or go for a walk when we’re stuck on a problem. But this technique works only if we’ve actually got a lot of knowledge about the problem stored in our brains through long study and focus. … You can’t come to a moment of creative insight if you haven’t got any mental fuel. You can’t be googling the info; it’s got to be inside you.

But while this is a valid concern, Thompson doubts that we’re outsourcing too many bits of knowledge and thus curtailing our creativity. He argues, instead, that we’re mostly employing this newly evolved skill to help us sift the meaningful from the meaningless, but we remain just as capable of absorbing that which truly stimulates us:

Evidence suggests that when it comes to knowledge we’re interested in — anything that truly excites us and has meaning — we don’t turn off our memory. Certainly, we outsource when the details are dull, as we now do with phone numbers. These are inherently meaningless strings of information, which offer little purchase on the mind. … It makes sense that our transactive brains would hand this stuff off to machines. But when information engages us — when we really care about a subject — the evidence suggests we don’t turn off our memory at all.

Originally featured in September — read more here.

7. COSMIC APPRENTICE

As if to define what science is and what philosophy is weren’t hard enough, to delineate how the two fit together appears a formidable task, one that has spurred rather intense opinions. But that’s precisely what Dorion Sagan, who has previously examined the prehistoric history of sex, braves in the introduction to Cosmic Apprentice: Dispatches from the Edges of Science (public library) as he sets out to explore the intricate ways in which the two fields hang “in a kind of odd balance, watching each other, holding hands”:

The difference between science and philosophy is that the scientist learns more and more about less and less until she knows everything about nothing, whereas a philosopher learns less and less about more and more until he knows nothing about everything. There is truth in this clever crack, but, as Niels Bohr impressed, while the opposite of a trivial truth is false, the opposite of a great truth is another great truth.

I would say that applies to the flip side of the above flip takedown: Science’s eye for detail, buttressed by philosophy’s broad view, makes for a kind of alembic, an antidote to both. This intellectual electrum cuts the cloying taste of idealist and propositional philosophy with the sharp nectar of fact yet softens the edges of a technoscience that has arguably lost both its moral and its epistemological compass, the result in part of its being funded by governments and corporations whose relationship to the search for truth and its open dissemination can be considered problematic at best.

Sagan refutes the popular perception of science as rationally objective, a vessel of capital-T Truth, reminding us that every scientific concept and theory was birthed by a subjective, fallible human mind:

All observations are made from distinct places and times, and in science no less than art or philosophy by particular individuals. … Although philosophy isn’t fiction, it can be more personal, creative and open, a kind of counterbalance for science even as it argues that science, with its emphasis on a kind of impersonal materialism, provides a crucial reality check for philosophy and a tendency to overtheorize that [is] inimical to the scientific spirit. Ideally, in the search for truth, science and philosophy, the impersonal and autobiographical, can “keep each other honest,” in a kind of open circuit. Philosophy as the underdog even may have an advantage, because it’s not supposed to be as advanced as science, nor does it enjoy science’s level of institutional support — or the commensurate heightened risks of being beholden to one’s benefactors.

Like Richard Feynman, who argued tirelessly for the scientist’s responsibility to remain unsure, Sagan echoes the idea that willful ignorance is what drives science and the fear of being wrong is one of its greatest hindrances:

Science’s spirit is philosophical. It is the spirit of questioning, of curiosity, of critical inquiry combined with fact-checking. It is the spirit of being able to admit you’re wrong, of appealing to data, not authority, which does not like to admit it is wrong.

Sagan reflects on his father’s conviction that “the effort to popularize science is a crucial one for society,” one he shared with Richard Feynman, and what made Carl’s words echo as profoundly and timelessly as they do:

Science and philosophy both had a reputation for being dry, but my father helped inject life into the former, partly by speaking in plain English and partly by focusing on the science fiction fantasy of discovering extraterrestrial life.

In that respect, science could learn from philosophy’s intellectual disposition:

Philosophy today, not taught in grade school in the United States, is too often merely an academic pursuit, a handmaiden or apologetics of science, or else a kind of existential protest, a trendy avocation of grad students and the dark-clad coffeehouse set. But philosophy, although it historically gives rise to experimental science, sometimes preserves a distinct mode of sustained questioning that sharply distinguishes it from modern science, which can be too quick to provide answers.

[…]

Philosophy is less cocksure, less already-knowing, or should be, than the pundits’ diatribes that relieve us of the difficulties of not knowing, of carefully weighing, of looking at the other side, of having to think things through for ourselves. Dwell in possibility, wrote Emily Dickinson: Philosophy at its best seems a kind of poetry, not an informational delivery but a dwelling, an opening of our thoughts to the world.

Like Buckminster Fuller, who vehemently opposed specialization, Sagan attests to the synergetic value of intellectual cross-pollination, attesting to the idea that true breakthroughs in science require cross-disciplinary connections and originality consists of linking up ideas whose connection was not previously suspected:

It is true that science requires analysis and that it has fractured into microdisciplines. But because of this, more than ever, it requires synthesis. Science is about connections. Nature no more obeys the territorial divisions of scientific academic disciplines than do continents appear from space to be colored to reflect the national divisions of their human inhabitants. For me, the great scientific satoris, epiphanies, eurekas, and aha! moments are characterized by their ability to connect.

“In disputes upon moral or scientific points,” advised Martine in his wonderful 1866 guide to the art of conversation, “ever let your aim be to come at truth, not to conquer your opponent. So you never shall be at a loss in losing the argument, and gaining a new discovery.” Science, Sagan suggests — at least at its most elegant — is a conversation of constant revision, where each dead end brings to life a new fruitful question:

Theories are not only practical, and wielded like intellectual swords to the death … but beautiful. A good one is worth more than all the ill-gotten hedge fund scraps in the world. A good scientific theory shines its light, revealing the world’s fearful symmetry. And its failure is also a success, as it shows us where to look next.

Supporting Neil deGrasse Tyson’s contention that intelligent design is a philosophy of ignorance, Sagan applies this very paradigm of connection-making to the crux of the age-old science vs. religion debate, painting evolution not as a tool of certitude but as a reminder of our connectedness to everything else:

Connecting humanity with other species in a single process was Darwin’s great natural historical accomplishment. It showed that some of the issues relegated to religion really come under the purview of science. More than just a research program for technoscience, it provides a eureka moment, a subject of contemplation open in principle to all thinking minds. Beyond the squabbles over its mechanisms and modes, evolution’s epiphany derives from its widening of vistas, its showing of the depths of our connections to others from whom we’d thought we were separate. Philosophy, too … in its ancient, scientifico-genic spirit of inquiry so different from a mere, let alone peevish, recounting of facts, needs to be reconnected to science for the latter to fulfill its potential not just as something useful but as a source of numinous moments, deep understanding, and indeed, religious-like epiphanies of cosmic comprehension and aesthetic contemplation.

Originally featured in April — see more here.

8. SOCIAL

“Without the sense of fellowship with men of like mind,” Einstein wrote, “life would have seemed to me empty.” It is perhaps unsurprising that the iconic physicist, celebrated as “the quintessential modern genius,” intuited something fundamental about the inner workings of the human mind and soul long before science itself had attempted to concretize it with empirical evidence. Now, it has: In Social: Why Our Brains Are Wired to Connect (public library), neuroscientist Matthew D. Lieberman, director of UCLA’s Social Cognitive Neuroscience lab, sets out to “get clear about ‘who we are’ as social creatures and to reveal how a more accurate understanding of our social nature can improve our lives and our society. Lieberman, who has spent the past two decades using tools like fMRI to study how the human brain responds to its social context, has found over and over again that our brains aren’t merely simplistic mechanisms that only respond to pain and pleasure, as philosopher Jeremy Bentham famously claimed, but are instead wired to connect. At the heart of his inquiry is a simple question: Why do we feel such intense agony when we lose a loved one? He argues that, far from being a design flaw in our neural architecture, our capacity for such overwhelming grief is a vital feature of our evolutionary constitution:

The research my wife and I have done over the past decade shows that this response, far from being an accident, is actually profoundly important to our survival. Our brains evolved to experience threats to our social connections in much the same way they experience physical pain. By activating the same neural circuitry that causes us to feel physical pain, our experience of social pain helps ensure the survival of our children by helping to keep them close to their parents. The neural link between social and physical pain also ensures that staying socially connected will be a lifelong need, like food and warmth. Given the fact that our brains treat social and physical pain similarly, should we as a society treat social pain differently than we do? We don’t expect someone with a broken leg to “just get over it.” And yet when it comes to the pain of social loss, this is a common response. The research that I and others have done using fMRI shows that how we experience social pain is at odds with our perception of ourselves. We intuitively believe social and physical pain are radically different kinds of experiences, yet the way our brains treat them suggests that they are more similar than we imagine.

Citing his research, Lieberman affirms the notion that there is no such thing as a nonconformist, pointing out the social construction of what we call our individual “selves” — empirical evidence for what the novelist William Gibson so eloquently termed one’s “personal micro-culture” — and observes “our socially malleable sense of self”:

The neural basis for our personal beliefs overlaps significantly with one of the regions of the brain primarily responsible for allowing other people’s beliefs to influence our own. The self is more of a superhighway for social influence than it is the impenetrable private fortress we believe it to be.

Contextualizing it in a brief evolutionary history, he argues that this osmosis of sociality and individuality is an essential aid in our evolutionary development rather than an aberrant defect in it:

Our sociality is woven into a series of bets that evolution has laid down again and again throughout mammalian history. These bets come in the form of adaptations that are selected because they promote survival and reproduction. These adaptations intensify the bonds we feel with those around us and increase our capacity to predict what is going on in the minds of others so that we can better coordinate and cooperate with them. The pain of social loss and the ways that an audience’s laughter can influence us are no accidents. To the extent that we can characterize evolution as designing our modern brains, this is what our brains were wired for: reaching out to and interacting with others. These are design features, not flaws. These social adaptations are central to making us the most successful species on earth.

The implications of this span across everything from the intimacy of our personal relationships to the intricacy of organizational management and teamwork. But rather than entrusting a single cognitive “social network” with these vital functions, our brains turn out to host many. Lieberman explains:

Just as there are multiple social networks on the Internet such as Facebook and Twitter, each with its own strengths, there are also multiple social networks in our brains, sets of brain regions that work together to promote our social well-being.

These networks each have their own strengths, and they have emerged at different points in our evolutionary history moving from vertebrates to mammals to primates to us, Homo sapiens. Additionally, these same evolutionary steps are recapitulated in the same order during childhood.

He goes on to explore three major adaptations that have made us so inextricably responsive to the social world:

  • Connection: Long before there were any primates with a neocortex, mammals split off from other vertebrates and evolved the capacity to feel social pains and pleasures, forever linking our well-being to our social connectedness. Infants embody this deep need to stay connected, but it is present through our entire lives.
  • Mindreading: Primates have developed an unparalleled ability to understand the actions and thoughts of those around them, enhancing their ability to stay connected and interact strategically. In the toddler years, forms of social thinking develop that outstrip those seen in the adults of any other species. This capacity allows humans to create groups that can implement nearly any idea and to anticipate the needs and wants of those around us, keeping our groups moving smoothly.
  • Harmonizing: The sense of self is one of the most recent evolutionary gifts we have received. Although the self may appear to be a mechanism for distinguishing us from others and perhaps accentuating our selfishness, the self actually operates as a powerful force for social cohesiveness. During the preteen and teenage years, adolescent refers to the neural adaptations that allow group beliefs and values to influence our own.

Originally featured in November — see more here, including Liberman’s fantastic TEDxStLouis talk.

9. GULP

Few writers are able to write about science in a way that’s provocative without being sensationalistic, truthful without being dry, enchanting without being forced — and even fewer are able to do so on subjects that don’t exactly lend themselves to Saganesque whimsy. After all, it’s infinitely easier to inspire awe while discussing the bombastic magnificence of the cosmos than, say, the function of bodily fluids and the structures that secrete them. But Mary Roach is one of those rare writers, and that’s precisely what she proves once more in Gulp: Adventures on the Alimentary Canal (public library) — a fascinating tour of the body’s most private hydraulics.

Roach writes in the introduction:

The early anatomists had that curiosity in spades. They entered the human form like an unexplored continent. Parts were named like elements of geography: the isthmus of the thyroid, the isles of the pancreas, the straits and inlets of the pelvis. The digestive tract was for centuries known as the alimentary canal. How lovely to picture one’s dinner making its way down a tranquil, winding waterway, digestion and excretion no more upsetting or off-putting than a cruise along the Rhine. It’s this mood, these sentiments — the excitement of exploration and the surprises and delights of travel to foreign locales — that I hope to inspire with this book.

It may take some doing. The prevailing attitude is one of disgust. … I remember, for my last book, talking to the public-affairs staff who choose what to stream on NASA TV. The cameras are often parked on the comings and goings of Mission Control. If someone spots a staffer eating lunch at his desk, the camera is quickly repositioned. In a restaurant setting, conviviality distracts us from the biological reality of nutrient intake and oral processing. But a man alone with a sandwich appears as what he is: an organism satisfying a need. As with other bodily imperatives, we’d rather not be watched. Feeding, and even more so its unsavory correlates, are as much taboos as mating and death.

The taboos have worked in my favor. The alimentary recesses hide a lode of unusual stories, mostly unmined. Authors have profiled the brain, the heart, the eyes, the skin, the penis and the female geography, even the hair, but never the gut. The pie hole and the feed chute are mine.

Roach goes on to bring real science to those subjects that make teenagers guffaw and that populate mediocre standup jokes, exploring such bodily mysteries as what flatulence research reveals about death, why tasting has little to do with taste, how thorough chewing can lower the national debt, and why we like the foods we like and loathe the rest.

10. WONDERS OF THE UNIVERSE

“I know that I am mortal by nature and ephemeral,” ur-astronomer Ptolemy contemplated nearly two millennia ago, “but when I trace at my pleasure the windings to and fro of the heavenly bodies, I no longer touch earth with my feet. I stand in the presence of Zeus himself and take my fill of ambrosia.” But while the cosmos has fascinated humanity since the dawn of time, its mesmerism isn’t that of an abstract other but, rather, the very self-reflexive awareness that Ptolemy attested to, that intimate and inextricable link between the wonders of life here on Earth and the magic we’ve always found in our closest cosmic neighbors.

That’s precisely what modern-day science-enchanter Brian Cox explores in Wonders of the Solar System (public library) — the fantastic and illuminating book based on his BBC series of the same title celebrating the spirit of exploration, and a follow-up to his Wonders of Life and every bit as brimming with his signature blend of enthralling storytelling, scientific brilliance, and contagious conviction.

Cox begins by reminding us that preserving the spirit of exploration is both a joy and a moral obligation — especially at a time when it faces tragic threats of indifference and neglect from the very authorities whose job it is to fuel it, despite a citizenry profoundly in love with the ethos of exploration:

[The spirit of exploration] is desperately relevant, an idea so important that celebration is perhaps too weak a word. It is a plea for the spirit of the navigators of the seas and the pioneers of aviation and spaceflight to be restored and cherished; a case made to the viewer and reader that reaching for worlds beyond our grasp is an essential driver of progress and necessary sustenance for the human spirit. Curiosity is the rocket fuel that powers our civilization. If we deny this innate and powerful urge, perhaps because earthly concerns seem more worthy or pressing, then the borders of our intellectual and physical domain will shrink with our ambitions. We are part of a much wider ecosystem, and our prosperity and even long-term survival are contingent on our understanding of it.

But most revelational of all is Cox’s gift from illustrating what our Earthly phenomena, right here on our seemingly ordinary planet, reveal about the wonders and workings of the Solar System.

Tornadoes, for instance, tell us how our star system was born — the processes that drive these giant rotating storms obey the same physics forces that caused clumps to form at the center of nebulae five billion years ago, around which the gas cloud collapsed and began spinning ever-faster, ordering the chaos, until the early Solar System was churned into existence. This universal principle, known as the conservation of angular momentum, is also what drives a tornado’s destructive spiral.

Cox synthesizes:

This is how our Solar System was born: rather than the whole system collapsing into the Sun, a disc of dust and gas extending billions of kilometers into space formed around the new shining star. In just a few hundred million years, pieces of the cloud collapsed to form planets and moons, and so a star system, our Solar System, was formed. The journey from chaos into order had begun.

Then we have Iceland’s icebergs and glacial lagoons, which offer remarkable insight into the nature of Saturn’s rings. Both shine with puzzling brightness — the lagoons, here on Earth, by bringing pure water that is thousands of years old and free of pollutants from the bottom of the seabed to the surface as they rise, forming ice crystals of exceptional vibrance; Saturn’s rings, young and ever-changing, by circling icy ring particles around the planet, constantly crashing them together and breaking them apart, thus exposing bright new facets of ice that catch the sunlight and dazzle amidst a Solar System that is otherwise “a very dirty place.”

Cox explains:

It’s difficult to imagine the scale, beauty and intricacy of Saturn’s rings here on Earth, but the glacial lagoons of Iceland can transport our minds across millions of kilometers of space and help us understand the true nature of the rings. … At first sight, the lagoon appears to be a solid sheet of pristine ice,but this is an illusion. The surface is constantly shifting, an almost organic, every-changing raft of thousands of individual icebergs floating on the water. The structure of Saturn’s rings is similar, because despite appearances the rings aren’t solid. Each ring is made up of hundreds of ringlets and each ringlet is made up of billions of separate pieces. Captured by Saturn’s gravity, the ring particles independently orbit the panel in an impossibly thin layer.

Cox goes on to explore other such illuminating parallels, from how Alaska’s Lake Eyak illustrate the methane cycles of the universe to what Hawaii’s Big Island tells us about the forces that keep any planet alive to how the volcanic features of India’s Deccan Traps explain why Venus choked to death. He ends with T. S. Eliot’s timeless verses on the spirit of exploration and echoes Neil deGrasse Tyson’s wisdom on your ego and the cosmic perspective, concluding:

You could take the view that our exploration of the Universe has made us somehow insignificant; one tiny planet around one star amongst hundreds of billions. But I don’t take that view, because we’ve discovered that it takes the rarest combination of chance and the laws of Nature to produce a planet that can support a civilization, that most magnificent structure that allows us to explore and understand the Universe. That’s why, for me, our civilization is the wonder of the Solar System, and if you were to be looking at the Earth from outside the Solar System that much would be obvious. We have written the evidence of our existence onto the surface of our planet. Our civilization has become a beacon that identifies our planet as a home to life.

Originally featured in August — see more here.

11. SAVE OUR SCIENCE

“What is crucial is not that technical ability, but it is imagination in all of its applications,” the great E. O. Wilson offered in his timeless advice to young scientists — a conviction shared by some of history’s greatest scientific minds. And yet it is rote memorization and the unimaginative application of technical skill that our dominant education system prioritizes — so it’s no wonder it is failing to produce the Edisons and Curies of our day. In Save Our Science: How to Inspire a New Generation of Scientists, materials scientist, inventor, and longtime Yale professor Ainissa Ramirez takes on a challenge Isaac Asimov presaged a quarter century ago, advocating for the value of science education and critiquing its present failures, with a hopeful and pragmatic eye toward improving its future. She writes in the introduction:

The 21st century requires a new kind of learner — not someone who can simply churn out answers by rote, as has been done in the past, but a student who can think expansively and solve problems resourcefully.

To do that, she argues, we need to replace the traditional academic skills of “reading, ’riting, and ’rithmetic” with creativity, curiosity, critical-thinking, and problem-solving. (Though, as psychology has recently revealed, problem-finding might be the more valuable skill.)

Ainissa Ramirez at TED 2012 (Photograph: James Duncan Davidson for TED)

She begins with the basics:

While the acronym STEM sounds very important, STEM answers just three questions: Why does something happen? How can we apply this knowledge in a practical way? How can we describe what is happening succinctly? Through the questions, STEM becomes a pathway to be curious, to create, and to think and figure things out.

Even for those of us who deem STEAM (wherein the A stands for “arts”) superior to STEM, Ramirez’s insights are razor-sharp and consistent with the oft-affirmed idea that creativity relies heavily upon connecting the seemingly disconnected and aligning the seemingly misaligned:

There are two schools of thought on defining creativity: divergent thinking, which is the formation of a creative idea resulting from generating lots of ideas, and a Janusian approach, which is the act of making links between two remote ideas. The latter takes its name from the two-faced Roman god of beginnings, Janus, who was associated with doorways and the idea of looking forward and backward at the same time. Janusian creativity hinges on the belief that the best ideas come from linking things that previously did not seem linkable. Henri Poincaré, a French mathematician, put it this way: ‘To create consists of making new combinations. … The most fertile will often be those formed of elements drawn from domains which are far apart.’

Another element inherent to the scientific process but hardly rewarded, if not punished, in education is the role of ignorance, or what the poet John Keats has eloquently and timelessly termed “negative capability” — the art of brushing up against the unknown and proceeding anyway. Ramirez writes:

My training as a scientist allows me to stare at an unknown and not run away, because I learned that this melding of uncertainty and curiosity is where innovation and creativity occur.

Yet these very qualities are missing from science education in the United States — and it shows. When the Programme for International Student Assessment (PISA) took their annual poll in 2006, the U.S. ranked 35th in math and 29th in science out of the 40 high-income, developed countries surveyed.

Average PISA scores versus expenditures for selected countries (Source: Organisation for Economic Co-operation and Development)

Ramirez offers a historical context: When American universities first took root in the colonial days, their primary role was to educate men for the clergy, so science, technology, and math were not a priority. But then Justin Smith Morrill, a little-known congressman from Vermont who had barely completed his high school education, came along in 1861 and quietly but purposefully sponsored legislation that forever changed American education, resulting in more than 70 new colleges and universities that included STEM subjects in their curricula. This catapulted enrollment rates from the mere 2% of the population who attended higher education prior to the Civil War and greatly increased diversity in academia, with the act’s second revision in 1890 extending education opportunities to women and African-Americans.

The growth of U.S. college enrollment from 1869 to 1994. (Source: S. B. Carter et al., Historical Statistics of the United States)

But what really propelled science education, Ramirez notes, was the competitive spirit of the Space Race:

The mixture of being outdone and humiliated motivated the U.S. to create NASA and bolster the National Science Foundation’s budget to support science research and education. Sputnik forced the U.S. to think about its science position and to look hard into a mirror — and the U.S. did not like what it saw. In 1956, before Sputnik, the National Science Foundation’s budget was a modest $15.9 million. In 1958, it tripled to $49.5 million, and it doubled again in 1959 to $132.9 million. The space race was on. We poured resources, infrastructure, and human capital into putting an American on the moon, and with that goal, STEM education became a top priority.

President John F. Kennedy addresses a crowd of 35,000 at Rice University in 1962, proclaiming again his desire to reach the moon with the words, 'We set sail on this new sea because there is new knowledge to be gained.' Credit: NASA / Public domain

Ramirez argues for returning to that spirit of science education as an investment in national progress:

The U.S. has a history of changing education to meet the nation’s needs. We need similar innovative forward-thinking legislation now, to prepare our children and our country for the 21st century. Looking at our history allows us to see that we have been here before and prevailed. Let’s meet this challenge, for it will, as Kennedy claimed, draw out the very best in all of us.

In confronting the problems that plague science education and the public’s relationship with scientific culture, Ramirez points to the fact that women account for only 26% of STEM bachelor’s degrees and explores the heart of the glaring gender problem:

[There is a] false presumption that girls are not as good as boys in science and math. This message absolutely pervades our national mindset. Even though girls and boys sit next to each other in class, fewer women choose STEM careers than men. This is the equivalent to a farmer sowing seeds and then harvesting only half of the fields.

The precipitous drop in girls’ enrollment in STEM classes. (Source: J. F. Latimer, What’s Happened To Our High Schools)

In turning toward possible solutions, Ramirez calls out the faulty models of standardized testing, which fail to account for more dimensional definitions of intelligence. She writes:

There is a concept in physics that the observer of an experiment can change the results just by the act of observing (this is called, not surprisingly, the observer effect). For example, knowing the required pressure of your tires and observing that they are overinflated dictates that you let some air out, which changes the pressure slightly.

Although this theory is really for electrons and atoms, we also see it at work in schools. Schools are evaluated, by the federal and state governments, by tests. The students are evaluated by tests administered by the teachers. It is the process of testing that has changed the mission of the school from instilling a wide knowledge of the subject matter to acquiring a good score on the tests.

The United States is one of the most test-taking countries in the world, and the standard weapon is the multiple-choice question. Although multiple-choice tests are efficient in schools, they don’t inspire learning. In fact, they do just the opposite. This is hugely problematic in encouraging the skills needed for success in the 21st century. Standardized testing teaches skills that are counter to skills needed for the future, such as curiosity, problem solving, and having a healthy relationship with failure. Standardized tests draw up a fear of failure, since you seek a specific answer and you will be either right or wrong; they kick problem solving in the teeth, since you never need to show your work and never develop a habit of figuring things out; and they slam the doors to curiosity, since only a small selection of the possible answers is laid out before you. These kinds of tests produce thinkers who are unwilling to stretch and take risks and who cannot handle failure. They crush a sense of wonder.

Like Noam Chomsky, who has questioned why schools train for passing tests rather than for creative inquiry, and Sir Ken Robinson, who has eloquently advocated for changing the factory model of education, Ramirez urges:

While scientists passionately explore, reason, discover, synthesize, compare, contrast, and connect the dots, students drudgingly memorize, watch, and passively consume. Students are exercising the wrong muscle. An infusion of STEM taught in compelling ways will give students an opportunity to acquire these active learning skills.

Ramirez goes on to propose a multitude of small changes and larger shifts that communities, educators, cities, institutions, and policy-makers could implement — from neighborhood maker-spaces to wifi hotspots on school buses to university science festivals to new curricula and testing methods — that would begin to bridge the gap between what science education currently is and what scientific culture could and should be. She concludes, echoing Alvin Toffler’s famous words that “the illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn”:

The skills of the 21st century need us to create scholars who can link the unlinkable. … Nurturing curious, creative problem solvers who can master the art of figuring things out will make them ready for this unknown brave new world. And that is the best legacy we can possibly leave.

Originally featured in February — see more here.

12. THE ELEMENTS OF EUCLID

Almost a century before Mondrian made his iconic red, yellow, and blue geometric compositions, and around the time that Edward Livingston Youmans was creating his stunning chemistry diagrams, an eccentric 19th-century civil engineer and mathematician named Oliver Byrne produced a striking series of vibrant diagrams in primary colors for a 1847 edition of the legendary Greek mathematical treatise Euclid’s Elements. Byrne, a vehement opponent of pseudoscience with an especial distaste phrenology, was early to the insight that great design and graphic elegance can powerfully aid learning. He explained that in his edition of Euclid, “coloured diagrams and symbols are used instead of letters for the greater ease of learners.” The book, a masterpiece of Victorian printing and graphic design long before “graphic design” existed as a discipline, is celebrated as one of the most unusual and most beautiful books of the 19th century.

Now, the fine folks of Taschen — who have brought us such visual treasures as the best illustrations from 150 years of Hans Christian Andersen, the life and legacy of infographics godfather Fritz Kahn, and the visual history of magic — are resurrecting Byrne’s gem in the lavish tome The First Six Books of the Elements of Euclid (public library), edited by Swiss polymath Werner Oechslin.

Proof of the Pythagorean theorem

A masterwork of art and science in equal measure, this newly rediscovered treasure mesmerizes the eye with its brightly colored circles, squares, and triangles while it tickles the brain with its mathematical magic.

Originally featured in November — see more here.

13. DOES MY GOLDFISH KNOW WHO I AM?

In 2012, I wrote about a lovely book titled Big Questions from Little People & Simple Answers from Great Minds, in which some of today’s greatest scientists, writers, and philosophers answer kids’ most urgent questions, deceptively simple yet profound. It went on to become one of the year’s best books and among readers’ favorites. A few months later, Gemma Elwin Harris, the editor who had envisioned the project, reached out to invite me to participate in the book’s 2013 edition by answering one randomly assigned question from a curious child. Naturally, I was thrilled to do it, and honored to be a part of something as heartening as Does My Goldfish Know Who I Am? (public library), also among the best children’s books of the year — a compendium of primary school children’s funny, poignant, innocent yet insightful questions about science and how life works, answered by such celebrated minds as rockstar physicist Brian Cox, beloved broadcaster and voice-of-nature Sir David Attenborough, legendary linguist Noam Chomsky, science writer extraordinaire Mary Roach, stat-showman Hans Rosling, Beatle Paul McCartney, biologist and Beagle Project director Karen James, and iconic illustrator Sir Quentin Blake. As was the case with last year’s edition, more than half of the proceeds from the book — which features illustrations by the wonderful Andy Smith — are being donated to a children’s charity.

The questions range from what the purpose of science is to why onions make us cry to whether spiders can speak to why we blink when we sneeze. Psychologist and broadcaster Claudia Hammond, who recently explained the fascinating science of why time slows down when we’re afraid, speeds up as we age, and gets all warped while we’re on vacation in one of the best psychology and philosophy books of 2013, answers the most frequently asked question by the surveyed children: Why do we cry?

It’s normal to cry when you feel upset and until the age of twelve boys cry just as often as girls. But when you think about it, it is a bit strange that salty water spills out from the corners of your eyes just because you feel sad.

One professor noticed people often say that, despite their blotchy faces, a good cry makes them feel better. So he did an experiment where people had to breathe in over a blender full of onions that had just been chopped up. Not surprisingly this made their eyes water. He collected the tears and put them in the freezer. Then he got people to sit in front of a very sad film wearing special goggles which had tiny buckets hanging off the bottom, ready to catch their tears if they cried. The people cried, but the buckets didn’t work and in the end he gathered their tears in tiny test tubes instead.

He found that the tears people cried when they were upset contained extra substances, which weren’t in the tears caused by the onions. So he thinks maybe we feel better because we get rid of these substances by crying and that this is the purpose of tears.

But not everyone agrees. Many psychologists think that the reason we cry is to let other people know that we need their sympathy or help. So crying, provided we really mean it, brings comfort because people are nice to us.

Crying when we’re happy is a bit more of a mystery, but strong emotions have a lot in common, whether happy or sad, so they seem to trigger some of the same processes in the body.

(For a deeper dive into the biological mystery of crying, see the science of sobbing and emotional tearing.)

Joshua Foer, who knows a thing or two about superhuman memory and the limits of our mind, explains to 9-year-old Tom how the brain can store so much information despite being that small:

An adult’s brain only weighs about 1.4 kilograms, but it’s made up of about 100 billion microscopic neurons. Each of those neurons looks like a tiny branching tree, whose limbs reach out and touch other neurons. In fact, each neuron can make between 5,000 and 10,000 connections with other neurons — sometimes even more. That’s more than 500 trillion connections! A memory is essentially a pattern of connections between neurons.

Every sensation that you remember, every thought that you think, transforms your brain by altering the connections within that vast network. By the time you get to the end of this sentence, you will have created a new memory, which means your brain will have physically changed.

Neuroscientist Tali Sharot, who has previously studied why our brains are wired for optimism, answers 8-year-old Maia’s question about why we don’t have memories from the time we were babies and toddlers:

We use our brain for memory. In the first few years of our lives, our brain grows and changes a lot, just like the rest of our body. Scientists think that because the parts of our brain that are important for memory have not fully developed when we are babies, we are unable to store memories in the same way that we do when we are older.

Also, when we are very young we do not know how to speak. This makes it difficult to keep events in your mind and remember them later, because we use language to remember what happened in the past.

In answering 8-year-old Hannah’s question about what newspapers do when there is no news, writer and journalist Oliver Burkeman, author of the excellent The Antidote: Happiness for People Who Can’t Stand Positive Thinking, offers a primer on media literacy — an important caveat on news that even we, as alleged grown-ups, frequently forget:

Newspapers don’t really go out and find the news: they decide what gets to count as news. The same goes for television and radio. And you might disagree with their decisions! (For example, journalists are often accused of focusing on bad news and ignoring the good, making the world seem worse than it is.)

The important thing to remember, whenever you’re reading or watching the news, is that someone decided to tell you those things, while leaving out other things. They’re presenting one particular view of the world — not the only one. There’s always another side to the story.

And my answer, to 9-year-old Ottilie’s question about why we have books:

Some people might tell you that books are no longer necessary now that we have the internet. Don’t believe them. Books help us know other people, know how the world works, and, in the process, know ourselves more deeply in a way that has nothing to with what you read them on and everything to do with the curiosity, integrity and creative restlessness you bring to them.

Books build bridges to the lives of others, both the characters in them and your countless fellow readers across other lands and other eras, and in doing so elevate you and anchor you more solidly into your own life. They give you a telescope into the minds of others, through which you begin to see with ever greater clarity the starscape of your own mind.

And though the body and form of the book will continue to evolve, its heart and soul never will. Though the telescope might change, the cosmic truths it invites you to peer into remain eternal like the Universe.

In many ways, books are the original internet — each fact, each story, each new bit of information can be a hyperlink to another book, another idea, another gateway into the endlessly whimsical rabbit hole of the written word. Just like the web pages you visit most regularly, your physical bookmarks take you back to those book pages you want to return to again and again, to reabsorb and relive, finding new meaning on each visit — because the landscape of your life is different, new, “reloaded” by the very act of living.

Originally featured in November — read more of the questions and answers here.

HONORABLE MENTIONS

The Space Book: From the Beginning to the End of Time, 250 Milestones in the History of Space & Astronomy by Jim Bell, An Appetite for Wonder: The Making of a Scientist by Richard Dawkins, and The Age of Edison: Electric Light and the Invention of Modern America by Ernest Freeberg.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.