Brain Pickings Icon
Brain Pickings

Search results for “Thoreau and the Language of Trees ”

Creative Courage for Young Hearts: 15 Emboldening Picture Books Celebrating the Lives of Great Artists, Writers, and Scientists

Jane Goodall, Julia Child, Pablo Neruda, Marie Curie, E.E. Cummings, Albert Einstein, Ella Fitzgerald, Antoine de Saint-Exupéry, Frida Kahlo, and more.

UPDATE: Also see other recently added picture-book biographies of Maria Mitchell, Ada Lovelace, Louise Bourgeois, Wangari Maathai, Virginia Woolf, Galileo, Nellie Bly, Paul Erdos, Louis Braille, Mary Lou Williams, John Lewis, Muddy Waters, Paul Gauguin, and Jane Jacobs.

Margaret Mead extolled the value of “spiritual and mental ancestors” in how we form our identity — those people to whom we aren’t related but whose values we try to cultivate in ourselves; role models we seek out not from our immediate genetic pool but from the pool of culture the surrounds us, past and present. Seneca saw in reading, one of the oldest and most reliable ways to identify and contact these cultural ancestors, a way of being adopted into the “households of the noblest intellects.” And what better time to meet such admirable models of personhood than in childhood, that fertile seedbed for the flowering of values and identity?

Collected here are thirteen wonderful picture-books celebrating such worthwhile “spiritual and mental ancestors.” It is, of course, an incomplete reading list, yet it is a deliberate one — a great many such books exist, but few feature the trifecta of wonderfulness: a cultural icon notable for his or her lasting contribution to humanity beyond mere fame; an intelligent and nuanced life-story lovingly told; and beautiful, imaginative illustrations rewarding in their own right. Please enjoy.

JANE GOODALL

“One should want only one thing and want it constantly,” young André Gide half-observed, half-resolved in his journal. “Then one is sure of getting it.” More than a century later, Werner Herzog wrote passionately of the “uninvited duty” that a sense of purpose plants in the heart, leaving one with “no choice but to push on.” That combination of desiring something with inextinguishable intensity — which begins with letting your life speak and daring to listen — and pursuing it with steadfast doggedness is perhaps the single common thread in the lives of those we most admire as luminaries of enduring genius. It is also at the heart of what it means to find your purpose and live it.

In Me…Jane (public library), celebrated cartoonist, author, and animal rights advocate Patrick McDonnell chronicles the early life of pioneering primatologist Jane Goodall (b. April 3, 1934) and tells the heartening story of how the seed planted by a childhood dream blossomed, under the generous beams of deep dedication, into the reality of a purposeful life.

McDonnell’s protagonist is not Jane Goodall the widely influential and wildly revered science and spiritualitysage of science and the human spirit — one of a handful of people in history to have both the titles Dame and Doctor — but little Jane, the ten-year-old girl who decided that she was going to work with animals in Africa when she grew up and, despite her family’s poverty, despite living in an era when girls were not encouraged to live the life of science or adventure, despite nearly everyone telling her that it was impossible, turned her dream into reality.

With simple, enormously expressive illustrations and an eloquent economy of words, McDonnell — creator of the beloved MUTTS comic strip — begins at the very beginning: that fateful day when little Jane was given a stuffed monkey named Jubilee.

Jane and Jubilee became inseparable, and she shared with him everything she loved — especially the outdoors. Together, they watched the birds and the spiders and the squirrels fill the backyard with aliveness.

At night, Jane and Jubilee read books to better understand what they saw.

One day, tickled to find out where eggs came from, they snuck into grandma’s chicken coop and observed the miracle of life.

It was a magical world full of joy and wonder, and Jane felt very much a part of it.

Jane liked to climb her beloved beech tree with Jubilee on her back, then sit perched on its branches reading and rereading Tarzan, imagining herself in place of that other Jane, wild and filled with wonder amid the jungles of Africa.

That dream soon became an all-consuming desire not just to go to Africa but to live there, trying to understand the animals and help them.

Every night Jane tucked Jubilee into bed and fell asleep with that dream, until one day — and such is the genius of McDonnell’s elegantly simple message of the dreamer’s doggedness — she awakes in a tent in the Gombe, the seedbed of what would become a remarkable career and an extraordinary life of purpose.

Goodall herself — who founded the heartening youth-led learning and community action initiative Roots & Shoots — writes in the afterword:

We cannot live through a single day without making an impact on the world around us — and we have a choice as to what sort of difference we make… Children are motivated when they can see the positive results their hard work can have.

See more, including a wonderful jazz tribute to Goodall, here.

PABLO NERUDA

Nobel laureate Pablo Neruda was not only one of the greatest poets in human history, but also a man of extraordinary insight into the human experience and the creative impulse — take, for instance, his remarkable reflection on what a childhood encounter taught him about why we make art, quite possibly the most beautiful metaphor for the creative impulse ever committed to paper.

His story and spirit spring alive in Pablo Neruda: Poet of the People (public library) by writer Monica Brown, with absolutely stunning illustrations and hand-lettering by artist Julie Paschkis.

The story begins with the poet’s birth in Chile in 1904 with the given name of Ricardo Eliecer Neftalí Reyes Basoalto — to evade his father’s disapproval of his poetry, he came up with the pen name “Pablo Neruda” at the age of sixteen when he first began publishing his work — and traces his evolution as a writer, his political awakening as an activist, his deep love of people and language and the luminosity of life.

Neftalí wasn’t very good at soccer or at throwing acorns like his friends, but he loved to read and discovered magic between the pages.

Embedded in the story is a sweet reminder of what books do for the soul and a heartening assurance that creative genius isn’t the product of conforming to common standards of excellence but of finding one’s element.

In fact, the book is as much a celebration of Neruda as it is a love letter to language itself — swirling through Paschkis’s vibrant illustrations are words both English and Spanish, beautiful words like “fathom” and “plummet” and “flicker” and “sigh” and “azul.”

Originally featured here.

E.E. CUMMINGS

“In a Cummings poem,” Susan Cheever wrote in her spectacular biography of E. E. Cummings, “the reader must often pick his way toward comprehension, which comes, when it does, in a burst of delight and recognition.” Such a burst is what rewards the reader, whatever his or her age, in Enormous Smallness: A Story of E. E. Cummings (public library) — an uncommonly delightful picture-book celebration of Cummings’s life by Brooklyn-based poet Matthew Burgess, illustrated by Kris Di Giacomo (the artist behind the wonderful alphabet book Take Away the A).

To reimagine the beloved poet’s life in a tango of word and image is quite befitting — unbeknownst to many, Cummings had a passion for drawing and once described himself as “an author of pictures, a draughtsman of words.”

The project comes from Brooklyn-based indie powerhouse Enchanted Lion Books — publisher of some of the most daring and tender children’s books of our time — and was first envisioned by ELB founder Claudia Zoe Bedrick, who approached Burgess about writing a children’s biography of Cummings. Miraculously, Burgess had visited Cummings’s home at 4 Patchin Place in New York City three years earlier, after a serendipitous encounter with the current resident — an experience that had planted a seed of quietly germinating obsession with the legendary poet’s life.

And so the collaboration stretched between them, as Cummings might say, like “a pleasant song” — Burgess and Bedrick worked side by side for four years to bring this wonder of a book to life.

The story begins with Cummings, already known as “E. E.” and living in his New York City home where he spent the last forty years of his life, typing away as the love of his life, the fashion model and photographer Marion Moorehouse, summons him to tea-time with an elephant-shaped bell.

From there, Burgess takes the reader on an affectionate biographical detective story, tracing how Edward Estlin became E. E., what brought him to Manhattan from his native Cambridge, and how elephants (and trees, and birds) became his lifelong creative companions in the circus of his imagination.

Young Estlin’s first poem “poured out of his mouth when he was only three.”

With the loving support of the unsung champions with whom the history of creative culture is strewn — the mother who began recording his spontaneous recitations in a little book titled “Estlin’s Original Poems”; the father who stomped on his hands and knees, play-pretending into existence the mighty elephant that was little Estlin’s creative muse; the teacher who encouraged him to pursue his love of words; the uncle who gave him a book on how to write poetry — he eventually made it to Harvard.

There, he came upon the words of his favorite poet, John Keats — “I am certain of nothing but the holiness of the Heart’s affections and the truth of the Imagination” — which awakened young Estlin’s creative courage. After graduation, he began experimenting with poetry and moved to New York City, falling in love with its “irresistibly stupendous newness.”

But then World War I struck and Estlin went to France, volunteering as an ambulance-driver. While working in the French countryside, he was mistaken for a spy and sent to prison for several months.

When the war ended, he wrote a book about his experience, titled The Enormous Room. Estlin was reborn as E. E.

The following year, he published his first book of poems, Tulips & Chimneys.

Burgess writes:

Using a style all his own,
e. e. put lowercase letters where capitals normally go,
and his playful punctuation grabbed readers’ attention.

His poems were alive with experimentation
and surprise!

And because of his love for lowercase letters,
his name began to appear with two little e’s (& a little c, too).

But his expansive experimentation was too much for the small-minded literary pantheon:

Some people criticized him for painting with words.
Other said his poems were
too strange
too small.
Some said they were
no good at all.

And yet Cummings, who viewed society’s criteria for what it means to be a successful artist with mischievous wryness, was undeterred. A century before Neil Gaiman’s memorable advice that the artist’s only appropriate response to criticism is to make good art, Cummings embodied this ethos. Burgess captures this spirit with quiet elegance, weaving one of Cummings’s poems into the story:

But no matter what the world was giving or taking,
E. E. went right on dreaming and making.
For inside, he knew his poems were new and true.

love is a place

love is a place
& through this place of
love move
(with brightness of peace)
all places

yes is a world
& in this world of
yes live
(skillfully curled)
all worlds.

His poems were his way
of saying YES.

YES to the heart
and the roundness of the moon,
to birds, elephants, trees,
and everything he loved.

YES to spring, too
which always brought him back
to childhood, when the first
sign of his favorite season
was the whistling arrival
of the balloon man.

The book’s epigraph is a celebration of this unflinching yes-saying: “It takes courage to grow up and become who you really are.”

With that courage he catapulted himself into the open arms of those who also hungered for beauty and meaning, and became one of the world’s most beloved poets — a capital-A Artist of his own lowercase making.

Originally featured here.

ALBERT EINSTEIN

Albert Einstein (March 14, 1879–April 18, 1955) may have eventually bequeathed some excellent advice on the secret to learning anything, but the great scientist himself didn’t learn one of the most basic human skills — speaking — until he was nearly four years old. On a Beam of Light: A Story of Albert Einstein (public library) by Jennifer Berne, illustrated by Vladimir Radunsky — the talent behind Mark Twain’s irreverent Advice to Little Girls — tells the tale of how an unusual and awkward child blossomed into becoming “the quintessential modern genius” by the sheer virtue of his unrelenting curiosity.

The story begins with Albert’s birth — a beautiful but odd baby boy who turns one and doesn’t say a word, turns two, then three, and nary a word.

Instead, he “just looked around with his big curious eyes,” wondering about the world. His parents worried that there might be something wrong, but loved him unconditionally. And then:

One day, when Albert was sick in bed, his father brought him a compass — a small round case with a magnetic needle inside. No matter which way Albert turned the compass, the needle always pointed north, as if held by an invisible hand. Albert was so amazed his body trembled.

Suddenly, he knew there were mysteries in the world — hidden and silent, unknown and unseen. He wanted, more than anything, to understand those mysteries.

This was that pivotal spark of curiosity that catapulted his young mind into a lifetime of exploring those mysteries. (One can’t help but wonder whether a similar child, today, would have a similar awakening of mind while beholding a smartphone’s fully automated GPS map. But, perhaps, that modern child would be developing a wholly different type of intelligence.)

Young Albert began asking countless questions at home and at school — so much so, that his teachers chastised him for being a disturbance, admonishing the little boy that he would get nowhere in life unless he learned to follow the rules and behave like the other kids. And yet the mysteries of the universe drew Albert deeper into inquiry.

One day, while riding his bicycle, he gazes at the rays of sunlight beaming from the Sun to the Earth and wonders what it would be like to ride on them, transporting himself into that fantasy:

It was the biggest, most exciting thought Albert had ever had. And it filled his mind with questions.

So he set out to answer them by burying himself in books, reading and discovering the poetry of numbers, that special secret language for decoding the mysteries of the universe.

Once he graduated from college, unable to find a teaching position, he settled for a low-key, quiet government job that allowed him to spend plenty of time with his thoughts and his mathematical explorations, pondering the everyday enigmas of life, until his thoughts coalesced into ideas that made sense of it all — ideas about atoms and motion and space and time. Soon, Albert became an internationally celebrated man of genius.

But with that came the necessary amount of eccentricity — or at least what seemed eccentric from the outside, but is in fact a vital part of any creative mind. Albert, for instance, liked to play his violin when he was having a hard time solving a particularly tricky problem — a perfect way to engage the incubation stage of the creative process, wherein the mind, engulfed in unconscious processing, makes “no effort of a direct nature” in order to later arrive at “sudden illumination.”

Some of his habits, however, were decidedly, and charmingly, quirky: He regularly wandered around town eating an ice-cream cone, and he preferred to wear no socks — not because he tried to be a pseudo-nonconformist, but because he “even chose his clothes for thinking,” often clad in his signature “comfy, old saggy-baggy sweaters and pants.”

Still, everywhere he went, he remained mesmerized by the mysteries of the universe, and the echoes of his thoughts framed much of our modern understanding of the world:

Albert’s ideas helped build spaceships and satellites that travel to the moon and beyond. His thinking helped us understand the universe as no one ever had before.

And yet the central message of this altogether wonderful picture-book is that despite his genius — or, perhaps, precisely because of it — Einstein’s greatest legacy to us isn’t all the answers he bequeathed but all the open questions he left for today’s young minds to grow up pondering. Because, after all, it is “thoroughly conscious ignorance” that drives science and our understanding of life.

The final spread, reminiscent of these illustrated morphologies of Susan Sontag’s favorite things and Ronald Barthes’s likes and dislikes, captures Einstein’s life in eight essentials:

Originally featured here.

ELLA FITZGERALD

From writer Roxanne Orgill and mixed-media artist Sean Qualls comes Skit-Scat Raggedy Cat: Ella Fitzgerald (public library) — the wonderfully illustrated rags-to-riches story of how The First Lady of Song sang her way from the streets of Yonkers to the cultural hall of fame, with a National Medal of Art, a Presidential Medal of Freedom, and thirteen Grammys, including one for Lifetime Achievement.

From how she cranked the phonograph as a little girl to hear the Boswell Sisters’ honey-voices to how she saved her nickels to take the train to Harlem “forty-five minutes and a world away” for an audition to how her early passion for dancing became a lifelong love affair with song, the story captures not only her journey to public stardom but also the private gleam of this beautiful soul’s inner starlight.

For a touch loveliness, interwoven throughout the biographical narrative are snippets of Fitzgerald’s most celebrated songs, extending to kids a warm invitation to discover the wonders of jazz — a modern-day counterpart to Langston Hughes’s vintage treasure The First Book of Jazz.

HENRI MATISSE

At 8PM on the last day of 1869, a little boy named Henri entered the world in a gray textile-mill town in the north of France, in a rundown two-room cottage with a leaky roof. He didn’t have much materially, but he was blessed with perhaps the greatest gift a child could have — an unconditionally loving, relentlessly supportive mother. Like many creative icons whose destinies were shaped by the unflinching encouragement of loved ones, little Henri became the great Henri Matisse thanks to his mother’s staunch support, which began with an unusual ignition spark: At the age of twenty, Henri was hospitalized for appendicitis and his mother brought him a set of art supplies with which to occupy his recovery. “From the moment I held the box of colors in my hands,” Matisse recounted, “I knew this was my life. I threw myself into it like a beast that plunges towards the thing it loves.” And that thing flowed from love, too — it was Matisse’s mother who encouraged her son, like E.E. Cummings encouraged all aspiring artists, to disregard the formal rules of art and instead paint from the heart. “My mother loved everything I did,” he asserted. Decades later, thanks to Gertrude Stein’s patronage, which catalyzed his career and sparked his friendship with Picasso, the world too would come to love what Matisse did.

In The Iridescence of Birds: A Book About Henri Matisse (public library), writer Patricia MacLachlan and illustrator Hadley Hooper tell the heartening story of young Henri’s childhood and how it shaped his artistic path long before he began painting — how his mother, in an attempt to brighten the drab and sunless days, put bright red rugs on the floors and painted colorful plates to hang on the walls, letting little Henri mix the paints; how his father gave him pigeons, whose iridescent plumage the boy observed with endless fascination; how the beautiful silks woven by the townspeople beguiled him with their bright patterns.

With a gentle sidewise gleam, the story offers a nuanced answer to the eternal nature-versus-nurture question of whether genius is born or made. Embedded in it is a wonderful testament to the idea that attentive presence rather than praise is the key to great parenting, especially when it comes to nurturing young talent. (Indeed, such maternal presence is what legendary editor Ursula Nordstrom provided for many of the young authors and artists — including, most notably, Maurice Sendak — whom she nurtured over the course of her reign as the twentieth century’s greatest patron saint of children’s books.)

For a delightful touch of empathy via a twist of perspective, MacLachlan places the reader in little Henri’s shoes:

If you were a boy named Henri Matisse who lived in a dreary town in northern France where the skies were gray

And the days were cold

And you wanted color and light

And sun,

And your mother, to brighten your days,

Painted plates to hang on the walls

With pictures of meadows and trees,

Rivers and birds,

And she let you mix the colors of paint…

… And you raised Pigeons

Watching their sharp eyes
And red feet,

And their colors that changed with the light
As they moved…

… Would it be a surprise that you became
A fine painter who painted
Light
and
Movement

And the iridescence of birds?

Beneath the biographical particulars of the story itself is MacLachlan’s larger inquiry into the enduring question of whether artists draw what they see or what they feel and remember — Matisse’s life, she writes in the afterword, attests to the fact that the two are inextricably entwined: “He painted his feelings and he painted his childhood.”

Hooper’s illustrations are themselves a masterwork of artistry, scholarship, and creative ingenuity. She spent considerable time studying Matisse’s sensibility and colors in reproductions of his drawings, cutouts, and paintings, then researched textile patterns from the era of his childhood and even used Google Maps to picture the actual streets that he walked as a little boy. The result is not imitation but dimensional celebration. Hooper reflects on the unusual and inventive technique she chose:

I decided to try relief printing, which forced me to simplify my shapes and allowed me to focus on the color and composition. I cut the characters and backgrounds out of stiff foam and cardboard, inked them up, made prints, and scanned the results into Photoshop. The approach felt right.

Originally featured here.

MARIE CURIE

Marie Curie (November 7, 1867–July 4, 1934) is one of the most extraordinary figures in the history of science and a tireless champion of curiosity and wonder. A pioneer in researching radioactivity, a field the very name for which she coined, she was not only the first woman to win a Nobel Prize but also the first person to win two Nobel Prizes in two different sciences: chemistry and physics. In Radioactive: Marie & Pierre Curie: A Tale of Love and Fallout (public library), artist Lauren Redniss tells the story of Curie through the two invisible but immensely powerful forces that guided her life: radioactivity and love. It’s a turbulent story — a passionate romance with Pierre Curie (honeymoon on bicycles!), the epic discovery of radium and polonium, Pierre’s sudden death in a freak accident in 1906, Marie’s affair with physicist Paul Langevin, her coveted second Noble Prize — under which lie poignant reflections on the implications of Curie’s work more than a century later as we face ethically polarized issues like nuclear energy, radiation therapy in medicine, nuclear weapons and more.

Most remarkable of all, however, is the thoughtfulness with which Redniss tailored her medium to her message, turning the book into a work of art in and of itself, every detail meticulously moulded to fit the essence of the narrative.

To stay true to Curie’s spirit and legacy, Redniss rendered her poetic artwork in an early-20th-century image printing process called cyanotype, critical to the discovery of both X-rays and radioactivity itself — a cameraless photographic technique in which paper is coated with light-sensitive chemicals. Once exposed to the sun’s UV rays, this chemically-treated paper turns a deep blue color. The text in the book is a unique typeface Redniss designed using the title pages of 18th- and 19th-century manuscripts from the New York Public Library archive. She named it Eusapia LR, for the croquet-playing, sexually ravenous Italian Spiritualist medium whose séances the Curies used to attend. The book’s cover is printed in glow-in-the-dark ink.

See more, including a behind-the-scenes look at Redniss’s impressive creative process, here.

HARVEY MILK

“Injustice anywhere is a threat to justice everywhere,” Martin Luther King, Jr. wrote in his indispensable 1963 letter from Birmingham City Jail. “We are caught in an inescapable network of mutuality.” One rainy January Sunday fifteen years later, long before Edie Windsor catalyzed the triumph of marriage equality, Harvey Milk (May 22, 1930–November 27, 1978) was sworn into office on the steps of San Francisco’s City Hall and became the first openly gay elected city official in America. His assassination eleven months later devastated millions and rendered him modernity’s great secular martyr for love. His tenure, however tragically brief, forever changed the landscape of civil rights.

In The Harvey Milk Story (public library) — a wonderful addition to the best LGBT children’s books — writer Kari Krakow and artist David Gardner tell the heartening and heartbreaking story of how a little boy with big ears grew up to hear the cry for social justice and how he answered it with a groundbreaking clarion call for equality in the kingdom of love.

Harvey was Born the second child of a middle-class Jewish family in upstate New York. He was a boy at once brimming with joy, frequently entertaining the family by conducting an invisible orchestra in the living room, and full of deep sensitivity to the suffering of others.

He was deeply moved when his mother, Minnie, told him the story of the Warsaw Ghetto Jews who courageously defended themselves even as the Nazis outnumbered them — a story that imprinted him with a profound empathy for the oppressed even before he had a clear sense that he would grow up to be one of them.

Although Harvey was athletic and popular in school, he anguished under the burden of a deep wistfulness — by the time he was fourteen, he knew he was gay, but like many queer people of his time, he kept this centerpiece of identity a closely guarded secret for a great many years to come.

He came of age, after all, in an era when queer couples celebrated their love only in private and when geniuses as vital to humanity as computing pioneer Alan Turing were driven to suicide after being criminally prosecuted by the government for being gay.

After graduating from college, Harvey joined the Navy, becoming an expert deep-sea diver and ascending through the ranks until he came to head a submarine rescue vessel.

When he went to his bother Robert’s wedding, he looked so handsome in his navy uniform that his family and friends all wondered when he would settle down and get married to the “right girl.”

But instead, like the hero of the heartwarming King & King fairy tale, Harvey fell in love and settled down with the right boy, a young man named Joe.

They moved together to a little town in New York, where Harvey became a high school math and science teacher. But after six years, Harvey and Joe separated — as Krakow points out, the pressure to hide their relationship in fear of losing their jobs put an undue strain on their love. Weary of hiding his identity, Harvey moved to San Francisco’s gay-friendly Castro neighborhood — where queer couples walked down the street holding hands like any other couple would in any other city — and he fell in love again.

Together with Scott, his new partner, Harvey opened a small store called Castro Camera, which soon turned into a community center as Harvey became a one-man Craigslist, counseling neighbors on everything from finding apartments to applying for jobs.

The more Harvey listened to the people, the more he sensed that they needed a leader — not only an informal one, but one who fought on their behalf in the eyes of the law, standing up to the police who harassed them constantly and fighting against the daily indignities of discrimination, from which the political system failed to protect them. Harvey saw only one course of action — to apply for office. His customers and the community embraced his campaign and volunteered their time.

Eleven-year-old Medora Payne came every day after school to lick envelopes and hand out brochures for Harvey. She organized a fundraiser at her school, earning $39.28 for his campaign.

Bigots believed that it wasn’t right or even possible for an openly gay candidate to be elected. Indeed, Harvey lost three consecutive election cycles between 1973 and 1976, but didn’t lose faith. He remained emboldened by the unflinching conviction that the rights of minorities — not only the LGBT community, but also African Americans, Asian Americans, senior citizens, and the disabled — weren’t adequately represented in and protected by the government. His people loved him for the dedication.

At last, in 1977, he was elected to the city’s Board of Supervisors and sworn into office the following January as Supervisor Milk. He immediately set out to champion greater quality of life for the people of the city — a kind of Robert Moses without the evil genius, bolstering the city’s parks, schools, and police protection. Eventually, he introduced a pioneering gay bill of rights. After ten of the city’s eleven supervisors voted for it, Mayor George Moscone signed it into law, proclaiming with gusto as Milk stood by his side:

I don’t do this enough, taking swift and unambiguous action on a substantial move for civil rights.

It was a historic moment, marked by a moving speech Milk made in front of City Hall, calling for a gay rights march in Washington.

But as the city celebrated, one man sat consumed with hateful bigotry and personal jealousy — Dan White, the only Supervisor who hadn’t voted for Milk’s bill and who had resigned from office in a petty act of protest, only to ask for his job back ten days later. Sensing his ill will, Mayor Moscone had refused to hire him back.

On a gloomy November morning, White crept into City Hall through a basement window, with a loaded gun. He barged into Moscone’s office and shot the mayor, promptly reloading his gun and heading down the hall to Harvey Milk’s office. Five shots echoed through the marble building.

Harvey Milk was dead.

People everywhere were stunned by the news of the double assassination. They left their homes, jobs and schools to mourn the loss of these two great leaders. Crowds began forming in front of City Hall. By nightfall thousands filled the mile-long street and ran from the Castro to City Hall. They stood in silence, carrying candles. That night the people of San Francisco wept.

Harvey Milk was gone, but his legacy only gained momentum in the fight for civil rights. The following October, a hundred thousand people brought his dream to life and took to the streets of Washington in the capital’s first-ever Gay Pride March, many carrying portraits of the slain San Francisco hero.

Thirty-four years later, one brave woman picked up where he left off and made possible a dream even Milk didn’t dare to dream — one which the president himself proclaimed “a victory for American democracy,” the triumphant road to which Milk had paved.

Originally featured here.

MARIA MERIAN

Inspired children’s books about science are woefully rare in our culture — as rare, perhaps, as are homages to pioneering female scientists and celebrations of the intersection of art and science. The confluence of these three rarities is what makes Summer Birds: The Butterflies of Maria Merian (public library) — a young-readers counterpart to Taschen’s lavish volume Maria Sibylla Merian: Insects of Surinam — so wonderful. Writer Margarita Engle and artist Julie Paschkis tell the story of 17th-century German naturalist and illustrator Maria Merian, whose studies of butterfly metamorphosis are among the most important contributions to the field of entomology in the history of science and forever transformed natural history illustration.

There are many ennobling and empowering threads to the story of Merian’s life — how she began studying insects as a young girl, two centuries before the dawn of science education for women; how she trained tirelessly in art, then brought those skills to illuminating science, all while raising her daughters; how she traveled to South Africa with her young daughter in an era when women had practically no agency of mobility; how she continued to work even after a stroke left her paralyzed.

But perhaps most pause-giving of all is the reminder of just how much superstition early scientists had to overcome in the service of simple truth: In Merian’s time, people considered insects evil and found the “supernatural” process of metamorphosis particularly ominous, believing it was witchcraft that transformed the insect from one state to another.

By meticulous and attentive observation, Merian proved that the process was very much a natural one, and beautifully so. She was only thirteen. Her groundbreaking work was a prescient testament to Richard Feynman’s famous assertion that science only adds to the mystery and the awe of the natural world.

When people understand the life cycles of creatures that change forms, they will stop calling small animals evil. They will learn, as I have, by seeing a wingless caterpillar turn into a flying summer bird.

On her site, Paschkis shares her research process and offers a fascinating history of insect illustration.

Originally featured here.

ANTOINE DE SAINT EXUPÉRY

“The Little Prince will shine upon children with a sidewise gleam. It will strike them in some place that is not the mind and glow there until the time comes for them to comprehend it.” So sang a 1943 review of The Little Prince, published a few months before the beloved book’s author disappeared over the Bay of Biscay never to return. But though it ultimately became the cause of his tragic death, Antoine de Saint-Exupéry’s experience as a pilot also informed the richness of his life and the expansive reach of his spirit, from his reflection on what his time in the Sahara desert taught him about the meaning of life to his beautiful meditation on the life-saving potential of a human smile. It was at the root of his identity and his imagination, and as such inspired the inception of The Little Prince.

That interplay between Saint-Exupéry the pilot and Saint-Exupéry the imaginative creator of a cultural classic is what celebrated Czech-born American children’s book author and illustrator Peter Sís explores in the beautiful graphic biography The Pilot and the Little Prince (public library) — a sensitive account of Saint-Exupéry’s life, underpinned by a fascinating chronicle of how aviation came to change humanity and a poignant undercurrent of political history, absolutely magical it its harmonized entirety.

Saint-Exupéry was born in 1900, a golden age of discovery, just as airplanes had been invented in France and the dawn of aviation was emanating an exhilarating spirit of exploration and invention. Young Antoine quickly became enchanted with that exhilaration and at the age of twelve, he built a makeshift flying machine.

Sís writes:

It did not take off, but this didn’t discourage him.

That summer, he rode his bike to a nearby airfield every day to watch the pilots test planes. He told them he had permission from his mother to fly, so one pilot took him up in the air. His mother was not happy. Antoine couldn’t wait to go up again.

The obsession had permanently lodged itself into his psyche. When the war came and he was summoned to military duty, young Saint-Exupéry requested the air force but was assigned to the ground crew. Again, he remained unperturbed. Two years later, when he heard about a new airline operated by the postal service to deliver the mail, he got himself hired — first as a mechanic, and soon as a test pilot, eventually learning to fly by accompanying other pilots on mail routes. Sís writes:

One day, he heard the news he had been waiting for: he would fly the mail from France to Spain by himself. Henri Guillaumet, another pilot and later Antoine’s good friend, told him not just to depend on the map but to follow the face of the landscape.

Saint-Exupéry was living his dream, flying in Europe and West Africa. Eventually, the airline assigned him to an airfield in Cape Juby in southern Morocco, and the two years he spent in the desert were among the happiest in his life, a period he would go on to cherish with beautiful and bittersweet wistfulness for the rest of his days. Sís captures the romantic poetics of the experience:

He lived in a wooden shack and had few belongings and fewer visitors. With an ocean on one side and desert everywhere else, it seemed like one of the loneliest places in the world. But he loved the solitude and being under millions of stars.

The locals came to call him Captain of the Birds as he rescued stranded pilots and appeased hostile nomads who had shot down planes and kidnapped flyers. His time in the desert became powerful fuel for his writing and the raw inspiration for The Little Prince. But the skies remained his greatest love. Sís traces the trajectory of Saint-Exupéry’s travels and passions:

Eager to explore other skies, Antoine joined his fellow aviators in creating new mail routes in South America. Nothing could stop them as they crossed glaciers, rain forests, and mountain peaks, battling fierce winds and wild storms.

Antoine spent more time in the air here than anywhere else because the pilots now also flew at night. With stars above and lights below, his world felt both immense and small.

Upon returning to France, Saint-Exupéry fell in love, got married, and reached significant fame as both a pilot and an author. But driven by his chronic adventurer’s restlessness, he continued to dream up expeditions that came to border on stunts. In one, he competed for a prize for the fastest flight between Paris and Saigon, but he and his copilot crashed in North Africa, surviving by a hair and wandering the desert for days before being rescued. In another, he set out to become the first French pilot to fly from New York to the tip of South America. The plane crashed near Guatemala City but, miraculously, he survived once more.

As World War II engulfed Europe, Saint-Exupéry was called for military duty once more, this time as a pilot, observing from high in the skies the atrocities the Germans inflicted all over. Once his war service ended, he decided he couldn’t continue to live in France under German occupation and fled to Portugal on a ship — a trip that would stir the very foundations of his soul and inspire his magnificent Letter to a Hostage — eventually ending up in New York, where he found himself lonesome and alienated.

After writing Flight to Arras and sending a copy to President Roosevelt with the inscription “For President Franklin Roosevelt, whose country is taking on the heavy burden of saving the world,”Saint-Exupéry bought a set of watercolor paints and began working on the illustrations for the story that would become The Little Prince. Sís captures the layered message of the book, informed both by Saint-Exupéry’s passions and his forlorn homesickness, with beautiful simplicity:

He described a planet more innocent than his own, with a boy who ventured far from home, questioned how things worked, and searched for answers.

But the author grew increasingly restless once more. Longing to fly again and to see his family, who had remained in France, he rejoined his old squadron in North Africa, requesting flights that would take him back to France. Sís captures the tragic bluntness of how Saint-Exupéry’s story ended, at once almost sterile in its abruptness and richly poetic in the context of his lifelong obsession:

On July 31, 1944, at 8:45am, he took off from Borgo, Corsica, to photograph enemy positions east of Lyon. It was a beautiful day. He was due back at 12:30.

But he never returned. Some say he forgot his oxygen mask and vanished at sea.

Maybe Antoine found his own glittering planet next to the stars.

Originally featured here.

IBN SINA

Humanity’s millennia-old quest to understand the human body is strewn with medical history milestones, but few individual figures merit as much credit as Persian prodigy-turned-polymath Ibn Sina (c. 980 CE–1037 AD), commonly known in the West as Avicenna — one of the most influential thinkers in our civilization’s unfolding story. He authored 450 known works spanning physics, philosophy, astronomy, mathematics, logic, poetry, and medicine, including the seminal encyclopedia The Canon of Medicine, which forever changed our understanding of the human body and its inner workings. This masterwork of science and philosophy — or metaphysics, as it was then called — remained in use as a centerpiece of medieval medical education until six hundred years after Ibn Sina’s death.

His story comes to life in The Amazing Discoveries of Ibn Sina (public library) by Lebanese writer Fatima Sharafeddine, Iran-based Iraqi illustrator Intelaq Mohammed Ali, and Canadian indie powerhouse Groundwood Books — a fine addition to the loveliest children’s books celebrating science.

In stunning illustrations reminiscent of ancient Islamic manuscript paintings, this lyrical first-person biography traces Ibn Sina’s life from his childhood as a voracious reader to his numerous scientific discoveries to his lifelong project of advancing the art of healing.

A universal celebration of curiosity and the unrelenting pursuit of knowledge, the story is doubly delightful for adding a sorely needed touch of diversity to the homogenous landscape of both science history and contemporary children’s books — here are two Middle Eastern women, telling the story of a pioneering scientist from the Islamic Golden Age.

Originally featured here.

FRIDA KAHLO

Mexican painter Frida Kahlo (July 6, 1907–July 13, 1954) was a woman of vibrantly tenacious spirit who overcame an unfair share of adversity to become one of humanity’s most remarkable artists and a wholehearted human being out of whom poured passionate love letters and compassionate friend-letters.

The polio she contracted as a child left her right leg underdeveloped — an imperfection she’d later come to disguise with her famous colorful skirts. As a teenager, having just become one of only thirty-five female students at Mexico’s prestigious Preparatoria school, Kahlo was in a serious traffic accident that sent an iron rod through her stomach and uterus. She spent three months in a full-body cast and even though the doctors didn’t believe it possible, she willed her way to walking again. Although the remainder of her life was strewn with relapses of extreme pain, frequent hospital visits, and more than thirty operations, that initial recovery period was a crucial part of her creative journey.

True to Roald Dahl’s conviction that illness emboldens creativity, Kahlo made her first strides in painting while bedridden, as a way of occupying herself, painting mostly her own image. Today, she remains best-known for her vibrant self-portraits, which comprise more than a third of her paintings, blending motifs from traditional Mexican art with a surrealist aesthetic. Above all, she became a testament to the notion that we can transcend external limitations to define our scope of possibility.

Kahlo’s singular spirit and story spring to life in the immeasurably wonderful Viva Frida (public library) by writer/illustrator Yuyi Morales and photographer Tim O’Meara.

In simple, lyrical words and enchanting photo-illustrations, this dreamlike bilingual beauty tells the story of an uncommon Alice in a luminous Wonderland of her own making.

Morales, who painstakingly handcrafted all the figurines and props and staged each vignette, writes in the afterword:

When I think of Frida Kahlo, I think of orgullo, pride. Growing up in Mexico, I wanted to know more about this woman with her mustache and unibrow. Who was this artist who had unapologetically filled her paintings with old and new symbols of Mexican culture in order to tell her own story?

I wasn’t always so taken by Frida. When I was younger, I often found her paintings tortuous and difficult to understand. The more I learned about Frida’s life, the more her paintings began to take on new light for me. I finally saw that what had terrified me about Frida’s images was actually her way of expressing the things she felt, feared, and wanted.

[…]

Her work was proud and unafraid and introduced the world to a side of Mexican culture that had been hidden from view.

As a child, while learning to draw, I would often study my own reflection in the mirror and think about Frida. Did she know how many artists she influenced with her courage and her ability to overcome her own limitations?

See more, including a behind-the-scenes look at Morales’s meticulous craftsmanship and creative process, here.

ERNEST SHACKLETON

In August of 1914, legendary British explorer Ernest Shackleton led his brave crew of men and dogs on a journey to the end of the world — the enigmatic continent of Antarctica. That voyage — monumental both historically and scientifically — would become the last expedition of the Heroic Age of Antarctic Exploration, which stretched from 1888 to 1914. From Flying Eye Books — the children’s book imprint of British indie press Nobrow, which gave us Freud’s comic biography, Blexbolex’s brilliant No Man’s Land and some gorgeous illustrated histories of aviation and the Space Race — comes Shackleton’s Journey (public library), a magnificent chronicle by emerging illustrator William Grill, whose affectionate and enchanting colored-pencil drawings bring to life the legendary explorer and his historic expedition.

As Grill tells us in the introduction, Shackleton was a rather extraordinary character:

Shackleton was the second of ten children. From a young age, Shackleton complained about teachers, but he had a keen interest in books, especially poetry — years later, on expeditions, he would read to his crew to lift their spirits. Always restless, the young Ernest left school at 16 to go to sea. After working his way up the ranks, he told his friends, “I think I can do something better, I want to make a name for myself.”

And make it he did. Reflecting on the inescapable allure of exploration, which carried him through his life of adventurous purpose, Shackleton once remarked:

I felt strangely drawn to the mysterious south. I vowed to myself that some day I would go to the region of ice and snow, and go on and on ’til I came to one of the poles of the Earth, the end of the axis on which this great round ball turns.

From the funding and recruitment of the famed expedition, to the pioneering engineering of the Endurance ship, to the taxonomy of crew members, dogs, and supplies, Grill traces Shackleton’s tumultuous journey from the moment the crew set sail to their misfortune-induced change of plans and soul-wrenching isolation “500 miles away from the nearest civilization” to their eventual escape from their icy prison and salvation ashore Elephant Island.

As a lover of dogs and visual lists, especially illustrated lists and dog-themed illustrations, I was especially taken with Grill’s visual inventories of equipment and dogs:

Despite the gargantuan challenges and life-threatening curveballs, Shackleton’s expedition drew to a heroic close without the loss of a single life. It is a story of unrelenting ambition to change the course of history, unflinching courage in the face of formidable setbacks, and above all optimism against all odds — the same optimism that emanates with incredible warmth from Grill’s tender illustrations.

Years later, Shackleton himself captured the spirit that carried them:

I chose life over death for myself and my friends… I believe it is in our nature to explore, to reach out into the unknown. The only true failure would be not to explore at all.

Originally featured here.

JULIA CHILD

Legendary chef Julia Child (August 15, 1912–August 13, 2004) not only revolutionized the world of cookbooks but was also a remarkable beacon of entrepreneurship and perseverance more than a decade before women started raising their voices in the media world. Her unrelenting spirit and generous heart cast her as one of modern history’s most timeless role models, and that’s precisely what writer and illustrator Jessie Hartland celebrates in the endlessly wonderful Bon Appetit! The Delicious Life of Julia Child (public library) — a heartening illustrated biography of the beloved chef, intended to enchant young readers with her story but certain to delight all of us. Hartland’s vibrant drawings — somewhere between Maira Kalman, Wendy MacNaughton, and Vladimir Radunsky — exude the very charisma that made Childs an icon, and infuse her legacy with fresh joy.

Amidst the beautiful illustrations are practical glimpses of Child’s culinary tricks and the context of her recipes:

At the end of the story, as at the end of her life, Child emerges not only as a masterful cook but also as a fierce entrepreneur, a humble human, and restlessly creative soul.

Originally featured here.

HENRI ROUSSEAU

“People working in the arts engage in street combat with The Fraud Police on a daily basis,” Amanda Palmer wrote in her fantastic manifesto for the creative life, one of the best books of the year, “because much of our work is new and not readily or conventionally categorized.” Few artists in history have lived through this street combat with more dignity and resilience of spirit than French Post-Impressionist painter Henri Rousseau (May 21, 1844–September 2, 1910). Long before history came to celebrate him as one of the greatest artists of his era, long before he was honored by major retrospectives by such iconic institutions as the MoMA and the Tate Museum, long before Sylvia Plath began weaving homages to him into her poetry, he spent a lifetime being not merely dismissed but ridiculed. And yet Rousseau — who was born into poverty, began working alongside his plumber father as a young boy, still worked as a toll collector by the age of forty, and was entirely self-taught in painting — withstood the unending barrage of harsh criticism with which his art was met during his entire life, and continued to paint from a deep place of creative conviction, with an irrepressible impulse to make art anyway.

In The Fantastic Jungles of Henri Rousseau (public library, writer Michelle Markel and illustrator Amanda Hall tell an emboldening real-life story, and a stunningly illustrated one, of remarkable resilience and optimism in the face of public criticism; of cultivating a center so solid and a creative vision so unflinching that no outside attack can demolish it and obstruct its transmutation into greatness; of embodying Ray Bradbury’s capacity for weathering the storm of rejection and Picasso’s conviction about never compromise in one’s art.

Henri Rousseau wants to be an artist.
Not a single person has ever told him he is talented.
He’s a toll collector.
He’s forty years old.

But he buys some canvas, paint, and brushes, and starts painting anyway.

Rousseau’s impulse for art sprang from his deep love of nature — a manifestation of the very thing that seventeen-year-old Virginia Woolf intuited when she wrote in her diary that the arts “imitate as far as they can the one great truth that all can see”.

Unable to afford art lessons, Rousseau educated himself by going to the Louvre to study the paintings of his favorite artists and examining photographs, magazines, and catalogs to learn about the anatomy of the human body.

At the age of forty-one, he showed his work as part of a big art exhibition, but his art — vibrant, flat, seemingly childish — was met, as Markel writes, with “only mean things.” Even so, Rousseau saved the reviews and pasted them into his scrapbook.

With his voracious appetite for inspiration, Rousseau visited the World’s Fair, where he was especially enchanted by the exhibits of exotic lands. “They remind him of adventure stories he loved when he was a boy,” Markel writes. The vivid images haunted him for days, until he finally turned to the easel to exorcise his restless imagination.

He holds his paintbrush to the canvas. A tiger crawls out. Lightning strikes, and wind whips the jungle grass.

Sometimes Henri is so startled by what he paints that he has to open the window to let in some air.

But for all his earnest creative exuberance, he is met with derision.

Every year Henri goes back to the art exhibition to show new paintings. He fusses over the canvases and retouches them until the last minute.

And every year the art experts make fun of him. They say it looks like he closed his eyes and painted with his feet.

And yet Rousseau manages to embody Georgia O’Keeffe’s credo that “whether you succeed or not is irrelevant… making your unknown known is the important thing” — he continues to paint, to study nature, and to rejoice in the process itself.

One night, he dreams up a painting of which he is especially proud, depicting a lion looking over a sleeping gypsy with friendly curiosity.

Once again he takes his work to the art show. This time, perhaps, he’ll please the experts. His pulse races.

The experts say he paints like a child. “If you want to have a good laugh,” one of them writes, “go see the paintings by Henri Rousseau.”

By now Henry is used to the nasty critics. He knows his shapes are simpler and flatter than everyone else’s, but he thinks that makes them lovely.

Everything he earns by giving music lessons, he spends on art supplies. But he lives by Thoreau’s definition of success.

His home is a shabby little studio, where one pot of stew must last the whole week. But every morning he wakes up and smiles at his pictures.

At sixty-one, Rousseau is still living in poverty, but happily paints his jubilant junglescapes. He continues to hope for critical acclaim and continues to be denied it, cruelly, by the “experts,” one of whom even says that “only cavemen would be impressed by his art.”

At last, Rousseau, already an old man, gets a break — but the recognition comes from a new generation of younger artists, who befriend him and come to admire his work. More than his talent and his stomach for criticism, however, one comes to admire his immensely kind and generous heart.

Whenever Henri has money to spare, and stages a concert in his little studio, all the artists come. Along with the grocer, locksmith, and other folks from the neighborhood, they listen to Henri’s students and friends play their musical instruments. Henri gives the shiniest, reddest apples to the children.

Eventually, even Picasso pays heed and throws old Henri a banquet, at which “the old man sits upon a makeshift throne” playing his violin as people dance and celebrate around him, his heart floating “like a hot-air balloon above the fields.”

At the end of his life, Rousseau paints his masterwork “The Dream” and finally becomes successful by a public standard as the critics, at last, grant him acclaim. But the beautiful irony and the ennobling message of the story is that he was successful all along, for he had found his purpose — a feat with which even Van Gogh struggled for years — and filled each day with the invigorating joy of making his unknown known.

A hundred years later, the flowers still blossom, the monkeys still frolic, and the snakes keep slithering through Henri’s hot jungles. His paintings now hang in museums all over the world. And do you think experts call them “foolish,” “clumsy,” or “monstrous”? Mais non! They call them works of art.

By an old man,
by a onetime toll collector,
by one of the most gifted self-taught artists in history:
Henri Rousseau

Originally featured here.

* * *

For a different, more grownup celebration of notable lives, complement these children’s-books treasures with the graphic-novel biographies of Sigmund Freud, Salvador Dalí, Karl Marx, Robert Moses, Andy Warhol, Charles Darwin, Francis Bacon, Richard Feynman, Steve Jobs, and Hunter S. Thompson.

BP

Consolation for Life’s Darkest Hours: 7 Unusual and Wonderful Books that Help Children Grieve and Make Sense of Death

From Japanese pop-up magic to Scandinavian storytelling to Maurice Sendak, a gentle primer on the messiness of mourning and the many faces and phases of grief.

UPDATE: Also see three more recent crown jewels of the genre: Cry, Heart, But Never Break, Michael Rosen’s Sad Book, and Duck, Death and the Tulip.

 data-recalc-dims=“If you are protected from dark things,” Neil Gaiman said in the context of his fantastic recent adaptation of the Brothers Grimm, “then you have no protection of, knowledge of, or understanding of dark things when they show up.” Maurice Sendak was equally adamant about not shielding young minds from the dark. Tolkien believed that there is no such thing as “writing for children” and E.B. White admonished that kids shouldn’t be written down to but written up to. In her wise reflection on the difference between myth and deception, Margaret Mead asserted that “children who have been told the truth about birth and death will know … that this is a truth of a different kind.”

And yet we hardly tell children — nor ourselves — those truths. Half a century after children’s literature patron saint Ursula Nordstrom lamented that “some mediocre ladies in influential positions are actually embarrassed by an unusual book,” most books for young readers still struggle to validate children’s darker emotions and make room for difficult, complex, yet inescapable experiences like loss, loneliness, and uncertainty.

* * *

UPDATE: For three more recent additions, see Cry, Heart, But Never Break by Danish duo Glenn Ringtved and Charlotte Pardi, The Heart and the Bottle by Oliver Jeffers, Michael Rosen’s Sad Book , illustrated by the great Quentin Blake.

Here are some proudly unusual books addressing these all too common yet commonly shirked emotional realities.

1. MY FATHER’S ARMS ARE A BOAT

For more than a decade, Brooklyn-based Enchanted Lion — an independent one-woman children’s book powerhouse — has been churning out some of the bravest and most sensitive picture-books of our time, championing foreign writers and artists who create layered universes of experience outside the unimaginative bounds of the pantheon. Among them is My Father’s Arms Are a Boat (public library) by writer Stein Erik Lunde and illustrator Øyvind Torseter (of The Hole fame), translated by Kari Dickson.

This tender Norwegian gem tells the story of an anxious young boy who climbs into his father’s arms seeking comfort on a cold sleepless night. The two step outside into the winter wonderland as the boy asks questions about the red birds in the spruce tree to be cut down the next morning, about the fox out hunting, about why his mother will never wake up again. With his warm and comforting answers, the father watches his son make sense of this strange world of ours, where love and loss go hand in hand.

Above all, it is story about the quiet way in which boundless love and unconditional assurance can embolden even the heaviest of spirits to rise from the sinkhole of anxiety and anguish.

Lunde, who also writes lyrics and has translated Bob Dylan into Norwegian, is a masterful storyteller who unfolds incredible richness in few words. Meanwhile, Torseter’s exquisite 2D/3D style combining illustration and paper sculpture, reminiscent of Soyeon Kim’s wonderful You Are Stardust, envelops the story in a sheath of delicate whimsy.

2. THE FLAT RABBIT

When death comes and brings grief with it, as Joan Didion memorably put it, it’s “nothing like we expect it to be.” What we need isn’t so much protection from that engulfing darkness as the shaky comfort of understanding — a sensemaking mechanism for the messiness of loss.

That’s precisely what Faroese children’s book author and artist Bárður Oskarsson does in The Flat Rabbit (public library) — a masterwork of minimalist storytelling that speaks volumes about our eternal tussle with our own impermanence.

The book, translated by Faroese language-lover Marita Thomsen, comes from a long tradition of Scandinavian children’s books with singular sensitivity to such difficult subjects — from Tove Jansson’s vintage parables of uncertainty to Stein Erik Lunde’s Norwegian tale of grief to Øyvind Torseter’s existential meditation on the meaning of something and nothing.

The story, full of quiet wit and wistful wonder, begins with a carefree dog walking down the street. Suddenly, he comes upon a rabbit, lying silently flattened on the road. As the dog, saddened by the sight, wonders what to do, his friend the rat comes by.

“She is totally flat,” said the rat. For a while they just stood there looking at her.

“Do you know her?”

“Well,” said the dog, “I think she’s from number 34. I’ve never talked to her, but I peed on the gate a couple of times, so we’ve definitely met.”

The two agree that “lying there can’t be any fun” and decide to move her, but don’t know where to take her and head to the park to think.

The dog was now so deep in thought that, had you put your ear to his skull, you would have actually heard him racking his brain.

Embedded in the story is a subtle reminder that ideas don’t come to us by force of will but by the power of incubation as everything we’ve unconsciously absorbed clicks together into new combinations in our minds. As the dog sits straining his neurons, we see someone flying a kite behind him — a seeming aside noted only in the visual narrative, but one that becomes the seed for the rabbit solution.

Exclaiming that he has a plan, the dog returns to the scene with the rat. They take the rabbit from the road and work all night on the plan, hammering away in the doghouse.

In the next scene, we see the rabbit lovingly taped to the frame of a kite, which takes the dog and the rat forty-two attempts to fly.

With great simplicity and sensitivity, the story lifts off into a subtle meditation on the spiritual question of an afterlife — there is even the spatial alignment of a proverbial heaven “above.” It suggests — to my mind, at least — that all such notions exist solely for the comfort of the living, for those who survive the dead and who confront their own mortality in that survival, and yet there is peace to be found in such illusory consolations anyway, which alone is reason enough to have them.

Mostly, the story serves as a gentle reminder that we simply don’t have all the answers and that, as John Updike put it, “the mystery of being is a permanent mystery.”

Once the kite was flying, they watched it in silence for a long time.

“Do you think she is having a good time?” the rat finally asked, without looking at the dog.

The dog tried to imagine what the world would look like from up there.

“I don’t know…” he replied slowly. “I don’t know.”

The Flat Rabbit was one of the best children’s books of 2014.

3. DAVEY MCGRAVY

If grief is so Sisyphean a struggle even for grownups, how are tiny humans to handle a weight so monumental once it presses down? Poet David Mason offers an uncommonly comforting answer in Davey McGravy (public library) — a lyrical litany of loss for children of all ages. Across a series of poems, accompanied by early-Sendakesque etchings by artist Grant Silverstein, we meet a little boy named Davey McGravy living in the tall-treed forest with his father and brothers. A few tender verses in, we realize that Davey is caught in the mire of mourning his mother.

Without invalidating the deep melancholy that has set in, Mason makes room for the mystery of life and death, inviting in the miraculous immortality of love. With great gentleness, he reminds us that whenever we grieve for someone we love, we grieve for our entire world, for the entire world; that whenever one grieves, the whole world grieves.

THE KITCHEN

He walked to where his father stood
and hugged him by a leg
and wept like the babe he used to be
in the green house by the lake

He wept for the giants in the woods
for the otter that swam in the waves.
He wept for his mother in the fog
so far away.

And then he felt a hand,
a big hand in his hair.
“It’s Davey McGravy,” his father said.
“I’m glad you’re here.”

“Davey McGravy,” he said again,
“How’s that for a brand new name?
Davey McGravy. Not so bad.
I like a name that rhymes.”

And there was his father on his knees
holding our boy in his arms.
And Davey McGravy felt the scratch
of whiskers and felt warm.

“Nobody else has a name like that.
It’s all your own.
Davey McGravy. Davey McGravy.
You could sing it in a song.

And then his father kissed him,
ruffled his hair and said,
“Supper time, Davey McGravy.
Then it’s time for bed.”

TO LOVE

May I call you Love?

Very well, then, you are Love,
and this is a tale about a boy
named Davey.

Never mind the rest of his name.
You need only know that he was born
in the land of rain
and the tallest of tall trees —

great shaggy cedars like the boots
of giants covered in green,
and where the giants had gone
no one could ever tell.

Only their boots remained
on the wet green grass,
surrounded by ferns on the shore
of a long, cold, windy lake.

That’s where Davey was born, Love.
That’s where you must imagine him,
a wee squall of tears and swaddling,
a babe, as you were too a babe,

with parents and the whole canoe,
the whole catastrophe
we call a family —
the human zoo.

Only a rare poet can merge the reverence of Thoreau with the irreverence of Zorba the Greek to create something wholly unlike anything else — and that is what Mason accomplishes in Davey McGravy.

4. WE ARE ALL IN THE DUMPS WITH JACK AND GUY

The 1993 masterwork We Are All in the Dumps with Jack and Guy (public library), which I’ve covered extensively here, is the darkest yet most hopeful book Maurice Sendak ever created, as well as one of his most personal. It’s an unusual fusion of two traditional Mother Goose nursery rhymes — “In the Dumps” and “Jack and Gye” — reimagined and interpreted by Sendak’s singular sensibility, and permeated by many layers of cultural and personal subtext.

On a most basic level, the story follows a famished black baby, part of a clan of homeless children dressed in newspaper and living in boxes, kidnapped by a gang of giant rats. Jack and Guy, who are strolling nearby and first brush the homeless kids off, witness the kidnapping and set out to rescue the boy. But the rats challenge them to a rigged game of bridge, with the child as the prize. After a series of challenges that play out across a number of scary scenes, Jack and Guy emerge victorious and save the boy with the help of the omniscient Moon and a mighty white cat that chases the rats away.

Created at the piercing pinnacle of the AIDS plague and amid an epidemic of homelessness, it is a highly symbolic, sensitive tale that reads almost like a cry for mercy, for light, for resurrection of the human spirit at a time of incomprehensible heartbreak and grimness. It is, above all, a living monument to hope — one built not on the denial of hopelessness but on its delicate demolition.

But the book’s true magic lies in its integration of Sendak’s many identities — the son of Holocaust survivors, a gay man witnessing the devastation of AIDS, a deft juggler of darkness and light.

Jack and Guy appear like a gay couple, and their triumph in rescuing the child resembles an adoption, two decades before that was an acceptable subject for a children’s book. “And we’ll bring him up / As other folk do,” the final pages read — and, once again, a double meaning reveals itself as two characters are depicted with wings on their backs, lifting off into the sky, lending the phrase “we’ll bring him up” an aura of salvation. In the end, the three curl up as a makeshift family amidst a world that is still vastly imperfect but full of love.

We are all in the dumps
For diamonds are thumps
The kittens are gone to St. Paul’s!
The baby is bit
The moon’s in a fit
And the houses are built
Without walls

Jack and Guy
Went out in the Rye
And they found a little boy
With one black eye
Come says Jack let’s knock
Him on the head
No says Guy
Let’s buy him some bread
You buy one loaf
And I’ll buy two
And we’ll bring him up
As other folk do

In many ways, this is Sendak’s most important and most personal book. In fact, Sendak would resurrect the characters of Jack and Guy two decades later in his breathtaking final book, a posthumously published love letter to the world and to his partner of fifty years, Eugene Glynn. Jack and Guy, according to playwright Tony Kushner, a dear friend of Sendak’s, represented the two most important people in the beloved illustrator’s life — Jack was his real-life brother Jack, whose death devastated Sendak, and Guy was Eugene, the love of Sendak’s life, who survived him after half a century of what would have been given the legal dignity of a marriage had Sendak lived to see the dawn of marriage equality. (Sendak died thirteen months before the defeat of DOMA.)

All throughout, the book emanates Sendak’s greatest lifelong influence — like the verses and drawings of William Blake, Sendak’s visual poetry in We Are All in the Dumps with Jack and Guy is deeply concerned with the human spirit and, especially, with the plight of children. See more of it here.

5. LOVE IS FOREVER

In Love is Forever (public library), writer Casey Rislov, who holds a master’s degree in elementary education and has an intense interest in special needs, and artist Rachel Balsaits unpack the complexities of loss with elegant simplicity.

The sweet verses and tender illustrations tell the story of Little Owl, who loves her Grandfather Owl very much. With the help of her parents and baby brother, Little Owl processes the profound sadness over her grandfather’s death by learning to keep his love alive forever.

Our love is a gift, a treasure to hold,
a story in our hearts forevermore.

This gift of love we have been given
is one that is pure, constant and sure.

The final pages feature a short guide for parents and teachers to the basic psychological phenomena that the mourner experiences and how to address those in children.

6. NICOLAS

Nicolas (public library), the debut of Quebecois cartoonist Pascal Girard, is a kind of children’s book for grownups chronicling the many faces and phases of grief in a series of autobiographical sketches that unfold over the decades since the childhood death of Girard’s younger brother, Nicolas. With great subtlety, honesty, and unsentimental sensitivity, he explores the multitude of complex emotions — sadness, numbness, restlessness, anxiety, even boredom, in Kierkegaard’s sense of existential emptiness — and their disorienting nonlinear flow.

From the confusing first days after Nicolas’s death from lactic acidosis in 1990, to Girard’s teenage years awkwardly telling kids in high school about his loss, to life as an adult paralyzed with dread over having a child of his own on account of everything that might go wrong, this moving visual narrative is at once utterly harrowing and tenaciously hopeful, told with gentle humor and great humanity.

Woven throughout the deeply personal story are the common threads of mourning, universal to the human experience — how we cling to the illusion that understanding the details of death would make processing its absoluteness easier, how we channel our restlessness into an impulse to do something (there is Girard as a boy, fundraising for lactic acidosis research in his neighborhood; there he is as a teenager, numbing the unprocessed grief with drugs), how bearing witness to the mourning of others rekindles our own but also makes more deeply empathetic (Nicolas, one realizes midway through the book, died exactly eleven years before the 9/11 attacks, the news of which resurfaces Girard’s grief as he is bowled over with empathy for the tragedy of others), and most of all how “the people we most love do become a physical part of us, ingrained in our synapses.”

What emerges is the elegant sidewise assurance that while grief never fully leaves us, we can be okay — more than that, in the words of Rilke, we can arrive at the difficult but transformative understanding that “death is our friend precisely because it brings us into absolute and passionate presence with all that is here, that is natural, that is love.”

7. LITTLE TREE

Pop-up books have a singular magic, but even the pioneering vintage “interactive” picture-books of Italian graphic designer Bruno Munari can’t compare to the beauty, subtlety, and exquisite elegance of those by Japanese graphic designer and book artist Katsumi Komagata.

When his daughter was born in 1990, Komagata expanded his graphic design studio, One Stroke, into publishing and began making extraordinary picture-books — including some particularly thoughtful and beguiling masterpieces for children with disabilities, from tactile pop-up gems to sign-language stories.

In 2008, Komagata released Little Tree (public library) — a most unusual and immeasurably wonderful story tracing the life-cycle of a single tree as it explores, with great subtlety and sensitivity, deeper themes of impermanence and the cycle of all life.

I received this delicate treasure as a gift from a dear friend, who had met Komagata at the Guadalajara International Book Fair. The book, she said, was inspired by a young child struggling with making sense of life and death after the loss of a beloved father, one of Komagata’s own dear friends.

On each spread of this whimsical trilingual story — told in Japanese, French, and English — a different stage of the tree’s growth unfolds, beginning with the tiny promise of a seedling poking through the snow.

No one notices such a small presence … be still here in the snow

Slowly, it grows into the recognizable shape of a tree and makes its way through the season — shy leaves greet the world in spring, a lush crown bathes in summer’s sunshine and turns a warm yellow, then a glowing red, as autumn embraces it.

A family of birds packs its nest, preparing to fly away for the winter.

When winter descends — that philosophical staple of intelligent children’s books — the mood darkens.

Clouds cover the sky
The wind blows hard, almost breaking the branches
Sheets of rain fill the darkness … be still here in the dark

But spring eventually returns, and the whole cycle repeats and repeats, until the tree grows “tall enough to look around when at the beginning it was too small and everything was big.”

Indeed, the book is very much a study in perspective — the existential through the spatial — as the tree’s height increases and its shadow shifts. With his gentle genius, Komagata casts the shadows of all peripheral characters and objects — a street lamp, a man walking his dog, a bird — not from the perspective of the reader but from that of the tree, appearing upside-down on the page. (To capture Komagata’s intended vignettes, I photographed the book from the top of the page facing down, following the tree’s viewpoint.)

And so the cycle of life continues — a new crow takes the nest built by last year’s bird, and as it observes these rhythms, the tree’s “point of view keeps changing.”

The man who lost a friend lays a flower down
It can’t be helped … be still here

But as wistful as the story is, the book is ultimately optimistic — a beautiful allegory for the same notion found in Rilke’s philosophy of befriending death in order to live more fully. At the end, the seed spurs a new turn of the cycle of life, going back to the beginning.

The seed was carried somewhere unknown
Surely it will exist for someone even though no one notices such a small presence at the beginning

* * *

For a grownup counterpart, see Meghan O’Rourke’s moving memoir of learning to live with loss, Anne Lamott on grief and gratitude, Atul Gawande’s indispensable Being Mortal, and Joanna Macy on how Rilke can help us befriend our mortality.

BP

The 13 Best Science and Technology Books of 2013

The wonders of the gut, why our brains are wired to be social, what poetry and math have in common, swarm intelligence vs. “God,” and more.

On the heels of the year’s best reads in psychology and philosophy, art and design, history and biography, and children’s books, the season’s subjective selection of best-of reading lists continues with the finest science and technology books of 2013. (For more timeless stimulation, revisit the selections for 2012 and 2011.)

1. THIS EXPLAINS EVERYTHING

Every year since 1998, intellectual impresario and Edge editor John Brockman has been posing a single grand question to some of our time’s greatest thinkers across a wide spectrum of disciplines, then collecting the answers in an annual anthology. Last year’s answers to the question “What scientific concept will improve everybody’s cognitive toolkit?” were released in This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking, one of the year’s best psychology and philosophy books.

In 2012, the question Brockman posed, proposed by none other than Steven Pinker, was “What is your favorite deep, elegant, or beautiful explanation?” The answers, representing an eclectic mix of 192 (alas, overwhelmingly male) minds spanning psychology, quantum physics, social science, political theory, philosophy, and more, are collected in the edited compendium This Explains Everything: Deep, Beautiful, and Elegant Theories of How the World Works (UK; public library) and are also available online.

In the introduction preceding the micro-essays, Brockman frames the question and its ultimate objective, adding to history’s most timeless definitions of science:

The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology, genetics, computer science, neurophysiology, psychology, cosmology, and physics. Emerging out of these contributions is a new natural philosophy, new ways of understanding physical systems, new ways of thinking that call into question many of our basic assumptions.

[…]

Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple principles in a surprising way. These explanations are called ‘beautiful’ or ‘elegant.’

[…]

The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining knowledge about anything — including such fields of inquiry as philosophy, mathematics, economics, history, language, and human behavior. The common thread is that a simple and nonobvious idea is proposed as the explanation of a diverse and complicated set of phenomena.

Puffer fish with Akule by photographer Wayne Levin. Click image for details.

Stanford neuroscientist Robert Sapolsky, eloquent as ever, marvels at the wisdom of the crowd and the emergence of swarm intelligence:

Observe a single ant, and it doesn’t make much sense, walking in one direction, suddenly careening in another for no obvious reason, doubling back on itself. Thoroughly unpredictable.

The same happens with two ants, a handful of ants. But a colony of ants makes fantastic sense. Specialized jobs, efficient means of exploiting new food sources, complex underground nests with temperature regulated within a few degrees. And critically, there’s no blueprint or central source of command—each individual ants has algorithms for their behaviors. But this is not wisdom of the crowd, where a bunch of reasonably informed individuals outperform a single expert. The ants aren’t reasonably informed about the big picture. Instead, the behavior algorithms of each ant consist of a few simple rules for interacting with the local environment and local ants. And out of this emerges a highly efficient colony.

Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple rules about when to lay down a pheromone trail and what to do when encountering someone else’s trail—approximations of optimal solutions to the Traveling Salesman problem. This has useful applications. In “ant-based routing,” simulations using virtual ants with similar rules can generate optimal ways of connecting the nodes in a network, something of great interest to telecommunications companies. It applies to the developing brain, which must wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of connecting axons. And migrating fetal neurons generate an efficient solution with a different version of ant-based routine.

A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple molecules in an organic soup to occasionally form more complex ones. Life may have originated this way without the requirement of bolts of lightning to catalyze the formation of complex molecules.

And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems don’t require a blue print, they don’t require a blue print maker. If they don’t require lightning bolts, they don’t require Someone hurtling lightning bolts.

Developmental psychologist Howard Gardner, who famously coined the seminal theory of multiple intelligences, echoes Anaïs Nin in advocating for the role of the individual and Susan Sontag in stressing the impact of individual acts on collective fate. His answer, arguing for the importance of human beings, comes as a welcome antidote to a question that suffers the danger of being inherently reductionist:

In a planet occupied now by seven billion inhabitants, I am amazed by the difference that one human being can make. Think of classical music without Mozart or Stravinsky; of painting without Caravaggio, Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible contributions of Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs (or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses or Christ.

[…]

Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of single individuals, or of small groups, working against the odds. As scholars, we cannot and should not sweep these instances under the investigative rug. We should bear in mind anthropologist Margaret Mead’s famous injunction: ‘Never doubt that a small group of thoughtful committed citizens can change the world. It is the only thing that ever has.’

Uber-curator Hans Ulrich Obrist, who also contributed to last year’s volume, considers the parallel role of patterns and chance in the works of iconic composer John Cage and painter Gerhard Richter, and the role of uncertainty in the creative process:

In art, the title of a work can often be its first explanation. And in this context I am thinking especially of the titles of Gerhard Richter. In 2006, when I visited Richter in his studio in Cologne, he had just finished a group of six corresponding abstract paintings which he gave the title Cage.

There are many relations between Richter’s painting and the compositions of John Cage. In a book about the Cage series, Robert Storr has traced them from Richter‘s attendance of a Cage performance at the Festum Fluxorum Fluxus in Düsseldorf 1963 to analogies in their artistic processes. Cage has often applied chance procedures in his compositions, notably with the use of the I Ching. Richter in his abstract paintings also intentionally allows effects of chance. In these paintings, he applies the oil paint on the canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the paint leaves on the canvas is to a large extent the outcome of chance.

[…]

Richter‘s concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other works)—but, one can say, the short form already contains everything. The title, like an explanation of a phenomenon, unlocks the works, describing their relation to one of the most important cultural figures of the twentieth century, John Cage, who shares with Richter the great themes of chance and uncertainty.

Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains indispensable, offers a lyrical meditation on the peculiar odds behind coincidences and déja vus:

I take comfort in the fact that there are two human moments that seem to be doled out equally and democratically within the human condition—and that there is no satisfying ultimate explanation for either. One is coincidence, the other is déja vu. It doesn’t matter if you’re Queen Elizabeth, one of the thirty-three miners rescued in Chile, a South Korean housewife or a migrant herder in Zimbabwe—in the span of 365 days you will pretty much have two déja vus as well as one coincidence that makes you stop and say, “Wow, that was a coincidence.”

The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can happen at any given moment, the fact is, that in practice, coincidences almost never do occur. Coincidences are actually so rare that when they do occur they are, in fact memorable. This suggests to me that the universe is designed to ward off coincidence whenever possible—the universe hates coincidence—I don’t know why—it just seems to be true. So when a coincidence happens, that coincidence had to work awfully hard to escape the system. There’s a message there. What is it? Look. Look harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a theorem for something larger than what they think it is.

What’s both eerie and interesting to me about déja vus is that they occur almost like metronomes throughout our lives, about one every six months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely assume that my thirteen year old niece, Stephen Hawking and someone working in a Beijing luggage-making factory each experience two déja vus a year. Not one. Not three. Two.

The underlying biodynamics of déja vus is probably ascribable to some sort of tingling neurons in a certain part of the brain, yet this doesn’t tell us why they exist. They seem to me to be a signal from larger point of view that wants to remind us that our lives are distinct, that they have meaning, and that they occur throughout a span of time. We are important, and what makes us valuable to the universe is our sentience and our curse and blessing of perpetual self-awareness.

Originally featured in January — read more here.

2. YOU ARE STARDUST

“Everyone you know, everyone you ever heard of, every human being who ever was … lived there — on a mote of dust suspended in a sunbeam,” Carl Sagan famously marveled in his poetic Pale Blue Dot monologue, titled after the iconic 1990 photograph of Earth. The stardust metaphor for our interconnection with the cosmos soon permeated popular culture and became a vehicle for the allure of space exploration. There’s something at once incredibly empowering and incredibly humbling in knowing that the flame in your fireplace came from the sun.

That’s precisely the kind of cosmic awe environmental writer Elin Kelsey and Toronto-based Korean artist Soyeon Kim seek to inspire in kids in You Are Stardust (public library) — an exquisite picture-book that instills that profound sense of connection with the natural world, and also among the best children’s books of the year. Underpinning the narrative is a bold sense of optimism — a refreshing antidote to the fear-appeal strategy plaguing most environmental messages today.

Kim’s breathtaking dioramas, to which this screen does absolutely no justice, mix tactile physical materials with fine drawing techniques and digital compositing to illuminate the relentlessly wondrous realities of our intertwined existence: The water in your sink once quenched the thirst of dinosaurs; with every sneeze, wind blasts out of your nose faster than a cheetah’s sprint; the electricity that powers every thought in your brain is stronger than lightning.

But rather than dry science trivia, the message is carried on the wings of poetic admiration for these intricate relationships:

Be still. Listen.

Like you, the Earth breathes.

Your breath is alive with the promise of flowers.

Each time you blow a kiss to the world, you spread pollen that might grow to be a new plant.

The book is nonetheless grounded in real science. Kelsey notes:

I wrote this book as a celebration — one to honor the extraordinary ways in which all of us simply are nature. Every example in this book is backed by current science. Every day, for instance, you breathe in more than a million pollen grains.

But what makes the project particularly exciting is that, in the face of the devastating gender gap in science education, here is a thoughtful, beautiful piece of early science education presented by two women, the most heartening such example since Lauren Redniss’s Radioactive.

A companion iPad app features sound effects, animation, an original score by Paul Aucoin, behind-the-scenes glimpses of Kim’s process in creating her stunning 3D dioramas, and even build-your-own-diorama adventures.

Originally featured in March — see more here.

3. ON LOOKING

“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library), also among the best psychology and philosophy books of the year — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Her approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.

4. WILD ONES

Wild Ones: A Sometimes Dismaying, Weirdly Reassuring Story About Looking at People Looking at Animals in America (public library) by journalist Jon Mooallem isn’t the typical story designed to make us better by making us feel bad, to scare us into behaving, into environmental empathy; Mooallem’s is not the self-righteous tone of capital-K knowing typical of many environmental activists but the scientist’s disposition of not-knowing, the poet’s penchant for “negative capability.” Rather than ready-bake answers, he offers instead directions of thought and signposts for curiosity and, in the process, somehow gently moves us a little bit closer to our better selves, to a deep sense of, as poet Diane Ackerman beautifully put it in 1974, “the plain everythingness of everything, in cahoots with the everythingness of everything else.”

In the introduction, Mooallem recalls looking at his four-year-old daughter Isla’s menagerie of stuffed animals and the odd cultural disconnect they mime:

[T]hey were foraging on the pages of every bedtime story, and my daughter was sleeping in polar bear pajamas under a butterfly mobile with a downy snow owl clutched to her chin. Her comb handle was a fish. Her toothbrush handle was a whale. She cut her first tooth on a rubber giraffe.

Our world is different, zoologically speaking — less straightforward and more grisly. We are living in the eye of a great storm of extinction, on a planet hemorrhaging living things so fast that half of its nine million species could be gone by the end of the century. At my place, the teddy bears and giggling penguins kept coming. But I didn’t realize the lengths to which humankind now has to go to keep some semblance of actual wildlife in the world. As our own species has taken over, we’ve tried to retain space for at least some of the others being pushed aside, shoring up their chances of survival. But the threats against them keep multiplying and escalating. Gradually, America’s management of its wild animals has evolved, or maybe devolved, into a surreal kind of performance art.

Yet even conservationists’ small successes — crocodile species bouncing back from the brink of extinction, peregrine falcons filling the skies once again — even these pride points demonstrate the degree to which we’ve assumed — usurped, even — a puppeteer role in the theater of organic life. Citing a scientist who lamented that “right now, nature is unable to stand on its own,” Mooallem writes:

We’ve entered what some scientists are calling the Anthropocene — a new geologic epoch in which human activity, more than any other force, steers change on the planet. Just as we’re now causing the vast majority of extinctions, the vast majority of endangered species will only survive if we keep actively rigging the world around them in their favor. … We are gardening the wilderness. The line between conservation and domestication has blurred.

He finds himself uncomfortably straddling these two animal worlds — the idyllic little-kid’s dreamland and the messy, fragile ecosystem of the real world:

Once I started looking around, I noticed the same kind of secondhand fauna that surrounds my daughter embellishing the grown-up world, too — not just the conspicuous bald eagle on flagpoles and currency, or the big-cat and raptor names we give sports teams and computer operating systems, but the whale inexplicably breaching in the life-insurance commercial, the glass dolphin dangling from a rearview mirror, the owl sitting on the rump of a wild boar silk-screened on a hipster’s tote bag. I spotted wolf after wolf airbrushed on the sides of old vans, and another wolf, painted against a full moon on purple velvet, greeting me over the toilet in a Mexican restaurant bathroom. … [But] maybe we never outgrow the imaginary animal kingdom of childhood. Maybe it’s the one we are trying to save.

[…]

From the very beginning, America’s wild animals have inhabited the terrain of our imagination just as much as they‘ve inhabited the actual land. They are free-roaming Rorschachs, and we are free to spin whatever stories we want about them. The wild animals always have no comment.

So he sets out to better understand the dynamics of the cultural forces that pull these worlds together with shared abstractions and rip them apart with the brutal realities of environmental collapse. His quest, in which little Isla is a frequent companion, sends him on the trails of three endangered species — a bear, a butterfly, and a bird — which fall on three different points on the spectrum of conservation reliance, relying to various degrees on the mercy of the very humans who first disrupted “the machinery of their wildness.” On the way, he encounters a remarkably vibrant cast of characters — countless passionate citizen scientists, a professional theater actor who, after an HIV diagnosis, became a professional butterfly enthusiast, and even Martha Stewart — and finds in their relationship with the environment “the same creeping disquiet about the future” that Mooallem himself came to know when he became a father. In fact, the entire project was inextricably linked to his sense of fatherly responsibility:

I’m part of a generation that seems especially resigned to watching things we encountered in childhood disappear: landline telephones, newspapers, fossil fuels. But leaving your kids a world without wild animals feels like a special tragedy, even if it’s hard to rationalize why it should.

The truth is that most of us will never experience the Earth’s endangered animals as anything more than beautiful ideas. They are figments of our shared imagination, recognizable from TV, but stalking places — places out there — to which we have no intention of going. I wondered how that imaginative connection to wildlife might fray or recalibrate as we’re forced to take more responsibility for its wildness.

It also occurred to me early on that all three endangered species I was getting to know could be gone by the time Isla is my age. It’s possible that, thirty years from now, they’ll have receded into the realm of dinosaurs, or the realm of Pokémon, for that matter — fantastical creatures whose names and diets little kids memorize from books. And it’s possible, too, I realized, that it might not even make a difference, that there would still be polar bears on footsy pajamas and sea turtle-shaped gummy vitamins — that there could be so much actual destruction without ever meaningfully upsetting the ecosystems in our minds.

Originally featured in May — read more here.

5. THINKING IN NUMBERS

Daniel Tammet was born with an unusual mind — he was diagnosed with high-functioning autistic savant syndrome, which meant his brain’s uniquely wired circuits made possible such extraordinary feats of computation and memory as learning Icelandic in a single week and reciting the number pi up to the 22,514th digit. He is also among the tiny fraction of people diagnosed with synesthesia — that curious crossing of the senses that causes one to “hear” colors, “smell” sounds, or perceive words and numbers in different hues, shapes, and textures. Synesthesia is incredibly rare — Vladimir Nabokov was among its few famous sufferers — which makes it overwhelmingly hard for the majority of us to imagine precisely what it’s like to experience the world through this sensory lens. Luckily, Tammet offers a fascinating first-hand account in Thinking In Numbers: On Life, Love, Meaning, and Math (public library) — a magnificent collection of 25 essays on “the math of life,” celebrating the magic of possibility in all its dimensions. In the process, he also invites us to appreciate the poetics of numbers, particularly of ordered sets — in other words, the very lists that dominate everything from our productivity tools to our creative inventories to the cheapened headlines flooding the internet.

Reflecting on his second book, Embracing the Wide Sky: A Tour Across the Horizons of the Mind, and the overwhelming response from fascinated readers seeking to know what it’s really like to experience words and numbers as colors and textures — to experience the beauty that a poem and a prime number exert on a synesthete in equal measure — Tammet offers an absorbing simulation of the synesthetic mind:

Imagine.

Close your eyes and imagine a space without limits, or the infinitesimal events that can stir up a country’s revolution. Imagine how the perfect game of chess might start and end: a win for white, or black, or a draw? Imagine numbers so vast that they exceed every atom in the universe, counting with eleven or twelve fingers instead of ten, reading a single book in an infinite number of ways.

Such imagination belongs to everyone. It even possesses its own science: mathematics. Ricardo Nemirovsky and Francesca Ferrara, who specialize in the study of mathematical cognition, write that “like literary fiction, mathematical imagination entertains pure possibilities.” This is the distillation of what I take to be interesting and important about the way in which mathematics informs our imaginative life. Often we are barely aware of it, but the play between numerical concepts saturates the way we experience the world.

Sketches from synesthetic artist and musician Michal Levy’s animated visualization of John Coltrane’s ‘Giant Steps.’ Click image for details.

Tammet, above all, is enchanted by the mesmerism of the unknown, which lies at the heart of science and the heart of poetry:

The fact that we have never read an endless book, or counted to infinity (and beyond!) or made contact with an extraterrestrial civilization (all subjects of essays in the book) should not prevent us from wondering: what if? … Literature adds a further dimension to the exploration of those pure possibilities. As Nemirovsky and Ferrara suggest, there are numerous similarities in the patterns of thinking and creating shared by writers and mathematicians (two vocations often considered incomparable.)

In fact, this very link between mathematics and fiction, between numbers and storytelling, underpins much of Tammet’s exploration. Growing up as one of nine siblings, he recounts how the oppressive nature of existing as a small number in a large set spurred a profound appreciation of numbers as sensemaking mechanisms for life:

Effaced as individuals, my brothers, sisters, and I existed only in number. The quality of our quantity became something we could not escape. It preceded us everywhere: even in French, whose adjectives almost always follow the noun (but not when it comes to une grande famille). … From my family I learned that numbers belong to life. The majority of my math acumen came not from books but from regular observations and day-to-day interactions. Numerical patterns, I realized, were the matter of our world.

This awareness was the beginning of Tammet’s synesthetic sensibility:

Like colors, the commonest numbers give character, form, and dimension to our world. Of the most frequent — zero and one — we might say that they are like black and white, with the other primary colors — red, blue, and yellow — akin to two, three, and four. Nine, then, might be a sort of cobalt or indigo: in a painting it would contribute shading, rather than shape. We expect to come across samples of nine as we might samples of a color like indigo—only occasionally, and in small and subtle ways. Thus a family of nine children surprises as much as a man or woman with cobalt-colored hair.

Daniel Tammet. Portrait by Jerome Tabet.

Sampling from Jorge Luis Borges’s humorous fictional taxonomy of animals, inspired by the work of nineteenth-century German mathematician Georg Cantor, Tammet points to the deeper insight beneath our efforts to itemize and organize the universe — something Umberto Eco knew when he proclaimed that “the list is the origin of culture” and Susan Sontag intuited when she reflected on why lists appeal to us. Tammet writes:

Borges here also makes several thought-provoking points. First, though a set as familiar to our understanding as that of “animals” implies containment and comprehension, the sheer number of its possible subsets actually swells toward infinity. With their handful of generic labels (“mammal,” “reptile,” “amphibious,” etc.), standard taxonomies conceal this fact. To say, for example, that a flea is tiny, parasitic, and a champion jumper is only to begin to scratch the surface of all its various aspects.

Second, defining a set owes more to art than it does to science. Faced with the problem of a near endless number of potential categories, we are inclined to choose from a few — those most tried and tested within our particular culture. Western descriptions of the set of all elephants privilege subsets like “those that are very large,” and “those possessing tusks,” and even “those possessing an excellent memory,” while excluding other equally legitimate possibilities such as Borges’s “those that at a distance resemble flies,” or the Hindu “those that are considered lucky.”

[…]

Reading Borges invites me to consider the wealth of possible subsets into which my family “set” could be classified, far beyond those that simply point to multiplicity.

Tammet circles back to the shared gifts of literature and mathematics, which both help cultivate our capacity for compassion:

Like works of literature, mathematical ideas help expand our circle of empathy, liberating us from the tyranny of a single, parochial point of view. Numbers, properly considered, make us better people.

Originally featured in August — read more here.

6. SMARTER THAN YOU THINK

“The dangerous time when mechanical voices, radios, telephones, take the place of human intimacies, and the concept of being in touch with millions brings a greater and greater poverty in intimacy and human vision,” Anaïs Nin wrote in her diary in 1946, decades before the internet as we know it even existed. Her fear has since been echoed again and again with every incremental advance in technology, often with simplistic arguments about the attrition of attention in the age of digital distraction. But in Smarter Than You Think: How Technology is Changing Our Minds for the Better (public library), Clive Thompson — one of the finest technology writers I know, with regular bylines for Wired and The New York Times — makes a powerful and rigorously thought out counterpoint. He argues that our technological tools — from search engines to status updates to sophisticated artificial intelligence that defeats the world’s best chess players — are now inextricably linked to our minds, working in tandem with them and profoundly changing the way we remember, learn, and “act upon that knowledge emotionally, intellectually, and politically,” and this is a promising rather than perilous thing.

He writes in the introduction:

These tools can make even the amateurs among us radically smarter than we’d be on our own, assuming (and this is a big assumption) we understand how they work. At their best, today’s digital tools help us see more, retain more, communicate more. At their worst, they leave us prey to the manipulation of the toolmakers. But on balance, I’d argue, what is happening is deeply positive. This book is about the transformation.

Page from ‘Charley Harper: An Illustrated Life.’ Click image for details.

But Thompson is nothing if not a dimensional thinker with extraordinary sensitivity to the complexities of cultural phenomena. Rather than revisiting painfully familiar and trite-by-overuse notions like distraction and information overload, he examines the deeper dynamics of how these new tools are affecting the way we make sense of the world and of ourselves. Several decades after Vannevar Bush’s now-legendary meditation on how technology will impact our thinking, Thompson reaches even further into the fringes of our cultural sensibility — past the cheap techno-dystopia, past the pollyannaish techno-utopia, and into that intricate and ever-evolving intersection of technology and psychology.

One of his most fascinating and important points has to do with our outsourcing of memory — or, more specifically, our increasingly deft, search-engine-powered skills of replacing the retention of knowledge in our own brains with the on-demand access to knowledge in the collective brain of the internet. Think, for instance, of those moments when you’re trying to recall the name of a movie but only remember certain fragmentary features — the name of the lead actor, the gist of the plot, a song from the soundtrack. Thompson calls this “tip-of-the-tongue syndrome” and points out that, today, you’ll likely be able to reverse-engineer the name of the movie you don’t remember by plugging into Google what you do remember about it. Thompson contextualizes the phenomenon, which isn’t new, then asks the obvious, important question about our culturally unprecedented solutions to it:

Tip-of-the-tongue syndrome is an experience so common that cultures worldwide have a phrase for it. Cheyenne Indians call it navonotootse’a, which means “I have lost it on my tongue”; in Korean it’s hyeu kkedu-te mam-dol-da, which has an even more gorgeous translation: “sparkling at the end of my tongue.” The phenomenon generally lasts only a minute or so; your brain eventually makes the connection. But … when faced with a tip-of-the-tongue moment, many of us have begun to rely instead on the Internet to locate information on the fly. If lifelogging … stores “episodic,” or personal, memories, Internet search engines do the same for a different sort of memory: “semantic” memory, or factual knowledge about the world. When you visit Paris and have a wonderful time drinking champagne at a café, your personal experience is an episodic memory. Your ability to remember that Paris is a city and that champagne is an alcoholic beverage — that’s semantic memory.

[…]

What’s the line between our own, in-brain knowledge and the sea of information around us? Does it make us smarter when we can dip in so instantly? Or dumber with every search?

Vannevar Bush’s ‘memex’ — short for ‘memory index’ — a primitive vision for a personal hard drive for information storage and management. Click image for the full story.

That concern, of course, is far from unique to our age — from the invention of writing to Alvin Toffler’s Future Shock, new technology has always been a source of paralyzing resistance and apprehension:

Writing — the original technology for externalizing information — emerged around five thousand years ago, when Mesopotamian merchants began tallying their wares using etchings on clay tablets. It emerged first as an economic tool. As with photography and the telephone and the computer, newfangled technologies for communication nearly always emerge in the world of commerce. The notion of using them for everyday, personal expression seems wasteful, risible, or debased. Then slowly it becomes merely lavish, what “wealthy people” do; then teenagers take over and the technology becomes common to the point of banality.

Thompson reminds us of the anecdote, by now itself familiar “to the point of banality,” about Socrates and his admonition that the “technology” of writing would devastate the Greek tradition of debate and dialectic, and would render people incapable of committing anything to memory because “knowledge stored was not really knowledge at all.” He cites Socrates’s parable of the Egyptian god Theuth and how he invented writing, offering it as a gift to the king of Egypt, Thamus, who met the present with defiant indignation:

This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

That resistance endured as technology changed shape, across the Middle Ages and past Gutenberg’s revolution, but it wasn’t without counter-resistance: Those who recorded their knowledge in writing and, eventually, collected it in the form of books argued that it expanded the scope of their curiosity and the ideas they were able to ponder, whereas the mere act of rote memorization made no guarantees of deeper understanding.

Ultimately, however, Thompson points out that Socrates was both right and wrong: It’s true that, with some deliberately cultivated exceptions and neurological outliers, few thinkers today rely on pure memorization and can recite extensive passages of text from memory. But what Socrates failed to see was the extraordinary dot-connecting enabled by access to knowledge beyond what our own heads can hold — because, as Amanda Palmer poignantly put it, “we can only connect the dots that we collect,” and the outsourcing of memory has exponentially enlarged our dot-collections.

With this in mind, Thompson offers a blueprint to this newly developed system of knowledge management in which access is critical:

If you are going to read widely but often read books only once; if you going to tackle the ever-expanding universe of ideas by skimming and glancing as well as reading deeply; then you are going to rely on the semantic-memory version of gisting. By which I mean, you’ll absorb the gist of what you read but rarely retain the specifics. Later, if you want to mull over a detail, you have to be able to refind a book, a passage, a quote, an article, a concept.

But Thompson argues that despite history’s predictable patterns of resistance followed by adoption and adaptation, there’s something immutably different about our own era:

The history of factual memory has been fairly predictable up until now. With each innovation, we’ve outsourced more information, then worked to make searching more efficient. Yet somehow, the Internet age feels different. Quickly pulling up [the answer to a specific esoteric question] on Google seems different from looking up a bit of trivia in an encyclopedia. It’s less like consulting a book than like asking someone a question, consulting a supersmart friend who lurks within our phones.

And therein lies the magic of the internet — that unprecedented access to humanity’s collective brain. Thompson cites the work of Harvard psychologist Daniel Wegner, who first began exploring this notion of collective rather than individual knowledge in the 1980s by observing how partners in long-term relationships often divide and conquer memory tasks in sharing the household’s administrative duties:

Wegner suspected this division of labor takes place because we have pretty good “metamemory.” We’re aware of our mental strengths and limits, and we’re good at intuiting the abilities of others. Hang around a workmate or a romantic partner long enough and you begin to realize that while you’re terrible at remembering your corporate meeting schedule, or current affairs in Europe, or how big a kilometer is relative to a mile, they’re great at it. So you begin to subconsciously delegate the task of remembering that stuff to them, treating them like a notepad or encyclopedia. In many respects, Wegner noted, people are superior to these devices, because what we lose in accuracy we make up in speed.

[…]

Wegner called this phenomenon “transactive” memory: two heads are better than one. We share the work of remembering, Wegner argued, because it makes us collectively smarter — expanding our ability to understand the world around us.

This ability to “google” one another’s memory stores, Thompson argues, is the defining feature of our evolving relationship with information — and it’s profoundly shaping our experience of knowledge:

Transactive memory helps explain how we’re evolving in a world of on-tap information.

He illustrates this by turning to the work of Betsy Sparrow, a graduate student of Wegner’s, who conducted a series of experiments demonstrating that when we know a digital tool will store information for us, we’re far less likely to commit it to memory. On the surface, this may appear like the evident and worrisome shrinkage of our mental capacity. But there’s a subtler yet enormously important layer that such techno-dystopian simplifications miss: This very outsourcing of memory requires that we learn what the machine knows — a kind of meta-knowledge that enables us to retrieve the information when we need it. And, reflecting on Sparrow’s findings, Thomspon points out that this is neither new nor negative:

We’ve been using transactive memory for millennia with other humans. In everyday life, we are only rarely isolated, and for good reason. For many thinking tasks, we’re dumber and less cognitively nimble if we’re not around other people. Not only has transactive memory not hurt us, it’s allowed us to perform at higher levels, accomplishing acts of reasoning that are impossible for us alone. It wasn’t until recently that computer memory became fast enough to be consulted on the fly, but once it did — with search engines boasting that they return results in tenths of a second — our transactive habits adapted.

Thompson’s most important point, however, has to do with how outsourcing our knowledge to digital tools actually hampers the very process of creative thought, which relies on our ability to connect existing ideas from our mental pool of resources into new combinations, or what the French polymath Henri Poincaré has famously termed “sudden illuminations.” Without a mental catalog of materials which to mull and let incubate in our fringe consciousness, our capacity for such illuminations is greatly deflated. Thompson writes:

These eureka moments are familiar to all of us; they’re why we take a shower or go for a walk when we’re stuck on a problem. But this technique works only if we’ve actually got a lot of knowledge about the problem stored in our brains through long study and focus. … You can’t come to a moment of creative insight if you haven’t got any mental fuel. You can’t be googling the info; it’s got to be inside you.

But while this is a valid concern, Thompson doubts that we’re outsourcing too many bits of knowledge and thus curtailing our creativity. He argues, instead, that we’re mostly employing this newly evolved skill to help us sift the meaningful from the meaningless, but we remain just as capable of absorbing that which truly stimulates us:

Evidence suggests that when it comes to knowledge we’re interested in — anything that truly excites us and has meaning — we don’t turn off our memory. Certainly, we outsource when the details are dull, as we now do with phone numbers. These are inherently meaningless strings of information, which offer little purchase on the mind. … It makes sense that our transactive brains would hand this stuff off to machines. But when information engages us — when we really care about a subject — the evidence suggests we don’t turn off our memory at all.

Originally featured in September — read more here.

7. COSMIC APPRENTICE

As if to define what science is and what philosophy is weren’t hard enough, to delineate how the two fit together appears a formidable task, one that has spurred rather intense opinions. But that’s precisely what Dorion Sagan, who has previously examined the prehistoric history of sex, braves in the introduction to Cosmic Apprentice: Dispatches from the Edges of Science (public library) as he sets out to explore the intricate ways in which the two fields hang “in a kind of odd balance, watching each other, holding hands”:

The difference between science and philosophy is that the scientist learns more and more about less and less until she knows everything about nothing, whereas a philosopher learns less and less about more and more until he knows nothing about everything. There is truth in this clever crack, but, as Niels Bohr impressed, while the opposite of a trivial truth is false, the opposite of a great truth is another great truth.

I would say that applies to the flip side of the above flip takedown: Science’s eye for detail, buttressed by philosophy’s broad view, makes for a kind of alembic, an antidote to both. This intellectual electrum cuts the cloying taste of idealist and propositional philosophy with the sharp nectar of fact yet softens the edges of a technoscience that has arguably lost both its moral and its epistemological compass, the result in part of its being funded by governments and corporations whose relationship to the search for truth and its open dissemination can be considered problematic at best.

Sagan refutes the popular perception of science as rationally objective, a vessel of capital-T Truth, reminding us that every scientific concept and theory was birthed by a subjective, fallible human mind:

All observations are made from distinct places and times, and in science no less than art or philosophy by particular individuals. … Although philosophy isn’t fiction, it can be more personal, creative and open, a kind of counterbalance for science even as it argues that science, with its emphasis on a kind of impersonal materialism, provides a crucial reality check for philosophy and a tendency to overtheorize that [is] inimical to the scientific spirit. Ideally, in the search for truth, science and philosophy, the impersonal and autobiographical, can “keep each other honest,” in a kind of open circuit. Philosophy as the underdog even may have an advantage, because it’s not supposed to be as advanced as science, nor does it enjoy science’s level of institutional support — or the commensurate heightened risks of being beholden to one’s benefactors.

Like Richard Feynman, who argued tirelessly for the scientist’s responsibility to remain unsure, Sagan echoes the idea that willful ignorance is what drives science and the fear of being wrong is one of its greatest hindrances:

Science’s spirit is philosophical. It is the spirit of questioning, of curiosity, of critical inquiry combined with fact-checking. It is the spirit of being able to admit you’re wrong, of appealing to data, not authority, which does not like to admit it is wrong.

Sagan reflects on his father’s conviction that “the effort to popularize science is a crucial one for society,” one he shared with Richard Feynman, and what made Carl’s words echo as profoundly and timelessly as they do:

Science and philosophy both had a reputation for being dry, but my father helped inject life into the former, partly by speaking in plain English and partly by focusing on the science fiction fantasy of discovering extraterrestrial life.

In that respect, science could learn from philosophy’s intellectual disposition:

Philosophy today, not taught in grade school in the United States, is too often merely an academic pursuit, a handmaiden or apologetics of science, or else a kind of existential protest, a trendy avocation of grad students and the dark-clad coffeehouse set. But philosophy, although it historically gives rise to experimental science, sometimes preserves a distinct mode of sustained questioning that sharply distinguishes it from modern science, which can be too quick to provide answers.

[…]

Philosophy is less cocksure, less already-knowing, or should be, than the pundits’ diatribes that relieve us of the difficulties of not knowing, of carefully weighing, of looking at the other side, of having to think things through for ourselves. Dwell in possibility, wrote Emily Dickinson: Philosophy at its best seems a kind of poetry, not an informational delivery but a dwelling, an opening of our thoughts to the world.

Like Buckminster Fuller, who vehemently opposed specialization, Sagan attests to the synergetic value of intellectual cross-pollination, attesting to the idea that true breakthroughs in science require cross-disciplinary connections and originality consists of linking up ideas whose connection was not previously suspected:

It is true that science requires analysis and that it has fractured into microdisciplines. But because of this, more than ever, it requires synthesis. Science is about connections. Nature no more obeys the territorial divisions of scientific academic disciplines than do continents appear from space to be colored to reflect the national divisions of their human inhabitants. For me, the great scientific satoris, epiphanies, eurekas, and aha! moments are characterized by their ability to connect.

“In disputes upon moral or scientific points,” advised Martine in his wonderful 1866 guide to the art of conversation, “ever let your aim be to come at truth, not to conquer your opponent. So you never shall be at a loss in losing the argument, and gaining a new discovery.” Science, Sagan suggests — at least at its most elegant — is a conversation of constant revision, where each dead end brings to life a new fruitful question:

Theories are not only practical, and wielded like intellectual swords to the death … but beautiful. A good one is worth more than all the ill-gotten hedge fund scraps in the world. A good scientific theory shines its light, revealing the world’s fearful symmetry. And its failure is also a success, as it shows us where to look next.

Supporting Neil deGrasse Tyson’s contention that intelligent design is a philosophy of ignorance, Sagan applies this very paradigm of connection-making to the crux of the age-old science vs. religion debate, painting evolution not as a tool of certitude but as a reminder of our connectedness to everything else:

Connecting humanity with other species in a single process was Darwin’s great natural historical accomplishment. It showed that some of the issues relegated to religion really come under the purview of science. More than just a research program for technoscience, it provides a eureka moment, a subject of contemplation open in principle to all thinking minds. Beyond the squabbles over its mechanisms and modes, evolution’s epiphany derives from its widening of vistas, its showing of the depths of our connections to others from whom we’d thought we were separate. Philosophy, too … in its ancient, scientifico-genic spirit of inquiry so different from a mere, let alone peevish, recounting of facts, needs to be reconnected to science for the latter to fulfill its potential not just as something useful but as a source of numinous moments, deep understanding, and indeed, religious-like epiphanies of cosmic comprehension and aesthetic contemplation.

Originally featured in April — see more here.

8. SOCIAL

“Without the sense of fellowship with men of like mind,” Einstein wrote, “life would have seemed to me empty.” It is perhaps unsurprising that the iconic physicist, celebrated as “the quintessential modern genius,” intuited something fundamental about the inner workings of the human mind and soul long before science itself had attempted to concretize it with empirical evidence. Now, it has: In Social: Why Our Brains Are Wired to Connect (public library), neuroscientist Matthew D. Lieberman, director of UCLA’s Social Cognitive Neuroscience lab, sets out to “get clear about ‘who we are’ as social creatures and to reveal how a more accurate understanding of our social nature can improve our lives and our society. Lieberman, who has spent the past two decades using tools like fMRI to study how the human brain responds to its social context, has found over and over again that our brains aren’t merely simplistic mechanisms that only respond to pain and pleasure, as philosopher Jeremy Bentham famously claimed, but are instead wired to connect. At the heart of his inquiry is a simple question: Why do we feel such intense agony when we lose a loved one? He argues that, far from being a design flaw in our neural architecture, our capacity for such overwhelming grief is a vital feature of our evolutionary constitution:

The research my wife and I have done over the past decade shows that this response, far from being an accident, is actually profoundly important to our survival. Our brains evolved to experience threats to our social connections in much the same way they experience physical pain. By activating the same neural circuitry that causes us to feel physical pain, our experience of social pain helps ensure the survival of our children by helping to keep them close to their parents. The neural link between social and physical pain also ensures that staying socially connected will be a lifelong need, like food and warmth. Given the fact that our brains treat social and physical pain similarly, should we as a society treat social pain differently than we do? We don’t expect someone with a broken leg to “just get over it.” And yet when it comes to the pain of social loss, this is a common response. The research that I and others have done using fMRI shows that how we experience social pain is at odds with our perception of ourselves. We intuitively believe social and physical pain are radically different kinds of experiences, yet the way our brains treat them suggests that they are more similar than we imagine.

Citing his research, Lieberman affirms the notion that there is no such thing as a nonconformist, pointing out the social construction of what we call our individual “selves” — empirical evidence for what the novelist William Gibson so eloquently termed one’s “personal micro-culture” — and observes “our socially malleable sense of self”:

The neural basis for our personal beliefs overlaps significantly with one of the regions of the brain primarily responsible for allowing other people’s beliefs to influence our own. The self is more of a superhighway for social influence than it is the impenetrable private fortress we believe it to be.

Contextualizing it in a brief evolutionary history, he argues that this osmosis of sociality and individuality is an essential aid in our evolutionary development rather than an aberrant defect in it:

Our sociality is woven into a series of bets that evolution has laid down again and again throughout mammalian history. These bets come in the form of adaptations that are selected because they promote survival and reproduction. These adaptations intensify the bonds we feel with those around us and increase our capacity to predict what is going on in the minds of others so that we can better coordinate and cooperate with them. The pain of social loss and the ways that an audience’s laughter can influence us are no accidents. To the extent that we can characterize evolution as designing our modern brains, this is what our brains were wired for: reaching out to and interacting with others. These are design features, not flaws. These social adaptations are central to making us the most successful species on earth.

The implications of this span across everything from the intimacy of our personal relationships to the intricacy of organizational management and teamwork. But rather than entrusting a single cognitive “social network” with these vital functions, our brains turn out to host many. Lieberman explains:

Just as there are multiple social networks on the Internet such as Facebook and Twitter, each with its own strengths, there are also multiple social networks in our brains, sets of brain regions that work together to promote our social well-being.

These networks each have their own strengths, and they have emerged at different points in our evolutionary history moving from vertebrates to mammals to primates to us, Homo sapiens. Additionally, these same evolutionary steps are recapitulated in the same order during childhood.

He goes on to explore three major adaptations that have made us so inextricably responsive to the social world:

  • Connection: Long before there were any primates with a neocortex, mammals split off from other vertebrates and evolved the capacity to feel social pains and pleasures, forever linking our well-being to our social connectedness. Infants embody this deep need to stay connected, but it is present through our entire lives.
  • Mindreading: Primates have developed an unparalleled ability to understand the actions and thoughts of those around them, enhancing their ability to stay connected and interact strategically. In the toddler years, forms of social thinking develop that outstrip those seen in the adults of any other species. This capacity allows humans to create groups that can implement nearly any idea and to anticipate the needs and wants of those around us, keeping our groups moving smoothly.
  • Harmonizing: The sense of self is one of the most recent evolutionary gifts we have received. Although the self may appear to be a mechanism for distinguishing us from others and perhaps accentuating our selfishness, the self actually operates as a powerful force for social cohesiveness. During the preteen and teenage years, adolescent refers to the neural adaptations that allow group beliefs and values to influence our own.

Originally featured in November — see more here, including Liberman’s fantastic TEDxStLouis talk.

9. GULP

Few writers are able to write about science in a way that’s provocative without being sensationalistic, truthful without being dry, enchanting without being forced — and even fewer are able to do so on subjects that don’t exactly lend themselves to Saganesque whimsy. After all, it’s infinitely easier to inspire awe while discussing the bombastic magnificence of the cosmos than, say, the function of bodily fluids and the structures that secrete them. But Mary Roach is one of those rare writers, and that’s precisely what she proves once more in Gulp: Adventures on the Alimentary Canal (public library) — a fascinating tour of the body’s most private hydraulics.

Roach writes in the introduction:

The early anatomists had that curiosity in spades. They entered the human form like an unexplored continent. Parts were named like elements of geography: the isthmus of the thyroid, the isles of the pancreas, the straits and inlets of the pelvis. The digestive tract was for centuries known as the alimentary canal. How lovely to picture one’s dinner making its way down a tranquil, winding waterway, digestion and excretion no more upsetting or off-putting than a cruise along the Rhine. It’s this mood, these sentiments — the excitement of exploration and the surprises and delights of travel to foreign locales — that I hope to inspire with this book.

It may take some doing. The prevailing attitude is one of disgust. … I remember, for my last book, talking to the public-affairs staff who choose what to stream on NASA TV. The cameras are often parked on the comings and goings of Mission Control. If someone spots a staffer eating lunch at his desk, the camera is quickly repositioned. In a restaurant setting, conviviality distracts us from the biological reality of nutrient intake and oral processing. But a man alone with a sandwich appears as what he is: an organism satisfying a need. As with other bodily imperatives, we’d rather not be watched. Feeding, and even more so its unsavory correlates, are as much taboos as mating and death.

The taboos have worked in my favor. The alimentary recesses hide a lode of unusual stories, mostly unmined. Authors have profiled the brain, the heart, the eyes, the skin, the penis and the female geography, even the hair, but never the gut. The pie hole and the feed chute are mine.

Roach goes on to bring real science to those subjects that make teenagers guffaw and that populate mediocre standup jokes, exploring such bodily mysteries as what flatulence research reveals about death, why tasting has little to do with taste, how thorough chewing can lower the national debt, and why we like the foods we like and loathe the rest.< /p>

10. WONDERS OF THE UNIVERSE

“I know that I am mortal by nature and ephemeral,” ur-astronomer Ptolemy contemplated nearly two millennia ago, “but when I trace at my pleasure the windings to and fro of the heavenly bodies, I no longer touch earth with my feet. I stand in the presence of Zeus himself and take my fill of ambrosia.” But while the cosmos has fascinated humanity since the dawn of time, its mesmerism isn’t that of an abstract other but, rather, the very self-reflexive awareness that Ptolemy attested to, that intimate and inextricable link between the wonders of life here on Earth and the magic we’ve always found in our closest cosmic neighbors.

That’s precisely what modern-day science-enchanter Brian Cox explores in Wonders of the Solar System (public library) — the fantastic and illuminating book based on his BBC series of the same title celebrating the spirit of exploration, and a follow-up to his Wonders of Life and every bit as brimming with his signature blend of enthralling storytelling, scientific brilliance, and contagious conviction.

Cox begins by reminding us that preserving the spirit of exploration is both a joy and a moral obligation — especially at a time when it faces tragic threats of indifference and neglect from the very authorities whose job it is to fuel it, despite a citizenry profoundly in love with the ethos of exploration:

[The spirit of exploration] is desperately relevant, an idea so important that celebration is perhaps too weak a word. It is a plea for the spirit of the navigators of the seas and the pioneers of aviation and spaceflight to be restored and cherished; a case made to the viewer and reader that reaching for worlds beyond our grasp is an essential driver of progress and necessary sustenance for the human spirit. Curiosity is the rocket fuel that powers our civilization. If we deny this innate and powerful urge, perhaps because earthly concerns seem more worthy or pressing, then the borders of our intellectual and physical domain will shrink with our ambitions. We are part of a much wider ecosystem, and our prosperity and even long-term survival are contingent on our understanding of it.

But most revelational of all is Cox’s gift from illustrating what our Earthly phenomena, right here on our seemingly ordinary planet, reveal about the wonders and workings of the Solar System.

Tornadoes, for instance, tell us how our star system was born — the processes that drive these giant rotating storms obey the same physics forces that caused clumps to form at the center of nebulae five billion years ago, around which the gas cloud collapsed and began spinning ever-faster, ordering the chaos, until the early Solar System was churned into existence. This universal principle, known as the conservation of angular momentum, is also what drives a tornado’s destructive spiral.

Cox synthesizes:

This is how our Solar System was born: rather than the whole system collapsing into the Sun, a disc of dust and gas extending billions of kilometers into space formed around the new shining star. In just a few hundred million years, pieces of the cloud collapsed to form planets and moons, and so a star system, our Solar System, was formed. The journey from chaos into order had begun.

Then we have Iceland’s icebergs and glacial lagoons, which offer remarkable insight into the nature of Saturn’s rings. Both shine with puzzling brightness — the lagoons, here on Earth, by bringing pure water that is thousands of years old and free of pollutants from the bottom of the seabed to the surface as they rise, forming ice crystals of exceptional vibrance; Saturn’s rings, young and ever-changing, by circling icy ring particles around the planet, constantly crashing them together and breaking them apart, thus exposing bright new facets of ice that catch the sunlight and dazzle amidst a Solar System that is otherwise “a very dirty place.”

Cox explains:

It’s difficult to imagine the scale, beauty and intricacy of Saturn’s rings here on Earth, but the glacial lagoons of Iceland can transport our minds across millions of kilometers of space and help us understand the true nature of the rings. … At first sight, the lagoon appears to be a solid sheet of pristine ice,but this is an illusion. The surface is constantly shifting, an almost organic, every-changing raft of thousands of individual icebergs floating on the water. The structure of Saturn’s rings is similar, because despite appearances the rings aren’t solid. Each ring is made up of hundreds of ringlets and each ringlet is made up of billions of separate pieces. Captured by Saturn’s gravity, the ring particles independently orbit the panel in an impossibly thin layer.

Cox goes on to explore other such illuminating parallels, from how Alaska’s Lake Eyak illustrate the methane cycles of the universe to what Hawaii’s Big Island tells us about the forces that keep any planet alive to how the volcanic features of India’s Deccan Traps explain why Venus choked to death. He ends with T. S. Eliot’s timeless verses on the spirit of exploration and echoes Neil deGrasse Tyson’s wisdom on your ego and the cosmic perspective, concluding:

You could take the view that our exploration of the Universe has made us somehow insignificant; one tiny planet around one star amongst hundreds of billions. But I don’t take that view, because we’ve discovered that it takes the rarest combination of chance and the laws of Nature to produce a planet that can support a civilization, that most magnificent structure that allows us to explore and understand the Universe. That’s why, for me, our civilization is the wonder of the Solar System, and if you were to be looking at the Earth from outside the Solar System that much would be obvious. We have written the evidence of our existence onto the surface of our planet. Our civilization has become a beacon that identifies our planet as a home to life.

Originally featured in August — see more here.

11. SAVE OUR SCIENCE

“What is crucial is not that technical ability, but it is imagination in all of its applications,” the great E. O. Wilson offered in his timeless advice to young scientists — a conviction shared by some of history’s greatest scientific minds. And yet it is rote memorization and the unimaginative application of technical skill that our dominant education system prioritizes — so it’s no wonder it is failing to produce the Edisons and Curies of our day. In Save Our Science: How to Inspire a New Generation of Scientists, materials scientist, inventor, and longtime Yale professor Ainissa Ramirez takes on a challenge Isaac Asimov presaged a quarter century ago, advocating for the value of science education and critiquing its present failures, with a hopeful and pragmatic eye toward improving its future. She writes in the introduction:

The 21st century requires a new kind of learner — not someone who can simply churn out answers by rote, as has been done in the past, but a student who can think expansively and solve problems resourcefully.

To do that, she argues, we need to replace the traditional academic skills of “reading, ’riting, and ’rithmetic” with creativity, curiosity, critical-thinking, and problem-solving. (Though, as psychology has recently revealed, problem-finding might be the more valuable skill.)

Ainissa Ramirez at TED 2012 (Photograph: James Duncan Davidson for TED)

She begins with the basics:

While the acronym STEM sounds very important, STEM answers just three questions: Why does something happen? How can we apply this knowledge in a practical way? How can we describe what is happening succinctly? Through the questions, STEM becomes a pathway to be curious, to create, and to think and figure things out.

Even for those of us who deem STEAM (wherein the A stands for “arts”) superior to STEM, Ramirez’s insights are razor-sharp and consistent with the oft-affirmed idea that creativity relies heavily upon connecting the seemingly disconnected and aligning the seemingly misaligned:

There are two schools of thought on defining creativity: divergent thinking, which is the formation of a creative idea resulting from generating lots of ideas, and a Janusian approach, which is the act of making links between two remote ideas. The latter takes its name from the two-faced Roman god of beginnings, Janus, who was associated with doorways and the idea of looking forward and backward at the same time. Janusian creativity hinges on the belief that the best ideas come from linking things that previously did not seem linkable. Henri Poincaré, a French mathematician, put it this way: ‘To create consists of making new combinations. … The most fertile will often be those formed of elements drawn from domains which are far apart.’

Another element inherent to the scientific process but hardly rewarded, if not punished, in education is the role of ignorance, or what the poet John Keats has eloquently and timelessly termed “negative capability” — the art of brushing up against the unknown and proceeding anyway. Ramirez writes:

My training as a scientist allows me to stare at an unknown and not run away, because I learned that this melding of uncertainty and curiosity is where innovation and creativity occur.

Yet these very qualities are missing from science education in the United States — and it shows. When the Programme for International Student Assessment (PISA) took their annual poll in 2006, the U.S. ranked 35th in math and 29th in science out of the 40 high-income, developed countries surveyed.

Average PISA scores versus expenditures for selected countries (Source: Organisation for Economic Co-operation and Development)

Ramirez offers a historical context: When American universities first took root in the colonial days, their primary role was to educate men for the clergy, so science, technology, and math were not a priority. But then Justin Smith Morrill, a little-known congressman from Vermont who had barely completed his high school education, came along in 1861 and quietly but purposefully sponsored legislation that forever changed American education, resulting in more than 70 new colleges and universities that included STEM subjects in their curricula. This catapulted enrollment rates from the mere 2% of the population who attended higher education prior to the Civil War and greatly increased diversity in academia, with the act’s second revision in 1890 extending education opportunities to women and African-Americans.

The growth of U.S. college enrollment from 1869 to 1994. (Source: S. B. Carter et al., Historical Statistics of the United States)

But what really propelled science education, Ramirez notes, was the competitive spirit of the Space Race:

The mixture of being outdone and humiliated motivated the U.S. to create NASA and bolster the National Science Foundation’s budget to support science research and education. Sputnik forced the U.S. to think about its science position and to look hard into a mirror — and the U.S. did not like what it saw. In 1956, before Sputnik, the National Science Foundation’s budget was a modest $15.9 million. In 1958, it tripled to $49.5 million, and it doubled again in 1959 to $132.9 million. The space race was on. We poured resources, infrastructure, and human capital into putting an American on the moon, and with that goal, STEM education became a top priority.

President John F. Kennedy addresses a crowd of 35,000 at Rice University in 1962, proclaiming again his desire to reach the moon with the words, ‘We set sail on this new sea because there is new knowledge to be gained.’ Credit: NASA / Public domain

Ramirez argues for returning to that spirit of science education as an investment in national progress:

The U.S. has a history of changing education to meet the nation’s needs. We need similar innovative forward-thinking legislation now, to prepare our children and our country for the 21st century. Looking at our history allows us to see that we have been here before and prevailed. Let’s meet this challenge, for it will, as Kennedy claimed, draw out the very best in all of us.

In confronting the problems that plague science education and the public’s relationship with scientific culture, Ramirez points to the fact that women account for only 26% of STEM bachelor’s degrees and explores the heart of the glaring gender problem:

[There is a] false presumption that girls are not as good as boys in science and math. This message absolutely pervades our national mindset. Even though girls and boys sit next to each other in class, fewer women choose STEM careers than men. This is the equivalent to a farmer sowing seeds and then harvesting only half of the fields.

The precipitous drop in girls’ enrollment in STEM classes. (Source: J. F. Latimer, What’s Happened To Our High Schools)

In turning toward possible solutions, Ramirez calls out the faulty models of standardized testing, which fail to account for more dimensional definitions of intelligence. She writes:

There is a concept in physics that the observer of an experiment can change the results just by the act of observing (this is called, not surprisingly, the observer effect). For example, knowing the required pressure of your tires and observing that they are overinflated dictates that you let some air out, which changes the pressure slightly.

Although this theory is really for electrons and atoms, we also see it at work in schools. Schools are evaluated, by the federal and state governments, by tests. The students are evaluated by tests administered by the teachers. It is the process of testing that has changed the mission of the school from instilling a wide knowledge of the subject matter to acquiring a good score on the tests.

The United States is one of the most test-taking countries in the world, and the standard weapon is the multiple-choice question. Although multiple-choice tests are efficient in schools, they don’t inspire learning. In fact, they do just the opposite. This is hugely problematic in encouraging the skills needed for success in the 21st century. Standardized testing teaches skills that are counter to skills needed for the future, such as curiosity, problem solving, and having a healthy relationship with failure. Standardized tests draw up a fear of failure, since you seek a specific answer and you will be either right or wrong; they kick problem solving in the teeth, since you never need to show your work and never develop a habit of figuring things out; and they slam the doors to curiosity, since only a small selection of the possible answers is laid out before you. These kinds of tests produce thinkers who are unwilling to stretch and take risks and who cannot handle failure. They crush a sense of wonder.

Like Noam Chomsky, who has questioned why schools train for passing tests rather than for creative inquiry, and Sir Ken Robinson, who has eloquently advocated for changing the factory model of education, Ramirez urges:

While scientists passionately explore, reason, discover, synthesize, compare, contrast, and connect the dots, students drudgingly memorize, watch, and passively consume. Students are exercising the wrong muscle. An infusion of STEM taught in compelling ways will give students an opportunity to acquire these active learning skills.

Ramirez goes on to propose a multitude of small changes and larger shifts that communities, educators, cities, institutions, and policy-makers could implement — from neighborhood maker-spaces to wifi hotspots on school buses to university science festivals to new curricula and testing methods — that would begin to bridge the gap between what science education currently is and what scientific culture could and should be. She concludes, echoing Alvin Toffler’s famous words that “the illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn”:

The skills of the 21st century need us to create scholars who can link the unlinkable. … Nurturing curious, creative problem solvers who can master the art of figuring things out will make them ready for this unknown brave new world. And that is the best legacy we can possibly leave.

Originally featured in February — see more here.

12. THE ELEMENTS OF EUCLID

Almost a century before Mondrian made his iconic red, yellow, and blue geometric compositions, and around the time that Edward Livingston Youmans was creating his stunning chemistry diagrams, an eccentric 19th-century civil engineer and mathematician named Oliver Byrne produced a striking series of vibrant diagrams in primary colors for a 1847 edition of the legendary Greek mathematical treatise Euclid’s Elements. Byrne, a vehement opponent of pseudoscience with an especial distaste phrenology, was early to the insight that great design and graphic elegance can powerfully aid learning. He explained that in his edition of Euclid, “coloured diagrams and symbols are used instead of letters for the greater ease of learners.” The book, a masterpiece of Victorian printing and graphic design long before “graphic design” existed as a discipline, is celebrated as one of the most unusual and most beautiful books of the 19th century.

Now, the fine folks of Taschen — who have brought us such visual treasures as the best illustrations from 150 years of Hans Christian Andersen, the life and legacy of infographics godfather Fritz Kahn, and the visual history of magic — are resurrecting Byrne’s gem in the lavish tome The First Six Books of the Elements of Euclid (public library), edited by Swiss polymath Werner Oechslin.

Proof of the Pythagorean theorem

A masterwork of art and science in equal measure, this newly rediscovered treasure mesmerizes the eye with its brightly colored circles, squares, and triangles while it tickles the brain with its mathematical magic.

Originally featured in November — see more here.

13. DOES MY GOLDFISH KNOW WHO I AM?

In 2012, I wrote about a lovely book titled Big Questions from Little People & Simple Answers from Great Minds, in which some of today’s greatest scientists, writers, and philosophers answer kids’ most urgent questions, deceptively simple yet profound. It went on to become one of the year’s best books and among readers’ favorites. A few months later, Gemma Elwin Harris, the editor who had envisioned the project, reached out to invite me to participate in the book’s 2013 edition by answering one randomly assigned question from a curious child. Naturally, I was thrilled to do it, and honored to be a part of something as heartening as Does My Goldfish Know Who I Am? (public library), also among the best children’s books of the year — a compendium of primary school children’s funny, poignant, innocent yet insightful questions about science and how life works, answered by such celebrated minds as rockstar physicist Brian Cox, beloved broadcaster and voice-of-nature Sir David Attenborough, legendary linguist Noam Chomsky, science writer extraordinaire Mary Roach, stat-showman Hans Rosling, Beatle Paul McCartney, biologist and Beagle Project director Karen James, and iconic illustrator Sir Quentin Blake. As was the case with last year’s edition, more than half of the proceeds from the book — which features illustrations by the wonderful Andy Smith — are being donated to a children’s charity.

The questions range from what the purpose of science is to why onions make us cry to whether spiders can speak to why we blink when we sneeze. Psychologist and broadcaster Claudia Hammond, who recently explained the fascinating science of why time slows down when we’re afraid, speeds up as we age, and gets all warped while we’re on vacation in one of the best psychology and philosophy books of 2013, answers the most frequently asked question by the surveyed children: Why do we cry?

It’s normal to cry when you feel upset and until the age of twelve boys cry just as often as girls. But when you think about it, it is a bit strange that salty water spills out from the corners of your eyes just because you feel sad.

One professor noticed people often say that, despite their blotchy faces, a good cry makes them feel better. So he did an experiment where people had to breathe in over a blender full of onions that had just been chopped up. Not surprisingly this made their eyes water. He collected the tears and put them in the freezer. Then he got people to sit in front of a very sad film wearing special goggles which had tiny buckets hanging off the bottom, ready to catch their tears if they cried. The people cried, but the buckets didn’t work and in the end he gathered their tears in tiny test tubes instead.

He found that the tears people cried when they were upset contained extra substances, which weren’t in the tears caused by the onions. So he thinks maybe we feel better because we get rid of these substances by crying and that this is the purpose of tears.

But not everyone agrees. Many psychologists think that the reason we cry is to let other people know that we need their sympathy or help. So crying, provided we really mean it, brings comfort because people are nice to us.

Crying when we’re happy is a bit more of a mystery, but strong emotions have a lot in common, whether happy or sad, so they seem to trigger some of the same processes in the body.

(For a deeper dive into the biological mystery of crying, see the science of sobbing and emotional tearing.)

Joshua Foer, who knows a thing or two about superhuman memory and the limits of our mind, explains to 9-year-old Tom how the brain can store so much information despite being that small:

An adult’s brain only weighs about 1.4 kilograms, but it’s made up of about 100 billion microscopic neurons. Each of those neurons looks like a tiny branching tree, whose limbs reach out and touch other neurons. In fact, each neuron can make between 5,000 and 10,000 connections with other neurons — sometimes even more. That’s more than 500 trillion connections! A memory is essentially a pattern of connections between neurons.

Every sensation that you remember, every thought that you think, transforms your brain by altering the connections within that vast network. By the time you get to the end of this sentence, you will have created a new memory, which means your brain will have physically changed.

Neuroscientist Tali Sharot, who has previously studied why our brains are wired for optimism, answers 8-year-old Maia’s question about why we don’t have memories from the time we were babies and toddlers:

We use our brain for memory. In the first few years of our lives, our brain grows and changes a lot, just like the rest of our body. Scientists think that because the parts of our brain that are important for memory have not fully developed when we are babies, we are unable to store memories in the same way that we do when we are older.

Also, when we are very young we do not know how to speak. This makes it difficult to keep events in your mind and remember them later, because we use language to remember what happened in the past.

In answering 8-year-old Hannah’s question about what newspapers do when there is no news, writer and journalist Oliver Burkeman, author of the excellent The Antidote: Happiness for People Who Can’t Stand Positive Thinking, offers a primer on media literacy — an important caveat on news that even we, as alleged grown-ups, frequently forget:

Newspapers don’t really go out and find the news: they decide what gets to count as news. The same goes for television and radio. And you might disagree with their decisions! (For example, journalists are often accused of focusing on bad news and ignoring the good, making the world seem worse than it is.)

The important thing to remember, whenever you’re reading or watching the news, is that someone decided to tell you those things, while leaving out other things. They’re presenting one particular view of the world — not the only one. There’s always another side to the story.

And my answer, to 9-year-old Ottilie’s question about why we have books:

Some people might tell you that books are no longer necessary now that we have the internet. Don’t believe them. Books help us know other people, know how the world works, and, in the process, know ourselves more deeply in a way that has nothing to with what you read them on and everything to do with the curiosity, integrity and creative restlessness you bring to them.

Books build bridges to the lives of others, both the characters in them and your countless fellow readers across other lands and other eras, and in doing so elevate you and anchor you more solidly into your own life. They give you a telescope into the minds of others, through which you begin to see with ever greater clarity the starscape of your own mind.

And though the body and form of the book will continue to evolve, its heart and soul never will. Though the telescope might change, the cosmic truths it invites you to peer into remain eternal like the Universe.

In many ways, books are the original internet — each fact, each story, each new bit of information can be a hyperlink to another book, another idea, another gateway into the endlessly whimsical rabbit hole of the written word. Just like the web pages you visit most regularly, your physical bookmarks take you back to those book pages you want to return to again and again, to reabsorb and relive, finding new meaning on each visit — because the landscape of your life is different, new, “reloaded” by the very act of living.

Originally featured in November — read more of the questions and answers here.

HONORABLE MENTIONS

The Space Book: From the Beginning to the End of Time, 250 Milestones in the History of Space & Astronomy by Jim Bell, An Appetite for Wonder: The Making of a Scientist by Richard Dawkins, and The Age of Edison: Electric Light and the Invention of Modern America by Ernest Freeberg.

BP

The 13 Best Psychology and Philosophy Books of 2013

How to think like Sherlock Holmes, make better mistakes, master the pace of productivity, find fulfilling work, stay sane, and more.

After the best biographies, memoirs, and history books of 2013, the season’s subjective selection of best-of reading lists continue with the most stimulating psychology and philosophy books published this year. (Catch up on the 2012 roundup here and 2011’s here.)

1. ON LOOKING: ELEVEN WALKS WITH EXPERT EYES

“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library) — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Her approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.

2. TIME WARPED

Given my soft spot for famous diaries, it should come as no surprise that I keep one myself. Perhaps the greatest gift of the practice has been the daily habit of reading what I had written on that day a year earlier; not only is it a remarkable tool of introspection and self-awareness, but it also illustrates that our memory “is never a precise duplicate of the original [but] a continuing act of creation” and how flawed our perception of time is — almost everything that occurred a year ago appears as having taken place either significantly further in the past (“a different lifetime,” I’d often marvel at this time-illusion) or significantly more recently (“this feels like just last month!”). Rather than a personal deficiency of those of us befallen by this tendency, however, it turns out to be a defining feature of how the human mind works, the science of which is at first unsettling, then strangely comforting, and altogether intensely interesting.

That’s precisely what acclaimed BBC broadcaster and psychology writer Claudia Hammond explores in Time Warped: Unlocking the Mysteries of Time Perception (public library) — a fascinating foray into the idea that our experience of time is actively created by our own minds and how these sensations of what neuroscientists and psychologists call “mind time” are created. As disorienting as the concept might seem — after all, we’ve been nursed on the belief that time is one of those few utterly reliable and objective things in life — it is also strangely empowering to think that the very phenomenon depicted as the unforgiving dictator of life is something we might be able to shape and benefit from. Hammond writes:

We construct the experience of time in our minds, so it follows that we are able to change the elements we find troubling — whether it’s trying to stop the years racing past, or speeding up time when we’re stuck in a queue, trying to live more in the present, or working out how long ago we last saw our old friends. Time can be a friend, but it can also be an enemy. The trick is to harness it, whether at home, at work, or even in social policy, and to work in line with our conception of time. Time perception matters because it is the experience of time that roots us in our mental reality. Time is not only at the heart of the way we organize life, but the way we experience it.

Discus chronologicus, a depiction of time by German engraver Christoph Weigel, published in the early 1720s; from Cartographies of Time. (Click for details)

Among the most intriguing illustrations of “mind time” is the incredible elasticity of how we experience time. (“Where is it, this present?,” William James famously wondered. “It has melted in our grasp, fled ere we could touch it, gone in the instant of becoming.”) For instance, Hammond points out, we slow time down when gripped by mortal fear — the cliche about the slow-motion car crash is, in fact, a cognitive reality. This plays out even in situations that aren’t life-or-death per se but are still associated with strong feelings of fear. Hammond points to a study in which people with arachnophobia were asked to look at spiders — the very object of their intense fear — for 45 seconds and they overestimated the elapsed time. The same pattern was observed in novice skydivers, who estimated the duration of their peers’ falls as short, whereas their own, from the same altitude, were deemed longer.

Inversely, time seems to speed up as we get older — a phenomenon of which competing theories have attempted to make light. One, known as the “proportionality theory,” uses pure mathematics, holding that a year feels faster when you’re 40 than when you’re 8 because it only constitutes one fortieth of your life rather than a whole eighth. Among its famous proponents are Vladimir Nabokov and William James. But Hammond remains unconvinced:

The problem with the proportionality theory is that it fails to account for the way we experience time at any one moment. We don’t judge one day in the context of our whole lives. If we did, then for a 40-year-old every single day should flash by because it is less than one fourteen-thousandth of the life they’ve had so far. It should be fleeting and inconsequential, yet if you have nothing to do or an enforced wait at an airport for example, a day at 40 can still feel long and boring and surely longer than a fun day at the seaside packed with adventure for a child. … It ignores attention and emotion, which … can have a considerable impact on time perception.

Another theory suggests that perhaps it is the tempo of life in general that has accelerated, making things from the past appear as slower, including the passage of time itself.

But one definite change does take place with age: As we grow older, we tend to feel like the previous decade elapsed more rapidly, while the earlier decades of our lives seem to have lasted longer. Similarly, we tend to think of events that took place in the past 10 years as having happened more recently than they actually did. (Quick: What year did the devastating Japanese tsunami hit? When did we lose Maurice Sendak?) Conversely, we perceive events that took place more than a decade ago as having happened even longer ago. (When did Princess Diana die? What year was the Chernobyl disaster?) This, Hammond points out, is known as “forward telescoping”:

It is as though time has been compressed and — as if looking through a telescope — things seem closer than they really are. The opposite is called backward or reverse telescoping, also known as time expansion. This is when you guess that events happened longer ago than they really did. This is rare for distant events, but not uncommon for recent weeks.

[…]

The most straightforward explanation for it is called the clarity of memory hypothesis, proposed by the psychologist Norman Bradburn in 1987. This is the simple idea that because we know that memories fade over time, we use the clarity of a memory as a guide to its recency. So if a memory seems unclear we assume it happened longer ago.

Originally featured in July, with a deeper dive into the psychology of why time slows down when we’re afraid, speeds up as we age, and gets warped when we’re on vacation.

3. HOW TO FIND FULFILLING WORK

“If one wanted to crush and destroy a man entirely, to mete out to him the most terrible punishment,” wrote Dostoevsky, “all one would have to do would be to make him do work that was completely and utterly devoid of usefulness and meaning.” Indeed, the quest to avoid work and make a living of doing what you love is a constant conundrum of modern life. In How to Find Fulfilling Work (public library) — the latest installment in The School of Life’s wonderful series reclaiming the traditional self-help genre as intelligent, non-self-helpy, yet immensely helpful guides to modern living, which previously gave us Philippa Perry’s How to Stay Sane and Alain de Botton’s How to Think More About Sex — philosopher Roman Krznaric (remember him?) explores the roots of this contemporary quandary and guides us to its fruitful resolution:

The desire for fulfilling work — a job that provides a deep sense of purpose, and reflects our values, passions and personality — is a modern invention. … For centuries, most inhabitants of the Western world were too busy struggling to meet their subsistence needs to worry about whether they had an exciting career that used their talents and nurtured their wellbeing. But today, the spread of material prosperity has freed our minds to expect much more from the adventure of life.

We have entered a new age of fulfillment, in which the great dream is to trade up from money to meaning.

Krznaric goes on to outline two key afflictions of the modern workplace — “a plague of job dissatisfaction” and “uncertainty about how to choose the right career” — and frames the problem:

Never have so many people felt so unfulfilled in their career roles, and been so unsure what to do about it. Most surveys in the West reveal that at least half the workforce are unhappy in their jobs. One cross-European study showed that 60 per cent of workers would choose a different career if they could start again. In the United States, job satisfaction is at its lowest level — 45 per cent — since record-keeping began over two decades ago.

Of course, Krznaric points out, there’s plenty of cynicism and skepticism to go around, with people questioning whether it’s even possible to find a job in which we thrive and feel complete. He offers an antidote to the default thinking:

There are two broad ways of thinking about these questions. The first is the ‘grin and bear it’ approach. This is the view that we should get our expectations under control and recognize that work, for the vast majority of humanity — including ourselves — is mostly drudgery and always will be. Forget the heady dream of fulfillment and remember Mark Twain’s maxim. “Work is a necessary evil to be avoided.” … The history is captured in the word itself. The Latin labor means drudgery or toil, while the French travail derives from the tripalium, an ancient Roman instrument of torture made of three sticks. … The message of the ‘grin and bear it’ school of thought is that we need to accept the inevitable and put up with whatever job we can get, as long as it meets our financial needs and leaves us enough time to pursue our ‘real life’ outside office hours. The best way to protect ourselves from all the optimistic pundits pedaling fulfillment is to develop a hardy philosophy of acceptance, even resignation, and not set our hearts on finding a meaningful career.

I am more hopeful than this, and subscribe to a different approach, which is that it is possible to find work that is life-enhancing, that broadens our horizons and makes us feel more human.

[…]

This is a book for those who are looking for a job that is big enough for their spirit, something more than a ‘day job’ whose main function is to pay the bills.

‘Never have so many people felt so unfulfilled in their career roles, and been so unsure what to do about it.’

Krznaric considers the five keys to making a career meaningful — earning money, achieving status, making a difference, following our passions, and using our talents — but goes on to demonstrate that they aren’t all created equal. In particular, he echoes 1970s Zen pioneer Alan Watts and modern science in arguing that money alone is a poor motivator:

Schopenhauer may have been right that the desire for money is widespread, but he was wrong on the issue of equating money with happiness. Overwhelming evidence has emerged in the last two decades that the pursuit of wealth is an unlikely path to achieving personal wellbeing — the ancient Greek ideal of eudaimonia or ‘the good life.’ The lack of any clear positive relationship between rising income and rising happiness has become one of the most powerful findings in the modern social sciences. Once our income reaches an amount that covers our basic needs, further increases add little, if anything, to our levels of life satisfaction.

The second false prophet of fulfillment, as Y-Combinator Paul Graham has poignantly cautioned and Debbie Millman has poetically articulated, is prestige. Krznaric admonishes:

We can easily find ourselves pursuing a career that society considers prestigious, but which we are not intrinsically devoted to ourselves — one that does not fulfill us on a day-to-day basis.

Krznaric pits respect, which he defines as “being appreciated for what we personally bring to a job, and being valued for our individual contribution,” as the positive counterpart to prestige and status, arguing that “in our quest for fulfilling work, we should seek a job that offers not just good status prospects, but good respect prospects.”

Rather than hoping to create a harmonious union between the pursuit of money and values, we might have better luck trying to combine values with talents. This idea comes courtesy of Aristotle, who is attributed with saying, ‘Where the needs of the world and your talents cross, there lies your vocation.’

Originally featured in April — read the full article here.

4. INTUITION PUMPS

“If you are not making mistakes, you’re not taking enough risks,” Debbie Millman counseled. “Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before,” Neil Gaiman advised young creators. In Intuition Pumps And Other Tools for Thinking (public library), the inimitable Daniel Dennett, one of our greatest living philosophers, offers a set of thinking tools — “handy prosthetic imagination-extenders and focus holders” — that allow us to “think reliably and even gracefully about really hard questions” — to enhance your cognitive toolkit. He calls these tools “intuition pumps” — thought experiments designed to stir “a heartfelt, table-thumping intuition” (which we know is a pillar of even the most “rational” of science) about the question at hand, a kind of persuasion tool the reverse-engineering of which enables us to think better about thinking itself. Intuition, of course, is a domain-specific ability that relies on honed critical thinking rather than a mystical quality bestowed by the gods — but that’s precisely Dennett’s point, and his task is to help us hone it.

Though most of his 77 “intuition pumps” address concrete questions, a dozen are “general-purpose” tools that apply deeply and widely, across just about any domain of thinking. The first of them is also arguably the most useful yet most uncomfortable: making mistakes.

Echoing Dorion Sagan’s case for why science and philosophy need each other, Dennett begins with an astute contribution to the best definitions of philosophy, wrapped in a necessary admonition about the value of history:

The history of philosophy is in large measure the history of very smart people making very tempting mistakes, and if you don’t know the history, you are doomed to making the same darn mistakes all over again. … There is no such thing as philosophy-free science, just science that has been conducted without any consideration of its underlying philosophical assumptions.

He speaks for the generative potential of mistakes and their usefulness as an empirical tool:

Sometimes you don’t just want to risk making mistakes; you actually want to make them — if only to give you something clear and detailed to fix.

Therein lies the power of mistakes as a vehicle for, as Rilke famously put it, “living the questions” and thus advancing knowledge in a way that certainty cannot — for, as Richard Feynman memorably noted, the scientist’s job is to remain unsure, and so seems the philosopher’s. Dennett writes:

We philosophers are mistake specialists. … While other disciplines specialize in getting the right answers to their defining questions, we philosophers specialize in all the ways there are of getting things so mixed up, so deeply wrong, that nobody is even sure what the right questions are, let alone the answers. Asking the wrong questions risks setting any inquiry off on the wrong foot. Whenever that happens, this is a job for philosophers! Philosophy — in every field of inquiry — is what you have to do until you figure out what questions you should have been asking in the first place.

[…]

Mistakes are not just opportunities for learning; they are, in an important sense, the only opportunity for learning or making something truly new. Before there can be learning, there must be learners. There are only two non-miraculous ways for learners to come into existence: they must either evolve or be designed and built by learners that evolved. Biological evolution proceeds by a grand, inexorable process of trial and error — and without the errors the trials wouldn’t accomplish anything.

Dennett offers a caveat that at once highlights the importance of acquiring knowledge and reminds us of the power of “chance-opportunism”:

Trials can be either blind or foresighted. You, who know a lot, but not the answer to the question at hand, can take leaps — foresighted leaps. You can look before you leap, and hence be somewhat guided from the outset by what you already know. You need not be guessing at random, but don’t look down your nose at random guesses; among its wonderful products is … you!

And since evolution is the highest epitome of how the process of trial and error drives progress, Dennett makes a case for understanding evolution as a key to understanding everything else we humans value:

Evolution … is the central, enabling process not only of life but also of knowledge and learning and understanding. If you attempt to make sense of the world of ideas and meanings, free will and morality, art and science and even philosophy itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back. … For evolution, which knows nothing, the steps into novelty are blindly taken by mutations, which are random copying “errors” in DNA.

Dennett echoes Dostoyevsky (“Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others.”) and offers the key to making productive mistakes:

The chief trick to making good mistakes is not to hide them — especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own mistakes, turning them over in your mind as if they were works of art, which in a way they are. … The trick is to take advantage of the particular details of the mess you’ve made, so that your next attempt will be informed by it and not just another blind stab in the dark.

We have all heard the forlorn refrain “Well, it seemed like a good idea at the time!” This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say, “Well, it seemed like a good idea at the time!” is standing on the threshold of brilliance.

Originally featured in May — read the full article here.

5. MASTERMIND: HOW TO THINK LIKE SHERLOCK HOLMES

“The habit of mind which leads to a search for relationships between facts,” wrote James Webb Young in his famous 1939 5-step technique for creative problem-solving, “becomes of the highest importance in the production of ideas.” But just how does one acquire those vital cognitive customs? That’s precisely what science writer Maria Konnikova explores in Mastermind: How to Think Like Sherlock Holmes (UK; public library) — an effort to reverse-engineer Holmes’s methodology into actionable insights that help develop “habits of thought that will allow you to engage mindfully with yourself and your world as a matter of course.”

Bridging ample anecdotes from the adventures of Conan Doyle’s beloved detective with psychology studies both classic and cutting-edge, Konnikova builds a compelling case at the intersection of science and secular spiritualism, stressing the power of rigorous observation alongside a Buddhist-like, Cageian emphasis on mindfulness. She writes:

The idea of mindfulness itself is by no means a new one. As early as the end of the nineteenth century, William James, the father of modern psychology, wrote that, ‘The faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will. … An education which should improve this faculty would be the education par excellence.’ That faculty, at its core, is the very essence of mindfulness. And the education that James proposes, an education in a mindful approach to life and to thought.

[…]

In recent years, studies have shown that meditation-like thought (an exercise in the very attentional control that forms the center of mindfulness), for as little as fifteen minutes a day, can shift frontal brain activity toward a pattern that has been associated with more positive and more approach-oriented emotional states, and that looking at scenes of nature, for even a short while, can help us become more insightful, more creative, and more productive. We also know, more definitively than we ever have, that our brains are not built for multitasking — something that precludes mindfulness altogether. When we are forced to do multiple things at once, not only do we perform worse on all of them but our memory decreases and our general wellbeing suffers a palpable hit.

But for Sherlock Holmes, mindful presence is just a first step. It’s a means to a far larger, far more practical and practically gratifying goal. Holmes provides precisely what William James had prescribed: an education in improving our faculty of mindful thought and in using it in order to accomplish more, think better, and decide more optimally. In its broadest application, it is a means for improving overall decision making and judgment ability, starting from the most basic building block of your own mind.

But mindfulness, and the related mental powers it bestows upon its master, is a skill acquired with grit and practice, rather than an in-born talent or an easy feat attained with a few half-hearted tries:

It is most difficult to apply Holmes’s logic in those moments that matter the most. And so, all we can do is practice, until our habits are such that even the most severe stressors will bring out the very thought patterns that we’ve worked so hard to master.

Echoing Carl Sagan, Konnikova examines the role of intuition — a grab-bag concept embraced by some of history’s greatest scientific minds, cultural icons, and philosophers — as both a helpful directional signpost of intellectual inquiry and a dangerous blind spot:

Our intuition is shaped by context, and that context is deeply informed by the world we live in. It can thus serve as a blinder — or blind spot — of sorts. … With mindfulness, however, we can strive to find a balance between fact-checking our intuitions and remaining open-minded. We can then make our best judgments, with the information we have and no more, but with, as well, the understanding that time may change the shape and color of that information.

“I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose,” Holmes famously remarked. Indeed, much like the inventor’s mind, the problem-solver’s mind is the product of that very choice: The details and observations we select to include in our “brain attic” shape and filter our perception of reality. Konnikova writes:

Observation with a capital O — the way Holmes uses the word when he gives his new companion a brief history of his life with a single glance — does entail more than, well, observation (the lowercase kind). It’s not just about the passive process of letting objects enter into your visual field. It is about knowing what and how to observe and directing your attention accordingly: what details do you focus on? What details do you omit? And how do you take in and capture those details that you do choose to zoom in on? In other words, how do you maximize your brain attic’s potential? You don’t just throw any old detail up there, if you remember Holmes’s early admonitions; you want to keep it as clean as possible. Everything we choose to notice has the potential to become a future furnishing of our attics — and what’s more, its addition will mean a change in the attic’s landscape that will affect, in turn, each future addition. So we have to choose wisely.

Choosing wisely means being selective. It means not only looking but looking properly, looking with real thought. It means looking with the full knowledge that what you note — and how you note it — will form the basis of any future deductions you might make. It’s about seeing the full picture, noting the details that matter, and understanding how to contextualize those details within a broader framework of thought.

Originally featured in January — read the full article for more, including Konnikova’s four rules for Sherlockian thinking.

6. MAKE GOOD ART

Commencement season is upon us and, after Greil Marcus’s soul-stirring speech on the essence of art at the 2013 School of Visual Arts graduation ceremony, here comes an exceptional adaptation of one of the best commencement addresses ever delivered: In May of 2012, beloved author Neil Gaiman stood up in front of the graduating class at Philadelphia’s University of the Arts and dispensed some timeless advice on the creative life; now, his talk comes to life as a slim but potent book titled Make Good Art (public library).

Best of all, it’s designed by none other than the inimitable Chip Kidd, who has spent the past fifteen years shaping the voice of contemporary cover design with his prolific and consistently stellar output, ranging from bestsellers like cartoonist Chris Ware’s sublime Building Stories and neurologist Oliver Sacks’s The Mind’s Eye to lesser-known gems like The Paris Review‘s Women Writers at Work and The Letter Q, that wonderful anthology of queer writers’ letters to their younger selves. (Fittingly, Kidd also designed the book adaptation of Ann Patchett’s 2006 commencement address.)

When things get tough, this is what you should do: Make good art. I’m serious. Husband runs off with a politician — make good art. Leg crushed and then eaten by a mutated boa constrictor — make good art. IRS on your trail — make good art. Cat exploded — make good art. Someone on the Internet thinks what you’re doing is stupid or evil or it’s all been done before — make good art. Probably things will work out somehow, eventually time will take the sting away, and that doesn’t even matter. Do what only you can do best: Make good art. Make it on the bad days, make it on the good days, too.

A wise woman once said, “If you are not making mistakes, you’re not taking enough risks.” Gaiman articulates the same sentiment with his own brand of exquisite eloquence:

I hope that in this year to come, you make mistakes.

Because if you are making mistakes, then you are making new things, trying new things, learning, living, pushing yourself, changing yourself, changing your world. You’re doing things you’ve never done before, and more importantly, you’re Doing Something.

So that’s my wish for you, and all of us, and my wish for myself. Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before. Don’t freeze, don’t stop, don’t worry that it isn’t good enough, or it isn’t perfect, whatever it is: art, or love, or work or family or life.

Whatever it is you’re scared of doing, Do it.

Make your mistakes, next year and forever.

Originally featured in May — read the full article here, along with a video of Gaiman’s original commencement address.

7. HOW CHILDREN SUCCEED

In How Children Succeed: Grit, Curiosity, and the Hidden Power of Character (public library) — a necessary addition to these fantastic reads on educationPaul Tough, whose writing has appeared in The New Yorker, Slate, Esquire, The New York Times, sets out to investigate the essential building blocks of character through the findings and practical insight of exceptional educators and bleeding-edge researchers. One of his core arguments is based on the work of pioneering psychologist and 2013 MacArthur “genius” grantee Angela Duckworth, who studied under positive psychology godfather Martin Seligman at my alma mater, the University of Pennsylvania, and has done more than anyone for advancing our understanding of how self-control and grit — the relentless work ethic of sustaining your commitments toward a long-term goal — impact success.

Duckworth had come to Penn in 2002, at the age of thirty-two, later in life than a typical graduate student. The daughter of Chinese immigrants, she had been a classic multitasking overachiever in her teens and twenties. After completing her undergraduate degree at Harvard (and starting a summer school for low-income kids in Cambridge in her spare time), she had bounced from one station of the mid-nineties meritocracy to the next: intern in the White House speechwriting office, Marshall scholar at Oxford (where she studied neuroscience), management consultant for McKinsey and Company, charter-school adviser.

Duckworth spent a number of years toying with the idea of starting her own charter school, but eventually concluded that the model didn’t hold much promise for changing the circumstances of children from disadvantaged backgrounds, those whom the education system was failing most tragically. Instead, she decided to pursue a PhD program at Penn. In her application essay, she shared how profoundly the experience of working in schools had changed her view of school reform and wrote:

The problem, I think, is not only the schools but also the students themselves. Here’s why: learning is hard. True, learning is fun, exhilarating and gratifying — but it is also often daunting, exhausting and sometimes discouraging. . . . To help chronically low-performing but intelligent students, educators and parents must first recognize that character is at least as important as intellect.

Duckworth began her graduate work by studying self-discipline. But when she completed her first-year thesis, based on a group of 164 eighth-graders from a Philadelphia middle school, she arrived at a startling discovery that would shape the course of her career: She found that the students’ self-discipline scores were far better predictors of their academic performance than their IQ scores. So she became intensely interested in what strategies and tricks we might develop to maximize our self-control, and whether those strategies can be taught. But self-control, it turned out, was only a good predictor when it came to immediate, concrete goals — like, say, resisting a cookie. Tough writes:

Duckworth finds it useful to divide the mechanics of achievement into two separate dimensions: motivation and volition. Each one, she says, is necessary to achieve long-term goals, but neither is sufficient alone. Most of us are familiar with the experience of possessing motivation but lacking volition: You can be extremely motivated to lose weight, for example, but unless you have the volition — the willpower, the self-control — to put down the cherry Danish and pick up the free weights, you’re not going to succeed. If a child is highly motivated, the self-control techniques and exercises Duckworth tried to teach [the students in her study] might be very helpful. But what if students just aren’t motivated to achieve the goals their teachers or parents want them to achieve? Then, Duckworth acknowledges, all the self-control tricks in the world aren’t going to help.

This is where grit comes in — the X-factor that helps us attain more long-term, abstract goals. To address this, Duckworth and her colleague Chris Peterson developed the Grit Scale — a deceptively simple test, on which you evaluate how much twelve statements apply to you, from “I am a hard worker” to “New ideas and projects sometimes distract me from previous ones.” The results are profoundly predictive of success at such wide-ranging domains of achievement as the National Spelling Bee and the West Point military academy. Tough describes the surprising power of this seemingly mundane questionnaire:

For each statement, respondents score themselves on a five-point scale, ranging from 5, “very much like me,” to 1, “not like me at all.” The test takes about three minutes to complete, and it relies entirely on self-report — and yet when Duckworth and Peterson took it out into the field, they found it was remarkably predictive of success. Grit, Duckworth discovered, is only faintly related to IQ — there are smart gritty people and dumb gritty people — but at Penn, high grit scores allowed students who had entered college with relatively low college-board scores to nonetheless achieve high GPAs. At the National Spelling Bee, Duckworth found that children with high grit scores were more likely to survive to the later rounds. Most remarkable, Duckworth and Peterson gave their grit test to more than twelve hundred freshman cadets as they entered the military academy at West Point and embarked on the grueling summer training course known as Beast Barracks. The military has developed its own complex evaluation, called the whole candidate score, to judge incoming cadets and predict which of them will survive the demands of West Point; it includes academic grades, a gauge of physical fitness, and a leadership potential score. But the more accurate predictor of which cadets persisted in Beast Barracks and which ones dropped out turned out to be Duckworth’s simple little twelve-item grit questionnaire.

You can take the Grit Scale here (registration is free).

8. THINKING: THE NEW SCIENCE OF DECISION-MAKING, PROBLEM-SOLVING AND PREDICTION

Every year, intellectual impresario and Edge editor John Brockman summons some of our era’s greatest thinkers and unleashes them on one provocative question, whether it’s the single most elegant theory of how the world works or the best way to enhance our cognitive toolkit. This year, he sets out on the most ambitious quest yet, a meta-exploration of thought itself: Thinking: The New Science of Decision-Making, Problem-Solving, and Prediction (public library) collects short essays and lecture adaptations from such celebrated and wide-ranging (though not in gender) minds as Daniel Dennett, Jonathan Haidt, Dan Gilbert, and Timothy Wilson, covering subjects as diverse as morality, essentialism, and the adolescent brain.

One of the most provocative contributions comes from Nobel-winning psychologist Daniel Kahneman — author of the indispensable Thinking, Fast and Slow, one of the best psychology books of 2012 — who examines “the marvels and the flaws of intuitive thinking.”

In the 1970s, Kahneman and his colleague Amos Tversky, self-crowned “prophets of irrationality,” began studying what they called “heuristics and biases” — mental shortcuts we take, which frequently result in cognitive errors. Those errors, however, reveal a great deal about how our minds work:

If you want to characterize how something is done, then one of the most powerful ways of characterizing how the mind does anything is by looking at the errors that the mind produces while it’s doing it because the errors tell you what it is doing. Correct performance tells you much less about the procedure than the errors do.

One of the most fascinating examples of heuristics and biases is what we call intuition — a complex cluster of cognitive processes, sometimes helpful but often misleading. Kahneman notes that thoughts come to mind in one of two ways: Either by “orderly computation,” which involves a series of stages of remembering rules and then applying them, or by perception, an evolutionary function that allows us to predict outcomes based on what we’re perceiving. (For instance, seeing a woman’s angry face helps us predict the general sentiment and disposition of what she’s about to say.) It is the latter mode that precipitates intuition. Kahneman explains the interplay:

There is no sharp line between intuition and perception. … Perception is predictive. . . . If you want to understand intuition, it is very useful to understand perception, because so many of the rules that apply to perception apply as well to intuitive thinking. Intuitive thinking is quite different from perception. Intuitive thinking has language. Intuitive thinking has a lot of word knowledge organized in different ways more than mere perception. But some very basic characteristics [of] perception are extended almost directly to intuitive thinking.

He then considers how the two types of mental operations established by modern cognitive science illuminate intuition:

Type 1 is automatic, effortless, often unconscious, and associatively coherent. . . . Type 2 is controlled, effortful, usually conscious, tends to be logically coherent, rule-governed. Perception and intuition are Type 1. … Type 2 is more controlled, slower, is more deliberate. . . . Type 2 is who we think we are. [And yet] if one made a film on this, Type 2 would be a secondary character who thinks that he is the hero because that’s who we think we are, but in fact, it’s Type 1 that does most of the work, and it’s most of the work that is completely hidden from us.

Type 1 also encompasses all of our practiced skills — for instance, driving, speaking, and understanding a language — which after a certain threshold of mastery enter autopilot mode. (Though this presents its own set of problems.) Underpinning that mode of thinking is our associative memory, which Kahneman unpacks:

You have to think of [your associative memory] as a huge repository of ideas, linked to each other in many ways, including causal links and other links, and activation spreading from ideas to other ideas until a small subset of that enormous network is illuminated, and the subset is what’s happening in the mind at the moment. You’re not conscious of it, you’re conscious of very little of it.

The Type 1 modality of thought gives rise to a System 1 of interpretation, which is at the heart of what we call “intuition” — but which is far less accurate and reliable than we like to believe:

System 1 infers and invents causes and intentions. [This] happens automatically. Infants have it. . . . We’re equipped … for the perception of causality.

It neglects ambiguity and suppresses doubt and … exaggerates coherence. Associative coherence [is] in large part where the marvels turn into flaws. We see a world that is vastly more coherent than the world actually is. That’s because of this coherence-creating mechanism that we have. We have a sense-making organ in our heads, and we tend to see things that are emotionally coherent, and that are associatively coherent.

Most treacherous of all is our tendency to use our very confidence — and overconfidence — as evidence itself:

What’s interesting is that many a time people have intuitions that they’re equally confident about except they’re wrong. That happens through the mechanism I call “the mechanism of substitution.” You have been asked a question, and instead you answer another question, but that answer comes by itself with complete confidence, and you’re not aware that you’re doing something that you’re not an expert on because you have one answer. Subjectively, whether it’s right or wrong, it feels exactly the same. Whether it’s based on a lot of information, or a little information, this is something that you may step back and have a look at. But the subjective sense of confidence can be the same for intuition that arrives from expertise, and for intuitions that arise from heuristics. . . .

In other words, intuition, like attention, is “an intentional, unapologetic discriminator [that] asks what is relevant right now, and gears us up to notice only that” — a humbling antidote to our culture’s propensity for self-righteousness, and above all a reminder to allow yourself the uncomfortable luxury of changing your mind.

Originally featured in October — read the full article here.

9. MANAGE YOUR DAY-TO-DAY

We seem to have a strange but all too human cultural fixation on the daily routines and daily rituals of famous creators, from Vonnegut to Burroughs to Darwin — as if a glimpse of their day-to-day would somehow magically infuse ours with equal potency, or replicating it would allow us to replicate their genius in turn. And though much of this is mere cultural voyeurism, there is something to be said for the value of a well-engineered daily routine to anchor the creative process. Manage Your Day-to-Day: Build Your Routine, Find Your Focus, and Sharpen Your Creative Mind (public library), edited by Behance’s 99U editor-in-chief Jocelyn Glei, delves into the secrets of this holy grail of creativity. Twenty of today’s most celebrated thinkers and doers explore such facets of the creative life as optimizing your idea-generation, defying the demons of perfectionism, managing procrastination, and breaking through your creative blocks, with insights from magnificent minds ranging from behavioral economist Dan Ariely to beloved graphic designer Stefan Sagmeister.

In the foreword to the book, Behance founder Scott Belsky, author of the indispensable Making Ideas Happen, points to “reactionary workflow” — our tendency to respond to requests and other stimuli rather than create meaningful work — as today’s biggest problem and propounds a call to arms:

It’s time to stop blaming our surroundings and start taking responsibility. While no workplace is perfect, it turns out that our gravest challenges are a lot more primal and personal. Our individual practices ultimately determine what we do and how well we do it. Specifically, it’s our routine (or lack thereof), our capacity to work proactively rather than reactively, and our ability to systematically optimize our work habits over time that determine our ability to make ideas happen.

[…]

Only by taking charge of your day-to-day can you truly make an impact in what matters most to you. I urge you to build a better routine by stepping outside of it, find your focus by rising above the constant cacophony, and sharpen your creative prowess by analyzing what really matters most when it comes to making your ideas happen.

One of the book’s strongest insights comes from Gretchen Rubin — author of The Happiness Project: Or, Why I Spent a Year Trying to Sing in the Morning, Clean My Closets, Fight Right, Read Aristotle, and Generally Have More Fun, one of these 7 essential books on the art and science of happiness, titled after her fantastic blog of the same name — who points to frequency as the key to creative accomplishment:

We tend to overestimate what we can do in a short period, and underestimate what we can do over a long period, provided we work slowly and consistently. Anthony Trollope, the nineteenth-century writer who managed to be a prolific novelist while also revolutionizing the British postal system, observed, “A small daily task, if it be really daily, will beat the labours of a spasmodic Hercules.” Over the long run, the unglamorous habit of frequency fosters both productivity and creativity.

Frequency, she argues, helps facilitate what Arthur Koestler has famously termed “bisociation” — the crucial ability to link the seemingly unlinkable, which is the defining characteristic of the creative mind. Rubin writes:

You’re much more likely to spot surprising relationships and to see fresh connections among ideas, if your mind is constantly humming with issues related to your work. When I’m deep in a project, everything I experience seems to relate to it in a way that’s absolutely exhilarating. The entire world becomes more interesting. That’s critical, because I have a voracious need for material, and as I become hyperaware of potential fodder, ideas pour in. By contrast, working sporadically makes it hard to keep your focus. It’s easy to become blocked, confused, or distracted, or to forget what you were aiming to accomplish.

[…]

Creativity arises from a constant churn of ideas, and one of the easiest ways to encourage that fertile froth is to keep your mind engaged with your project. When you work regularly, inspiration strikes regularly.

Echoing Alexander Graham Bell, who memorably wrote that “it is the man who carefully advances step by step … who is bound to succeed in the greatest degree,” and Virginia Woolf, who extolled the creative benefits of keeping a diary, Rubin writes:

Step by step, you make your way forward. That’s why practices such as daily writing exercises or keeping a daily blog can be so helpful. You see yourself do the work, which shows you that you can do the work. Progress is reassuring and inspiring; panic and then despair set in when you find yourself getting nothing done day after day. One of the painful ironies of work life is that the anxiety of procrastination often makes people even less likely to buckle down in the future.

Riffing on wisdom from her latest book, Happier at Home: Kiss More, Jump More, Abandon a Project, Read Samuel Johnson, and My Other Experiments in the Practice of Everyday Life, Rubin offers:

I have a long list of “Secrets of Adulthood,” the lessons I’ve learned as I’ve grown up, such as: “It’s the task that’s never started that’s more tiresome,” “The days are long, but the years are short,” and “Always leave plenty of room in the suitcase.” One of my most helpful Secrets is, “What I do every day matters more than what I do once in a while.”

With a sentiment reminiscent of William James’s timeless words on habit, she concludes:

Day by day, we build our lives, and day by day, we can take steps toward making real the magnificent creations of our imaginations.

Entrepreneurship guru and culture-sage Seth Godin seconds Rubin and admonishes against confusing vacant ritualization with creative rituals that actually spur productivity:

Everybody who does creative work has figured out how to deal with their own demons to get their work done. There is no evidence that setting up your easel like Van Gogh makes you paint better. Tactics are idiosyncratic. But strategies are universal, and there are a lot of talented folks who are not succeeding the way they want to because their strategies are broken.

The strategy is simple, I think. The strategy is to have a practice, and what it means to have a practice is to regularly and reliably do the work in a habitual way.

There are many ways you can signify to yourself that you are doing your practice. For example, some people wear a white lab coat or a particular pair of glasses, or always work in a specific place — in doing these things, they are professionalizing their art.

He echoes Chuck Close (“Inspiration is for amateurs — the rest of us just show up and get to work.”), Tchaikovsky (“a self-respecting artist must not fold his hands on the pretext that he is not in the mood.”) E. B. White (“A writer who waits for ideal conditions under which to work will die without putting a word on paper.”), and Isabel Allende (“Show up, show up, show up, and after a while the muse shows up, too.”), observing:

The notion that I do my work here, now, like this, even when I do not feel like it, and especially when I do not feel like it, is very important. Because lots and lots of people are creative when they feel like it, but you are only going to become a professional if you do it when you don’t feel like it. And that emotional waiver is why this is your work and not your hobby.

Originally featured in May — read the full article here. Also of note: 99U’s sequel, Maximize Your Potential, which collects practical wisdom from 21 celebrated creative entrepreneurs.

10. GIVE AND TAKE

“The principle of give and take; that is diplomacy— give one and take ten,” Mark Twain famously smirked. But for every such cynicism, there’s a heartening meditation on the art of asking and the beautiful osmosis of altruism. “The world is just,” Amelia Barr admonished in her rules for success, “it may, it does, patronize quacks; but it never puts them on a level with true men.” After all, it pays to be nice because, as Austin Kleon put it, “the world is a small town,” right?

Well, maybe — maybe not. Just as the world may be, how givers and takers fare in matters of success proves to be more complicated. So argues organizational psychology wunderkind Adam Grant (remember him?), the youngest-tenured and highest-rated Wharton professor at my alma mater, in Give and Take: A Revolutionary Approach to Success (public library).

Grant’s extensive research has shed light on a crucial element of success, debunking some enduring tenets of cultural mythology:

According to conventional wisdom, highly successful people have three things in common: motivation, ability, and opportunity. If we want to succeed, we need a combination of hard work, talent, and luck. [But there is] a fourth ingredient, one that’s critical but often neglected: success depends heavily on how we approach our interactions with other people. Every time we interact with another person at work, we have a choice to make: do we try to claim as much value as we can, or contribute value without worrying about what we receive in return?

At the heart of his insight is a dichotomy of behavioral styles people adopt in pursuing success:

Takers have a distinctive signature: they like to get more than they give. They tilt reciprocity in their own favor, putting their own interests ahead of others’ needs. Takers believe that the world is a competitive, dog-eat-dog place. They feel that to succeed, they need to be better than others. To prove their competence, they self-promote and make sure they get plenty of credit for their efforts. Garden-variety takers aren’t cruel or cutthroat; they’re just cautious and self-protective. “If I don’t look out for myself first,” takers think, “no one will.”

Grant contrasts takers with givers:

In the workplace, givers are a relatively rare breed. They tilt reciprocity in the other direction, preferring to give more than they get. Whereas takers tend to be self-focused, evaluating what other people can offer them, givers are other-focused, paying more attention to what other people need from them. These preferences aren’t about money: givers and takers aren’t distinguished by how much they donate to charity or the compensation that they command from their employers. Rather, givers and takers differ in their attitudes and actions toward other people. If you’re a taker, you help others strategically, when the benefits to you outweigh the personal costs. If you’re a giver, you might use a different cost-benefit analysis: you help whenever the benefits to others exceed the personal costs. Alternatively, you might not think about the personal costs at all, helping others without expecting anything in return. If you’re a giver at work, you simply strive to be generous in sharing your time, energy, knowledge, skills, ideas, and connections with other people who can benefit from them.

Outside the workplace, Grant argues by citing Yale psychologist Margaret Clark’s research, most of us are givers in close relationships like marriages and friendships, contributing without preoccupation with keeping score. In the workplace, however, few of us are purely givers or takers — rather, what dominates is a third style:

We become matchers, striving to preserve an equal balance of giving and getting. Matchers operate on the principle of fairness: when they help others, they protect themselves by seeking reciprocity. If you’re a matcher, you believe in tit for tat, and your relationships are governed by even exchanges of favors.

True to psychologists’ repeated insistence that personality is fluid rather than fixed, Grant notes:

Giving, taking, and matching are three fundamental styles of social interaction, but the lines between them aren’t hard and fast. You might find that you shift from one reciprocity style to another as you travel across different work roles and relationships. It wouldn’t be surprising if you act like a taker when negotiating your salary, a giver when mentoring someone with less experience than you, and a matcher when sharing expertise with a colleague. But evidence shows that at work, the vast majority of people develop a primary reciprocity style, which captures how they approach most of the people most of the time. And this primary style can play as much of a role in our success as hard work, talent, and luck.

Originally featured in April — for a closer look at Grant’s findings on the science of success, read the full article here.

11. THE EXAMINED LIFE

Despite ample evidence and countless testaments to the opposite, there persists a toxic cultural mythology that creative and intellectual excellence comes from a passive gift bestowed upon the fortunate few by the gods of genius, rather than being the product of the active application and consistent cultivation of skill. So what might the root of that stubborn fallacy be? Childhood and upbringing, it turns out, might have a lot to do.

In The Examined Life: How We Lose and Find Ourselves (public library), psychoanalyst and University College London professor Stephen Grosz builds on more than 50,000 hours of conversation from his quarter-century experience as a practicing psychoanalyst to explore the machinery of our inner life, with insights that are invariably profound and often provocative — for instance, a section titled “How praise can cause a loss of confidence,” in which Grosz writes:

Nowadays, we lavish praise on our children. Praise, self-confidence and academic performance, it is commonly believed, rise and fall together. But current research suggests otherwise — over the past decade, a number of studies on self-esteem have come to the conclusion that praising a child as ‘clever’ may not help her at school. In fact, it might cause her to under-perform. Often a child will react to praise by quitting — why make a new drawing if you have already made ‘the best’? Or a child may simply repeat the same work — why draw something new, or in a new way, if the old way always gets applause?

Grosz cites psychologists Carol Dweck and Claudia Mueller’s famous 1998 study, which divided 128 children ages 10 and 11 into two groups. All were asked to solve mathematical problems, but one group were praised for their intellect (“You did really well, you’re so clever.”) while the other for their effort (“You did really well, you must have tried really hard.”) The kids were then given more complex problems, which those previously praised for their hard work approached with dramatically greater resilience and willingness to try different approaches whenever they reached a dead end. By contrast, those who had been praised for their cleverness were much more anxious about failure, stuck with tasks they had already mastered, and dwindled in tenacity in the face of new problems. Grosz summarizes the now-legendary findings:

Ultimately, the thrill created by being told ‘You’re so clever’ gave way to an increase in anxiety and a drop in self-esteem, motivation and performance. When asked by the researchers to write to children in another school, recounting their experience, some of the ‘clever’ children lied, inflating their scores. In short, all it took to knock these youngsters’ confidence, to make them so unhappy that they lied, was one sentence of praise.

He goes on to admonish against today’s culture of excessive parental praise, which he argues does more for lifting the self-esteem of the parents than for cultivating a healthy one in their children:

Admiring our children may temporarily lift our self-esteem by signaling to those around us what fantastic parents we are and what terrific kids we have — but it isn’t doing much for a child’s sense of self. In trying so hard to be different from our parents, we’re actually doing much the same thing — doling out empty praise the way an earlier generation doled out thoughtless criticism. If we do it to avoid thinking about our child and her world, and about what our child feels, then praise, just like criticism, is ultimately expressing our indifference.

To explore what the healthier substitute for praise might be, he recounts observing an eighty-year-old remedial reading teacher named Charlotte Stiglitz, the mother of the Nobel Prize-winning economist Joseph Stiglitz, who told Grosz of her teaching methodology:

I don’t praise a small child for doing what they ought to be able to do,’ she told me. ‘I praise them when they do something really difficult — like sharing a toy or showing patience. I also think it is important to say “thank you”. When I’m slow in getting a snack for a child, or slow to help them and they have been patient, I thank them. But I wouldn’t praise a child who is playing or reading.

Rather than utilizing the familiar mechanisms of reward and punishment, Grosz observed, Charlotte’s method relied on keen attentiveness to “what a child did and how that child did it.” Presence, he argues, helps build the child’s confidence by way of indicating he is worthy of the observer’s thoughts and attention — its absence, on the other hand, divorces in the child the journey from the destination by instilling a sense that the activity itself is worthless unless it’s a means to obtaining praise. Grosz reminds us how this plays out for all of us, and why it matters throughout life:

Being present, whether with children, with friends, or even with oneself, is always hard work. But isn’t this attentiveness — the feeling that someone is trying to think about us — something we want more than praise?

Originally featured in May — read the full article here.

12. TO SELL IS HUMAN

Whether it’s “selling” your ideas, your writing, or yourself to a potential mate, the art of the sell is crucial to your fulfillment in life, both personal and professional. So argues Dan Pink in To Sell Is Human: The Surprising Truth About Moving Others (public library; UK) — a provocative anatomy of the art-science of “selling” in the broadest possible sense of the word, substantiated by ample research spanning psychology, behavioral economics, and the social sciences.

Pink, wary of the disagreeable twinges accompanying the claim that everyone should self-identify as a salesperson, preemptively counters in the introduction:

I’m convinced we’ve gotten it wrong.

This is a book about sales. But it is unlike any book about sales you have read (or ignored) before. That’s because selling in all its dimensions — whether pushing Buicks on a lot or pitching ideas in a meeting — has changed more in the last ten years than it did over the previous hundred. Most of what we think we understand about selling is constructed atop a foundation of assumptions that have crumbled.

[…]

Selling, I’ve grown to understand, is more urgent, more important, and, in its own sweet way, more beautiful than we realize. The ability to move others to exchange what they have for what we have is crucial to our survival and our happiness. It has helped our species evolve, lifted our living standards, and enhanced our daily lives. The capacity to sell isn’t some unnatural adaptation to the merciless world of commerce. It is part of who we are.

One of Pink’s most fascinating arguments echoes artist Chuck Close, who famously noted that “our whole society is much too problem-solving oriented. It is far more interesting to [participate in] ‘problem creation.’” Pink cites the research of celebrated social scientists Jacob Getzels and Mihaly Csikszentmihalyi, who in the 1960s recruited three dozen fourth-year art students for an experiment. They brought the young artists into a studio with two large tables. The first table displayed 27 eclectic objects that the school used in its drawing classes. The students were instructed to select one or more objects, then arrange a still life on the second table and draw it. What happened next reveals an essential pattern about how creativity works:

The young artists approached their task in two distinct ways. Some examined relatively few objects, outlined their idea swiftly, and moved quickly to draw their still life. Others took their time. They handled more objects, turned them this way and that, rearranged them several times, and needed much longer to complete the drawing. As Csikszentmihalyi saw it, the first group was trying to solve a problem: How can I produce a good drawing? The second was trying to find a problem: What good drawing can I produce?

As Csikszentmihalyi then assembled a group of art experts to evaluate the resulting works, he found that the problem-finders’ drawings had been ranked much higher in creativity than the problem-solvers’. Ten years later, the researchers tracked down these art students, who at that point were working for a living, and found that about half had left the art world, while the other half had gone on to become professional artists. That latter group was composed almost entirely of problem-finders. Another decade later, the researchers checked in again and discovered that the problem-finders were “significantly more successful — by the standards of the artistic community — than their peers.” Getzels concluded:

It is in fact the discovery and creation of problems rather than any superior knowledge, technical skill, or craftsmanship that often sets the creative person apart from others in his field.

Pink summarizes:

The more compelling view of the nature of problems has enormous implications for the new world of selling. Today, both sales and non-sales selling depend more on the creative, heuristic, problem-finding skills of artists than on the reductive, algorithmic, problem-solving skills of technicians.

Another fascinating chapter reveals counterintuitive insights about the competitive advantages of introversion vs. extraversion. While new theories might extol the power of introverts over traditional exaltations of extraversion, the truth turns out to be quite different: Pink turns to the research of social psychologist Adam Grant, management professor at the Wharton School of Business at the University of Pennsylvania (my alma mater).

Grant measured where a sample of call center sales representatives fell on the introversion-extraversion spectrum, then correlated that with their actual sales figures. Unsurprisingly, Grant found that extraverts averaged $125 per hour in revenue, exceeding introverts’ $120. His most surprising finding, however, was that “ambiverts” — those who fell in the middle of the spectrum, “not too hot, not too cold” — performed best of all, with an hourly average of $155. The outliers who brought in an astounding $208 per hour scored a solid 4 on the 1-7 introversion-extraversion scale.

Pink synthesizes the findings into an everyday insight for the rest of us:

The best approach is for the people on the ends to emulate those in the center. As some have noted, introverts are ‘geared to inspect,’ while extraverts are ‘geared to respond.’ Selling of any sort — whether traditional sales or non-sales selling — requires a delicate balance of inspecting and responding. Ambiverts can find that balance. They know when to speak and when to shut up. Their wider repertoires allow them to achieve harmony with a broader range of people and a more varied set of circumstances. Ambiverts are the best movers because they’re the most skilled attuners.

Pink goes on to outline “the new ABCs of moving others” — attunement (“the ability to bring one’s actions and outlook into harmony with other people an with the context you’re [sic] in”), buoyancy (a trifecta of “interrogative self-talk” that moves from making statements to asking questions, contagious “positivity,” and an optimistic “explanatory style” of explaining negative events to yourself), and clarity (“the capacity to help others see their situations in fresh and more revealing ways and to identify problems they didn’t realize they had”).

Originally featured in February — read the full article here, where you can watch the charming companion video.

13. HOW TO STAY SANE

“I pray to Jesus to preserve my sanity,” Jack Kerouac professed in discussing his writing routine. But those of us who fall on the more secular end of the spectrum might need a slightly more potent sanity-preservation tool than prayer. That’s precisely what writer and psychotherapist Philippa Perry offers in How To Stay Sane (public library; UK), part of The School of Life’s wonderful series reclaiming the traditional self-help genre as intelligent, non-self-helpy, yet immensely helpful guides to modern living.

At the heart of Perry’s argument — in line with neurologist Oliver Sacks’s recent meditation on memory and how “narrative truth,” rather than “historical truth,” shapes our impression of the world — is the recognition that stories make us human and learning to reframe our interpretations of reality is key to our experience of life:

Our stories give shape to our inchoate, disparate, fleeting impressions of everyday life. They bring together the past and the future into the present to provide us with structures for working towards our goals. They give us a sense of identity and, most importantly, serve to integrate the feelings of our right brain with the language of our left.

[…]

We are primed to use stories. Part of our survival as a species depended upon listening to the stories of our tribal elders as they shared parables and passed down their experience and the wisdom of those who went before. As we get older it is our short-term memory that fades rather than our long-term memory. Perhaps we have evolved like this so that we are able to tell the younger generation about the stories and experiences that have formed us which may be important to subsequent generations if they are to thrive.

I worry, though, about what might happen to our minds if most of the stories we hear are about greed, war and atrocity.

Perry goes on to cite research indicating that people who watch television for more than four hours a day see themselves as far more likely to fall victim in a violent incident in the forthcoming week than their peers who watch less than two hours a day. Just like E. B. White advocated for the responsibility of the writer to “to lift people up, not lower them down,” so too is our responsibility as the writers of our own life-stories to avoid the well-documented negativity bias of modern media — because, as artist Austin Kleon wisely put it, “you are a mashup of what you let into your life.” Perry writes:

Be careful which stories you expose yourself to.

[…]

The meanings you find, and the stories you hear, will have an impact on how optimistic you are: it’s how we evolved. … If you do not know how to draw positive meaning from what happens in life, the neural pathways you need to appreciate good news will never fire up.

[…]

The trouble is, if we do not have a mind that is used to hearing good news, we do not have the neural pathways to process such news.

Yet despite the adaptive optimism bias of the human brain, Perry argues a positive outlook is a practice — and one that requires mastering the art of vulnerability and increasing our essential tolerance for uncertainty:

You may find that you have been telling yourself that practicing optimism is a risk, as though, somehow, a positive attitude will invite disaster and so if you practice optimism it may increase your feelings of vulnerability. The trick is to increase your tolerance for vulnerable feelings, rather than avoid them altogether.

[…]

Optimism does not mean continual happiness, glazed eyes and a fixed grin. When I talk about the desirability of optimism I do not mean that we should delude ourselves about reality. But practicing optimism does mean focusing more on the positive fall-out of an event than on the negative. … I am not advocating the kind of optimism that means you blow all your savings on a horse running at a hundred to one; I am talking about being optimistic enough to sow some seeds in the hope that some of them will germinate and grow into flowers.

Another key obstruction to our sanity is our chronic aversion to being wrong, entwined with our damaging fear of the unfamiliar. Perry cautions:

We all like to think we keep an open mind and can change our opinions in the light of new evidence, but most of us seem to be geared to making up our minds very quickly. Then we process further evidence not with an open mind but with a filter, only acknowledging the evidence that backs up our original impression. It is too easy for us to fall into the rap of believing that being right is more important than being open to what might be.

If we practice detachment from our thoughts we learn to observe them as though we are taking a bird’s eye view of our own thinking. When we do this, we might find that our thinking belongs to an older, and different, story to the one we are now living.

Perry concludes:

We need to look at the repetitions in the stories we tell ourselves [and] at the process of the stories rather than merely their surface content. Then we can begin to experiment with changing the filter through which we look at the world, start to edit the story and thus regain flexibility where we have been getting stuck.

Originally featured in February — read the full article here.

BP

View Full Site

Brain Pickings participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from a link on here, I receive a small percentage of its price. Privacy policy.