Brain Pickings Icon
Brain Pickings

Search results for “obama identity”

16 Overall Favorite Books of 2016

From loneliness to love to black holes, by way of Neil Gaiman, Annie Dillard, and Mary Oliver.

16 Overall Favorite Books of 2016

To look back on any period of reading with the intention of selecting one’s favorite books is a curious two-way time machine — one must scoop the memory of a past and filter it through the sieve of an indefinite future in an effort to discern which books have left a mark on one’s conscience deep enough to last a lifetime. Of the many books I read in 2016, these are the sixteen that moved me most deeply and memorably. And since I stand with Susan Sontag, who considered reading an act of rebirth, I invite you to revisit the annual favorites for 2015, 2014, and 2013.

THE LONELY CITY

“You are born alone. You die alone. The value of the space in between is trust and love,” artist Louise Bourgeois wrote in her diary at the end of a long and illustrious life as she contemplated how solitude enriches creative work. It’s a lovely sentiment, but as empowering as it may be to those willing to embrace solitude, it can be tremendously lonesome-making to those for whom loneliness has contracted the space of trust and love into a suffocating penitentiary. For if in solitude, as Wendell Berry memorably wrote, “one’s inner voices become audible [and] one responds more clearly to other lives,” in loneliness one’s inner scream becomes deafening, deadening, severing any thread of connection to other lives.

How to break free of that prison and reinhabit the space of trust and love is what Olivia Laing explores in The Lonely City: Adventures in the Art of Being Alone (public library) — an extraordinary more-than-memoir; a sort of memoir-plus-plus, partway between Helen MacDonald’s H Is for Hawk and the diary of Virginia Woolf; a lyrical account of wading through a period of self-expatriation, both physical and psychological, in which Laing paints an intimate portrait of loneliness as “a populated place: a city in itself.”

After the sudden collapse of a romance marked by extreme elation, Laing left her native England and took her shattered heart to New York, “that teeming island of gneiss and concrete and glass.” The daily, bone-deep loneliness she experienced there was both paralyzing in its all-consuming potency and, paradoxically, a strange invitation to aliveness. Indeed, her choice to leave home and wander a foreign city is itself a rich metaphor for the paradoxical nature of loneliness, animated by equal parts restlessness and stupor, capable of turning one into a voluntary vagabond and a catatonic recluse all at once, yet somehow a vitalizing laboratory for self-discovery. The pit of loneliness, she found, could “drive one to consider some of the larger questions of what it is to be alive.”

She writes:

There were things that burned away at me, not only as a private individual, but also as a citizen of our century, our pixelated age. What does it mean to be lonely? How do we live, if we’re not intimately engaged with another human being? How do we connect with other people, particularly if we don’t find speaking easy? Is sex a cure for loneliness, and if it is, what happens if our body or sexuality is considered deviant or damaged, if we are ill or unblessed with beauty? And is technology helping with these things? Does it draw us closer together, or trap us behind screens?

Bedeviled by this acute emotional anguish, Laing seeks consolation in the great patron saints of loneliness in twentieth-century creative culture. From this eclectic tribe of the lonesome — including Jean-Michel Basquiat, Alfred Hitchcock, Peter Hujar, Billie Holiday, and Nan Goldin — Laing chooses four artists as her companions charting the terra incognita of loneliness: Edward Hopper, Andy Warhol, Henry Darger, and David Wojnarowicz, who had all “grappled in their lives as well as work with loneliness and its attendant issues.”

Art by Isol from Daytime Visions

Laing examines the particular, pervasive form of loneliness in the eye of a city aswirl with humanity:

Imagine standing by a window at night, on the sixth or seventeenth or forty-third floor of a building. The city reveals itself as a set of cells, a hundred thousand windows, some darkened and some flooded with green or white or golden light. Inside, strangers swim to and fro, attending to the business of their private hours. You can see them, but you can’t reach them, and so this commonplace urban phenomenon, available in any city of the world on any night, conveys to even the most social a tremor of loneliness, its uneasy combination of separation and exposure.

You can be lonely anywhere, but there is a particular flavour to the loneliness that comes from living in a city, surrounded by millions of people. One might think this state was antithetical to urban living, to the massed presence of other human beings, and yet mere physical proximity is not enough to dispel a sense of internal isolation. It’s possible – easy, even – to feel desolate and unfrequented in oneself while living cheek by jowl with others. Cities can be lonely places, and in admitting this we see that loneliness doesn’t necessarily require physical solitude, but rather an absence or paucity of connection, closeness, kinship: an inability, for one reason or another, to find as much intimacy as is desired. Unhappy, as the dictionary has it, as a result of being without the companionship of others. Hardly any wonder, then, that it can reach its apotheosis in a crowd.

There is, of course, a universe of difference between solitude and loneliness — two radically different interior orientations toward the same exterior circumstance of lacking companionship. We speak of “fertile solitude” as a developmental achievement essential for our creative capacity, but loneliness is barren and destructive; it cottons in apathy the will to create. More than that, it seems to signal an existential failing — a social stigma the nuances of which Laing addresses beautifully:

Loneliness is difficult to confess; difficult too to categorise. Like depression, a state with which it often intersects, it can run deep in the fabric of a person, as much a part of one’s being as laughing easily or having red hair. Then again, it can be transient, lapping in and out in reaction to external circumstance, like the loneliness that follows on the heels of a bereavement, break-up or change in social circles.

Like depression, like melancholy or restlessness, it is subject too to pathologisation, to being considered a disease. It has been said emphatically that loneliness serves no purpose… Perhaps I’m wrong, but I don’t think any experience so much a part of our common shared lives can be entirely devoid of meaning, without a richness and a value of some kind.

Dive deeper here.

HOPE IN THE DARK

I think a great deal about what it means to live with hope and sincerity in the age of cynicism, about how we can continue standing at the gates of hope as we’re being bombarded with news of hopeless acts of violence, as we’re confronted daily with what Marcus Aurelius called the “meddling, ungrateful, arrogant, dishonest, jealous, and surly.”

I’ve found no more lucid and luminous a defense of hope than the one Rebecca Solnit launches in Hope in the Dark: Untold Histories, Wild Possibilities (public library) — a slim, potent book that has grown only more relevant and poignant in the decade since its original publication in the wake of the Bush administration’s invasion of Iraq, recently reissued with a new introduction by Solnit.

Rebecca Solnit (Photograph: Sallie Dean Shatz)
Rebecca Solnit (Photograph: Sallie Dean Shatz)

We lose hope, Solnit suggests, because we lose perspective — we lose sight of the “accretion of incremental, imperceptible changes” which constitute progress and which render our era dramatically different from the past, a contrast obscured by the undramatic nature of gradual transformation punctuated by occasional tumult. She writes:

There are times when it seems as though not only the future but the present is dark: few recognize what a radically transformed world we live in, one that has been transformed not only by such nightmares as global warming and global capital, but by dreams of freedom and of justice — and transformed by things we could not have dreamed of… We need to hope for the realization of our own dreams, but also to recognize a world that will remain wilder than our imaginations.

Solnit — one of the most singular, civically significant, and poetically potent voices of our time, emanating echoes of Virginia Woolf’s luminous prose and Adrienne Rich’s unflinching political conviction — looks back on the seemingly distant past as she peers forward into the near future:

The moment passed long ago, but despair, defeatism, cynicism, and the amnesia and assumptions from which they often arise have not dispersed, even as the most wildly, unimaginably magnificent things came to pass. There is a lot of evidence for the defense… Progressive, populist, and grassroots constituencies have had many victories. Popular power has continued to be a profound force for change. And the changes we’ve undergone, both wonderful and terrible, are astonishing.

[…]

This is an extraordinary time full of vital, transformative movements that could not be foreseen. It’s also a nightmarish time. Full engagement requires the ability to perceive both.

Engage more fully here.

UPSTREAM

To read Mary Oliver is to be read by her — to be made real by her words, to have the richest subterranean truths of your own experience mirrored back to you with tenfold the luminosity. Her prose collection Upstream: Selected Essays (public library) is a book of uncommon enchantment, containing Oliver’s largehearted wisdom on writing, creative work, and the art of life.

In one particularly satisfying piece from the volume, titled Of Power and Time,” Oliver writes:

The working, concentrating artist is an adult who refuses interruption from himself, who remains absorbed and energized in and by the work — who is thus responsible to the work… Serious interruptions to work, therefore, are never the inopportune, cheerful, even loving interruptions which come to us from another.

[…]

It is six A.M., and I am working. I am absentminded, reckless, heedless of social obligations, etc. It is as it must be. The tire goes flat, the tooth falls out, there will be a hundred meals without mustard. The poem gets written. I have wrestled with the angel and I am stained with light and I have no shame. Neither do I have guilt. My responsibility is not to the ordinary, or the timely. It does not include mustard, or teeth. It does not extend to the lost button, or the beans in the pot. My loyalty is to the inner vision, whenever and howsoever it may arrive. If I have a meeting with you at three o’clock, rejoice if I am late. Rejoice even more if I do not arrive at all.

There is no other way work of artistic worth can be done. And the occasional success, to the striver, is worth everything. The most regretful people on earth are those who felt the call to creative work, who felt their own creative power restive and uprising, and gave to it neither power nor time.

For a richer taste of this feast for the mind, heart, and spirit, see Oliver on how books saved her life and time, the artist’s task, and the central commitment of the creative life.

BLACK HOLE BLUES

In Black Hole Blues and Other Songs from Outer Space (public library), which crowns the year’s finest science books, cosmologist and novelist Janna Levin tells the story of the century-long vision, originated by Einstein, and half-century experimental quest to hear the sound of spacetime by detecting a gravitational wave. This book remains one of the most intensely interesting and beautifully written I’ve ever encountered — the kind that comes about once a generation if we’re lucky.

Everything we know about the universe so far comes from four centuries of sight — from peering into space with our eyes and their prosthetic extension, the telescope. Now commences a new mode of knowing the cosmos through sound. The detection of gravitational waves is one of the most significant discoveries in the entire history of physics, marking the dawn of a new era as we begin listening to the sound of space — the probable portal to mysteries as unimaginable to us today as galaxies and nebulae and pulsars and other cosmic wonders were to the first astronomers. Gravitational astronomy, as Levin elegantly puts it, promises a “score to accompany the silent movie humanity has compiled of the history of the universe from still images of the sky, a series of frozen snapshots captured over the past four hundred years since Galileo first pointed a crude telescope at the Sun.”

blackholes_einstein

Astonishingly enough, Levin wrote the book before the Laser Interferometer Gravitational-Wave Observatory (LIGO) — the monumental instrument at the center of the story, decades in the making — made the actual detection of a ripple in the fabric of spacetime caused by the collision of two black holes in the autumn of 2015, exactly a century after Einstein first envisioned the possibility of gravitational waves. So the story she tells is not that of the triumph but that of the climb, which renders it all the more enchanting — because it is ultimately a story about the human spirit and its incredible tenacity, about why human beings choose to devote their entire lives to pursuits strewn with unimaginable obstacles and bedeviled by frequent failure, uncertain rewards, and meager public recognition.

Indeed, what makes the book interesting is that it tells the story of this monumental discovery, but what makes it enchanting is that Levin comes at it from a rather unusual perspective. She is a working astrophysicist who studies black holes, but she is also an incredibly gifted novelist — an artist whose medium is language and thought itself. This is no popular science book but something many orders of magnitude higher in its artistic vision, the impeccable craftsmanship of language, and the sheer pleasure of the prose. The story is structured almost as a series of short, integrated novels, with each chapter devoted to one of the key scientists involved in LIGO. With Dostoyevskian insight and nuance, Levin paints a psychological, even philosophical portrait of each protagonist, revealing how intricately interwoven the genius and the foibles are in the fabric of personhood and what a profoundly human endeavor science ultimately is.

She writes:

Scientists are like those levers or knobs or those boulders helpfully screwed into a climbing wall. Like the wall is some cemented material made by mixing knowledge, which is a purely human construct, with reality, which we can only access through the filter of our minds. There’s an important pursuit of objectivity in science and nature and mathematics, but still the only way up the wall is through the individual people, and they come in specifics… So the climb is personal, a truly human endeavor, and the real expedition pixelates into individuals, not Platonic forms.

For a taste of this uncategorizably wonderful book, see Levin on the story of the tragic hero who pioneered gravitational astronomy and how astronomer Jocelyn Bell discovered pulsars.

TIME TRAVEL

Time Travel: A History (public library) by science historian and writer extraordinaire James Gleick, another rare enchanter of science, is not a “science book” per se, in that although it draws heavily on the history of twentieth-century science and quantum physics in particular (as well as on millennia of philosophy), it is a decidedly literary inquiry into our temporal imagination — why we think about time, why its directionality troubles us so, and what asking these questions at all reveals about the deepest mysteries of our consciousness. I consider it a grand thought experiment, using physics and philosophy as the active agents, and literature as the catalyst.

Gleick, who examined the origin of our modern anxiety about time with remarkable prescience nearly two decades ago, traces the invention of the notion of time travel to H.G. Wells’s 1895 masterpiece The Time Machine. Although Wells — like Gleick, like any reputable physicist — knew that time travel was a scientific impossibility, he created an aesthetic of thought which never previously existed and which has since shaped the modern consciousness. Gleick argues that the art this aesthetic produced — an entire canon of time travel literature and film — not only permeated popular culture but even influenced some of the greatest scientific minds of the past century, including Stephen Hawking, who once cleverly hosted a party for time travelers and when no one showed up considered the impossibility of time travel proven, and John Archibald Wheeler, who popularized the term “black hole” and coined “wormhole,” both key tropes of time travel literature.

Gleick considers how a scientific impossibility can become such fertile ground for the artistic imagination:

Why do we need time travel, when we already travel through space so far and fast? For history. For mystery. For nostalgia. For hope. To examine our potential and explore our memories. To counter regret for the life we lived, the only life, one dimension, beginning to end.

Wells’s Time Machine revealed a turning in the road, an alteration in the human relationship with time. New technologies and ideas reinforced one another: the electric telegraph, the steam railroad, the earth science of Lyell and the life science of Darwin, the rise of archeology out of antiquarianism, and the perfection of clocks. When the nineteenth century turned to the twentieth, scientists and philosophers were primed to understand time in a new way. And so were we all. Time travel bloomed in the culture, its loops and twists and paradoxes.

I wrote about Gleick’s uncommonly pleasurable book at length here.

THE VIEW FROM THE CHEAP SEATS

Neil Gaiman is one of the most beloved storytellers of our time, unequaled at his singular brand of darkly delightful fantasy. His long-awaited nonfiction collection The View from the Cheap Seats (public library) celebrates a different side of Gaiman. Here stands a writer of firm conviction and porous curiosity, an idealist amid our morass of cynicism who, in revealing who he is, reveals who we are and who we can be if we only tried a little bit harder to wrest more goodness out of our imperfect humanity. An evangelist for the righteous without a shred of our culture’s pathological self-righteousness, Gaiman jolts us out of our collective amnesia and reminds us again and again what matters: ideas over ideologies, public libraries, the integrity of children’s inner lives, the stories we choose to tell of why the world is the way it is, the moral obligation to imagine better stories — and, oh, the sheer fun of it all.

Neil Gaiman (Photograph: Amanda Palmer)
Neil Gaiman (Photograph: Amanda Palmer)

Among the many gems in the collection, which include Gaiman’s meditations on why we read and the power of cautionary questions, is a particularly timely short piece titled “Credo,” in which Gaiman writes:

I believe that it is difficult to kill an idea because ideas are invisible and contagious, and they move fast.

I believe that you can set your own ideas against ideas you dislike. That you should be free to argue, explain, clarify, debate, offend, insult, rage, mock, sing, dramatize, and deny.

I do not believe that burning, murdering, exploding people, smashing their heads with rocks (to let the bad ideas out), drowning them or even defeating them will work to contain ideas you do not like. Ideas spring up where you do not expect them, like weeds, and are as difficult to control.

I believe that repressing ideas spreads ideas.

Read more here.

HOLD STILL

“Memory is never a precise duplicate of the original… it is a continuing act of creation,” pioneering researcher Rosalind Cartwright wrote in distilling the science of the unconscious mind.

Although I lack early childhood memories, I do have one rather eidetic recollection: I remember standing before the barren elephant yard at the Sofia Zoo in Bulgaria, at age three or so, clad in a cotton polka-dot jumper. I remember squinting into a scowl as the malnourished elephant behind me swirls dirt into the air in front of her communism-stamped concrete edifice. I don’t remember the temperature, though I deduce from the memory of my outfit that it must have been summer. I don’t remember the smell of the elephant or the touch of the blown dirt on my skin, though I remember my grimace.

For most of my life, I held onto that memory as the sole surviving mnemonic fragment of my early childhood self. And then, one day in my late twenties, I discovered an old photo album tucked into the back of my grandmother’s cabinet in Bulgaria. It contained dozens of photographs of me, from birth until around age four, including one depicting that very vignette — down to the minutest detail of what I believed was my memory of that moment. There I was, scowling in my polka-dot jumper with the elephant and the cloud of dust behind me. In an instant, I realized that I had been holding onto a prosthetic memory — what I remembered was the photograph from that day, which I must have been shown at some point, and not the day itself, of which I have no other recollection. The question — and what a Borgesian question — remains whether one should prefer having such a prosthetic memory, constructed entirely of photographs stitched together into artificial cohesion, to having no memory at all.

That confounding parallax of personal history is what photographer Sally Mann explores throughout Hold Still: A Memoir with Photographs (public library) — a lyrical yet unsentimental meditation on art, mortality, and the lacuna between memory and myth, undergirded by what Mann calls her “long preoccupation with the treachery of memory” and “memory’s truth, which is to scientific, objective truth as a pearl is to a piece of sand.”

Sally Mann as a girl
Sally Mann as a child

In a sentiment that calls to mind Oliver Sacks’s exquisite elucidation of how memory works, Mann writes:

Whatever of my memories hadn’t crumbled into dust must surely by now have been altered by the passage of time. I tend to agree with the theory that if you want to keep a memory pristine, you must not call upon it too often, for each time it is revisited, you alter it irrevocably, remembering not the original impression left by experience but the last time you recalled it. With tiny differences creeping in at each cycle, the exercise of our memory does not bring us closer to the past but draws us farther away.

I had learned over time to meekly accept whatever betrayals memory pulled over on me, allowing my mind to polish its own beautiful lie. In distorting the information it’s supposed to be keeping safe, the brain, to its credit, will often bow to some instinctive aesthetic wisdom, imparting to our life’s events a coherence, logic, and symbolic elegance that’s not present or not so obvious in the improbable, disheveled sloppiness of what we’ve actually been through.

Photograph: Sally Mann
Photograph: Sally Mann

Nearly half a century after Italo Calvino observed that “the life that you live in order to photograph it is already, at the outset, a commemoration of itself,” Mann traces this cultural pathology — now a full epidemic with the rise of the photo-driven social web — to the dawn of the medium itself. Reflecting on the discovery of a box of old photographs in her own family’s attic, she echoes Teju Cole’s assertion that “photography is at the nerve center of our paradoxical memorial impulses” and writes:

As far back as 1901 Émile Zola telegraphed the threat of this relatively new medium, remarking that you cannot claim to have really seen something until you have photographed it. What Zola perhaps also knew or intuited was that once photographed, whatever you had “really seen” would never be seen by the eye of memory again. It would forever be cut from the continuum of being, a mere sliver, a slight, translucent paring from the fat life of time; elegiac, one-dimensional, immediately assuming the amber quality of nostalgia: an instantaneous memento mori. Photography would seem to preserve our past and make it invulnerable to the distortions of repeated memorial superimpositions, but I think that is a fallacy: photographs supplant and corrupt the past, all the while creating their own memories. As I held my childhood pictures in my hands, in the tenderness of my “remembering,” I also knew that with each photograph I was forgetting.

Read more here.

ANGER AND FORGIVENESS

“We’ve got to be as clear-headed about human beings as possible, because we are still each other’s only hope,” James Baldwin told Margaret Mead in their terrific forgotten conversation about forgiveness and the difference between guilt and responsibility. “To forgive is to assume a larger identity than the person who was first hurt,” philosopher David Whyte echoed half a century later in contemplating anger, forgiveness, and what maturity really means. And yet the dance of anger and forgiveness, performed to the uncontrollable rhythm of trust, is perhaps the most difficult in human life, as well as one of the oldest.

The moral choreography of that dance is what philosopher Martha Nussbaum explores in Anger and Forgiveness: Resentment, Generosity, Justice (public library).

Martha Nussbaum

Nussbaum, who has previously examined the intelligence of the emotions and whom I consider the most incisive philosopher of our time, argues that despite anger’s long cultural history of being seen as morally justifiable and as a useful signal that wrongdoing has taken place, it is a normatively faulty response that masks deeper, more difficult emotions and stands in the way of resolving them. Consequently, forgiveness — which Nussbaum defines as “a change of heart on the part of the victim, who gives up anger and resentment in response to the offender’s confession and contrition” — is also warped into a transactional proposition wherein the wrongdoer must earn, through confession and apology, the wronged person’s morally superior grace.

Nussbaum outlines the core characteristics and paradoxes of anger:

Anger is an unusually complex emotion, since it involves both pain and pleasure [because] the prospect of retribution is pleasant… Anger also involves a double reference—to a person or people and to an act… The focus of anger is an act imputed to the target, which is taken to be a wrongful damage.

Injuries may be the focus in grief as well. But whereas grief focuses on the loss or damage itself, and lacks a target (unless it is the lost person, as in “I am grieving for so-and-so”), anger starts with the act that inflicted the damage, seeing it as intentionally inflicted by the target — and then, as a result, one becomes angry, and one’s anger is aimed at the target. Anger, then, requires causal thinking, and some grasp of right and wrong.

[…]

Notoriously, however, people sometimes get angry when they are frustrated by inanimate objects, which presumably cannot act wrongfully… In 1988, the Journal of the American Medical Association published an article on “vending machine rage”: fifteen injuries, three of them fatal, as a result of angry men kicking or rocking machines that had taken their money without dispensing the drink. (The fatal injuries were caused by machines falling over on the men and crushing them.)

Beneath this tragicomic response lies a combination of personal insecurity, vulnerability, and what Nussbaum calls status-injury (or what Aristotle called down-ranking) — the perception that the wrongdoer has lowered the social status of the wronged — conspiring to produce a state of exasperating helplessness. Anger, Nussbaum argues, is how we seek to create an illusion of control where we feel none.

Art by JooHee Yoon from The Tiger Who Would Be King, James Thurber’s parable of the destructiveness of status-seeking

She writes:

Anger is not always, but very often, about status-injury. And status-injury has a narcissistic flavor: rather than focusing on the wrongfulness of the act as such, a focus that might lead to concern for wrongful acts of the same type more generally, the status-angry person focuses obsessively on herself and her standing vis-à-vis others.

[…]

We are prone to anger to the extent that we feel insecure or lacking control with respect to the aspect of our goals that has been assailed — and to the extent that we expect or desire control. Anger aims at restoring lost control and often achieves at least an illusion of it. To the extent that a culture encourages people to feel vulnerable to affront and down-ranking in a wide variety of situations, it encourages the roots of status-focused anger.

Nowhere is anger more acute, nor more damaging, than in intimate relationships, where the stakes are impossibly high. Because they are so central to our flourishing and because our personal investment in them is at its deepest, the potential for betrayal there is enormous and therefore enormously vulnerable-making. Crucially, Nussbaum argues, intimate relationships involve trust, which is predicated on inevitable vulnerability. She considers what trust actually means:

Trust … is different from mere reliance. One may rely on an alarm clock, and to that extent be disappointed if it fails to do its job, but one does not feel deeply vulnerable, or profoundly invaded by the failure. Similarly, one may rely on a dishonest colleague to continue lying and cheating, but this is reason, precisely, not to trust that person; instead, one will try to protect oneself from damage. Trust, by contrast, involves opening oneself to the possibility of betrayal, hence to a very deep form of harm. It means relaxing the self-protective strategies with which we usually go through life, attaching great importance to actions by the other over which one has little control. It means, then, living with a certain degree of helplessness.

Is trust a matter of belief or emotion? Both, in complexly related ways. Trusting someone, one believes that she will keep her commitments, and at the same time one appraises those commitments as very important for one’s own flourishing. But that latter appraisal is a key constituent part of a number of emotions, including hope, fear, and, if things go wrong, deep grief and loss. Trust is probably not identical to those emotions, but under normal circumstances of life it often proves sufficient for them. One also typically has other related emotions toward a person whom one trusts, such as love and concern. Although one typically does not decide to trust in a deliberate way, the willingness to be in someone else’s hands is a kind of choice, since one can certainly live without that type of dependency… Living with trust involves profound vulnerability and some helplessness, which may easily be deflected into anger.

Read more here.

UNFORBIDDEN PLEASURES

The English psychoanalytical writer Adam Phillips has written with beguiling nuance about such variousness of our psychic experience as the importance of “fertile solitude,” the value of missing out, and the rewards of being out of balance. In Unforbidden Pleasures (public library), he explores our paradoxical desires and the topsy-turvy ways we go about pursuing pleasure and avoiding pain.

In the collection’s standout essay, titled “Against Self-Criticism,” Phillips reaches across the space-time of culture to both revolt against and pay homage to Susan Sontag’s masterwork Against Interpretation, and examines “our virulent, predatory self-criticism [has] become one of our greatest pleasures.” He writes:

In broaching the possibility of being, in some way, against self-criticism, we have to imagine a world in which celebration is less suspect than criticism; in which the alternatives of celebration and criticism are seen as a determined narrowing of the repertoire; and in which we praise whatever we can.

But we have become so indoctrinated in this conscience of self-criticism, both collectively and individually, that we’ve grown reflexively suspicious of that alternative possibility. (Kafka, the great patron-martyr of self-criticism, captured this pathology perfectly: “There’s only one thing certain. That is one’s own inadequacy.”) Phillips writes:

Self-criticism, and the self as critical, are essential to our sense, our picture, of our so-called selves.

[…]

Nothing makes us more critical, more confounded — more suspicious, or appalled, or even mildly amused — than the suggestion that we should drop all this relentless criticism; that we should be less impressed by it. Or at least that self-criticism should cease to have the hold over us that it does.

Read more here.

THE COURSE OF LOVE

“Nothing awakens us to the reality of life so much as a true love,” Vincent van Gogh wrote to his brother. “Why is love rich beyond all other possible human experiences and a sweet burden to those seized in its grasp?” philosopher Martin Heidegger asked in his electrifying love letters to Hannah Arendt. “Because we become what we love and yet remain ourselves.” Still, nearly every anguishing aspect of love arises from the inescapable tension between this longing for transformative awakening and the sleepwalking selfhood of our habitual patterns. True as it may be that frustration is a prerequisite for satisfaction in romance, how are we to reconcile the sundering frustration of these polar pulls?

The multiple sharp-edged facets of this question are what Alain de Botton explores in The Course of Love (public library) — a meditation on the beautiful, tragic tendernesses and fragilities of the human heart, at once unnerving and assuring in its psychological insightfulness. At its heart is a lamentation of — or, perhaps, an admonition against — how the classic Romantic model has sold us on a number of self-defeating beliefs about the most essential and nuanced experiences of human life: love, infatuation, marriage, sex, children, infidelity, trust.

Alain De Botton
Alain de Botton

A sequel of sorts to his 1993 novel On Love, the book is bold bending of form that fuses fiction and De Botton’s supreme forte, the essay — twined with the narrative thread of the romance between the two protagonists are astute observations at the meeting point of psychology and philosophy, spinning out from the particular problems of the couple to unravel broader insight into the universal complexities of the human heart.

In fact, as the book progresses, one gets the distinct and surprisingly pleasurable sense that De Botton has sculpted the love story around the robust armature of these philosophical meditations; that the essay is the raison d’être for the fiction.

In one of these contemplative interstitials, De Botton writes:

Maturity begins with the capacity to sense and, in good time and without defensiveness, admit to our own craziness. If we are not regularly deeply embarrassed by who we are, the journey to self-knowledge hasn’t begun.

For a richer taste of the book, devour these portions exploring why our partners drive us mad, what makes a good communicator, and the paradox of sulking.

THE GUTSY GIRL

In 1885, a young woman sent the editor of her hometown newspaper a brilliant response to a letter by a patronizing chauvinist, which the paper had published under the title “What Girls Are Good For.” The woman, known today as Nellie Bly, so impressed the editor that she was hired at the paper and went on to become a trailblazing journalist, circumnavigating the globe in 75 days with only a duffle bag and risking her life to write a seminal exposé of asylum abuse, which forever changed legal protections for the mentally ill. But Bly’s courage says as much about her triumphant character as it does about the tragedies of her culture — she is celebrated as a hero in large part because she defied and transcended the limiting gender norms of the Victorian era, which reserved courageous and adventurous feats for men, while raising women to be diffident, perfect, and perfectly pretty instead.

Writer Caroline Paul, one of the first women on San Francisco’s firefighting force and an experimental plane pilot, believes that not much has changed in the century since — that beneath the surface progress, our culture still nurses girls on “the insidious language of fear” and boys on that of bravery and resilience. She offers an intelligent and imaginative antidote in The Gutsy Girl: Escapades for Your Life of Epic Adventure (public library) — part memoir, part manifesto, part aspirational workbook, aimed at tween girls but speaking to the ageless, ungendered spirit of adventure in all of us, exploring what it means to be brave, to persevere, to break the tyranny of perfection, and to laugh at oneself while setting out to do the seemingly impossible.

gutsygirl4

Illustrated by Paul’s partner (and my frequent collaborator), artist and graphic journalist Wendy MacNaughton, the book features sidebar celebrations of diverse “girl heroes” of nearly every imaginable background, ranging from famous pioneers like Nellie Bly and astronaut Mae Jemison to little-known adventurers like canopy-climbing botanist Marie Antoine, prodigy rock-climber Ashima Shiraishi, and barnstorming pilot and parachutist Bessie “Queen Bess” Coleman.

A masterful memoirist who has previously written about what a lost cat taught her about finding human love and what it’s like to be a twin, Paul structures each chapter as a thrilling micro-memoir of a particular adventure from her own life — building a milk carton pirate ship as a teenager and sinking it triumphantly into the rapids, mastering a challenging type of paragliding as a young woman, climbing and nearly dying on the formidable mount Denali as an adult.

gutsygirl5

Let me make one thing clear: Throughout the book, Paul does a remarkably thoughtful job of pointing out the line between adventurousness and recklessness. Her brushes with disaster, rather than lionizing heedlessness, are the book’s greatest gift precisely because they decondition the notion that an adventure is the same thing as an achievement — that one must be perfect and error-proof in every way in order to live a daring and courageous life. Instead, by chronicling her many missteps along the running starts of her leaps, she assures the young reader over and over that owning up to mistakes isn’t an attrition of one’s courage but an essential building block of it. After all, the fear of humiliation is perhaps what undergirds all fear, and in our culture of stubborn self-righteousness, there are few things we resist more staunchly, to the detriment of our own growth, than looking foolish for being wrong. The courageous, Paul reminds us, trip and fall, often in public, but get right back up and leap again.

Indeed, the book is a lived and living testament to psychologist Carol Dweck’s seminal work on the “fixed” vs. “growth” mindsets — life-tested evidence that courage is the fruit not of perfection but of doggedness in the face of fallibility, fertilized by the choice (and it is a choice, Paul reminds us over and over) to get up and dust yourself off each time.

But Paul wasn’t always an adventurer. She reflects:

I had been a shy and fearful kid. Many things had scared me. Bigger kids. Second grade. The elderly woman across the street. Being called on in class. The book Where the Wild Things Are. Woods at dusk. The way the bones in my hand crisscrossed.

Being scared was a terrible feeling, like sinking in quicksand. My stomach would drop, my feet would feel heavy, my head would prickle. Fear was an all-body experience. For a shy kid like me it was overwhelming.

Let me pause here to note that Caroline Paul is one of the most extraordinary human beings I know — a modern-day Amazon, Shackleton, Amelia Earhart, and Hedy Lamarr rolled into one — and since she is also a brilliant writer, the self-deprecating humor permeating the book serves a deliberate purpose: to assure us that no one is born a modern-day Amazon, Shackleton, Amelia Earhart, and Hedy Lamarr rolled into one, but the determined can become it by taking on challenges, conceding the possibility of imperfection and embarrassment, and seeing those outcomes as part of the adventure rather than as failure at achievement.

That’s exactly what Paul does in the adventures she chronicles. It’s time, after all, to replace that woeful Victorian map of woman’s heart with a modern map of the gutsy girl spirit.

gutsygirl3

Read and see more here.

HIDDEN FIGURES

“No woman should say, ‘I am but a woman!’ But a woman! What more can you ask to be?” astronomer Maria Mitchell, who paved the way for women in American science, admonished the first class of female astronomers at Vassar in 1876. By the middle of the next century, a team of unheralded women scientists and engineers were powering space exploration at NASA’s Jet Propulsion Laboratory.

Meanwhile, across the continent and in what was practically another country, a parallel but very different revolution was taking place: In the segregated South, a growing number of black female mathematicians, scientists, and engineers were steering early space exploration and helping American win the Cold War at NASA’s Langley Research Center in Hampton, Virginia.

Long before the term “computer” came to signify the machine that dictates our lives, these remarkable women were working as human “computers” — highly skilled professional reckoners, who thought mathematically and computationally for their living and for their country. When Neil Armstrong set his foot on the moon, his “giant leap for mankind” had been powered by womankind, particularly by Katherine Johnson — the “computer” who calculated Apollo 11’s launch windows and who was awarded the Presidential Medal of Freedom by President Obama at age 97 in 2015, three years after the accolade was conferred upon John Glenn, the astronaut whose flight trajectory Johnson had made possible.

Katherine Johnson at her Langley desk with a globe, or "Celestial Training Device," 1960 (Photographs: NASA)
Katherine Johnson at her Langley desk with a globe, or “Celestial Training Device,” 1960 (Photographs: NASA)

In Hidden Figures: The Story of the African-American Women Who Helped Win the Space Race (public library), Margot Lee Shetterly tells the untold story of these brilliant women, once on the frontlines of our cultural leaps and since sidelined by the selective collective memory we call history.

She writes:

Just as islands — isolated places with unique, rich biodiversity — have relevance for the ecosystems everywhere, so does studying seemingly isolated or overlooked people and events from the past turn up unexpected connections and insights to modern life.

Against a sobering cultural backdrop, Shetterly captures the enormous cognitive dissonance the very notion of these black female mathematicians evokes:

Before a computer became an inanimate object, and before Mission Control landed in Houston; before Sputnik changed the course of history, and before the NACA became NASA; before the Supreme Court case Brown v. Board of Education of Topeka established that separate was in fact not equal, and before the poetry of Martin Luther King Jr.’s “I Have a Dream” speech rang out over the steps of the Lincoln Memorial, Langley’s West Computers were helping America dominate aeronautics, space research, and computer technology, carving out a place for themselves as female mathematicians who were also black, black mathematicians who were also female.

Shetterly herself grew up in Hampton, which dubbed itself “Spacetown USA,” amid this archipelago of women who were her neighbors and teachers. Her father, who had built his first rocket in his early teens after seeing the Sputnik launch, was one of Langley’s African American scientists in an era when words we now shudder to hear were used instead of “African American.” Like him, the first five black women who joined Langley’s research staff in 1943 entered a segregated NASA — even though, as Shetterly points out, the space agency was among the most inclusive workplaces in the country, with more than fourfold the percentage of black scientists and engineers than the national average.

Over the next forty years, the number of these trailblazing black women mushroomed to more than fifty, revealing the mycelia of a significant groundswell. Shetterly’s favorite Sunday school teacher had been one of the early computers — a retired NASA mathematician named Kathleen Land. And so Shetterly, who considers herself “as much a product of NASA as the Moon landing,” grew up believing that black women simply belonged in science and space exploration as a matter of course — after all, they populated her father’s workplace and her town, a town whose church “abounded with mathematicians.”

Embodying astronomer Vera Rubin’s wisdom on how modeling expands children’s scope of possibility, Shetterly reflects on this normalizing and rousing power of example:

Building 1236, my father’s daily destination, contained a byzantine complex of government-gray cubicles, perfumed with the grown-up smells of coffee and stale cigarette smoke. His engineering colleagues with their rumpled style and distracted manner seemed like exotic birds in a sanctuary. They gave us kids stacks of discarded 11×14 continuous-form computer paper, printed on one side with cryptic arrays of numbers, the blank side a canvas for crayon masterpieces. Women occupied many of the cubicles; they answered phones and sat in front of typewriters, but they also made hieroglyphic marks on transparent slides and conferred with my father and other men in the office on the stacks of documents that littered their desks. That so many of them were African American, many of them my grandmother’s age, struck me as simply a part of the natural order of things: growing up in Hampton, the face of science was brown like mine.

[…]

The community certainly included black English professors, like my mother, as well as black doctors and dentists, black mechanics, janitors, and contractors, black cobblers, wedding planners, real estate agents, and undertakers, several black lawyers, and a handful of black Mary Kay salespeople. As a child, however, I knew so many African Americans working in science, math, and engineering that I thought that’s just what black folks did.

Katherine Johnson, age 98 (Photograph: Annie Leibovitz for Vanity Fair)
Katherine Johnson, age 98 (Photograph: Annie Leibovitz for Vanity Fair)

Read more here.

BECOMING WISE

“Words are events, they do things, change things,” Ursula K. Le Guin wrote in her beautiful meditation on the power and magic of real human conversation. “They transform both speaker and hearer; they feed energy back and forth and amplify it. They feed understanding or emotion back and forth and amplify it.” Hardly anyone in our time has been a greater amplifier of spirits than longtime journalist, On Being host, and patron saint of nuance Krista Tippett — a modern-day Simone Weil who has been fusing spiritual life and secular culture with remarkable virtuosity through her conversations with physicists and poets, neuroscientists and novelists, biologists and Benedictine monks, united by the quality of heart and mind that Einstein so beautifully termed “spiritual genius.”

In her interviews with the great spiritual geniuses of our time, Tippett has cultivated a rare space for reflection and redemption amid our reactionary culture — a space framed by her generous questions exploring the life of meaning. In Becoming Wise: An Inquiry into the Mystery and Art of Living (public library), Tippett distills more than a decade of these conversations across disciplines and denominations into a wellspring of wisdom on the most elemental questions of being human — questions about happiness, morality, justice, wellbeing, and love — reanimated with a fresh vitality of insight.

Krista Tippett
Krista Tippett

At the core of Tippett’s inquiry is the notion virtue — not in the limiting, prescriptive sense with which scripture has imbued it, but in the expansive, empowering sense of a psychological, emotional, and spiritual technology that allows us to first fully inhabit, then conscientiously close the gap between who we are and who we aspire to be.

She explores five primary fertilizers of virtue: words — the language we use to tell the stories we tell about who we are and how the world works; flesh — the body as the birthplace of every virtue, rooted in the idea that “how we inhabit our senses tests the mettle of our souls”; love — a word so overused that it has been emptied of meaning yet one that gives meaning to our existence, both in our most private selves and in the fabric of public life; faith — Tippett left a successful career as a political journalist in divided Berlin in the 1980s to study theology not in order to be ordained but in order to question power structures and examine the grounds of moral imagination through the spiritual wisdom of the ages; and hope — an orientation of the mind and spirit predicated not on the blinders of optimism but on a lucid lens on the possible furnished by an active, unflinching reach for it.

eucalyptus

Tippett, who has spent more than a decade cross-pollinating spirituality, science, and the human spirit and was awarded the National Humanities Medal for it, considers the raw material of her work — the power of questions “as social art and civic tools”:

If I’ve learned nothing else, I’ve learned this: a question is a powerful thing, a mighty use of words. Questions elicit answers in their likeness. Answers mirror the questions they rise, or fall, to meet. So while a simple question can be precisely what’s needed to drive to the heart of the matter, it’s hard to meet a simplistic question with anything but a simplistic answer. It’s hard to transcend a combative question. But it’s hard to resist a generous question. We all have it in us to formulate questions that invite honesty, dignity, and revelation. There is something redemptive and life-giving about asking better questions.

Read more here.

THE ABUNDANCE

For decades, Annie Dillard has beguiled those in search of truth and beauty in the written word with the lyrical splendor and wakeful sagacity of her prose. The Abundance: Narrative Essays Old and New (public library) collects her finest work, spanning such varied subjects as writing, the consecrating art of attention, and the surreal exhilaration of witnessing a total solar eclipse.

In a beautiful 1989 piece titled “A Writer in the World,” Dillard writes:

People love pretty much the same things best. A writer, though, looking for subjects asks not after what he loves best, but what he alone loves at all… Why do you never find anything written about that idiosyncratic thought you advert to, about your fascination with something no one else understands? Because it is up to you. There is something you find interesting, for a reason hard to explain because you have never read it on any page; there you begin. You were made and set here to give voice to this, your own astonishment.

And yet this singular voice is refined not by the stubborn flight from all that has been said before but by a deliberate immersion in the very best of it. Like Hemingway, who insisted that aspiring writers should metabolize a certain set of essential books, Dillard counsels:

The writer studies literature, not the world. He lives in the world; he cannot miss it. If he has ever bought a hamburger, or taken a commercial airplane flight, he spares his readers a report of his experience. He is careful of what he reads, for that is what he will write. He is careful of what he learns, because that is what he will know.

The writer as a consequence reads outside his time and place.

The most significant animating force of great art, Dillard argues, is the artist’s willingness to hold nothing back and to create, always, with an unflappable generosity of spirit:

One of the few things I know about writing is this: Spend it all, shoot it, play it, lose it, all, right away, every time. Don’t hoard what seems good for a later place in the book, or for another book; give it, give it all, give it now. The very impulse to save something good for a better place later is the signal to spend it now. Something more will arise for later, something better. These things fill from behind, from beneath, like well water. Similarly, the impulse to keep to yourself what you have learned is not only shameful; it is destructive. Anything you do not give freely and abundantly becomes lost to you. You open your safe and find ashes.

Read more here.

WHEN BREATH BECOMES AIR

All life is lived in the shadow of its own finitude, of which we are always aware — an awareness we systematically blunt through the daily distraction of living. But when this finitude is made acutely imminent, one suddenly collides with awareness so acute that it leaves no choice but to fill the shadow with as much light as a human being can generate — the sort of inner illumination we call meaning: the meaning of life.

That tumultuous turning point is what neurosurgeon Paul Kalanithi chronicles in When Breath Becomes Air (public library), also among the year’s best science books — his piercing memoir of being diagnosed with terminal cancer at the peak of a career bursting with potential and a life exploding with aliveness. Partway between Montaigne and Oliver Sacks, Kalanithi weaves together philosophical reflections on his personal journey with stories of his patients to illuminate the only thing we have in common — our mortality — and how it spurs all of us, in ways both minute and monumental, to pursue a life of meaning.

What emerges is an uncommonly insightful, sincere, and sobering revelation of how much our sense of self is tied up with our sense of potential and possibility — the selves we would like to become, those we work tirelessly toward becoming. Who are we, then, and what remains of “us” when that possibility is suddenly snipped?

Paul Kalanithi in 2014 (Photograph: Norbert von der Groeben/Stanford Hospital and Clinics)
Paul Kalanithi in 2014 (Photograph: Norbert von der Groeben/Stanford Hospital and Clinics)

A generation after surgeon Sherwin Nuland’s foundational text on confronting the meaning of life while dying, Kalanithi sets out to answer these questions and their myriad fractal implications. He writes:

At age thirty-six, I had reached the mountaintop; I could see the Promised Land, from Gilead to Jericho to the Mediterranean Sea. I could see a nice catamaran on that sea that Lucy, our hypothetical children, and I would take out on weekends. I could see the tension in my back unwinding as my work schedule eased and life became more manageable. I could see myself finally becoming the husband I’d promised to be.

And then the unthinkable happens. He recounts one of the first incidents in which his former identity and his future fate collided with jarring violence:

My back stiffened terribly during the flight, and by the time I made it to Grand Central to catch a train to my friends’ place upstate, my body was rippling with pain. Over the past few months, I’d had back spasms of varying ferocity, from simple ignorable pain, to pain that made me forsake speech to grind my teeth, to pain so severe I curled up on the floor, screaming. This pain was toward the more severe end of the spectrum. I lay down on a hard bench in the waiting area, feeling my back muscles contort, breathing to control the pain — the ibuprofen wasn’t touching this — and naming each muscle as it spasmed to stave off tears: erector spinae, rhomboid, latissimus, piriformis…

A security guard approached. “Sir, you can’t lie down here.”

“I’m sorry,” I said, gasping out the words. “Bad … back … spasms.”

“You still can’t lie down here.”

[…]

I pulled myself up and hobbled to the platform.

Like the book itself, the anecdote speaks to something larger and far more powerful than the particular story — in this case, our cultural attitude toward what we consider the failings of our bodies: pain and, in the ultimate extreme, death. We try to dictate the terms on which these perceived failings may occur; to make them conform to wished-for realities; to subvert them by will and witless denial. All this we do because, at bottom, we deem them impermissible — in ourselves and in each other.

Read more here.

PINOCCHIO

“Myths are made for the imagination to breathe life into them,” Albert Camus wrote. Ada Lovelace, the world’s first computer programmer, observed a century earlier as she contemplated the nature of the imagination and its three core faculties: “Imagination is the Discovering Faculty, pre-eminently… that which penetrates into the unseen worlds around us.”

This “discovering faculty” of the imagination, which breathes life into both the most captivating myths and the deepest layers of reality, is what animated Italian artist Alessandro Sanna one winter afternoon when he glimpsed a most unusual tree branch from the window of a moving train — a branch that looked like a sensitive human silhouette, mid-fall or mid-embrace.

As Sanna cradled the enchanting image in his mind and began sketching it, he realized that something about the “body language” of the branch reminded him of a small, delicate, terminally ill child he’d gotten to know during his visits to Turin’s Pediatric Hospital. In beholding this common ground of tender fragility, Sanna’s imagination leapt to a foundational myth of his nation’s storytelling — the Pinocchio story.

In the astonishingly beautiful and tenderhearted Pinocchio: The Origin Story (public library), also among the year’s loveliest picture-books, Sanna imagines an alternative prequel to the beloved story, a wordless genesis myth of the wood that became Pinocchio, radiating a larger cosmogony of life, death, and the transcendent continuity between the two.

pinocchio_sanna11

A fitting follow-up to The River — Sanna’s exquisite visual memoir of life on the Po River in Northern Italy, reflecting on the seasonality of human existence — this imaginative masterwork dances with the cosmic unknowns that eclipse human life and the human mind with their enormity: questions like what life is, how it began, and what happens when it ends.

Origin myths have been our oldest sensemaking mechanism for wresting meaning out of these as-yet-unanswered, perhaps unanswerable questions. But rather than an argument with science and our secular sensibility, Sanna’s lyrical celebration of myth embodies Margaret Mead’s insistence on the importance of poetic truth in the age of facts.

pinocchio_sanna16

The tree is an organic choice for this unusual cosmogony — after all, trees have inspired centuries of folk tales around the world; a 17th-century English gardener marveled at how they “speak to the mind, and tell us many things, and teach us many good lessons” and Hermann Hesse called them “the most penetrating of preachers.”

pinocchio_sanna7

It is both a pity and a strange comfort that Sanna’s luminous, buoyant watercolors and his masterful subtlety of scale don’t fully translate onto this screen — his analog and deeply humane art is of a different order, almost of a different time, and yet woven of the timeless and the eternal.

pinocchio_sanna25

See more here.

BP

The Greatest Science Books of 2016

From the sound of spacetime to time travel to the microbiome, by way of polar bears, dogs, and trees.

I have long believed that E.B. White’s abiding wisdom on children’s books — “Anyone who writes down to children is simply wasting his time. You have to write up, not down.” — is equally true of science books. The question of what makes a great book of any kind is, of course, a slippery one, but I recently endeavored to synthesize my intuitive system for assessing science books that write up to the reader in a taxonomy of explanation, elucidation, and enchantment.

enchanters

Gathered here are exceptional books that accomplish at least two of the three, assembled in the spirit of my annual best-of reading lists, which I continue to consider Old Year’s resolutions in reverse — not a list of priorities for the year ahead, but a reflection on the reading most worth prioritizing in the year being left behind.

BLACK HOLE BLUES

In Black Hole Blues and Other Songs from Outer Space (public library), cosmologist, novelist, and unparalleled enchanter of science Janna Levin tells the story of the century-long vision, originated by Einstein, and half-century experimental quest to hear the sound of spacetime by detecting a gravitational wave. This book remains one of the most intensely interesting and beautifully written I’ve ever encountered — the kind that comes about once a generation if we’re lucky.

Everything we know about the universe so far comes from four centuries of sight — from peering into space with our eyes and their prosthetic extension, the telescope. Now commences a new mode of knowing the cosmos through sound. The detection of gravitational waves is one of the most significant discoveries in the entire history of physics, marking the dawn of a new era as we begin listening to the sound of space — the probable portal to mysteries as unimaginable to us today as galaxies and nebulae and pulsars and other cosmic wonders were to the first astronomers. Gravitational astronomy, as Levin elegantly puts it, promises a “score to accompany the silent movie humanity has compiled of the history of the universe from still images of the sky, a series of frozen snapshots captured over the past four hundred years since Galileo first pointed a crude telescope at the Sun.”

blackholes_einstein

Astonishingly enough, Levin wrote the book before the Laser Interferometer Gravitational-Wave Observatory (LIGO) — the monumental instrument at the center of the story, decades in the making — made the actual detection of a ripple in the fabric of spacetime caused by the collision of two black holes in the autumn of 2015, exactly a century after Einstein first envisioned the possibility of gravitational waves. So the story she tells is not that of the triumph but that of the climb, which renders it all the more enchanting — because it is ultimately a story about the human spirit and its incredible tenacity, about why human beings choose to devote their entire lives to pursuits strewn with unimaginable obstacles and bedeviled by frequent failure, uncertain rewards, and meager public recognition.

Indeed, what makes the book interesting is that it tells the story of this monumental discovery, but what makes it enchanting is that Levin comes at it from a rather unusual perspective. She is a working astrophysicist who studies black holes, but she is also an incredibly gifted novelist — an artist whose medium is language and thought itself. This is no popular science book but something many orders of magnitude higher in its artistic vision, the impeccable craftsmanship of language, and the sheer pleasure of the prose. The story is structured almost as a series of short, integrated novels, with each chapter devoted to one of the key scientists involved in LIGO. With Dostoyevskian insight and nuance, Levin paints a psychological, even philosophical portrait of each protagonist, revealing how intricately interwoven the genius and the foibles are in the fabric of personhood and what a profoundly human endeavor science ultimately is.

She writes:

Scientists are like those levers or knobs or those boulders helpfully screwed into a climbing wall. Like the wall is some cemented material made by mixing knowledge, which is a purely human construct, with reality, which we can only access through the filter of our minds. There’s an important pursuit of objectivity in science and nature and mathematics, but still the only way up the wall is through the individual people, and they come in specifics… So the climb is personal, a truly human endeavor, and the real expedition pixelates into individuals, not Platonic forms.

For a taste of this uncategorizably wonderful book, see Levin on the story of the tragic hero who pioneered gravitational astronomy and how astronomer Jocelyn Bell discovered pulsars.

TIME TRAVEL

Time Travel: A History (public library) by science historian and writer extraordinaire James Gleick, another rare enchanter of science, is not a “science book” per se, in that although it draws heavily on the history of twentieth-century science and quantum physics in particular (as well as on millennia of philosophy), it is a decidedly literary inquiry into our temporal imagination — why we think about time, why its directionality troubles us so, and what asking these questions at all reveals about the deepest mysteries of our consciousness. I consider it a grand thought experiment, using physics and philosophy as the active agents, and literature as the catalyst.

Gleick, who examined the origin of our modern anxiety about time with remarkable prescience nearly two decades ago, traces the invention of the notion of time travel to H.G. Wells’s 1895 masterpiece The Time Machine. Although Wells — like Gleick, like any reputable physicist — knew that time travel was a scientific impossibility, he created an aesthetic of thought which never previously existed and which has since shaped the modern consciousness. Gleick argues that the art this aesthetic produced — an entire canon of time travel literature and film — not only permeated popular culture but even influenced some of the greatest scientific minds of the past century, including Stephen Hawking, who once cleverly hosted a party for time travelers and when no one showed up considered the impossibility of time travel proven, and John Archibald Wheeler, who popularized the term “black hole” and coined “wormhole,” both key tropes of time travel literature.

Gleick considers how a scientific impossibility can become such fertile ground for the artistic imagination:

Why do we need time travel, when we already travel through space so far and fast? For history. For mystery. For nostalgia. For hope. To examine our potential and explore our memories. To counter regret for the life we lived, the only life, one dimension, beginning to end.

Wells’s Time Machine revealed a turning in the road, an alteration in the human relationship with time. New technologies and ideas reinforced one another: the electric telegraph, the steam railroad, the earth science of Lyell and the life science of Darwin, the rise of archeology out of antiquarianism, and the perfection of clocks. When the nineteenth century turned to the twentieth, scientists and philosophers were primed to understand time in a new way. And so were we all. Time travel bloomed in the culture, its loops and twists and paradoxes.

I wrote about Gleick’s uncommonly pleasurable book at length here.

FELT TIME

A very different take on time, not as cultural phenomenon but as individual psychological interiority, comes from German psychologist Marc Wittmann in Felt Time: The Psychology of How We Perceive Time (public library) — a fascinating inquiry into how our subjective experience of time’s passage shapes everything from our emotional memory to our sense of self. Bridging disciplines as wide-ranging as neuroscience and philosophy, Wittmann examines questions of consciousness, identity, happiness, boredom, money, and aging, exposing the centrality of time in each of them. What emerges is the disorienting sense that time isn’t something which happens to us — rather, we are time.

One of Wittmann’s most pause-giving points has to do with how temporality mediates the mind-body problem. He writes:

Presence means becoming aware of a physical and psychic self that is temporally extended. To be self-conscious is to recognize oneself as something that persists through time and is embodied.

In a sense, time is a construction of our consciousness. Two generations after Hannah Arendt observed in her brilliant meditation on time that “it is the insertion of man with his limited life span that transforms the continuously flowing stream of sheer change … into time as we know it,” Wittmann writes:

Self-consciousness — achieving awareness of one’s own self — emerges on the basis of temporally enduring perception of bodily states that are tied to neural activity in the brain’s insular lobe. The self and time prove to be especially present in boredom. They go missing in the hustle and bustle of everyday life, which results from the acceleration of social processes. Through mindfulness and emotional control, the tempo of life that we experience can be reduced, and we can regain time for ourselves and others.

Perception necessarily encompasses the individual who is doing the perceiving. It is I who perceives. This might seem self-evident. Perception of myself, my ego, occurs naturally when I consider myself. I “feel” and think about myself. But who is the subject if I am the object of my own attention? When I observe myself, after all, I become the object of observation. Clearly, this intangibility of the subject as a subject — and not an object — poses a philosophical problem: as soon as I observe myself, I have already become the object of my observation.

More here.

WHEN BREATH BECOMES AIR

All life is lived in the shadow of its own finitude, of which we are always aware — an awareness we systematically blunt through the daily distraction of living. But when this finitude is made acutely imminent, one suddenly collides with awareness so acute that it leaves no choice but to fill the shadow with as much light as a human being can generate — the sort of inner illumination we call meaning: the meaning of life.

That tumultuous turning point is what neurosurgeon Paul Kalanithi chronicles in When Breath Becomes Air (public library) — his piercing memoir of being diagnosed with terminal cancer at the peak of a career bursting with potential and a life exploding with aliveness. Partway between Montaigne and Oliver Sacks, Kalanithi weaves together philosophical reflections on his personal journey with stories of his patients to illuminate the only thing we have in common — our mortality — and how it spurs all of us, in ways both minute and monumental, to pursue a life of meaning.

What emerges is an uncommonly insightful, sincere, and sobering revelation of how much our sense of self is tied up with our sense of potential and possibility — the selves we would like to become, those we work tirelessly toward becoming. Who are we, then, and what remains of “us” when that possibility is suddenly snipped?

Paul Kalanithi in 2014 (Photograph: Norbert von der Groeben/Stanford Hospital and Clinics)
Paul Kalanithi in 2014 (Photograph: Norbert von der Groeben/Stanford Hospital and Clinics)

A generation after surgeon Sherwin Nuland’s foundational text on confronting the meaning of life while dying, Kalanithi sets out to answer these questions and their myriad fractal implications. He writes:

At age thirty-six, I had reached the mountaintop; I could see the Promised Land, from Gilead to Jericho to the Mediterranean Sea. I could see a nice catamaran on that sea that Lucy, our hypothetical children, and I would take out on weekends. I could see the tension in my back unwinding as my work schedule eased and life became more manageable. I could see myself finally becoming the husband I’d promised to be.

And then the unthinkable happens. He recounts one of the first incidents in which his former identity and his future fate collided with jarring violence:

My back stiffened terribly during the flight, and by the time I made it to Grand Central to catch a train to my friends’ place upstate, my body was rippling with pain. Over the past few months, I’d had back spasms of varying ferocity, from simple ignorable pain, to pain that made me forsake speech to grind my teeth, to pain so severe I curled up on the floor, screaming. This pain was toward the more severe end of the spectrum. I lay down on a hard bench in the waiting area, feeling my back muscles contort, breathing to control the pain — the ibuprofen wasn’t touching this — and naming each muscle as it spasmed to stave off tears: erector spinae, rhomboid, latissimus, piriformis…

A security guard approached. “Sir, you can’t lie down here.”

“I’m sorry,” I said, gasping out the words. “Bad … back … spasms.”

“You still can’t lie down here.”

[…]

I pulled myself up and hobbled to the platform.

Like the book itself, the anecdote speaks to something larger and far more powerful than the particular story — in this case, our cultural attitude toward what we consider the failings of our bodies: pain and, in the ultimate extreme, death. We try to dictate the terms on which these perceived failings may occur; to make them conform to wished-for realities; to subvert them by will and witless denial. All this we do because, at bottom, we deem them impermissible — in ourselves and in each other.

I wrote about the book at length here.

THE CONFIDENCE GAME

“Try not to get overly attached to a hypothesis just because it’s yours,” Carl Sagan urged in his excellent Baloney Detection Kit — and yet our tendency is to do just that, becoming increasingly attached to what we’ve come to believe because the belief has sprung from our own glorious, brilliant, fool-proof minds. How con artists take advantage of this human hubris is what New Yorker columnist and psychology writer Maria Konnikova explores in The Confidence Game: Why We Fall for It … Every Time (public library) — a thrilling psychological detective story investigating how con artists, the supreme masterminds of malevolent reality-manipulation, prey on our hopes, our fears, and our propensity for believing what we wish were true. Through a tapestry of riveting real-life con artist profiles interwoven with decades of psychology experiments, Konnikova illuminates the inner workings of trust and deception in our everyday lives.

She writes:

It’s the oldest story ever told. The story of belief — of the basic, irresistible, universal human need to believe in something that gives life meaning, something that reaffirms our view of ourselves, the world, and our place in it… For our minds are built for stories. We crave them, and, when there aren’t ready ones available, we create them. Stories about our origins. Our purpose. The reasons the world is the way it is. Human beings don’t like to exist in a state of uncertainty or ambiguity. When something doesn’t make sense, we want to supply the missing link. When we don’t understand what or why or how something happened, we want to find the explanation. A confidence artist is only too happy to comply — and the well-crafted narrative is his absolute forte.

Konnikova describes the basic elements of the con and the psychological susceptibility into which each of them plays:

The confidence game starts with basic human psychology. From the artist’s perspective, it’s a question of identifying the victim (the put-up): who is he, what does he want, and how can I play on that desire to achieve what I want? It requires the creation of empathy and rapport (the play): an emotional foundation must be laid before any scheme is proposed, any game set in motion. Only then does it move to logic and persuasion (the rope): the scheme (the tale), the evidence and the way it will work to your benefit (the convincer), the show of actual profits. And like a fly caught in a spider’s web, the more we struggle, the less able to extricate ourselves we become (the breakdown). By the time things begin to look dicey, we tend to be so invested, emotionally and often physically, that we do most of the persuasion ourselves. We may even choose to up our involvement ourselves, even as things turn south (the send), so that by the time we’re completely fleeced (the touch), we don’t quite know what hit us. The con artist may not even need to convince us to stay quiet (the blow-off and fix); we are more likely than not to do so ourselves. We are, after all, the best deceivers of our own minds. At each step of the game, con artists draw from a seemingly endless toolbox of ways to manipulate our belief. And as we become more committed, with every step we give them more psychological material to work with.

Needless to say, the book bears remarkable relevance to the recent turn of events in American politics and its ripples in the mass manipulation machine known as the media.

More here.

THE GENE

“This is the entire essence of life: Who are you? What are you?” young Leo Tolstoy wrote in his diary. For Tolstoy, this was a philosophical inquiry — or a metaphysical one, as it would have been called in his day. But between his time and ours, science has unraveled the inescapable physical dimensions of this elemental question, rendering the already disorienting attempt at an answer all the more complex and confounding.

In The Gene: An Intimate History (public library), physician and Pulitzer-winning author Siddhartha Mukherjee offers a rigorously researched, beautifully written detective story about the genetic components of what we experience as the self, rooted in Mukherjee’s own painful family history of mental illness and radiating a larger inquiry into how genetics illuminates the future of our species.

Mukherjee writes:

Three profoundly destabilizing scientific ideas ricochet through the twentieth century, trisecting it into three unequal parts: the atom, the byte, the gene. Each is foreshadowed by an earlier century, but dazzles into full prominence in the twentieth. Each begins its life as a rather abstract scientific concept, but grows to invade multiple human discourses — thereby transforming culture, society, politics, and language. But the most crucial parallel between the three ideas, by far, is conceptual: each represents the irreducible unit — the building block, the basic organizational unit — of a larger whole: the atom, of matter; the byte (or “bit”), of digitized information; the gene, of heredity and biological information.

Why does this property — being the least divisible unit of a larger form — imbue these particular ideas with such potency and force? The simple answer is that matter, information, and biology are inherently hierarchically organized: understanding that smallest part is crucial to understanding the whole.

Among the book’s most fascinating threads is Mukherjee’s nuanced, necessary discussion of intelligence and the dark side of IQ.

THE POLAR BEAR

“In wildness is the preservation of the world,” Thoreau wrote 150 years ago in his ode to the spirit of sauntering. But in a world increasingly unwild, where we are in touch with nature only occasionally and only in fragments, how are we to nurture the preservation of our Pale Blue Dot?

That’s what London-based illustrator and Sendak Fellow Jenni Desmond explores in The Polar Bear (public library) — the follow-up to Desmond’s serenade to the science and life of Earth’s largest-hearted creature, The Blue Whale, which was among the best science books of 2015.

thepolarbear_jennidesmond2

thepolarbear_jennidesmond1

thepolarbear_jennidesmond8

thepolarbear_jennidesmond7

The story follows a little girl who, in a delightful meta-touch, pulls this very book off the bookshelf and begins learning about the strange and wonderful world of the polar bear, its life, and the science behind it — its love of solitude, the black skin that hides beneath its yellowish-white fur, the built-in sunglasses protecting its eyes from the harsh Arctic light, why it evolved to have an unusually long neck and slightly inward paws, how it maintains the same temperature as us despite living in such extreme cold, why it doesn’t hibernate.

thepolarbear_jennidesmond6

Beyond its sheer loveliness, the book is suddenly imbued with a new layer of urgency. At a time when we can no longer count on politicians to protect the planet and educate the next generations about preserving it, the task falls on solely on parents and educators. Desmond’s wonderful project alleviates that task by offering a warm, empathic invitation to care about, which is the gateway to caring for, one of the creatures most vulnerable to our changing climate and most needful of our protection.

thepolarbear_jennidesmond5

Look closer here.

THE BIG PICTURE

“We are — as far as we know — the only part of the universe that’s self-conscious,” the poet Mark Strand marveled in his beautiful meditation on the artist’s task to bear witness to existence, adding: “We could even be the universe’s form of consciousness. We might have come along so that the universe could look at itself… It’s such a lucky accident, having been born, that we’re almost obliged to pay attention.” Scientists are rightfully reluctant to ascribe a purpose or meaning to the universe itself but, as physicist Lisa Randall has pointed out, “an unconcerned universe is not a bad thing — or a good one for that matter.” Where poets and scientists converge is the idea that while the universe itself isn’t inherently imbued with meaning, it is in this self-conscious human act of paying attention that meaning arises.

Physicist Sean Carroll terms this view poetic naturalism and examines its rewards in The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (public library) — a nuanced inquiry into “how our desire to matter fits in with the nature of reality at its deepest levels,” in which Carroll offers an assuring dose of what he calls “existential therapy” reconciling the various and often seemingly contradictory dimensions of our experience.

With an eye to his life’s work of studying the nature of the universe — an expanse of space and time against the incomprehensibly enormous backdrop of which the dramas of a single human life claim no more than a photon of the spotlight — Carroll offers a counterpoint to our intuitive cowering before such magnitudes of matter and mattering:

I like to think that our lives do matter, even if the universe would trundle along without us.

[…]

I want to argue that, though we are part of a universe that runs according to impersonal underlying laws, we nevertheless matter. This isn’t a scientific question — there isn’t data we can collect by doing experiments that could possibly measure the extent to which a life matters. It’s at heart a philosophical problem, one that demands that we discard the way that we’ve been thinking about our lives and their meaning for thousands of years. By the old way of thinking, human life couldn’t possibly be meaningful if we are “just” collections of atoms moving around in accordance with the laws of physics. That’s exactly what we are, but it’s not the only way of thinking about what we are. We are collections of atoms, operating independently of any immaterial spirits or influences, and we are thinking and feeling people who bring meaning into existence by the way we live our lives.

Carroll’s captivating term poetic naturalism builds on a worldview that has been around for centuries, dating back at least to the Scottish philosopher David Hume. It fuses naturalism — the idea that the reality of the natural world is the only reality, that it operates according to consistent patterns, and that those patterns can be studied — with the poetic notion that there are multiple ways of talking about the world and of framing the questions that arise from nature’s elemental laws.

I wrote about the book at length here.

THE HIDDEN LIFE OF TREES

Trees dominate the world’s the oldest living organisms. Since the dawn of our species, they have been our silent companions, permeating our most enduring tales and never ceasing to inspire fantastical cosmogonies. Hermann Hesse called them “the most penetrating of preachers.” A forgotten seventeenth-century English gardener wrote of how they “speak to the mind, and tell us many things, and teach us many good lessons.”

But trees might be among our lushest metaphors and sensemaking frameworks for knowledge precisely because the richness of what they say is more than metaphorical — they speak a sophisticated silent language, communicating complex information via smell, taste, and electrical impulses. This fascinating secret world of signals is what German forester Peter Wohlleben explores in The Hidden Life of Trees: What They Feel, How They Communicate (public library).

Illustration by Arthur Rackham for a rare 1917 edition of the Brothers Grimm fairy tales

Wohlleben chronicles what his own experience of managing a forest in the Eifel mountains in Germany has taught him about the astonishing language of trees and how trailblazing arboreal research from scientists around the world reveals “the role forests play in making our world the kind of place where we want to live.” As we’re only just beginning to understand nonhuman consciousnesses, what emerges from Wohlleben’s revelatory reframing of our oldest companions is an invitation to see anew what we have spent eons taking for granted and, in this act of seeing, to care more deeply about these remarkable beings that make life on this planet we call home not only infinitely more pleasurable, but possible at all.

Read more here.

BEING A DOG

“The act of smelling something, anything, is remarkably like the act of thinking itself,” the great science storyteller Lewis Thomas wrote in his beautiful 1985 meditation on the poetics of smell as a mode of knowledge. But, like the conditioned consciousness out of which our thoughts arise, our olfactory perception is beholden to our cognitive, cultural, and biological limitations. The 438 cubic feet of air we inhale each day are loaded with an extraordinary richness of information, but we are able to access and decipher only a fraction. And yet we know, on some deep creaturely level, just how powerful and enlivening the world of smell is, how intimately connected with our ability to savor life. “Get a life in which you notice the smell of salt water pushing itself on a breeze over the dunes,” Anna Quindlen advised in her indispensable Short Guide to a Happy Life — but the noticing eclipses the getting, for the salt water breeze is lost on any life devoid of this sensorial perception.

Dogs, who “see” the world through smell, can teach us a great deal about that springlike sensorial aliveness which E.E. Cummings termed “smelloftheworld.” So argues cognitive scientist and writer Alexandra Horowitz, director of the Dog Cognition Lab at Barnard College, in Being a Dog: Following the Dog Into a World of Smell (public library) — a fascinating tour of what Horowitz calls the “surprising and sometimes alarming feats of olfactory perception” that dogs perform daily, and what they can teach us about swinging open the doors of our own perception by relearning some of our long-lost olfactory skills that grant us access to hidden layers of reality.

Art by Maira Kalman from Beloved Dog

The book is a natural extension of Horowitz’s two previous books, exploring the subjective reality of the dog and how our human perceptions shape our own subjective reality. She writes:

I am besotted with dogs, and to know a dog is to be interested in what it’s like to be a dog. And that all begins with the nose.

What the dog sees and knows comes through his nose, and the information that every dog — the tracking dog, of course, but also the dog lying next to you, snoring, on the couch — has about the world based on smell is unthinkably rich. It is rich in a way we humans once knew about, once acted on, but have since neglected.

Savor more of the wonderland of canine olfaction here.

I CONTAIN MULTITUDES

“I have observed many tiny animals with great admiration,” Galileo marveled as he peered through his microscope — a tool that, like the telescope, he didn’t invent himself but he used with in such a visionary way as to render it revolutionary. The revelatory discoveries he made in the universe within the cell are increasingly proving to be as significant as his telescopic discoveries in the universe without — a significance humanity has been even slower and more reluctant to accept than his radical revision of the cosmos.

That multilayered significance is what English science writer and microbiology elucidator Ed Yong explores in I Contain Multitudes: The Microbes Within Us and a Grander View of Life (public library) — a book so fascinating and elegantly written as to be worthy of its Whitman reference, in which Yong peels the veneer of the visible to reveal the astonishing complexity of life thriving beneath and within the crude confines of our perception.

Early-twentieth-century drawing of Radiolaria, one of the first microorganisms, by Ernst Haeckel
Early-twentieth-century drawing of Radiolarians, some of the first microorganisms, by Ernst Haeckel

Artist Agnes Martin memorably observed that “the best things in life happen to you when you’re alone,” but Yong offers a biopoetic counterpoint in the fact that we are never truly alone. He writes:

Even when we are alone, we are never alone. We exist in symbiosis — a wonderful term that refers to different organisms living together. Some animals are colonised by microbes while they are still unfertilised eggs; others pick up their first partners at the moment of birth. We then proceed through our lives in their presence. When we eat, so do they. When we travel, they come along. When we die, they consume us. Every one of us is a zoo in our own right — a colony enclosed within a single body. A multi-species collective. An entire world.

[…]

All zoology is really ecology. We cannot fully understand the lives of animals without understanding our microbes and our symbioses with them. And we cannot fully appreciate our own microbiome without appreciating how those of our fellow species enrich and influence their lives. We need to zoom out to the entire animal kingdom, while zooming in to see the hidden ecosystems that exist in every creature. When we look at beetles and elephants, sea urchins and earthworms, parents and friends, we see individuals, working their way through life as a bunch of cells in a single body, driven by a single brain, and operating with a single genome. This is a pleasant fiction. In fact, we are legion, each and every one of us. Always a “we” and never a “me.”

There are ample reasons to admire and appreciate microbes, well beyond the already impressive facts that they ruled “our” Earth for the vast majority of its 4.54-billion-year history and that we ourselves evolved from them. By pioneering photosynthesis, they became the first organisms capable of making their own food. They dictate the planet’s carbon, nitrogen, sulphur, and phosphorus cycles. They can survive anywhere and populate just about corner of the Earth, from the hydrothermal vents at the bottom of the ocean to the loftiest clouds. They are so diverse that the microbes on your left hand are different from those on your right.

But perhaps most impressively — for we are, after all, the solipsistic species — they influence innumerable aspects of our biological and even psychological lives. Young offers a cross-section of this microbial dominion:

The microbiome is infinitely more versatile than any of our familiar body parts. Your cells carry between 20,000 and 25,000 genes, but it is estimated that the microbes inside you wield around 500 times more. This genetic wealth, combined with their rapid evolution, makes them virtuosos of biochemistry, able to adapt to any possible challenge. They help to digest our food, releasing otherwise inaccessible nutrients. They produce vitamins and minerals that are missing from our diet. They break down toxins and hazardous chemicals. They protect us from disease by crowding out more dangerous microbes or killing them directly with antimicrobial chemicals. They produce substances that affect the way we smell. They are such an inevitable presence that we have outsourced surprising aspects of our lives to them. They guide the construction of our bodies, releasing molecules and signals that steer the growth of our organs. They educate our immune system, teaching it to tell friend from foe. They affect the development of the nervous system, and perhaps even influence our behaviour. They contribute to our lives in profound and wide-ranging ways; no corner of our biology is untouched. If we ignore them, we are looking at our lives through a keyhole.

In August, I wrote about one particularly fascinating aspect of Yong’s book — the relationship between mental health, free will, and your microbiome.

HIDDEN FIGURES

“No woman should say, ‘I am but a woman!’ But a woman! What more can you ask to be?” astronomer Maria Mitchell, who paved the way for women in American science, admonished the first class of female astronomers at Vassar in 1876. By the middle of the next century, a team of unheralded women scientists and engineers were powering space exploration at NASA’s Jet Propulsion Laboratory.

Meanwhile, across the continent and in what was practically another country, a parallel but very different revolution was taking place: In the segregated South, a growing number of black female mathematicians, scientists, and engineers were steering early space exploration and helping American win the Cold War at NASA’s Langley Research Center in Hampton, Virginia.

Long before the term “computer” came to signify the machine that dictates our lives, these remarkable women were working as human “computers” — highly skilled professional reckoners, who thought mathematically and computationally for their living and for their country. When Neil Armstrong set his foot on the moon, his “giant leap for mankind” had been powered by womankind, particularly by Katherine Johnson — the “computer” who calculated Apollo 11’s launch windows and who was awarded the Presidential Medal of Freedom by President Obama at age 97 in 2015, three years after the accolade was conferred upon John Glenn, the astronaut whose flight trajectory Johnson had made possible.

Katherine Johnson at her Langley desk with a globe, or "Celestial Training Device," 1960 (Photographs: NASA)
Katherine Johnson at her Langley desk with a globe, or “Celestial Training Device,” 1960 (Photographs: NASA)

In Hidden Figures: The Story of the African-American Women Who Helped Win the Space Race (public library), Margot Lee Shetterly tells the untold story of these brilliant women, once on the frontlines of our cultural leaps and since sidelined by the selective collective memory we call history.

She writes:

Just as islands — isolated places with unique, rich biodiversity — have relevance for the ecosystems everywhere, so does studying seemingly isolated or overlooked people and events from the past turn up unexpected connections and insights to modern life.

Against a sobering cultural backdrop, Shetterly captures the enormous cognitive dissonance the very notion of these black female mathematicians evokes:

Before a computer became an inanimate object, and before Mission Control landed in Houston; before Sputnik changed the course of history, and before the NACA became NASA; before the Supreme Court case Brown v. Board of Education of Topeka established that separate was in fact not equal, and before the poetry of Martin Luther King Jr.’s “I Have a Dream” speech rang out over the steps of the Lincoln Memorial, Langley’s West Computers were helping America dominate aeronautics, space research, and computer technology, carving out a place for themselves as female mathematicians who were also black, black mathematicians who were also female.

Shetterly herself grew up in Hampton, which dubbed itself “Spacetown USA,” amid this archipelago of women who were her neighbors and teachers. Her father, who had built his first rocket in his early teens after seeing the Sputnik launch, was one of Langley’s African American scientists in an era when words we now shudder to hear were used instead of “African American.” Like him, the first five black women who joined Langley’s research staff in 1943 entered a segregated NASA — even though, as Shetterly points out, the space agency was among the most inclusive workplaces in the country, with more than fourfold the percentage of black scientists and engineers than the national average.

Over the next forty years, the number of these trailblazing black women mushroomed to more than fifty, revealing the mycelia of a significant groundswell. Shetterly’s favorite Sunday school teacher had been one of the early computers — a retired NASA mathematician named Kathleen Land. And so Shetterly, who considers herself “as much a product of NASA as the Moon landing,” grew up believing that black women simply belonged in science and space exploration as a matter of course — after all, they populated her father’s workplace and her town, a town whose church “abounded with mathematicians.”

Embodying astronomer Vera Rubin’s wisdom on how modeling expands children’s scope of possibility, Shetterly reflects on this normalizing and rousing power of example:

Building 1236, my father’s daily destination, contained a byzantine complex of government-gray cubicles, perfumed with the grown-up smells of coffee and stale cigarette smoke. His engineering colleagues with their rumpled style and distracted manner seemed like exotic birds in a sanctuary. They gave us kids stacks of discarded 11×14 continuous-form computer paper, printed on one side with cryptic arrays of numbers, the blank side a canvas for crayon masterpieces. Women occupied many of the cubicles; they answered phones and sat in front of typewriters, but they also made hieroglyphic marks on transparent slides and conferred with my father and other men in the office on the stacks of documents that littered their desks. That so many of them were African American, many of them my grandmother’s age, struck me as simply a part of the natural order of things: growing up in Hampton, the face of science was brown like mine.

[…]

The community certainly included black English professors, like my mother, as well as black doctors and dentists, black mechanics, janitors, and contractors, black cobblers, wedding planners, real estate agents, and undertakers, several black lawyers, and a handful of black Mary Kay salespeople. As a child, however, I knew so many African Americans working in science, math, and engineering that I thought that’s just what black folks did.

Katherine Johnson, age 98 (Photograph: Annie Leibovitz for Vanity Fair)
Katherine Johnson, age 98 (Photograph: Annie Leibovitz for Vanity Fair)

But despite the opportunities at NASA, almost countercultural in their contrast to the norms of the time, life for these courageous and brilliant women was no idyll — persons and polities are invariably products of their time and place. Shetterly captures the sundering paradoxes of the early computers’ experience:

I interviewed Mrs. Land about the early days of Langley’s computing pool, when part of her job responsibility was knowing which bathroom was marked for “colored” employees. And less than a week later I was sitting on the couch in Katherine Johnson’s living room, under a framed American flag that had been to the Moon, listening to a ninety-three-year-old with a memory sharper than mine recall segregated buses, years of teaching and raising a family, and working out the trajectory for John Glenn’s spaceflight. I listened to Christine Darden’s stories of long years spent as a data analyst, waiting for the chance to prove herself as an engineer. Even as a professional in an integrated world, I had been the only black woman in enough drawing rooms and boardrooms to have an inkling of the chutzpah it took for an African American woman in a segregated southern workplace to tell her bosses she was sure her calculations would put a man on the Moon.

[…]

And while the black women are the most hidden of the mathematicians who worked at the NACA, the National Advisory Committee for Aeronautics, and later at NASA, they were not sitting alone in the shadows: the white women who made up the majority of Langley’s computing workforce over the years have hardly been recognized for their contributions to the agency’s long-term success. Virginia Biggins worked the Langley beat for the Daily Press newspaper, covering the space program starting in 1958. “Everyone said, ‘This is a scientist, this is an engineer,’ and it was always a man,” she said in a 1990 panel on Langley’s human computers. She never got to meet any of the women. “I just assumed they were all secretaries,” she said.

These women’s often impossible dual task of preserving their own sanity and dignity while pushing culture forward is perhaps best captured in the words of African American NASA mathematician Dorothy Vaughan:

What I changed, I could; what I couldn’t, I endured.

Dive in here.

THE GLASS UNIVERSE

Predating NASA’s women mathematicians by a century was a devoted team of female amateur astronomers — “amateur” being a reflection not of their skill but of the dearth of academic accreditation available to women at the time — who came together at the Harvard Observatory at the end of the nineteenth century around an unprecedented quest to catalog the cosmos by classifying the stars and their spectra.

Decades before they were allowed to vote, these women, who came to be known as the “Harvard computers,” classified hundreds of thousands of stars according to a system they invented, which astronomers continue to use today. Their calculations became the basis for the discovery that the universe is expanding. Their spirit of selfless pursuit of truth and knowledge stands as a timeless testament to pioneering physicist Lise Meitner’s definition of the true scientist.

The "Harvard Computers" at work, circa 1890.
The “Harvard computers” at work, circa 1890.

Science historian Dava Sobel, author of Galileo’s Daughter, chronicles their unsung story and lasting legacy in The Glass Universe: How the Ladies of the Harvard Observatory Took the Measure of the Stars (public library).

Sobel, who takes on the role of rigorous reporter and storyteller bent on preserving the unvarnished historical integrity of the story, paints the backdrop:

A little piece of heaven. That was one way to look at the sheet of glass propped up in front of her. It measured about the same dimensions as a picture frame, eight inches by ten, and no thicker than a windowpane. It was coated on one side with a fine layer of photographic emulsion, which now held several thousand stars fixed in place, like tiny insects trapped in amber. One of the men had stood outside all night, guiding the telescope to capture this image, along with another dozen in the pile of glass plates that awaited her when she reached the observatory at 9 a.m. Warm and dry indoors in her long woolen dress, she threaded her way among the stars. She ascertained their positions on the dome of the sky, gauged their relative brightness, studied their light for changes over time, extracted clues to their chemical content, and occasionally made a discovery that got touted in the press. Seated all around her, another twenty women did the same.

Women working at the Harvard Observatory, with Williamina Fleming (standing) supervising
The “computers” working at the Harvard Observatory, with Williamina Fleming (standing) supervising. (Harvard University Archives)

Among the “Harvard computers” were Antonia Maury, who had graduated from Maria Mitchell’s program at Vassar; Annie Jump Cannon, who catalogued more than 20,000 variable stars in a short period after joining the observatory; Henrietta Swan Levitt, a Radcliffe alumna whose discoveries later became the basis for Hubble’s Law demonstrating the expansion of the universe and whose work was so valued that she was paid 30 cents an hour, five cents over the standard salary of the computers; and Cecilia Helena Payne-Gaposchkin, who became not only the first woman but the first person of any gender to earn a Ph.D. in astronomy.

Helming the team was Williamina Fleming — a Scotswoman whom Edward Charles Pickering, the thirty-something director of the observatory, first hired as a second maid at his residency in 1879 before recognizing her mathematical talents and assigning her the role of part-time computer.

Dive into their story here.

WOMEN IN SCIENCE

For a lighter companion to the two books above, one aimed at younger readers, artist and author Rachel Ignotofsky offers Women in Science: 50 Fearless Pioneers Who Changed the World (public library) — an illustrated encyclopedia of fifty influential and inspiring women in STEM since long before we acronymized the conquest of curiosity through discovery and invention, ranging from the ancient astronomer, mathematician, and philosopher Hypatia in the fourth century to Iranian mathematician Maryam Mirzakhani, born in 1977.

True as it may be that being an outsider is an advantage in science and life, modeling furnishes young hearts with the assurance that people who are in some way like them can belong and shine in fields comprised primarily of people drastically unlike them. It is this ethos that Igontofsky embraces by being deliberate in ensuring that the scientists included come from a vast variety of ethnic backgrounds, nationalities, orientations, and cultural traditions.

womeninscience_igontofsky_alicball

womeninscience_igontofsky_wangzhenyi

womeninscience_igontofsky_verarubin

lisemeitner_ignotofsky1

There are the expected trailblazers who have stood as beacons of possibility for decades, even centuries: Ada Lovelace, who became the world’s first de facto computer programmer; Marie Curie, the first woman to win a Nobel Prize and to this day the only person awarded a Nobel in two different sciences; Jocelyn Bell Burnell, who once elicited the exclamation “Miss Bell, you have made the greatest astronomical discovery of the twentieth century!” (and was subsequently excluded from the Nobel she deserved); Maria Sybilla Merian, the 17th-century German naturalist whose studies of butterfly metamorphosis revolutionized entomology and natural history illustration; and Jane Goodall — another pioneer who turned her childhood dream into reality against tremendous odds and went on to do more for the understanding of nonhuman consciousness than any scientist before or since.

Take a closer look here.

* * *

On December 2, I joined Science Friday alongside Scientific American editor Lee Billings to discuss some of our favorite science books of 2016:

Step into the cultural time machine with selections for the best science books of 2015, 2014, 2013, 2012, and 2011.

BP

Genes and the Holy G: Siddhartha Mukherjee on the Dark Cultural History of IQ and Why We Can’t Measure Intelligence

“If the history of medical genetics teaches us one lesson, it is to be wary of precisely such slips between biology and culture… Genes cannot tell us how to categorize or comprehend human diversity; environments can, cultures can, geographies can, histories can.”

Genes and the Holy G: Siddhartha Mukherjee on the Dark Cultural History of IQ and Why We Can’t Measure Intelligence

Intelligence, Simone de Beauvoir argued, is not a ready-made quality “but a way of casting oneself into the world and of disclosing being.” Like the rest of De Beauvoir’s socially wakeful ideas, this was a courageously countercultural proposition — she lived in the heyday of the IQ craze, which sought to codify into static and measurable components the complex and dynamic mode of being we call “intelligence.” Even today, as we contemplate the nebulous future of artificial intelligence, we find ourselves stymied by the same core problem — how are we to synthesize and engineer intelligence if we are unable to even define it in its full dimension?

How the emergence of IQ tests contracted our understanding of intelligence rather than expanding it and what we can do to transcend their perilous cultural legacy is what practicing physician, research scientist, and Pulitzer-winning author Siddhartha Mukherjee explores throughout The Gene: An Intimate History (public library) — a rigorously researched, beautifully written detective story about the genetic components of what we experience as the self, rooted in Mukherjee’s own painful family history of mental illness and radiating a larger inquiry into how genetics illuminates the future of our species.

Siddhartha Mukherjee (Photograph: Deborah Feingold)
Siddhartha Mukherjee (Photograph: Deborah Feingold)

A crucial agent in our limiting definition of intelligence, which has a dark heritage in nineteenth-century biometrics and eugenics, was the British psychologist and statistician Charles Spearman, who became interested in the strong correlation between an individual’s high performance on tests assessing very different mental abilities. He surmised that human intelligence is a function not of specific knowledge but of the individual’s ability to manipulate abstract knowledge across a variety of domains. Spearman called this ability “general intelligence,” shorthanded g. Mukherjee chronicles the monumental and rather grim impact of this theory on modern society:

By the early twentieth century, g had caught the imagination of the public. First, it captivated early eugenicists. In 1916, the Stanford psychologist Lewis Terman, an avid supporter of the American eugenics movement, created a standardized test to rapidly and quantitatively assess general intelligence, hoping to use the test to select more intelligent humans for eugenic breeding. Recognizing that this measurement varied with age during childhood development, Terman advocated a new metric to quantify age-specific intelligence. If a subject’s “mental age” was the same as his or her physical age, their “intelligence quotient,” or IQ, was defined as exactly 100. If a subject lagged in mental age compared to physical age, the IQ was less than a hundred; if she was more mentally advanced, she was assigned an IQ above 100.

A numerical measure of intelligence was also particularly suited to the demands of the First and Second World Wars, during which recruits had to be assigned to wartime tasks requiring diverse skills based on rapid, quantitative assessments. When veterans returned to civilian life after the wars, they found their lives dominated by intelligence testing.

Illustration by Emily Hughes from Wild

Because categories, measurements, and labels help us navigate the world and, in Umberto Eco’s undying words, “make infinity comprehensible,” IQ metrics enchanted the popular imagination with the convenient illusion of neat categorization. Like any fad that offers a shortcut for something difficult to achieve, they spread like wildfire across the societal landscape. Mukherjee writes:

By the early 1940s, such tests had become accepted as an inherent part of American culture. IQ tests were used to rank job applicants, place children in school, and recruit agents for the Secret Service. In the 1950s, Americans commonly listed their IQs on their résumés, submitted the results of a test for a job application, or even chose their spouses based on the test. IQ scores were pinned on the babies who were on display in Better Babies contests (although how IQ was measured in a two-year-old remained mysterious).

These rhetorical and historical shifts in the concept of intelligence are worth noting, for we will return to them in a few paragraphs. General intelligence (g) originated as a statistical correlation between tests given under particular circumstances to particular individuals. It morphed into the notion of “general intelligence” because of a hypothesis concerning the nature of human knowledge acquisition. And it was codified into “IQ” to serve the particular exigencies of war. In a cultural sense, the definition of g was an exquisitely self-reinforcing phenomenon: those who possessed it, anointed as “intelligent” and given the arbitration of the quality, had every incentive in the world to propagate its definition.

With an eye to evolutionary biologist Richard Dawkins’s culture-shaping coinage of the word “meme”“Just as genes propagate themselves in the gene pool by leaping from body to body via sperms or eggs,” Dawkins wrote in his 1976 classic The Selfish Gene, “so memes propagate themselves in the meme pool by leaping from brain to brain.” — Mukherjee argues that g became a self-propagating unit worthy of being thought of as “selfish g.” He writes:

It takes counterculture to counter culture — and it was only inevitable, perhaps, that the sweeping political movements that gripped America in the 1960s and 1970s would shake the notions of general intelligence and IQ by their very roots. As the civil rights movement and feminism highlighted chronic political and social inequalities in America, it became evident that biological and psychological features were not just inborn but likely to be deeply influenced by context and environment. The dogma of a single form of intelligence was also challenged by scientific evidence.

Illustration by Vladimir Radunsky for On a Beam of Light: A Story of Albert Einstein by Jennifer Berne

Along came social scientists like Howard Gardner, whose germinal 1983 Theory of Multiple Intelligences set out to upend the tyranny of “selfish g” by demonstrating that human acumen exists along varied dimensions, subtler and more context-specific, not necessarily correlated with one another — those who score high on logical/mathematical intelligence, for instance, may not score high on bodily/kinesthetic intelligence, and vice versa. Mukherjee considers the layered implications for g and its active agents:

Is g heritable? In a certain sense, yes. In the 1950s, a series of reports suggested a strong genetic component. Of these, twin studies were the most definitive. When identical twins who had been reared together — i.e., with shared genes and shared environments — were tested in the early fifties, psychologists had found a striking degree of concordance in their IQs, with a correlation value of 0.86. In the late eighties, when identical twins who were separated at birth and reared separately were tested, the correlation fell to 0.74 — still a striking number.

But the heritability of a trait, no matter how strong, may be the result of multiple genes, each exerting a relatively minor effect. If that was the case, identical twins would show strong correlations in g, but parents and children would be far less concordant. IQ followed this pattern. The correlation between parents and children living together, for instance, fell to 0.42. With parents and children living apart, the correlation collapsed to 0.22. Whatever the IQ test was measuring, it was a heritable factor, but one also influenced by many genes and possibly strongly modified by environment — part nature and part nurture.

The most logical conclusion from these facts is that while some combination of genes and environments can strongly influence g, this combination will rarely be passed, intact, from parents to their children. Mendel’s laws virtually guarantee that the particular permutation of genes will scatter apart in every generation. And environmental interactions are so difficult to capture and predict that they cannot be reproduced over time. Intelligence, in short, is heritable (i.e., influenced by genes), but not easily inheritable (i.e., moved down intact from one generation to the next).

And yet the quest for the mythic holy grail of general intelligence persisted and took us down paths not only questionable but morally abhorrent by our present standards. In the 1980s, scientists conducted numerous studies demonstrating a discrepancy in IQ across the races, with white children scoring higher than their black peers. While the controversial results initially provided rampant fodder for racists, they also provided incentive for scientists to do what scientists must — question the validity of their own methods. In a testament to trailblazing philosopher Susanne Langer’s assertion that the way we frame our questions shapes our answers, it soon became clear that these IQ tests weren’t measuring the mythic g but, rather, reflected the effects of contextual circumstances like poverty, illness, hunger, and educational opportunity. Mukherjee explains:

It is easy to demonstrate an analogous effect in a lab: If you raise two plant strains — one tall and one short — in undernourished circumstances, then both plants grow short regardless of intrinsic genetic drive. In contrast, when nutrients are no longer limiting, the tall plant grows to its full height. Whether genes or environment — nature or nurture — dominates in influence depends on context. When environments are constraining, they exert a disproportionate influence. When the constraints are removed, genes become ascendant.

[…]

If the history of medical genetics teaches us one lesson, it is to be wary of precisely such slips between biology and culture. Humans, we now know, are largely similar in genetic terms — but with enough variation within us to represent true diversity. Or, perhaps more accurately, we are culturally or biologically inclined to magnify variations, even if they are minor in the larger scheme of the genome. Tests that are explicitly designed to capture variances in abilities will likely capture variances in abilities — and these variations may well track along racial lines. But to call the score in such a test “intelligence,” especially when the score is uniquely sensitive to the configuration of the test, is to insult the very quality it sets out to measure.

Genes cannot tell us how to categorize or comprehend human diversity; environments can, cultures can, geographies can, histories can. Our language sputters in its attempt to capture this slip. When a genetic variation is statistically the most common, we call it normal — a word that implies not just superior statistical representation but qualitative or even moral superiority… When the variation is rare, it is termed a mutant — a word that implies not just statistical uncommonness, but qualitative inferiority, or even moral repugnance.

And so it goes, interposing linguistic discrimination on genetic variation, mixing biology and desire.

Illustration by Lisbeth Zwerger for a special edition of the fairy tales of the Brothers Grimm

Intelligence, it turns out, is as integrated and indivisible as what we call identity, which the great Lebanese-born French writer Amin Maalouf likened to an intricate pattern drawn on a tightly stretched drumhead. “Touch just one part of it, just one allegiance,” he wrote, “and the whole person will react, the whole drum will sound.” Indeed, it is to identity that Mukherjee points as an object of inquiry far apter than intelligence in understanding personhood. In a passage emblematic of the elegance with which he fuses science, cultural history, and lyrical prose, Mukherjee writes:

Like the English novel, or the face, say, the human genome can be lumped or split in a million different ways. But whether to split or lump, to categorize or synthesize, is a choice. When a distinct, heritable biological feature, such as a genetic illness (e.g., sickle-cell anemia), is the ascendant concern, then examining the genome to identify the locus of that feature makes absolute sense. The narrower the definition of the heritable feature or the trait, the more likely we will find a genetic locus for that trait, and the more likely that the trait will segregate within some human subpopulation (Ashkenazi Jews in the case of Tay-Sachs disease, or Afro-Caribbeans for sickle-cell anemia). There’s a reason that marathon running, for instance, is becoming a genetic sport: runners from Kenya and Ethiopia, a narrow eastern wedge of one continent, dominate the race not just because of talent and training, but also because the marathon is a narrowly defined test for a certain form of extreme fortitude. Genes that enable this fortitude (e.g., particular combinations of gene variants that produce distinct forms of anatomy, physiology, and metabolism) will be naturally selected.

Conversely, the more we widen the definition of a feature or trait (say, intelligence, or temperament), the less likely that the trait will correlate with single genes — and, by extension, with races, tribes, or subpopulations. Intelligence and temperament are not marathon races: there are no fixed criteria for success, no start or finish lines — and running sideways or backward, might secure victory. The narrowness, or breadth, of the definition of a feature is, in fact, a question of identity — i.e., how we define, categorize, and understand humans (ourselves) in a cultural, social, and political sense. The crucial missing element in our blurred conversation on the definition of race, then, is a conversation on the definition of identity.

Complement this particular portion of the wholly fascinating The Gene with young Barack Obama on identity and the search for a coherent self and Mark Twain on intelligence vs. morality, then revisit Schopenhauer on what makes a genius.

BP

Finding Poetry in Other Lives: James Baldwin on Shakespeare, Language as a Tool of Love, and the Poet’s Responsibility to a Divided Society

“The greatest poet in the English language found his poetry where poetry is found: in the lives of the people. He could have done this only through love — by knowing… that whatever was happening to anyone was happening to him.”

Finding Poetry in Other Lives: James Baldwin on Shakespeare, Language as a Tool of Love, and the Poet’s Responsibility to a Divided Society

“You have to tell your own story simultaneously as you hear and respond to the stories of others,” the poet Elizabeth Alexander wrote in contemplating power, possibility, and language as a tool of transformation. A year later, she became the fourth poet in history to read at a U.S. presidential inauguration when she welcomed Barack Obama to the presidency with her poem “Praise Song for the Day.”

But where do we turn when the day is unpraisable? When we can’t find the kinds of words that Ursula K. Le Guin celebrated as able to “transform both speaker and hearer, [to] feed understanding or emotion back and forth and amplify it”? When we can no longer respond but merely react to the stories of others, and can no longer sing?

Leonard Cohen, the great poet of redemption, called for “a revelation in the heart rather than a confrontation or a call-to-arms or a defense” in considering what is needed for healing the divides that rip democracy asunder. How to do that is what James Baldwin(August 2, 1924–December 1, 1987) explored a generation earlier in a spectacular and acutely timely 1964 essay titled “Why I Stopped Hating Shakespeare,” found in The Cross of Redemption: Uncollected Writings (public library) — the indispensable anthology that gave us Baldwin on the artist’s role in society.

James Baldwin with Shakespeare, 1969 (Photograph: Allan Warren)

Baldwin writes:

Every writer in the English language, I should imagine, has at some point hated Shakespeare, has turned away from that monstrous achievement with a kind of sick envy. In my most anti-English days I condemned him as a chauvinist (“this England” indeed!) and because I felt it so bitterly anomalous that a black man should be forced to deal with the English language at all — should be forced to assault the English language in order to be able to speak — I condemned him as one of the authors and architects of my oppression.

Leaning on the scale of life-sobered hindsight with which one weighs the hubrises of one’s youth, Baldwin notes that he “was young and missed the point entirely.” He recounts the moment in which the point revealed itself to him:

I still remember my shock when I finally heard these lines from the murder scene in Julius Caesar. The assassins are washing their hands in Caesar’s blood. Cassius says:

Stoop then, and wash. — How many ages hence
Shall this our lofty scene be acted over,
In states unborn and accents yet unknown!

In a passage of piercing prescience given the political situation in America today, Baldwin reflects on the revelation of this verse:

What I suddenly heard, for the first time, was manifold. It was the voice of lonely, dedicated, deluded Cassius, whose life had never been real for me before — I suddenly seemed to know what this moment meant to him. But beneath and beyond that voice I also heard a note yet more rigorous and impersonal — and contemporary: that “lofty scene,” in all its blood and necessary folly, its blind and necessary pain, was thrown into a perspective which has never left my mind. Just so, indeed, is the heedless State overthrown by men, who, in order to overthrow it, have had to achieve a desperate single-mindedness. And this single-mindedness, which we think of (why?) as ennobling, also operates, and much more surely, to distort and diminish a man — to distort and diminish us all, even, or perhaps especially, those whose needs and whose energy made the overthrow of the State inevitable, necessary, and just.

[…]

Once one has begun to suspect this much about the world — once one has begun to suspect, that is, that one is not, and never will be, innocent, for the reason that no one is — some of the self-protective veils between oneself and reality begin to fall away.

Art by Wendy MacNaughton for Brain Pickings
Art by Wendy MacNaughton for Brain Pickings

With an eye to the two “mighty witnesses” of his life in language — his black ancestors, “who evolved the sorrow songs, the blues, and jazz, and created an entirely new idiom in an overwhelmingly hostile place,” and Shakespeare, whom he calls “the last bawdy writer in the English language” — Baldwin considers how language can become a tool of love and a curative force for our alienation from the world’s otherness:

My quarrel with the English language has been that the language reflected none of my experience. But now I began to see the matter in quite another way. If the language was not my own, it might be the fault of the language; but it might also be my fault. Perhaps the language was not my own because I had never attempted to use it, had only learned to imitate it. If this were so, then it might be made to bear the burden of my experience if I could find the stamina to challenge it, and me, to such a test.

[…]

What I began to see — especially since, as I say, I was living and speaking in French — is that it is experience which shapes a language; and it is language which controls an experience.

In a passage that calls to mind Leonard Cohen’s notion of “a revelation in the heart,” Baldwin adds:

My relationship, then, to the language of Shakespeare revealed itself as nothing less than my relationship to myself and my past. Under this light, this revelation, both myself and my past began slowly to open, perhaps the way a flower opens at morning, but more probably the way an atrophied muscle begins to function, or frozen fingers to thaw.

With this, Baldwin returns to Shakespeare as a lens on the ultimate purpose of the poet as a vehicle for love and mutual understanding in a society woven of otherness — a purpose all the more vital and vitalizing in our troubled and troubling times:

The greatest poet in the English language found his poetry where poetry is found: in the lives of the people. He could have done this only through love — by knowing, which is not the same thing as understanding, that whatever was happening to anyone was happening to him. It is said that his time was easier than ours, but I doubt it — no time can be easy if one is living through it. I think it is simply that he walked his streets and saw them, and tried not to lie about what he saw: his public streets and his private streets, which are always so mysteriously and inexorably connected; but he trusted that connection. And, though I, and many of us, have bitterly bewailed (and will again) the lot of an American writer — to be part of a people who have ears to hear and hear not, who have eyes to see and see not — I am sure that Shakespeare did the same. Only, he saw, as I think we must, that the people who produce the poet are not responsible to him: he is responsible to them.

That is why he is called a poet. And his responsibility, which is also his joy and his strength and his life, is to defeat all labels and complicate all battles by insisting on the human riddle, to bear witness, as long as breath is in him, to that mighty, unnameable, transfiguring force which lives in the soul of man, and to aspire to do his work so well that when the breath has left him, the people — all people! — who search in the rubble for a sign or a witness will be able to find him there.

Complement this particular portion of Baldwin’s increasingly timely and necessary The Cross of Redemption with Carl Sagan on moving beyond us vs. them and Jeanette Winterson on language and how art creates a sanctified space for the human spirit, then revisit Baldwin on the artist’s responsibility to her society, what it means to be truly empowered, and his forgotten, astonishingly timely conversations with Margaret Mead about identity, race, power, and forgiveness.

BP

View Full Site

Brain Pickings participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from a link on here, I receive a small percentage of its price. Privacy policy.