The Marginalian
The Marginalian

The 13 Best Books of 2013: The Definitive Annual Reading List of Overall Favorites

All gratifying things must come to an end: The season’s subjective selection of best-of reading lists — which covered writing and creativity, photography, psychology and philosophy, art and design, history and biography, science and technology, children’s literature, and pets and animals — comes full-circle with this final omnibus of the year’s most indiscriminately wonderful reads, a set of overall favorites that spill across multiple disciplines, cross-pollinate subjects, and defy categorization in the most stimulating of ways. (Revisit last year’s selection here.)

1. ON LOOKING

“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library), also among the year’s best psychology books — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Horowitz’s approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

Art by Maira Kalman from ‘On Looking: Eleven Walks with Expert Eyes’

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.

2. ADVICE TO LITTLE GIRLS

In 1865, when he was only thirty, Mark Twain penned a playful short story mischievously encouraging girls to think independently rather than blindly obey rules and social mores. In the summer of 2011, I chanced upon and fell in love with a lovely Italian edition of this little-known gem with Victorian-scrapbook-inspired artwork by celebrated Russian-born children’s book illustrator Vladimir Radunsky. I knew the book had to come to life in English, so I partnered with the wonderful Claudia Zoe Bedrick of Brooklyn-based indie publishing house Enchanted Lion, maker of extraordinarily beautiful picture-books, and we spent the next two years bringing Advice to Little Girls (public library) to life in America — a true labor-of-love project full of so much delight for readers of all ages. (And how joyous to learn that it was also selected among NPR’s best books of 2013!)

While frolicsome in tone and full of wink, the story is colored with subtle hues of grown-up philosophy on the human condition, exploring all the deft ways in which we creatively rationalize our wrongdoing and reconcile the good and evil we each embody.

Good little girls ought not to make mouths at their teachers for every trifling offense. This retaliation should only be resorted to under peculiarly aggravated circumstances.

If you have nothing but a rag-doll stuffed with sawdust, while one of your more fortunate little playmates has a costly China one, you should treat her with a show of kindness nevertheless. And you ought not to attempt to make a forcible swap with her unless your conscience would justify you in it, and you know you are able to do it.

One can’t help but wonder whether this particular bit may have in part inspired the irreverent 1964 anthology Beastly Boys and Ghastly Girls and its mischievous advice on brother-sister relations:

If at any time you find it necessary to correct your brother, do not correct him with mud — never, on any account, throw mud at him, because it will spoil his clothes. It is better to scald him a little, for then you obtain desirable results. You secure his immediate attention to the lessons you are inculcating, and at the same time your hot water will have a tendency to move impurities from his person, and possibly the skin, in spots.

If your mother tells you to do a thing, it is wrong to reply that you won’t. It is better and more becoming to intimate that you will do as she bids you, and then afterward act quietly in the matter according to the dictates of your best judgment.

Good little girls always show marked deference for the aged. You ought never to ‘sass’ old people unless they ‘sass’ you first.

Originally featured in April — see more spreads, as well as the story behind the project, here.

3. THIS EXPLAINS EVERYTHING

Every year since 1998, intellectual impresario and Edge editor John Brockman has been posing a single grand question to some of our time’s greatest thinkers across a wide spectrum of disciplines, then collecting the answers in an annual anthology. Last year’s answers to the question “What scientific concept will improve everybody’s cognitive toolkit?” were released in This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking, one of the year’s best psychology and philosophy books.

In 2012, the question Brockman posed, proposed by none other than Steven Pinker, was “What is your favorite deep, elegant, or beautiful explanation?” The answers, representing an eclectic mix of 192 (alas, overwhelmingly male) minds spanning psychology, quantum physics, social science, political theory, philosophy, and more, are collected in the edited compendium This Explains Everything: Deep, Beautiful, and Elegant Theories of How the World Works (UK; public library) and are also available online.

In the introduction preceding the micro-essays, Brockman frames the question and its ultimate objective, adding to history’s most timeless definitions of science:

The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology, genetics, computer science, neurophysiology, psychology, cosmology, and physics. Emerging out of these contributions is a new natural philosophy, new ways of understanding physical systems, new ways of thinking that call into question many of our basic assumptions.

[…]

Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple principles in a surprising way. These explanations are called ‘beautiful’ or ‘elegant.’

[…]

The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining knowledge about anything — including such fields of inquiry as philosophy, mathematics, economics, history, language, and human behavior. The common thread is that a simple and nonobvious idea is proposed as the explanation of a diverse and complicated set of phenomena.

Puffer fish with Akule by photographer Wayne Levin. Click image for details.

Stanford neuroscientist Robert Sapolsky, eloquent as ever, marvels at the wisdom of the crowd and the emergence of swarm intelligence:

Observe a single ant, and it doesn’t make much sense, walking in one direction, suddenly careening in another for no obvious reason, doubling back on itself. Thoroughly unpredictable.

The same happens with two ants, a handful of ants. But a colony of ants makes fantastic sense. Specialized jobs, efficient means of exploiting new food sources, complex underground nests with temperature regulated within a few degrees. And critically, there’s no blueprint or central source of command—each individual ants has algorithms for their behaviors. But this is not wisdom of the crowd, where a bunch of reasonably informed individuals outperform a single expert. The ants aren’t reasonably informed about the big picture. Instead, the behavior algorithms of each ant consist of a few simple rules for interacting with the local environment and local ants. And out of this emerges a highly efficient colony.

Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple rules about when to lay down a pheromone trail and what to do when encountering someone else’s trail—approximations of optimal solutions to the Traveling Salesman problem. This has useful applications. In “ant-based routing,” simulations using virtual ants with similar rules can generate optimal ways of connecting the nodes in a network, something of great interest to telecommunications companies. It applies to the developing brain, which must wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of connecting axons. And migrating fetal neurons generate an efficient solution with a different version of ant-based routine.

A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple molecules in an organic soup to occasionally form more complex ones. Life may have originated this way without the requirement of bolts of lightning to catalyze the formation of complex molecules.

And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems don’t require a blue print, they don’t require a blue print maker. If they don’t require lightning bolts, they don’t require Someone hurtling lightning bolts.

Developmental psychologist Howard Gardner, who famously coined the seminal theory of multiple intelligences, echoes Anaïs Nin in advocating for the role of the individual and Susan Sontag in stressing the impact of individual acts on collective fate. His answer, arguing for the importance of human beings, comes as a welcome antidote to a question that suffers the danger of being inherently reductionist:

In a planet occupied now by seven billion inhabitants, I am amazed by the difference that one human being can make. Think of classical music without Mozart or Stravinsky; of painting without Caravaggio, Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible contributions of Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs (or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses or Christ.

[…]

Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of single individuals, or of small groups, working against the odds. As scholars, we cannot and should not sweep these instances under the investigative rug. We should bear in mind anthropologist Margaret Mead’s famous injunction: ‘Never doubt that a small group of thoughtful committed citizens can change the world. It is the only thing that ever has.’

Uber-curator Hans Ulrich Obrist, who also contributed to last year’s volume, considers the parallel role of patterns and chance in the works of iconic composer John Cage and painter Gerhard Richter, and the role of uncertainty in the creative process:

In art, the title of a work can often be its first explanation. And in this context I am thinking especially of the titles of Gerhard Richter. In 2006, when I visited Richter in his studio in Cologne, he had just finished a group of six corresponding abstract paintings which he gave the title Cage.

There are many relations between Richter’s painting and the compositions of John Cage. In a book about the Cage series, Robert Storr has traced them from Richter‘s attendance of a Cage performance at the Festum Fluxorum Fluxus in Düsseldorf 1963 to analogies in their artistic processes. Cage has often applied chance procedures in his compositions, notably with the use of the I Ching. Richter in his abstract paintings also intentionally allows effects of chance. In these paintings, he applies the oil paint on the canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the paint leaves on the canvas is to a large extent the outcome of chance.

[…]

Richter‘s concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other works)—but, one can say, the short form already contains everything. The title, like an explanation of a phenomenon, unlocks the works, describing their relation to one of the most important cultural figures of the twentieth century, John Cage, who shares with Richter the great themes of chance and uncertainty.

Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains indispensable, offers a lyrical meditation on the peculiar odds behind coincidences and déja vus:

I take comfort in the fact that there are two human moments that seem to be doled out equally and democratically within the human condition—and that there is no satisfying ultimate explanation for either. One is coincidence, the other is déja vu. It doesn’t matter if you’re Queen Elizabeth, one of the thirty-three miners rescued in Chile, a South Korean housewife or a migrant herder in Zimbabwe—in the span of 365 days you will pretty much have two déja vus as well as one coincidence that makes you stop and say, “Wow, that was a coincidence.”

The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can happen at any given moment, the fact is, that in practice, coincidences almost never do occur. Coincidences are actually so rare that when they do occur they are, in fact memorable. This suggests to me that the universe is designed to ward off coincidence whenever possible—the universe hates coincidence—I don’t know why—it just seems to be true. So when a coincidence happens, that coincidence had to work awfully hard to escape the system. There’s a message there. What is it? Look. Look harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a theorem for something larger than what they think it is.

What’s both eerie and interesting to me about déja vus is that they occur almost like metronomes throughout our lives, about one every six months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely assume that my thirteen year old niece, Stephen Hawking and someone working in a Beijing luggage-making factory each experience two déja vus a year. Not one. Not three. Two.

The underlying biodynamics of déja vus is probably ascribable to some sort of tingling neurons in a certain part of the brain, yet this doesn’t tell us why they exist. They seem to me to be a signal from larger point of view that wants to remind us that our lives are distinct, that they have meaning, and that they occur throughout a span of time. We are important, and what makes us valuable to the universe is our sentience and our curse and blessing of perpetual self-awareness.

Originally featured in January — read more here.

4. TIME WARPED

Given my soft spot for famous diaries, it should come as no surprise that I keep one myself. Perhaps the greatest gift of the practice has been the daily habit of reading what I had written on that day a year earlier; not only is it a remarkable tool of introspection and self-awareness, but it also illustrates that our memory “is never a precise duplicate of the original [but] a continuing act of creation” and how flawed our perception of time is — almost everything that occurred a year ago appears as having taken place either significantly further in the past (“a different lifetime,” I’d often marvel at this time-illusion) or significantly more recently (“this feels like just last month!”). Rather than a personal deficiency of those of us befallen by this tendency, however, it turns out to be a defining feature of how the human mind works, the science of which is at first unsettling, then strangely comforting, and altogether intensely interesting.

That’s precisely what acclaimed BBC broadcaster and psychology writer Claudia Hammond explores in Time Warped: Unlocking the Mysteries of Time Perception (public library) — a fascinating foray into the idea that our experience of time is actively created by our own minds and how these sensations of what neuroscientists and psychologists call “mind time” are created, and also among the year’s best psychology books. As disorienting as the concept might seem — after all, we’ve been nursed on the belief that time is one of those few utterly reliable and objective things in life — it is also strangely empowering to think that the very phenomenon depicted as the unforgiving dictator of life is something we might be able to shape and benefit from. Hammond writes:

We construct the experience of time in our minds, so it follows that we are able to change the elements we find troubling — whether it’s trying to stop the years racing past, or speeding up time when we’re stuck in a queue, trying to live more in the present, or working out how long ago we last saw our old friends. Time can be a friend, but it can also be an enemy. The trick is to harness it, whether at home, at work, or even in social policy, and to work in line with our conception of time. Time perception matters because it is the experience of time that roots us in our mental reality. Time is not only at the heart of the way we organize life, but the way we experience it.

Discus chronologicus, a depiction of time by German engraver Christoph Weigel, published in the early 1720s; from Cartographies of Time. (Click for details)

Among the most intriguing illustrations of “mind time” is the incredible elasticity of how we experience time. (“Where is it, this present?,” William James famously wondered. “It has melted in our grasp, fled ere we could touch it, gone in the instant of becoming.”) For instance, Hammond points out, we slow time down when gripped by mortal fear — the cliche about the slow-motion car crash is, in fact, a cognitive reality. This plays out even in situations that aren’t life-or-death per se but are still associated with strong feelings of fear. Hammond points to a study in which people with arachnophobia were asked to look at spiders — the very object of their intense fear — for 45 seconds and they overestimated the elapsed time. The same pattern was observed in novice skydivers, who estimated the duration of their peers’ falls as short, whereas their own, from the same altitude, were deemed longer.

Inversely, time seems to speed up as we get older — a phenomenon of which competing theories have attempted to make light. One, known as the “proportionality theory,” uses pure mathematics, holding that a year feels faster when you’re 40 than when you’re 8 because it only constitutes one fortieth of your life rather than a whole eighth. Among its famous proponents are Vladimir Nabokov and William James. But Hammond remains unconvinced:

The problem with the proportionality theory is that it fails to account for the way we experience time at any one moment. We don’t judge one day in the context of our whole lives. If we did, then for a 40-year-old every single day should flash by because it is less than one fourteen-thousandth of the life they’ve had so far. It should be fleeting and inconsequential, yet if you have nothing to do or an enforced wait at an airport for example, a day at 40 can still feel long and boring and surely longer than a fun day at the seaside packed with adventure for a child. … It ignores attention and emotion, which … can have a considerable impact on time perception.

Another theory suggests that perhaps it is the tempo of life in general that has accelerated, making things from the past appear as slower, including the passage of time itself.

But one definite change does take place with age: As we grow older, we tend to feel like the previous decade elapsed more rapidly, while the earlier decades of our lives seem to have lasted longer. Similarly, we tend to think of events that took place in the past 10 years as having happened more recently than they actually did. (Quick: What year did the devastating Japanese tsunami hit? When did we lose Maurice Sendak?) Conversely, we perceive events that took place more than a decade ago as having happened even longer ago. (When did Princess Diana die? What year was the Chernobyl disaster?) This, Hammond points out, is known as “forward telescoping”:

It is as though time has been compressed and — as if looking through a telescope — things seem closer than they really are. The opposite is called backward or reverse telescoping, also known as time expansion. This is when you guess that events happened longer ago than they really did. This is rare for distant events, but not uncommon for recent weeks.

[…]

The most straightforward explanation for it is called the clarity of memory hypothesis, proposed by the psychologist Norman Bradburn in 1987. This is the simple idea that because we know that memories fade over time, we use the clarity of a memory as a guide to its recency. So if a memory seems unclear we assume it happened longer ago.

Originally featured in July, with a deeper dive into the psychology of why time slows down when we’re afraid, speeds up as we age, and gets warped when we’re on vacation.

5. SELF-PORTRAIT AS YOUR TRAITOR

“Still this childish fascination with my handwriting,” young Susan Sontag wrote in her diary in 1949. “To think that I always have this sensuous potentiality glowing within my fingers.” This is the sort of sensuous potentiality that comes aglow in Self-Portrait as Your Traitor (public library) — the magnificent collection of hand-lettered poems and illustrated essays by friend-of-Brain-Pickings and frequent contributor Debbie Millman. In the introduction, design legend Paula Scher aptly describes this singular visual form as a “21st-century illuminated manuscript.” Personal bias aside, these moving, lovingly crafted poems and essays — some handwritten, some drawn with colored pencils, some typeset in felt on felt — vibrate at that fertile intersection of the deeply personal and the universally profound.

In “Fail Safe,” her widely read essay-turned-commencement-address on creative courage and embracing the unknown from the 2009 anthology Look Both Ways, Millman wrote:

John Maeda once explained, “The computer will do anything within its abilities, but it will do nothing unless commanded to do so.” I think people are the same — we like to operate within our abilities. But whereas the computer has a fixed code, our abilities are limited only by our perceptions. Two decades since determining my code, and after 15 years of working in the world of branding, I am now in the process of rewriting the possibilities of what comes next. I don’t know exactly what I will become; it is not something I can describe scientifically or artistically. Perhaps it is a “code in progress.”

Self-Portrait as Your Traitor, a glorious large-format tome full of textured colors to which the screen does absolutely no justice, is the result of this progress — a brave and heartening embodiment of what it truly means, as Rilke put it, to live the questions; the stunning record of one woman’s personal and artistic code-rewriting, brimming with wisdom on life and art for all.

Originally featured in November. See an exclusive excerpt here, then take a peek at Debbie’s creative process here.

6. SUSAN SONTAG: THE COMPLETE ROLLING STONE INTERVIEW

In 1978, Rolling Stone contributing editor Jonathan Cott interviewed Susan Sontag in twelve hours of conversation, beginning in Paris and continuing in New York, only a third of which was published in the magazine. More than three decades later and almost a decade after Sontag’s death, the full, wide-ranging magnificence of their tête-à-tête, spanning literature, philosophy, illness, mental health, music, art, and much more, is at last released in Susan Sontag: The Complete Rolling Stone Interview (public library) — a rare glimpse of one of modern history’s greatest minds in her element.

Cott marvels at what made the dialogue especially extraordinary:

Unlike almost any other person whom I’ve ever interviewed — the pianist Glenn Gould is the one other exception — Susan spoke not in sentences but in measured and expansive paragraphs. And what seemed most striking to me was the exactitude and “moral and linguistic fine-tuning” — as she once described Henry James’s writing style—with which she framed and elaborated her thoughts, precisely calibrating her intended meanings with parenthetical remarks and qualifying words (“sometimes,” “occasionally,” “usually,” “for the most part,” “in almost all cases”), the munificence and fluency of her conversation manifesting what the French refer to as an ivresse du discours — an inebriation with the spoken word. “I am hooked on talk as a creative dialogue,” she once remarked in her journals, and added: “For me, it’s the principal medium of my salvation.

In one segment of the conversation, Sontag discusses how the false divide between “high” and pop culture impoverishes our lives. In another, she makes a beautiful case for the value of history:

I really believe in history, and that’s something people don’t believe in anymore. I know that what we do and think is a historical creation. I have very few beliefs, but this is certainly a real belief: that most everything we think of as natural is historical and has roots — specifically in the late eighteenth and early nineteenth centuries, the so-called Romantic revolutionary period — and we’re essentially still dealing with expectations and feelings that were formulated at that time, like ideas about happiness, individuality, radical social change, and pleasure. We were given a vocabulary that came into existence at a particular historical moment. So when I go to a Patti Smith concert at CBGB, I enjoy, participate, appreciate, and am tuned in better because I’ve read Nietzsche.

In another meditation, she argues for the existential and creative value of presence:

What I want is to be fully present in my life — to be really where you are, contemporary with yourself in your life, giving full attention to the world, which includes you. You are not the world, the world is not identical to you, but you’re in it and paying attention to it. That’s what a writer does — a writer pays attention to the world. Because I’m very against this solipsistic notion that you find it all in your head. You don’t, there really is a world that’s there whether you’re in it or not.

In another passage, she considers how taking responsibility empowers rather than disempowers us:

I want to feel as responsible as I possibly can. As I told you before, I hate feeling like a victim, which not only gives me no pleasure but also makes me feel very uncomfortable. Insofar as it’s possible, and not crazy, I want to enlarge to the furthest extent possible my sense of my own autonomy, so that in friendship and love relationships I’m eager to take responsibility for both the good and the bad things. I don’t want this attitude of “I was so wonderful and that person did me in.” Even when it’s sometimes true, I’ve managed to convince myself that I was at least co-responsible for bad things that have happened to me, because it actually makes me feel stronger and makes me feel that things could perhaps be different.

The conversation, in which Sontag reaches unprecedented depths of self-revelation, also debunks some misconceptions about her public image as an intellectual in the dry, scholarly sense of the term:

Most of what I do, contrary to what people think, is so intuitive and unpremeditated and not at all that kind of cerebral, calculating thing people imagine it to be. I’m just following my instincts and intuitions. […] An argument appears to me much more like the spokes of a wheel than the links of a chain.

In one of her most poignant insights, Sontag admonishes against our culture’s dangerous polarities and imprisoning stereotypes:

A lot of our ideas about what we can do at different ages and what age means are so arbitrary — as arbitrary as sexual stereotypes. I think that the young-old polarization and the male-female polarization are perhaps the two leading stereotypes that imprison people. The values associated with youth and with masculinity are considered to be the human norms, and anything else is taken to be at least less worthwhile or inferior. Old people have a terrific sense of inferiority. They’re embarrassed to be old. What you can do when you’re young and what you can do when you’re old is as arbitrary and without much basis as what you can do if you’re a woman or what you can do if you’re a man.

Originally featured in November — take a closer look here and here.

7. LETTERS OF NOTE

As a hopeless lover of letters, I was thrilled for the release of Letters of Note: Correspondence Deserving of a Wider Audience (public library) — the aptly titled, superb collection based on Shaun Usher’s indispensable website of the same name, which stands as a heartening echelon of independent online scholarship and journalism at the intersection of the editorial and the curatorial and features timeless treasures from such diverse icons and Brain Pickings favorites as E. B. White, Virginia Woolf, Ursula Nordstrom, Nick Cave, Ray Bradbury, Amelia Earhart, Galileo Galilei, and more.

One of the most beautiful letters in the collection comes from Hunter S. Thompsongonzo journalism godfather, pundit of media politics, dark philosopher. The letter, which Thompson sent to his friend Hume Logan in 1958, makes for an exquisite addition to luminaries’ reflections on the meaning of life, speaking to what it really means to find your purpose.

Cautious that “all advice can only be a product of the man who gives it” — a caveat other literary legends have stressed with varying degrees of irreverence — Thompson begins with a necessary disclaimer about the very notion of advice-giving:

To give advice to a man who asks what to do with his life implies something very close to egomania. To presume to point a man to the right and ultimate goal — to point with a trembling finger in the RIGHT direction is something only a fool would take upon himself.

And yet he honors his friend’s request, turning to Shakespeare for an anchor of his own advice:

“To be, or not to be: that is the question: Whether ’tis nobler in the mind to suffer the slings and arrows of outrageous fortune, or to take arms against a sea of troubles…”

And indeed, that IS the question: whether to float with the tide, or to swim for a goal. It is a choice we must all make consciously or unconsciously at one time in our lives. So few people understand this! Think of any decision you’ve ever made which had a bearing on your future: I may be wrong, but I don’t see how it could have been anything but a choice however indirect — between the two things I’ve mentioned: the floating or the swimming.

He acknowledges the obvious question of why not take the path of least resistance and float aimlessly, then counters it:

The answer — and, in a sense, the tragedy of life — is that we seek to understand the goal and not the man. We set up a goal which demands of us certain things: and we do these things. We adjust to the demands of a concept which CANNOT be valid. When you were young, let us say that you wanted to be a fireman. I feel reasonably safe in saying that you no longer want to be a fireman. Why? Because your perspective has changed. It’s not the fireman who has changed, but you.

Touching on the same notion that William Gibson termed “personal micro-culture,” Austin Kleon captured in asserting that “you are the mashup of what you let into your life,” and Paula Scher articulated so succinctly in speaking of the combinatorial nature of our creativity, Thompson writes:

Every man is the sum total of his reactions to experience. As your experiences differ and multiply, you become a different man, and hence your perspective changes. This goes on and on. Every reaction is a learning process; every significant experience alters your perspective.

So it would seem foolish, would it not, to adjust our lives to the demands of a goal we see from a different angle every day? How could we ever hope to accomplish anything other than galloping neurosis?

The answer, then, must not deal with goals at all, or not with tangible goals, anyway. It would take reams of paper to develop this subject to fulfillment. God only knows how many books have been written on “the meaning of man” and that sort of thing, and god only knows how many people have pondered the subject. (I use the term “god only knows” purely as an expression.)* There’s very little sense in my trying to give it up to you in the proverbial nutshell, because I’m the first to admit my absolute lack of qualifications for reducing the meaning of life to one or two paragraphs.

Resolving to steer clear of the word “existentialism,” Thompson nonetheless strongly urges his friend to read Sartre’s Nothingness and the anthology Existentialism: From Dostoyevsky to Sartre, then admonishes against succumbing to faulty definitions of success at the expense of finding one’s own purpose:

To put our faith in tangible goals would seem to be, at best, unwise. So we do not strive to be firemen, we do not strive to be bankers, nor policemen, nor doctors. WE STRIVE TO BE OURSELVES.

But don’t misunderstand me. I don’t mean that we can’t BE firemen, bankers, or doctors—but that we must make the goal conform to the individual, rather than make the individual conform to the goal. In every man, heredity and environment have combined to produce a creature of certain abilities and desires—including a deeply ingrained need to function in such a way that his life will be MEANINGFUL. A man has to BE something; he has to matter.

As I see it then, the formula runs something like this: a man must choose a path which will let his ABILITIES function at maximum efficiency toward the gratification of his DESIRES. In doing this, he is fulfilling a need (giving himself identity by functioning in a set pattern toward a set goal) he avoids frustrating his potential (choosing a path which puts no limit on his self-development), and he avoids the terror of seeing his goal wilt or lose its charm as he draws closer to it (rather than bending himself to meet the demands of that which he seeks, he has bent his goal to conform to his own abilities and desires).

In short, he has not dedicated his life to reaching a pre-defined goal, but he has rather chosen a way of life he KNOWS he will enjoy. The goal is absolutely secondary: it is the functioning toward the goal which is important. And it seems almost ridiculous to say that a man MUST function in a pattern of his own choosing; for to let another man define your own goals is to give up one of the most meaningful aspects of life — the definitive act of will which makes a man an individual.

Noting that his friend had thus far lived “a vertical rather than horizontal existence,” Thompson acknowledges the challenge of this choice but admonishes that however difficult, the choice must be made or else it melts away into those default modes of society:

A man who procrastinates in his CHOOSING will inevitably have his choice made for him by circumstance. So if you now number yourself among the disenchanted, then you have no choice but to accept things as they are, or to seriously seek something else. But beware of looking for goals: look for a way of life. Decide how you want to live and then see what you can do to make a living WITHIN that way of life. But you say, “I don’t know where to look; I don’t know what to look for.”

And there’s the crux. Is it worth giving up what I have to look for something better? I don’t know — is it? Who can make that decision but you? But even by DECIDING TO LOOK, you go a long way toward making the choice.

He ends by returning to his original disclaimer by reiterating that rather than a prescription for living, his “advice” is merely a reminder that how and what we choose — choices we’re in danger of forgetting even exist — shapes the course and experience of our lives:

I’m not trying to send you out “on the road” in search of Valhalla, but merely pointing out that it is not necessary to accept the choices handed down to you by life as you know it. There is more to it than that — no one HAS to do something he doesn’t want to do for the rest of his life.

Originally featured in November.

8. INTUITION PUMPS

“If you are not making mistakes, you’re not taking enough risks,” Debbie Millman counseled. “Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before,” Neil Gaiman advised young creators. In Intuition Pumps And Other Tools for Thinking (public library), also among the year’s best psychology books, the inimitable Daniel Dennett, one of our greatest living philosophers, offers a set of thinking tools — “handy prosthetic imagination-extenders and focus holders” — that allow us to “think reliably and even gracefully about really hard questions” — to enhance your cognitive toolkit. He calls these tools “intuition pumps” — thought experiments designed to stir “a heartfelt, table-thumping intuition” (which we know is a pillar of even the most “rational” of science) about the question at hand, a kind of persuasion tool the reverse-engineering of which enables us to think better about thinking itself. Intuition, of course, is a domain-specific ability that relies on honed critical thinking rather than a mystical quality bestowed by the gods — but that’s precisely Dennett’s point, and his task is to help us hone it.

Though most of his 77 “intuition pumps” address concrete questions, a dozen are “general-purpose” tools that apply deeply and widely, across just about any domain of thinking. The first of them is also arguably the most useful yet most uncomfortable: making mistakes.

Echoing Dorion Sagan’s case for why science and philosophy need each other, Dennett begins with an astute contribution to the best definitions of philosophy, wrapped in a necessary admonition about the value of history:

The history of philosophy is in large measure the history of very smart people making very tempting mistakes, and if you don’t know the history, you are doomed to making the same darn mistakes all over again. … There is no such thing as philosophy-free science, just science that has been conducted without any consideration of its underlying philosophical assumptions.

He speaks for the generative potential of mistakes and their usefulness as an empirical tool:

Sometimes you don’t just want to risk making mistakes; you actually want to make them — if only to give you something clear and detailed to fix.

Therein lies the power of mistakes as a vehicle for, as Rilke famously put it, “living the questions” and thus advancing knowledge in a way that certainty cannot — for, as Richard Feynman memorably noted, the scientist’s job is to remain unsure, and so seems the philosopher’s. Dennett writes:

We philosophers are mistake specialists. … While other disciplines specialize in getting the right answers to their defining questions, we philosophers specialize in all the ways there are of getting things so mixed up, so deeply wrong, that nobody is even sure what the right questions are, let alone the answers. Asking the wrong questions risks setting any inquiry off on the wrong foot. Whenever that happens, this is a job for philosophers! Philosophy — in every field of inquiry — is what you have to do until you figure out what questions you should have been asking in the first place.

[…]

Mistakes are not just opportunities for learning; they are, in an important sense, the only opportunity for learning or making something truly new. Before there can be learning, there must be learners. There are only two non-miraculous ways for learners to come into existence: they must either evolve or be designed and built by learners that evolved. Biological evolution proceeds by a grand, inexorable process of trial and error — and without the errors the trials wouldn’t accomplish anything.

Dennett offers a caveat that at once highlights the importance of acquiring knowledge and reminds us of the power of “chance-opportunism”:

Trials can be either blind or foresighted. You, who know a lot, but not the answer to the question at hand, can take leaps — foresighted leaps. You can look before you leap, and hence be somewhat guided from the outset by what you already know. You need not be guessing at random, but don’t look down your nose at random guesses; among its wonderful products is … you!

And since evolution is the highest epitome of how the process of trial and error drives progress, Dennett makes a case for understanding evolution as a key to understanding everything else we humans value:

Evolution … is the central, enabling process not only of life but also of knowledge and learning and understanding. If you attempt to make sense of the world of ideas and meanings, free will and morality, art and science and even philosophy itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back. … For evolution, which knows nothing, the steps into novelty are blindly taken by mutations, which are random copying “errors” in DNA.

Dennett echoes Dostoyevsky (“Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others.”) and offers the key to making productive mistakes:

The chief trick to making good mistakes is not to hide them — especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own mistakes, turning them over in your mind as if they were works of art, which in a way they are. … The trick is to take advantage of the particular details of the mess you’ve made, so that your next attempt will be informed by it and not just another blind stab in the dark.

We have all heard the forlorn refrain “Well, it seemed like a good idea at the time!” This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say, “Well, it seemed like a good idea at the time!” is standing on the threshold of brilliance.

Originally featured in May — read the full article here.

9. LOST CAT

“Dogs are not about something else. Dogs are about dogs,” Malcolm Gladwell asserted indignantly in the introduction to The Big New Yorker Book of Dogs. Though hailed as memetic rulers of the internet, cats have also enjoyed a long history as artistic and literary muses, but never have they been at once more about cats and more about something else than in Lost Cat: A True Story of Love, Desperation, and GPS Technology (public library) by firefighter-turned-writer Caroline Paul and illustrator extraordinaire Wendy MacNaughton, she of many wonderful collaborations — a tender, imaginative memoir, among the year’s best biographies and memoirs and best books on pets and animals, infused with equal parts humor and humanity. Though “about” a cat, this heartwarming and heartbreaking tale is really about what it means to be human — about the osmosis of hollowing loneliness and profound attachment, the oscillation between boundless affection and paralyzing fear of abandonment, the unfair promise of loss implicit to every possibility of love.

After Caroline crashes an experimental plane she was piloting, she finds herself severely injured and spiraling into the depths of depression. It both helps and doesn’t that Caroline and Wendy have just fallen in love, soaring in the butterfly heights of new romance, “the phase of love that didn’t obey any known rules of physics,” until the crash pulls them into a place that would challenge even the most seasoned and grounded of relationships. And yet they persevere as Wendy patiently and lovingly takes care of Caroline.

When Caroline returns from the hospital with a shattered ankle, her two thirteen-year-old tabbies — the shy, anxious Tibby (short for Tibia, affectionately — and, in these circumstances, ironically — named after the shinbone) and the sociable, amicable Fibby (short for Fibula, after the calf bone on the lateral side of the tibia) — are, short of Wendy, her only joy and comfort:

Tibia and Fibula meowed happily when I arrived. They were undaunted by my ensuing stupor. In fact they were delighted; suddenly I had become a human who didn’t shout into a small rectangle of lights and plastic in her hand, peer at a computer, or get up and disappear from the vicinity, only to reappear through the front door hours later. Instead, I was completely available to them at all times. Amazed by their good luck, they took full feline advantage. They asked for ear scratches and chin rubs. They rubbed their whiskers along my face. They purred in response to my slurred, affectionate baby talk. But mostly they just settled in and went to sleep. Fibby snored into my neck. Tibby snored on the rug nearby. Meanwhile I lay awake, circling the deep dark hole of depression.

Without my cats, I would have fallen right in.

And then, one day, Tibby disappears.

Wendy and Caroline proceed to flyer the neighborhood, visit every animal shelter in the vicinity, and even, in their desperation, enlist the help of a psychic who specializes in lost pets — but to no avail. Heartbroken, they begin to mourn Tibby’s loss.

And then, one day five weeks later, Tibby reappears. But once the initial elation of the recovery has worn off, Caroline begins to wonder where he’d been and why he’d left. He is now no longer eating at home and regularly leaves the house for extended periods of time — Tibby clearly has a secret place he now returns to. Even more worrisomely, he’s no longer the shy, anxious tabby he’d been for thirteen years — instead, he’s a half pound heavier, chirpy, with “a youthful spring in his step.” But why would a happy cat abandon his loving lifelong companion and find comfort — find himself, even — elsewhere?

When the relief that my cat was safe began to fade, and the joy of his prone, snoring form — sprawled like an athlete after a celebratory night of boozing — started to wear thin, I was left with darker emotions. Confusion. Jealousy. Betrayal. I thought I’d known my cat of thirteen years. But that cat had been anxious and shy. This cat was a swashbuckling adventurer back from the high seas. What siren call could have lured him away? Was he still going to this gilded place, with its overflowing food bowls and endless treats?

There only one obvious thing left to do: Track Tibby on his escapades. So Caroline, despite Wendy’s lovingly suppressed skepticism, heads to a spy store — yes, those exist — and purchases a real-time GPS tracker, complete with a camera that they program to take snapshots every few minutes, which they then attach to Tibby’s collar.

What follows is a wild, hilarious, and sweet tale of tinkering, tracking, and tenderness. Underpinning the obsessive quest is the subtle yet palpable subplot of Wendy and Caroline’s growing love for each other, the deepening of trust and affection that happens when two people share in a special kind of insanity.

“Evert quest is a journey, every journey a story. Every story, in turn, has a moral,” writes Caroline in the final chapter, then offers several “possible morals” for the story, the last of which embody everything that makes Lost Cat an absolute treat from cover to cover:

6. You can never know your cat. In fact, you can never know anyone as completely as you want.

7. But that’s okay, love is better.

Take a closer look here, then hear MacNaughton and Paul in conversation about combining creative collaboration with a romantic relationship.

10. CODEX SERAPHINIANUS

In 1976, Italian artist, architect, and designer Luigi Serafini, only 27 at the time, set out to create an elaborate encyclopedia of imaginary objects and creatures that fell somewhere between Edward Gorey’s cryptic alphabets, Albertus Seba’s cabinet of curiosities, the book of surrealist games, and Alice in Wonderland. What’s more, it wasn’t written in any ordinary language but in an unintelligible alphabet that appeared to be a conlang — an undertaking so complex it constitutes one of the highest feats of cryptography. It took him nearly three years to complete the project, and three more to publish it, but when it was finally released, the book — a weird and wonderful masterpiece of art and philosophical provocation on the precipice of the information age — attracted a growing following that continued to gather momentum even as the original edition went out of print.

Now, for the first time in more than thirty years, Codex Seraphinianus (public library) is resurrected in a lavish new edition by Rizzoli — who have a penchant for excavating forgotten gems — featuring a new chapter by Serafini, now in his 60s, and a gorgeous signed print with each deluxe tome. Besides a visual masterwork, it’s also a timeless meditation on what “reality” really is, one all the timelier in today’s age of such seemingly surrealist feats as bioengineering whole new lifeforms, hurling subatomic particles at each other faster than the speed of light, and encoding an entire book onto a DNA molecule.

In an interview for Wired Italy, Serafini aptly captures the subtle similarity to children’s books in how the Codex bewitches our grown-up fancy with its bizarre beauty:

What I want my alphabet to convey to the reader is the sensation that children feel in front of books they cannot yet understand. I used it to describe analytically an imaginary world and give a coherent framework. The images originate from the clash between this fantasy vocabulary and the real world. … The Codex became so popular because it makes you feel more comfortable with your fantasies. Another world is not possible, but a fantasy one maybe is.

Playfully addressing the book’s towering price point, Serafini makes a more serious point about how it bespeaks the seductive selectiveness of our attention:

The [new] edition is very rich and also pricey, I know, but it’s just like psychoanalysis: Money matters and the fee is part of the process of healing. At the end of the day, the Codex is similar to the Rorschach inkblot test. You see what you want to see. You might think it’s speaking to you, but it’s just your imagination.

Originally featured in October — see more here.

11. MY BROTHER’S BOOK

For those of us who loved legendary children’s book author Maurice Sendak — famed creator of wild things, little-known illustrator of velveteen rabbits, infinitely warm heart, infinitely witty mind — his death in 2012 was one of the year’s greatest heartaches. Now, half a century after his iconic Where The Wild Things Are, comes My Brother’s Book (public library), one of the year’s best children’s books — a bittersweet posthumous farewell to the world, illustrated in vibrant, dreamsome watercolors and written in verse inspired by some of Sendak’s lifelong influences: Shakespeare, Blake, Keats, and the music of Mozart. In fact, a foreword by Shakespeare scholar Stephen Greenblatt reveals the book is based on the Bard’s “A Winter’s Tale.”

It tells the story of two brothers, Jack and Guy, torn asunder when a falling star crashes onto Earth. Though on the surface about the beloved author’s own brother Jack, who died 18 years ago, the story is also about the love of Sendak’s life and his partner of fifty years, psychoanalyst Eugene Glynn, whose prolonged illness and eventual loss in 2007 devastated Sendak — the character of Guy reads like a poetic fusion of Sendak and Glynn. And while the story might be a universal “love letter to those who have gone before,” as NPR’s Renee Montagne suggests in Morning Edition, it is in equal measure a private love letter to Glynn. (Sendak passed away the day before President Obama announced his support for same-sex marriage, but Sendak fans were quick to honor both historic moments with a bittersweet homage.)

Indeed, the theme of all-consuming love manifests viscerally in Sendak’s books. Playwright Tony Kushner, a longtime close friend of Sendak’s and one of his most heartfelt mourners, tells NPR:

There’s a lot of consuming and devouring and eating in Maurice’s books. And I think that when people play with kids, there’s a lot of fake ferocity and threats of, you know, devouring — because love is so enormous, the only thing you can think of doing is swallowing the person that you love entirely.

My Brother’s Book ends on a soul-stirring note, tender and poignant in its posthumous light:

And Jack slept safe
Enfolded in his brother’s arms
And Guy whispered ‘Good night
And you will dream of me.’

Originally featured in February.

12. EIGHTY DAYS

“Anything one man can imagine, other men can make real,” science fiction godfather Jules Verne famously proclaimed. He was right about the general sentiment but oh how very wrong about its gendered language: Sixteen years after Verne’s classic novel Eighty Days Around the World, his vision for speed-circumnavigation would be made real — but by a woman. On the morning of November 14, 1889, Nellie Bly, an audacious newspaper reporter, set out to outpace Verne’s fictional itinerary by circumnavigating the globe in seventy-five days, thus setting the real-world record for the fastest trip around the world. In Eighty Days: Nellie Bly and Elizabeth Bisland’s History-Making Race Around the World (public library), also among the year’s best history books, Matthew Goodman traces the groundbreaking adventure, beginning with a backdrop of Bly’s remarkable journalistic fortitude and contribution to defying our stubbornly enduring biases about women writers:

No female reporter before her had ever seemed quite so audacious, so willing to risk personal safety in pursuit of a story. In her first exposé for The World, Bly had gone undercover … feigning insanity so that she might report firsthand on the mistreatment of the female patients of the Blackwell’s Island Insane Asylum. … Bly trained with the boxing champion John L. Sullivan; she performed, with cheerfulness but not much success, as a chorus girl at the Academy of Music (forgetting the cue to exit, she momentarily found herself all alone onstage). She visited with a remarkable deaf, dumb, and blind nine-year-old girl in Boston by the name of Helen Keller. Once, to expose the workings of New York’s white slave trade, she even bought a baby. Her articles were by turns lighthearted and scolding and indignant, some meant to edify and some merely to entertain, but all were shot through with Bly’s unmistakable passion for a good story and her uncanny ability to capture the public’s imagination, the sheer force of her personality demanding that attention be paid to the plight of the unfortunate, and, not incidentally, to herself.

For all her extraordinary talent and work ethic, Bly’s appearance was decidedly unremarkable — a fact that shouldn’t matter, but one that would be repeatedly remarked upon by her critics and commentators, something we’ve made sad little progress on in discussing women’s professional, intellectual, and creative merit more than a century later. Goodman paints a portrait of Bly:

She was a young woman in a plaid coat and cap, neither tall nor short, dark nor fair, not quite pretty enough to turn a head: the sort of woman who could, if necessary, lose herself in a crowd.

[…]

Her voice rang with the lilt of the hill towns of western Pennsylvania; there was an unusual rising inflection at the ends of her sentences, the vestige of an Elizabethan dialect that had still been spoken in the hills when she was a girl. She had piercing gray eyes, though sometimes they were called green, or blue-green, or hazel. Her nose was broad at its base and delicately upturned at the end — the papers liked to refer to it as a “retroussé” nose — and it was the only feature about which she was at all self-conscious. She had brown hair that she wore in bangs across her forehead. Most of those who knew her considered her pretty, although this was a subject that in the coming months would be hotly debated in the press.

But, as if the ambitious adventure weren’t scintillating enough, the story takes an unexpected turn: That fateful November morning, as Bly was making her way to the journey’s outset at the Hoboken docks, a man named John Brisben Walker passed her on a ferry in the opposite direction, traveling from Jersey City to Lower Manhattan. He was the publisher of a high-brow magazine titled The Cosmopolitan, the same publication that decades later, under the new ownership of William Randolph Hearst, would take a dive for the commercially low-brow. On his ferry ride, Walker skimmed that morning’s edition of The World and paused over the front-page feature announcing Bly’s planned adventure around the world. A seasoned media manipulator of the public’s voracious appetite for drama, he instantly birthed an idea that would seize upon a unique publicity opportunity — The Cosmopolitan would send another circumnavigator to race against Bly. To keep things equal, it would have to be a woman. To keep them interesting, she’d travel in the opposite direction.

And so it went:

Elizabeth Bisland was twenty-eight years old, and after nearly a decade of freelance writing she had recently obtained a job as literary editor of The Cosmopolitan, for which she wrote a monthly review of recently published books entitled “In the Library.” Born into a Louisiana plantation family ruined by the Civil War and its aftermath, at the age of twenty she had moved to New Orleans and then, a few years later, to New York, where she contributed to a variety of magazines and was regularly referred to as the most beautiful woman in metropolitan journalism. Bisland was tall, with an elegant, almost imperious bearing that accentuated her height; she had large dark eyes and luminous pale skin and spoke in a low, gentle voice. She reveled in gracious hospitality and smart conversation, both of which were regularly on display in the literary salon that she hosted in the little apartment she shared with her sister on Fourth Avenue, where members of New York’s creative set, writers and painters and actors, gathered to discuss the artistic issues of the day. Bisland’s particular combination of beauty, charm, and erudition seems to have been nothing short of bewitching.

But Bisland was no literary bombshell. Wary of beauty’s fleeting and superficial nature — she once lamented, “After the period of sex-attraction has passed, women have no power in America” — she blended Edison’s circadian relentlessness and Tchaikovsky’s work ethic:

She took pride in the fact that she had arrived in New York with only fifty dollars in her pocket, and that the thousands of dollars now in her bank account had come by virtue of her own pen. Capable of working for eighteen hours at a stretch, she wrote book reviews, essays, feature articles, and poetry in the classical vein. She was a believer, more than anything else, in the joys of literature, which she had first experienced as a girl in ancient volumes of Shakespeare and Cervantes that she found in the library of her family’s plantation house. (She taught herself French while she churned butter, so that she might read Rousseau’s Confessions in the original — a book, as it turned out, that she hated.) She cared nothing for fame, and indeed found the prospect of it distasteful.

And yet, despite their competitive circumstances and seemingly divergent dispositions, something greater bound the two women together, some ineffable force of culture that quietly united them in a bold defiance of their era’s normative biases:

On the surface the two women … were about as different as could be: one woman a Northerner, the other from the South; one a scrappy, hard-driving crusader, the other priding herself on her gentility; one seeking out the most sensational of news stories, the other preferring novels and poetry and disdaining much newspaper writing as “a wild, crooked, shrieking hodge-podge,” a “caricature of life.” Elizabeth Bisland hosted tea parties; Nellie Bly was known to frequent O’Rourke’s saloon on the Bowery. But each of them was acutely conscious of the unequal position of women in America. Each had grown up without much money and had come to New York to make a place for herself in big-city journalism, achieving a hard-won success in what was still, unquestionably, a man’s world.

Originally featured in May — read the full article, including Bly’s entertaining illustrated packing list, here.

13. DON’T GO BACK TO SCHOOL

“The present education system is the trampling of the herd,” legendary architect Frank Lloyd Wright lamented in 1956. Half a century later, I started Brain Pickings in large part out of frustration and disappointment with my trampling experience of our culturally fetishized “Ivy League education.” I found myself intellectually and creatively unstimulated by the industrialized model of the large lecture hall, the PowerPoint presentations, the standardized tests assessing my rote memorization of facts rather than my ability to transmute that factual knowledge into a pattern-recognition mechanism that connects different disciplines to cultivate wisdom about how the world works and a moral lens on how it should work. So Brain Pickings became the record of my alternative learning, of that cross-disciplinary curiosity that took me from art to psychology to history to science, by way of the myriad pieces of knowledge I discovered — and connected — on my own. I didn’t live up to the entrepreneurial ideal of the college drop-out and begrudgingly graduated “with honors,” but refused to go to my own graduation and decided never to go back to school. Years later, I’ve learned more in the course of writing and researching the thousands of articles to date than in all the years of my formal education combined.

So, in 2012, when I found out that writer Kio Stark was crowdfunding a book that would serve as a manifesto for learning outside formal education, I eagerly chipped in. Now, Don’t Go Back to School: A Handbook for Learning Anything is out and is everything I could’ve wished for when I was in college, an essential piece of cultural literacy, at once tantalizing and practically grounded assurance that success doesn’t lie at the end of a single highway but is sprinkled along a thousand alternative paths. Stark describes it as “a radical project, the opposite of reform … not about fixing school [but] about transforming learning — and making traditional school one among many options rather than the only option.” Through a series of interviews with independent learners who have reached success and happiness in fields as diverse as journalism, illustration, and molecular biology, Stark — who herself dropped out of a graduate program at Yale, despite being offered a prestigious fellowship — cracks open the secret to defining your own success and finding your purpose outside the factory model of formal education. She notes the patterns that emerge:

People who forgo school build their own infrastructures. They create and borrow and reinvent the best that formal schooling has to offer, and they leave the worst behind. That buys them the freedom to learn on their own terms.

[…]

From their stories, you’ll see that when you step away from the prepackaged structure of traditional education, you’ll discover that there are many more ways to learn outside school than within.

Reflecting on her own exit from academia, Stark articulates a much more broadly applicable insight:

A gracefully executed quit is a beautiful thing, opening up more doors than it closes.

But despite discovering in dismay that “liberal arts graduate school is professional school for professors,” which she had no interest in becoming, Stark did learn something immensely valuable from her third year of independent study, during which she read about 200 books of her own choosing:

I learned how to teach myself. I had to make my own reading lists for the exams, which meant I learned how to take a subject I was interested in and make myself a map for learning it.

The interviews revealed four key common tangents: learning is collaborative rather than done alone; the importance of academic credentials in many professions is declining; the most fulfilling learning tends to take place outside of school; and those happiest about learning are those who learn out of intrinsic motivation rather than in pursuit of extrinsic rewards. The first of these insights, of course, appears on the surface to contradict the very notion of “independent learning,” but Stark offers an eloquent semantic caveat:

Independent learning suggests ideas such as “self-taught,” or “autodidact.” These imply that independence means working solo. But that’s just not how it happens. People don’t learn in isolation. When I talk about independent learners, I don’t mean people learning alone. I’m talking about learning that happens independent of schools.

[…]

Anyone who really wants to learn without school has to find other people to learn with and from. That’s the open secret of learning outside of school. It’s a social act. Learning is something we do together.

Independent learners are interdependent learners.

Much of the argument for formal education rests on statistics indicating that people with college and graduate degrees earn more. But those statistics, Stark notes, suffer an important and rarely heeded bias:

The problem is that this statistic is based on long-term data, gathered from a period of moderate loan debt, easy employability, and annual increases in the value of a college degree. These conditions have been the case for college grads for decades. Given the dramatically changed circumstances grads today face, we already know that the trends for debt, employability, and the value of a degree have all degraded, and we cannot assume the trend toward greater lifetime earnings will hold true for the current generation. This is a critical omission from media coverage. The fact is we do not know. There’s absolutely no guarantee it will hold true.

Some heartening evidence suggests the blind reliance on degrees might be beginning to change. Stark cites Zappos CEO Tony Hsieh:

I haven’t looked at a résumé in years. I hire people based on their skills and whether or not they are going to fit our culture.

Another common argument for formal education extols the alleged advantages of its structure, proposing that homework assignments, reading schedules, and regular standardized testing would motivate you to learn with greater rigor. But, as Daniel Pink has written about the psychology of motivation, in school, as in work, intrinsic drives far outweigh extrinsic, carrots-and-sticks paradigms of reward and punishment, rendering this argument unsound. Stark writes:

Learning outside school is necessarily driven by an internal engine. … [I]ndependent learners stick with the reading, thinking, making, and experimenting by which they learn because they do it for love, to scratch an itch, to satisfy curiosity, following the compass of passion and wonder about the world.

So how can you best fuel that internal engine of learning outside the depot of formal education? Stark offers an essential insight, which places self-discovery at the heart of acquiring external knowledge:

Learning your own way means finding the methods that work best for you and creating conditions that support sustained motivation. Perseverance, pleasure, and the ability to retain what you learn are among the wonderful byproducts of getting to learn using methods that suit you best and in contexts that keep you going. Figuring out your personal approach to each of these takes trial and error.

[…]

For independent learners, it’s essential to find the process and methods that match your instinctual tendencies as a learner. Everyone I talked to went through a period of experimenting and sorting out what works for them, and they’ve become highly aware of their own preferences. They’re clear that learning by methods that don’t suit them shuts down their drive and diminishes their enjoyment of learning. Independent learners also find that their preferred methods are different for different areas. So one of the keys to success and enjoyment as an independent learner is to discover how you learn.

[…]

School isn’t very good at dealing with the multiplicity of individual learning preferences, and it’s not very good at helping you figure out what works for you.

Echoing Neil deGrasse Tyson, who has argued that “every child is a scientist” since curiosity is coded into our DNA, and Sir Ken Robinson, who has lamented that the industrial model of education schools us out of our inborn curiosity, Stark observes:

Any young child you observe displays these traits. But passion and curiosity can be easily lost. School itself can be a primary cause; arbitrary motivators such as grades leave little room for variation in students’ abilities and interests, and fail to reward curiosity itself. There are also significant social factors working against children’s natural curiosity and capacity for learning, such as family support or the lack of it, or a degree of poverty that puts families in survival mode with little room to nurture curiosity.

Stark returns to the question of motivators that do work, once again calling to mind Pink’s advocacy of autonomy, mastery, and purpose as the trifecta of success. She writes:

[T]hree broadly defined elements of the learning experience support internal motivation and the persistence it enables. Internal motivation relies on learners having autonomy in their learning, a progressing sense of competence in their skills and knowledge, and the ability to learn in a concrete or “real world” context rather than in the abstract. These are mostly absent from classroom learning. Autonomy is rare, useful context is absent, and school’s means for affirming competence often feel so arbitrary as to be almost without use — and are sometimes actively demotivating. . . . [A]utonomy means that you follow your own path. You learn what you want to learn, when and how you want to learn it, for your own reasons. Your impetus to learn comes from within because you control the conditions of your learning rather than working within a structure that’s pre-made and inflexible.

The second thing you need to stick with learning independently is to set your own goals toward an increasing sense of competence. You need to create a feedback loop that confirms your work is worth it and keeps you moving forward. In school this is provided by advancing through the steps of the linear path within an individual class or a set curriculum, as well as from feedback from grades and praise.

But Stark found that outside of school, those most successful at learning sought their sense of competence through alternative sources. Many, like James Mangan advised in his 1936 blueprint to acquiring knowledge, solidified their learning by teaching it to other people, increasing their own sense of mastery and deepening their understanding. Others centered their learning around specific projects, which enabled them to make progress more modular and thus more attainable. Another cohort cited failure as an essential part of the road to mastery. Stark continues:

The third thing [that] can make or break your ability to sustain internal motivation … is to situate what you’re learning in a context that matters to you. In some cases, the context is a specific project you want to accomplish, which … also functions to support your sense of progress.

She sums up the failings of the establishment:

School is not designed to offer these three conditions; autonomy and context are sorely lacking in classrooms. School can provide a sense of increasing mastery, via grades and moving from introductory classes to harder ones. But a sense of true competence is harder to come by in a school environment. Fortunately, there are professors in higher education who are working to change the motivational structures that underlie their curricula.

The interviews, to be sure, offer a remarkably diverse array of callings, underpinned by a number of shared values and common characteristics. Computational biologist Florian Wagner, for instance, echoes Steve Jobs’s famous words on the secret of life in articulating a sentiment shared by many of the other interviewees:

There is something really special about when you first realize you can figure out really cool things completely on your own. That alone is a valuable lesson in life.

Investigative journalist Quinn Norton subscribes to Mangan’s prescription for learning by teaching:

I ended up teaching [my] knowledge to others at the school. That’s one of my most effective ways to learn, by teaching; you just have to stay a week ahead of your students. … Everything I learned, I immediately turned around and taught to others.

She also used the gift of ignorance to proactively drive her knowledge forward:

When I wanted to learn something new as a professional writer, I’d pitch a story on it. I was interested in neurology, and I figured, why don’t I start interviewing neurologists? The great thing about being a journalist is that you can pick up the phone and talk to anybody. It was just like what I found out about learning from experts on mailing lists. People like to talk about what they know.

Norton speaks to the usefulness of useless knowledge, not only in one’s own intellectual development but also as social currency:

I’m stuffed with trivial, useless knowledge, on a panoply of bizarre topics, so I can find something that they’re interested in that I know something about. Being able to do that is tremendously socially valuable. The exchange of knowledge is a very human way to learn. I try never to walk into a room where I want to get information without knowing what I’m bringing to the other person.

[…]

I think part of the problem with the usual mindset of the student is that it’s like being a sponge. It’s passive. It’s not about having something to bring to the interaction. People who are experts in things are experts because they like learning.

Software engineer, artist, and University of Texas molecular biologist Zack Booth Simpson speaks to the value of cultivating what William Gibson has called “a personal micro-culture” and learning from the people with whom you surround yourself:

In a way, the best education you can get is just talking with people who are really smart and interested in things, and you can get that for the cost of lunch.

Artist Molly Crabapple, who inked this beautiful illustration of Salvador Dalí’s creative credo and live-sketched Susan Cain’s talk on the power of introverts, recalls how self-initiated reading shaped her life:

I was … a constant reader. At home, I lived next to this thrift store that sold paperbacks for 10¢ apiece so I would go and buy massive stacks of paperback books on everything. Everything from trashy 1970s romance novels to Plato. When I went to Europe, I brought with me every single book that I didn’t think I would read voluntarily, because I figured if I was on a bus ride, I would read them. So I read Plato and Dante’s Inferno, and all types of literature. I got my education on the bus.

Originally featured in May — read more here.

* * *

For a subject-specific lens on the year’s finest reading, revisit the best books on best-of reading lists — which covered writing and creativity, photography, psychology and philosophy, art and design, history and biography, science and technology, children’s literature, and pets and animals.


Published December 23, 2013

https://www.themarginalian.org/2013/12/23/best-books-of-2013/

BP

www.themarginalian.org

BP

PRINT ARTICLE

Filed Under

View Full Site

The Marginalian participates in the Bookshop.org and Amazon.com affiliate programs, designed to provide a means for sites to earn commissions by linking to books. In more human terms, this means that whenever you buy a book from a link here, I receive a small percentage of its price, which goes straight back into my own colossal biblioexpenses. Privacy policy. (TLDR: You're safe — there are no nefarious "third parties" lurking on my watch or shedding crumbs of the "cookies" the rest of the internet uses.)