The human mind is obviously vaster and more powerful than any other animal mind, and that’s something people throughout all human history couldn’t help but notice. You probably considered this the last time you visited the zoo or watched a dog battle its own hind legs. Your kind seems the absolute pinnacle of what evolution can produce, maybe even the apex and final beautiful result of the universe unfolding itself. It is a delectable idea to entertain. Even before we had roller skates and Salvador Dalí, it was a conviction in which great thinkers liked to wallow. Of course, as soon as you settle into that thought, you’ll accidentally send an e-mail to your boss meant for your proctologist, or you’ll read a news story about how hot dog-stuffed pizza is now the most popular food in the country. It’s always true that whenever you look at the human condition and get a case of the smugs, a nice heaping helping of ridiculousness plops in your lap and remedies the matter.
This tendency of ours is known as “naïve realism” — the assertion that we see the world as it actually is and our impression of it is an objective, accurate representation of “reality” — a concept that comes from ancient philosophy and has since been amply debunked by modern science. McRaney writes:
The last one hundred years of research suggest that you, and everyone else, still believe in a form of naïve realism. You still believe that although your inputs may not be perfect, once you get to thinking and feeling, those thoughts and feelings are reliable and predictable. We now know that there is no way you can ever know an “objective” reality, and we know that you can never know how much of subjective reality is a fabrication, because you never experience anything other than the output of your mind. Everything that’s ever happened to you has happened inside your skull.
In sum, we are excellent at deluding ourselves, and terrible in recognizing when our own perceptions, attitudes, impressions, and opinions about the external world are altered from within. And one of the most remarkable of manifestations of this is the Benjamin Franklin Effect, which McRaney examines in the third chapter. The self-delusion in question is that we do nice things to people we like and bad things to those we dislike. But what the psychology behind the effect reveals is quite the opposite, a reverse-engineering of attitudes that takes place as we grow to like people for whom we do nice things and dislike those to whom we are unkind.
This curious effect is named after a specific incident early in the Founding Father’s political career. Franklin, born one of seventeen children to poor parents, entered this world — despite his parents’ and society’s priorities in his favor relative to his siblings — with very low odds of becoming an educated scientist, gentleman, scholar, entrepreneur, and, perhaps most of all, a man of significant political power. To compensate for his unfavorable givens, he quickly learned formidable people skills and became “a master of the game of personal politics.” McRaney writes:
Like many people full of drive and intelligence born into a low station, Franklin developed strong people skills and social powers. All else denied, the analytical mind will pick apart behavior, and Franklin became adroit at human relations. From an early age, he was a talker and a schemer, a man capable of guile, cunning, and persuasive charm. He stockpiled a cache of secret weapons, one of which was the Benjamin Franklin effect, a tool as useful today as it was in the 1730s and still just as counterintuitive.
At age twenty-one, he formed a “club of mutual improvement” called the Junto. It was a grand scheme to gobble up knowledge. He invited working-class polymaths like him to have the chance to pool together their books and trade thoughts and knowledge of the world on a regular basis. They wrote and recited essays, held debates, and devised ways to acquire currency. Franklin used the Junto as a private consulting firm, a think tank, and he bounced ideas off the other members so he could write and print better pamphlets. Franklin eventually founded the first subscription library in America, writing that it would make “the common tradesman and farmers as intelligent as most gentlemen from other countries,” not to mention give him access to whatever books he wanted to buy.
This is where his eponymous effect comes into play: When Franklin ran for his second term as a clerk, a peer whose name he never mentions in his autobiography delivered a long election speech censuring Franklin and tarnishing his reputation. Although Franklin won, he was furious with his opponent and, observing that this was “a gentleman of fortune and education” who might one day come to hold great power in government, rather concerned about future frictions with him.
The troll had to be tamed, and tamed shrewdly. McRaney writes:
Franklin set out to turn his hater into a fan, but he wanted to do it without “paying any servile respect to him.” Franklin’s reputation as a book collector and library founder gave him a standing as a man of discerning literary tastes, so Franklin sent a letter to the hater asking if he could borrow a specific selection from his library, one that was a “very scarce and curious book.” The rival, flattered, sent it right away. Franklin sent it back a week later with a thank-you note. Mission accomplished. The next time the legislature met, the man approached Franklin and spoke to him in person for the first time. Franklin said the man “ever after manifested a readiness to serve me on all occasions, so that we became great friends, and our friendship continued to his death.”
Instant pause-giver: In what universe does inducing an opponent to do you a favor magically turn him into a supporter? This, it turns out, shares a psychological basis with the reason why the art of asking is the art of cultivating community — and, McRaney explains, it has a lot to do with the psychology of attitudes, those clusters of convictions about and emotional impressions of a person or a situation:
For many things, your attitudes came from actions that led to observations that led to explanations that led to beliefs. Your actions tend to chisel away at the raw marble of your persona, carving into being the self you experience from day to day. It doesn’t feel that way, though. To conscious experience, it feels as if you were the one holding the chisel, motivated by existing thoughts and beliefs. It feels as though the person wearing your pants performed actions consistent with your established character, yet there is plenty of research suggesting otherwise. The things you do often create the things you believe.
At the lowest level, behavior-into-attitude conversion begins with impression management theory, which says you present to your peers the person you wish to be. You engage in something economists call signaling by buying and displaying to your peers the sorts of things that give you social capital… Whatever are the easiest-to-obtain, loudest forms of the ideals you aspire to portray become the things you own, such as bumper stickers signaling to the world you are in one group and not another. These things then influence you to become the sort of person who owns them.
Anxiety over being ostracized, over being an outsider, has driven the behavior of billions for millions of years. Impression management theory says you are always thinking about how you appear to others, even when there are no others around. In the absence of onlookers, deep in your mind a mirror reflects back that which you have done, and when you see a person who has behaved in a way that could get you booted from your in-group, the anxiety drives you to seek a realignment.
This brings us to the chicken-or-the-egg question of whether the belief or the display came first. According to self-perception theory, we are both observers and narrators of our own experience — we see ourselves do something and, unable to pin down our motive, we try to make sense of it by constructing a plausible story. We then form beliefs about ourselves based on observing our actions, as narrated by that story, which of course is based on our existing beliefs in the first place. This is what happened to Franklin’s nemesis: He observed himself performing an act of kindness toward Franklin, which he explained to himself by constructing the most plausible story — that he did so willfully, because he liked Franklin after all.
This, as we’ve previously seen in the way we rationalize our dishonesty, is an example of cognitive dissonance, a mental affliction that befalls us all as we struggle to reconcile conflicting ideas about ourselves, others, or a situation. McRaney points to the empirical evidence:
You can see the proof in an MRI scan of someone presented with political opinions that conflict with her own. The brain scans of a person shown statements that oppose her political stance show that the highest areas of the cortex, the portions responsible for providing rational thought, get less blood until another statement is presented that confirms her beliefs. Your brain literally begins to shut down when you feel your ideology is threatened.
One of the most vivid examples of this process in action comes from a Stanford study:
Students … signed up for a two-hour experiment called “Measures of Performance” as a requirement to pass a class. Researchers divided them into two groups. One was told they would receive $1 (about $8 in today’s money). The other group was told they would receive $20 (about $150 in today’s money). The scientists then explained that the students would be helping improve the research department by evaluating a new experiment. They were then led into a room where they had to use one hand to place wooden spools into a tray and remove them over and over again. A half hour later, the task changed to turning square pegs clockwise on a flat board one-quarter spin at a time for half an hour. All the while, an experimenter watched and scribbled. It was one hour of torturous tedium, with a guy watching and taking notes. After the hour was up, the researcher asked the student if he could do the school a favor on his way out by telling the next student scheduled to perform the tasks, who was waiting outside, that the experiment was fun and interesting. Finally, after lying, people in both groups — one with one dollar in their pocket and one with twenty dollars — filled out a survey in which they were asked their true feelings about the study.
Something extraordinary and baffling had happened: The students who were paid $20 lied to their peers but reported in the survey, as expected, that they’d just endured two hours of mind-numbing tedium. But those who were only paid a dollar completely internalized the lie, reporting even in the survey that they found the task stimulating. The first group, the researchers concluded, were able to justify both the tedium and the lie with the dollar amount of their compensation, but the second group, having been paid hardly anything, had no external justification and instead had to assuage their mental unease by convincing themselves that it was all inherently worth it. McRaney extends the insight to the broader question of volunteerism:
This is why volunteering feels good and unpaid interns work so hard. Without an obvious outside reward you create an internal one. That’s the cycle of cognitive dissonance; a painful confusion about who you are gets resolved by seeing the world in a more satisfying way.
This dynamic plays out in reverse as well — as the infamous Stanford Prison Experiment, being induced to perform unkind behaviors makes us develop unkind attitudes. It all brings us back to Franklin’s foe-turned-friend:
When you feel anxiety over your actions, you will seek to lower the anxiety by creating a fantasy world in which your anxiety can’t exist, and then you come to believe the fantasy is reality, just as Benjamin Franklin’s rival did. He couldn’t possibly have lent a rare book to a guy he didn’t like, so he must actually like him. Problem solved.
The Benjamin Franklin effect is the result of your concept of self coming under attack. Every person develops a persona, and that persona persists because inconsistencies in your personal narrative get rewritten, redacted, and misinterpreted. If you are like most people, you have high self-esteem and tend to believe you are above average in just about every way. It keeps you going, keeps your head above water, so when the source of your own behavior is mysterious you will confabulate a story that paints you in a positive light. If you are on the other end of the self-esteem spectrum and tend to see yourself as undeserving and unworthy [and] will rewrite nebulous behavior as the result of attitudes consistent with the persona of an incompetent person, deviant, or whatever flavor of loser you believe yourself to be. Successes will make you uncomfortable, so you will dismiss them as flukes. If people are nice to you, you will assume they have ulterior motives or are mistaken. Whether you love or hate your persona, you protect the self with which you’ve become comfortable. When you observe your own behavior, or feel the gaze of an outsider, you manipulate the facts so they match your expectations.
Indeed, Franklin noted in his autobiography: “He that has once done you a kindness will be more ready to do you another, than he whom you yourself have obliged.” McRaney leaves us with some grounding advice:
Pay attention to when the cart is getting before the horse. Notice when a painful initiation leads to irrational devotion, or when unsatisfying jobs start to seem worthwhile. Remind yourself pledges and promises have power, as do uniforms and parades. Remember in the absence of extrinsic rewards you will seek out or create intrinsic ones. Take into account [that] the higher the price you pay for your decisions the more you value them. See that ambivalence becomes certainty with time. Realize that lukewarm feelings become stronger once you commit to a group, club, or product. Be wary of the roles you play and the acts you put on, because you tend to fulfill the labels you accept. Above all, remember the more harm you cause, the more hate you feel. The more kindness you express, the more you come to love those you help.
You Are Now Less Dumb is excellent in its entirety, exploring such facets of our self-delusion as why we see patterns where there aren’t any, how we often confuse the origin of our emotional states, and more. Complement it with its prequel, then treat yourself to McRaney’s excellent podcast.
“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,”Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library), also among the year’s best psychology books — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.
Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:
Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.
By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.
This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.
For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:
Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.
The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:
Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.
In this way, the familiar becomes unfamiliar, and the old the new.
Horowitz’s approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.
First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)
I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.
In 1865, when he was only thirty, Mark Twain penned a playful short story mischievously encouraging girls to think independently rather than blindly obey rules and social mores. In the summer of 2011, I chanced upon and fell in love with a lovely Italian edition of this little-known gem with Victorian-scrapbook-inspired artwork by celebrated Russian-born children’s book illustrator Vladimir Radunsky. I knew the book had to come to life in English, so I partnered with the wonderful Claudia Zoe Bedrick of Brooklyn-based indie publishing house Enchanted Lion, maker of extraordinarily beautiful picture-books, and we spent the next two years bringing Advice to Little Girls (public library) to life in America — a true labor-of-love project full of so much delight for readers of all ages. (And how joyous to learn that it was also selected among NPR’s best books of 2013!)
Good little girls ought not to make mouths at their teachers for every trifling offense. This retaliation should only be resorted to under peculiarly aggravated circumstances.
If you have nothing but a rag-doll stuffed with sawdust, while one of your more fortunate little playmates has a costly China one, you should treat her with a show of kindness nevertheless. And you ought not to attempt to make a forcible swap with her unless your conscience would justify you in it, and you know you are able to do it.
One can’t help but wonder whether this particular bit may have in part inspired the irreverent 1964 anthology Beastly Boys and Ghastly Girls and its mischievous advice on brother-sister relations:
If at any time you find it necessary to correct your brother, do not correct him with mud — never, on any account, throw mud at him, because it will spoil his clothes. It is better to scald him a little, for then you obtain desirable results. You secure his immediate attention to the lessons you are inculcating, and at the same time your hot water will have a tendency to move impurities from his person, and possibly the skin, in spots.
If your mother tells you to do a thing, it is wrong to reply that you won’t. It is better and more becoming to intimate that you will do as she bids you, and then afterward act quietly in the matter according to the dictates of your best judgment.
Good little girls always show marked deference for the aged. You ought never to ‘sass’ old people unless they ‘sass’ you first.
Originally featured in April — see more spreads, as well as the story behind the project, here.
The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology, genetics, computer science, neurophysiology, psychology, cosmology, and physics. Emerging out of these contributions is a new natural philosophy, new ways of understanding physical systems, new ways of thinking that call into question many of our basic assumptions.
Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple principles in a surprising way. These explanations are called ‘beautiful’ or ‘elegant.’
The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining knowledge about anything — including such fields of inquiry as philosophy, mathematics, economics, history, language, and human behavior. The common thread is that a simple and nonobvious idea is proposed as the explanation of a diverse and complicated set of phenomena.
Observe a single ant, and it doesn’t make much sense, walking in one direction, suddenly careening in another for no obvious reason, doubling back on itself. Thoroughly unpredictable.
The same happens with two ants, a handful of ants. But a colony of ants makes fantastic sense. Specialized jobs, efficient means of exploiting new food sources, complex underground nests with temperature regulated within a few degrees. And critically, there’s no blueprint or central source of command—each individual ants has algorithms for their behaviors. But this is not wisdom of the crowd, where a bunch of reasonably informed individuals outperform a single expert. The ants aren’t reasonably informed about the big picture. Instead, the behavior algorithms of each ant consist of a few simple rules for interacting with the local environment and local ants. And out of this emerges a highly efficient colony.
Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple rules about when to lay down a pheromone trail and what to do when encountering someone else’s trail—approximations of optimal solutions to the Traveling Salesman problem. This has useful applications. In “ant-based routing,” simulations using virtual ants with similar rules can generate optimal ways of connecting the nodes in a network, something of great interest to telecommunications companies. It applies to the developing brain, which must wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of connecting axons. And migrating fetal neurons generate an efficient solution with a different version of ant-based routine.
A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple molecules in an organic soup to occasionally form more complex ones. Life may have originated this way without the requirement of bolts of lightning to catalyze the formation of complex molecules.
And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems don’t require a blue print, they don’t require a blue print maker. If they don’t require lightning bolts, they don’t require Someone hurtling lightning bolts.
In a planet occupied now by seven billion inhabitants, I am amazed by the difference that one human being can make. Think of classical music without Mozart or Stravinsky; of painting without Caravaggio, Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible contributions of Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs (or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses or Christ.
Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of single individuals, or of small groups, working against the odds. As scholars, we cannot and should not sweep these instances under the investigative rug. We should bear in mind anthropologist Margaret Mead’s famous injunction: ‘Never doubt that a small group of thoughtful committed citizens can change the world. It is the only thing that ever has.’
In art, the title of a work can often be its first explanation. And in this context I am thinking especially of the titles of Gerhard Richter. In 2006, when I visited Richter in his studio in Cologne, he had just finished a group of six corresponding abstract paintings which he gave the title Cage.
There are many relations between Richter’s painting and the compositions of John Cage. In a book about the Cage series, Robert Storr has traced them from Richter‘s attendance of a Cage performance at the Festum Fluxorum Fluxus in Düsseldorf 1963 to analogies in their artistic processes. Cage has often applied chance procedures in his compositions, notably with the use of the I Ching. Richter in his abstract paintings also intentionally allows effects of chance. In these paintings, he applies the oil paint on the canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the paint leaves on the canvas is to a large extent the outcome of chance.
Richter‘s concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other works)—but, one can say, the short form already contains everything. The title, like an explanation of a phenomenon, unlocks the works, describing their relation to one of the most important cultural figures of the twentieth century, John Cage, who shares with Richter the great themes of chance and uncertainty.
Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains indispensable, offers a lyrical meditation on the peculiar odds behind coincidences and déja vus:
I take comfort in the fact that there are two human moments that seem to be doled out equally and democratically within the human condition—and that there is no satisfying ultimate explanation for either. One is coincidence, the other is déja vu. It doesn’t matter if you’re Queen Elizabeth, one of the thirty-three miners rescued in Chile, a South Korean housewife or a migrant herder in Zimbabwe—in the span of 365 days you will pretty much have two déja vus as well as one coincidence that makes you stop and say, “Wow, that was a coincidence.”
The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can happen at any given moment, the fact is, that in practice, coincidences almost never do occur. Coincidences are actually so rare that when they do occur they are, in fact memorable. This suggests to me that the universe is designed to ward off coincidence whenever possible—the universe hates coincidence—I don’t know why—it just seems to be true. So when a coincidence happens, that coincidence had to work awfully hard to escape the system. There’s a message there. What is it? Look. Look harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a theorem for something larger than what they think it is.
What’s both eerie and interesting to me about déja vus is that they occur almost like metronomes throughout our lives, about one every six months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely assume that my thirteen year old niece, Stephen Hawking and someone working in a Beijing luggage-making factory each experience two déja vus a year. Not one. Not three. Two.
The underlying biodynamics of déja vus is probably ascribable to some sort of tingling neurons in a certain part of the brain, yet this doesn’t tell us why they exist. They seem to me to be a signal from larger point of view that wants to remind us that our lives are distinct, that they have meaning, and that they occur throughout a span of time. We are important, and what makes us valuable to the universe is our sentience and our curse and blessing of perpetual self-awareness.
Given my soft spot for famous diaries, it should come as no surprise that I keep one myself. Perhaps the greatest gift of the practice has been the daily habit of reading what I had written on that day a year earlier; not only is it a remarkable tool of introspection and self-awareness, but it also illustrates that our memory “is never a precise duplicate of the original [but] a continuing act of creation” and how flawed our perception of time is — almost everything that occurred a year ago appears as having taken place either significantly further in the past (“a different lifetime,” I’d often marvel at this time-illusion) or significantly more recently (“this feels like just last month!”). Rather than a personal deficiency of those of us befallen by this tendency, however, it turns out to be a defining feature of how the human mind works, the science of which is at first unsettling, then strangely comforting, and altogether intensely interesting.
That’s precisely what acclaimed BBC broadcaster and psychology writer Claudia Hammond explores in Time Warped: Unlocking the Mysteries of Time Perception (public library) — a fascinating foray into the idea that our experience of time is actively created by our own minds and how these sensations of what neuroscientists and psychologists call “mind time” are created, and also among the year’s best psychology books. As disorienting as the concept might seem — after all, we’ve been nursed on the belief that time is one of those few utterly reliable and objective things in life — it is also strangely empowering to think that the very phenomenon depicted as the unforgiving dictator of life is something we might be able to shape and benefit from. Hammond writes:
We construct the experience of time in our minds, so it follows that we are able to change the elements we find troubling — whether it’s trying to stop the years racing past, or speeding up time when we’re stuck in a queue, trying to live more in the present, or working out how long ago we last saw our old friends. Time can be a friend, but it can also be an enemy. The trick is to harness it, whether at home, at work, or even in social policy, and to work in line with our conception of time. Time perception matters because it is the experience of time that roots us in our mental reality. Time is not only at the heart of the way we organize life, but the way we experience it.
Among the most intriguing illustrations of “mind time” is the incredible elasticity of how we experience time. (“Where is it, this present?,”William James famously wondered. “It has melted in our grasp, fled ere we could touch it, gone in the instant of becoming.”) For instance, Hammond points out, we slow time down when gripped by mortal fear — the cliche about the slow-motion car crash is, in fact, a cognitive reality. This plays out even in situations that aren’t life-or-death per se but are still associated with strong feelings of fear. Hammond points to a study in which people with arachnophobia were asked to look at spiders — the very object of their intense fear — for 45 seconds and they overestimated the elapsed time. The same pattern was observed in novice skydivers, who estimated the duration of their peers’ falls as short, whereas their own, from the same altitude, were deemed longer.
Inversely, time seems to speed up as we get older — a phenomenon of which competing theories have attempted to make light. One, known as the “proportionality theory,” uses pure mathematics, holding that a year feels faster when you’re 40 than when you’re 8 because it only constitutes one fortieth of your life rather than a whole eighth. Among its famous proponents are Vladimir Nabokov and William James. But Hammond remains unconvinced:
The problem with the proportionality theory is that it fails to account for the way we experience time at any one moment. We don’t judge one day in the context of our whole lives. If we did, then for a 40-year-old every single day should flash by because it is less than one fourteen-thousandth of the life they’ve had so far. It should be fleeting and inconsequential, yet if you have nothing to do or an enforced wait at an airport for example, a day at 40 can still feel long and boring and surely longer than a fun day at the seaside packed with adventure for a child. … It ignores attention and emotion, which … can have a considerable impact on time perception.
Another theory suggests that perhaps it is the tempo of life in general that has accelerated, making things from the past appear as slower, including the passage of time itself.
But one definite change does take place with age: As we grow older, we tend to feel like the previous decade elapsed more rapidly, while the earlier decades of our lives seem to have lasted longer. Similarly, we tend to think of events that took place in the past 10 years as having happened more recently than they actually did. (Quick: What year did the devastating Japanese tsunami hit? When did we lose Maurice Sendak?) Conversely, we perceive events that took place more than a decade ago as having happened even longer ago. (When did Princess Diana die? What year was the Chernobyl disaster?) This, Hammond points out, is known as “forward telescoping”:
It is as though time has been compressed and — as if looking through a telescope — things seem closer than they really are. The opposite is called backward or reverse telescoping, also known as time expansion. This is when you guess that events happened longer ago than they really did. This is rare for distant events, but not uncommon for recent weeks.
The most straightforward explanation for it is called the clarity of memory hypothesis, proposed by the psychologist Norman Bradburn in 1987. This is the simple idea that because we know that memories fade over time, we use the clarity of a memory as a guide to its recency. So if a memory seems unclear we assume it happened longer ago.
“Still this childish fascination with my handwriting,” young Susan Sontag wrote in her diary in 1949. “To think that I always have this sensuous potentiality glowing within my fingers.” This is the sort of sensuous potentiality that comes aglow in Self-Portrait as Your Traitor (public library) — the magnificent collection of hand-lettered poems and illustrated essays by friend-of-Brain-Pickings and frequentcontributorDebbie Millman. In the introduction, design legend Paula Scher aptly describes this singular visual form as a “21st-century illuminated manuscript.” Personal bias aside, these moving, lovingly crafted poems and essays — some handwritten, some drawn with colored pencils, some typeset in felt on felt — vibrate at that fertile intersection of the deeply personal and the universally profound.
John Maeda once explained, “The computer will do anything within its abilities, but it will do nothing unless commanded to do so.” I think people are the same — we like to operate within our abilities. But whereas the computer has a fixed code, our abilities are limited only by our perceptions. Two decades since determining my code, and after 15 years of working in the world of branding, I am now in the process of rewriting the possibilities of what comes next. I don’t know exactly what I will become; it is not something I can describe scientifically or artistically. Perhaps it is a “code in progress.”
Self-Portrait as Your Traitor, a glorious large-format tome full of textured colors to which the screen does absolutely no justice, is the result of this progress — a brave and heartening embodiment of what it truly means, as Rilke put it, to live the questions; the stunning record of one woman’s personal and artistic code-rewriting, brimming with wisdom on life and art for all.
Originally featured in November. See an exclusive excerpt here, then take a peek at Debbie’s creative process here.
6. SUSAN SONTAG: THE COMPLETE ROLLING STONE INTERVIEW
In 1978, Rolling Stone contributing editor Jonathan Cott interviewed Susan Sontag in twelve hours of conversation, beginning in Paris and continuing in New York, only a third of which was published in the magazine. More than three decades later and almost a decade after Sontag’s death, the full, wide-ranging magnificence of their tête-à-tête, spanning literature, philosophy, illness, mental health, music, art, and much more, is at last released in Susan Sontag: The Complete Rolling Stone Interview (public library) — a rare glimpse of one of modern history’s greatest minds in her element.
Cott marvels at what made the dialogue especially extraordinary:
Unlike almost any other person whom I’ve ever interviewed — the pianist Glenn Gould is the one other exception — Susan spoke not in sentences but in measured and expansive paragraphs. And what seemed most striking to me was the exactitude and “moral and linguistic fine-tuning” — as she once described Henry James’s writing style—with which she framed and elaborated her thoughts, precisely calibrating her intended meanings with parenthetical remarks and qualifying words (“sometimes,” “occasionally,” “usually,” “for the most part,” “in almost all cases”), the munificence and fluency of her conversation manifesting what the French refer to as an ivresse du discours — an inebriation with the spoken word. “I am hooked on talk as a creative dialogue,” she once remarked in her journals, and added: “For me, it’s the principal medium of my salvation.
I really believe in history, and that’s something people don’t believe in anymore. I know that what we do and think is a historical creation. I have very few beliefs, but this is certainly a real belief: that most everything we think of as natural is historical and has roots — specifically in the late eighteenth and early nineteenth centuries, the so-called Romantic revolutionary period — and we’re essentially still dealing with expectations and feelings that were formulated at that time, like ideas about happiness, individuality, radical social change, and pleasure. We were given a vocabulary that came into existence at a particular historical moment. So when I go to a Patti Smith concert at CBGB, I enjoy, participate, appreciate, and am tuned in better because I’ve read Nietzsche.
What I want is to be fully present in my life — to be really where you are, contemporary with yourself in your life, giving full attention to the world, which includes you. You are not the world, the world is not identical to you, but you’re in it and paying attention to it. That’s what a writer does — a writer pays attention to the world. Because I’m very against this solipsistic notion that you find it all in your head. You don’t, there really is a world that’s there whether you’re in it or not.
I want to feel as responsible as I possibly can. As I told you before, I hate feeling like a victim, which not only gives me no pleasure but also makes me feel very uncomfortable. Insofar as it’s possible, and not crazy, I want to enlarge to the furthest extent possible my sense of my own autonomy, so that in friendship and love relationships I’m eager to take responsibility for both the good and the bad things. I don’t want this attitude of “I was so wonderful and that person did me in.” Even when it’s sometimes true, I’ve managed to convince myself that I was at least co-responsible for bad things that have happened to me, because it actually makes me feel stronger and makes me feel that things could perhaps be different.
The conversation, in which Sontag reaches unprecedented depths of self-revelation, also debunks some misconceptions about her public image as an intellectual in the dry, scholarly sense of the term:
Most of what I do, contrary to what people think, is so intuitive and unpremeditated and not at all that kind of cerebral, calculating thing people imagine it to be. I’m just following my instincts and intuitions. […] An argument appears to me much more like the spokes of a wheel than the links of a chain.
A lot of our ideas about what we can do at different ages and what age means are so arbitrary — as arbitrary as sexual stereotypes. I think that the young-old polarization and the male-female polarization are perhaps the two leading stereotypes that imprison people. The values associated with youth and with masculinity are considered to be the human norms, and anything else is taken to be at least less worthwhile or inferior. Old people have a terrific sense of inferiority. They’re embarrassed to be old. What you can do when you’re young and what you can do when you’re old is as arbitrary and without much basis as what you can do if you’re a woman or what you can do if you’re a man.
Originally featured in November — take a closer look here and here.
Cautious that “all advice can only be a product of the man who gives it” — a caveat other literary legends have stressed with varying degrees of irreverence — Thompson begins with a necessary disclaimer about the very notion of advice-giving:
To give advice to a man who asks what to do with his life implies something very close to egomania. To presume to point a man to the right and ultimate goal — to point with a trembling finger in the RIGHT direction is something only a fool would take upon himself.
And yet he honors his friend’s request, turning to Shakespeare for an anchor of his own advice:
“To be, or not to be: that is the question: Whether ’tis nobler in the mind to suffer the slings and arrows of outrageous fortune, or to take arms against a sea of troubles…”
And indeed, that IS the question: whether to float with the tide, or to swim for a goal. It is a choice we must all make consciously or unconsciously at one time in our lives. So few people understand this! Think of any decision you’ve ever made which had a bearing on your future: I may be wrong, but I don’t see how it could have been anything but a choice however indirect — between the two things I’ve mentioned: the floating or the swimming.
He acknowledges the obvious question of why not take the path of least resistance and float aimlessly, then counters it:
The answer — and, in a sense, the tragedy of life — is that we seek to understand the goal and not the man. We set up a goal which demands of us certain things: and we do these things. We adjust to the demands of a concept which CANNOT be valid. When you were young, let us say that you wanted to be a fireman. I feel reasonably safe in saying that you no longer want to be a fireman. Why? Because your perspective has changed. It’s not the fireman who has changed, but you.
Every man is the sum total of his reactions to experience. As your experiences differ and multiply, you become a different man, and hence your perspective changes. This goes on and on. Every reaction is a learning process; every significant experience alters your perspective.
So it would seem foolish, would it not, to adjust our lives to the demands of a goal we see from a different angle every day? How could we ever hope to accomplish anything other than galloping neurosis?
The answer, then, must not deal with goals at all, or not with tangible goals, anyway. It would take reams of paper to develop this subject to fulfillment. God only knows how many books have been written on “the meaning of man” and that sort of thing, and god only knows how many people have pondered the subject. (I use the term “god only knows” purely as an expression.)* There’s very little sense in my trying to give it up to you in the proverbial nutshell, because I’m the first to admit my absolute lack of qualifications for reducing the meaning of life to one or two paragraphs.
To put our faith in tangible goals would seem to be, at best, unwise. So we do not strive to be firemen, we do not strive to be bankers, nor policemen, nor doctors. WE STRIVE TO BE OURSELVES.
But don’t misunderstand me. I don’t mean that we can’t BE firemen, bankers, or doctors—but that we must make the goal conform to the individual, rather than make the individual conform to the goal. In every man, heredity and environment have combined to produce a creature of certain abilities and desires—including a deeply ingrained need to function in such a way that his life will be MEANINGFUL. A man has to BE something; he has to matter.
As I see it then, the formula runs something like this: a man must choose a path which will let his ABILITIES function at maximum efficiency toward the gratification of his DESIRES. In doing this, he is fulfilling a need (giving himself identity by functioning in a set pattern toward a set goal) he avoids frustrating his potential (choosing a path which puts no limit on his self-development), and he avoids the terror of seeing his goal wilt or lose its charm as he draws closer to it (rather than bending himself to meet the demands of that which he seeks, he has bent his goal to conform to his own abilities and desires).
In short, he has not dedicated his life to reaching a pre-defined goal, but he has rather chosen a way of life he KNOWS he will enjoy. The goal is absolutely secondary: it is the functioning toward the goal which is important. And it seems almost ridiculous to say that a man MUST function in a pattern of his own choosing; for to let another man define your own goals is to give up one of the most meaningful aspects of life — the definitive act of will which makes a man an individual.
Noting that his friend had thus far lived “a vertical rather than horizontal existence,” Thompson acknowledges the challenge of this choice but admonishes that however difficult, the choice must be made or else it melts away into those default modes of society:
A man who procrastinates in his CHOOSING will inevitably have his choice made for him by circumstance. So if you now number yourself among the disenchanted, then you have no choice but to accept things as they are, or to seriously seek something else. But beware of looking for goals: look for a way of life. Decide how you want to live and then see what you can do to make a living WITHIN that way of life. But you say, “I don’t know where to look; I don’t know what to look for.”
And there’s the crux. Is it worth giving up what I have to look for something better? I don’t know — is it? Who can make that decision but you? But even by DECIDING TO LOOK, you go a long way toward making the choice.
He ends by returning to his original disclaimer by reiterating that rather than a prescription for living, his “advice” is merely a reminder that how and what we choose — choices we’re in danger of forgetting even exist — shapes the course and experience of our lives:
I’m not trying to send you out “on the road” in search of Valhalla, but merely pointing out that it is not necessary to accept the choices handed down to you by life as you know it. There is more to it than that — no one HAS to do something he doesn’t want to do for the rest of his life.
Though most of his 77 “intuition pumps” address concrete questions, a dozen are “general-purpose” tools that apply deeply and widely, across just about any domain of thinking. The first of them is also arguably the most useful yet most uncomfortable: making mistakes.
The history of philosophy is in large measure the history of very smart people making very tempting mistakes, and if you don’t know the history, you are doomed to making the same darn mistakes all over again. … There is no such thing as philosophy-free science, just science that has been conducted without any consideration of its underlying philosophical assumptions.
He speaks for the generative potential of mistakes and their usefulness as an empirical tool:
Sometimes you don’t just want to risk making mistakes; you actually want to make them — if only to give you something clear and detailed to fix.
Therein lies the power of mistakes as a vehicle for, as Rilke famously put it, “living the questions” and thus advancing knowledge in a way that certainty cannot — for, as Richard Feynman memorably noted, the scientist’s job is to remain unsure, and so seems the philosopher’s. Dennett writes:
We philosophers are mistake specialists. … While other disciplines specialize in getting the right answers to their defining questions, we philosophers specialize in all the ways there are of getting things so mixed up, so deeply wrong, that nobody is even sure what the right questions are, let alone the answers. Asking the wrong questions risks setting any inquiry off on the wrong foot. Whenever that happens, this is a job for philosophers! Philosophy — in every field of inquiry — is what you have to do until you figure out what questions you should have been asking in the first place.
Mistakes are not just opportunities for learning; they are, in an important sense, the only opportunity for learning or making something truly new. Before there can be learning, there must be learners. There are only two non-miraculous ways for learners to come into existence: they must either evolve or be designed and built by learners that evolved. Biological evolution proceeds by a grand, inexorable process of trial and error — and without the errors the trials wouldn’t accomplish anything.
Trials can be either blind or foresighted. You, who know a lot, but not the answer to the question at hand, can take leaps — foresighted leaps. You can look before you leap, and hence be somewhat guided from the outset by what you already know. You need not be guessing at random, but don’t look down your nose at random guesses; among its wonderful products is … you!
And since evolution is the highest epitome of how the process of trial and error drives progress, Dennett makes a case for understanding evolution as a key to understanding everything else we humans value:
Evolution … is the central, enabling process not only of life but also of knowledge and learning and understanding. If you attempt to make sense of the world of ideas and meanings, free will and morality, art and science and even philosophy itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back. … For evolution, which knows nothing, the steps into novelty are blindly taken by mutations, which are random copying “errors” in DNA.
Dennett echoes Dostoyevsky (“Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others.”) and offers the key to making productive mistakes:
The chief trick to making good mistakes is not to hide them — especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own mistakes, turning them over in your mind as if they were works of art, which in a way they are. … The trick is to take advantage of the particular details of the mess you’ve made, so that your next attempt will be informed by it and not just another blind stab in the dark.
We have all heard the forlorn refrain “Well, it seemed like a good idea at the time!” This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say, “Well, it seemed like a good idea at the time!” is standing on the threshold of brilliance.
After Caroline crashes an experimental plane she was piloting, she finds herself severely injured and spiraling into the depths of depression. It both helps and doesn’t that Caroline and Wendy have just fallen in love, soaring in the butterfly heights of new romance, “the phase of love that didn’t obey any known rules of physics,” until the crash pulls them into a place that would challenge even the most seasoned and grounded of relationships. And yet they persevere as Wendy patiently and lovingly takes care of Caroline.
When Caroline returns from the hospital with a shattered ankle, her two thirteen-year-old tabbies — the shy, anxious Tibby (short for Tibia, affectionately — and, in these circumstances, ironically — named after the shinbone) and the sociable, amicable Fibby (short for Fibula, after the calf bone on the lateral side of the tibia) — are, short of Wendy, her only joy and comfort:
Tibia and Fibula meowed happily when I arrived. They were undaunted by my ensuing stupor. In fact they were delighted; suddenly I had become a human who didn’t shout into a small rectangle of lights and plastic in her hand, peer at a computer, or get up and disappear from the vicinity, only to reappear through the front door hours later. Instead, I was completely available to them at all times. Amazed by their good luck, they took full feline advantage. They asked for ear scratches and chin rubs. They rubbed their whiskers along my face. They purred in response to my slurred, affectionate baby talk. But mostly they just settled in and went to sleep. Fibby snored into my neck. Tibby snored on the rug nearby. Meanwhile I lay awake, circling the deep dark hole of depression.
Without my cats, I would have fallen right in.
And then, one day, Tibby disappears.
Wendy and Caroline proceed to flyer the neighborhood, visit every animal shelter in the vicinity, and even, in their desperation, enlist the help of a psychic who specializes in lost pets — but to no avail. Heartbroken, they begin to mourn Tibby’s loss.
And then, one day five weeks later, Tibby reappears. But once the initial elation of the recovery has worn off, Caroline begins to wonder where he’d been and why he’d left. He is now no longer eating at home and regularly leaves the house for extended periods of time — Tibby clearly has a secret place he now returns to. Even more worrisomely, he’s no longer the shy, anxious tabby he’d been for thirteen years — instead, he’s a half pound heavier, chirpy, with “a youthful spring in his step.” But why would a happy cat abandon his loving lifelong companion and find comfort — find himself, even — elsewhere?
When the relief that my cat was safe began to fade, and the joy of his prone, snoring form — sprawled like an athlete after a celebratory night of boozing — started to wear thin, I was left with darker emotions. Confusion. Jealousy. Betrayal. I thought I’d known my cat of thirteen years. But that cat had been anxious and shy. This cat was a swashbuckling adventurer back from the high seas. What siren call could have lured him away? Was he still going to this gilded place, with its overflowing food bowls and endless treats?
There only one obvious thing left to do: Track Tibby on his escapades. So Caroline, despite Wendy’s lovingly suppressed skepticism, heads to a spy store — yes, those exist — and purchases a real-time GPS tracker, complete with a camera that they program to take snapshots every few minutes, which they then attach to Tibby’s collar.
What follows is a wild, hilarious, and sweet tale of tinkering, tracking, and tenderness. Underpinning the obsessive quest is the subtle yet palpable subplot of Wendy and Caroline’s growing love for each other, the deepening of trust and affection that happens when two people share in a special kind of insanity.
“Evert quest is a journey, every journey a story. Every story, in turn, has a moral,” writes Caroline in the final chapter, then offers several “possible morals” for the story, the last of which embody everything that makes Lost Cat an absolute treat from cover to cover:
6. You can never know your cat. In fact, you can never know anyone as completely as you want.
In 1976, Italian artist, architect, and designer Luigi Serafini, only 27 at the time, set out to create an elaborate encyclopedia of imaginary objects and creatures that fell somewhere between Edward Gorey’s cryptic alphabets, Albertus Seba’s cabinet of curiosities, the book of surrealist games, and Alice in Wonderland. What’s more, it wasn’t written in any ordinary language but in an unintelligible alphabet that appeared to be a conlang — an undertaking so complex it constitutes one of the highest feats of cryptography. It took him nearly three years to complete the project, and three more to publish it, but when it was finally released, the book — a weird and wonderful masterpiece of art and philosophical provocation on the precipice of the information age — attracted a growing following that continued to gather momentum even as the original edition went out of print.
Now, for the first time in more than thirty years, Codex Seraphinianus (public library) is resurrected in a lavish new edition by Rizzoli — who have a penchant for excavating forgotten gems — featuring a new chapter by Serafini, now in his 60s, and a gorgeous signed print with each deluxe tome. Besides a visual masterwork, it’s also a timeless meditation on what “reality” really is, one all the timelier in today’s age of such seemingly surrealist feats as bioengineering whole new lifeforms, hurling subatomic particles at each other faster than the speed of light, and encoding an entire book onto a DNA molecule.
In an interview for Wired Italy, Serafini aptly captures the subtle similarity to children’s books in how the Codex bewitches our grown-up fancy with its bizarre beauty:
What I want my alphabet to convey to the reader is the sensation that children feel in front of books they cannot yet understand. I used it to describe analytically an imaginary world and give a coherent framework. The images originate from the clash between this fantasy vocabulary and the real world. … The Codex became so popular because it makes you feel more comfortable with your fantasies. Another world is not possible, but a fantasy one maybe is.
The [new] edition is very rich and also pricey, I know, but it’s just like psychoanalysis: Money matters and the fee is part of the process of healing. At the end of the day, the Codex is similar to the Rorschach inkblot test. You see what you want to see. You might think it’s speaking to you, but it’s just your imagination.
It tells the story of two brothers, Jack and Guy, torn asunder when a falling star crashes onto Earth. Though on the surface about the beloved author’s own brother Jack, who died 18 years ago, the story is also about the love of Sendak’s life and his partner of fifty years, psychoanalyst Eugene Glynn, whose prolonged illness and eventual loss in 2007 devastated Sendak — the character of Guy reads like a poetic fusion of Sendak and Glynn. And while the story might be a universal “love letter to those who have gone before,” as NPR’s Renee Montagne suggests in Morning Edition, it is in equal measure a private love letter to Glynn. (Sendak passed away the day before President Obama announced his support for same-sex marriage, but Sendak fans were quick to honor both historic moments with a bittersweet homage.)
Indeed, the theme of all-consuming love manifests viscerally in Sendak’s books. Playwright Tony Kushner, a longtime close friend of Sendak’s and one of his most heartfelt mourners, tells NPR:
There’s a lot of consuming and devouring and eating in Maurice’s books. And I think that when people play with kids, there’s a lot of fake ferocity and threats of, you know, devouring — because love is so enormous, the only thing you can think of doing is swallowing the person that you love entirely.
My Brother’s Book ends on a soul-stirring note, tender and poignant in its posthumous light:
And Jack slept safe
Enfolded in his brother’s arms
And Guy whispered ‘Good night
And you will dream of me.’
No female reporter before her had ever seemed quite so audacious, so willing to risk personal safety in pursuit of a story. In her first exposé for The World, Bly had gone undercover … feigning insanity so that she might report firsthand on the mistreatment of the female patients of the Blackwell’s Island Insane Asylum. … Bly trained with the boxing champion John L. Sullivan; she performed, with cheerfulness but not much success, as a chorus girl at the Academy of Music (forgetting the cue to exit, she momentarily found herself all alone onstage). She visited with a remarkable deaf, dumb, and blind nine-year-old girl in Boston by the name of Helen Keller. Once, to expose the workings of New York’s white slave trade, she even bought a baby. Her articles were by turns lighthearted and scolding and indignant, some meant to edify and some merely to entertain, but all were shot through with Bly’s unmistakable passion for a good story and her uncanny ability to capture the public’s imagination, the sheer force of her personality demanding that attention be paid to the plight of the unfortunate, and, not incidentally, to herself.
For all her extraordinary talent and work ethic, Bly’s appearance was decidedly unremarkable — a fact that shouldn’t matter, but one that would be repeatedly remarked upon by her critics and commentators, something we’ve made sad little progress on in discussing women’s professional, intellectual, and creative merit more than a century later. Goodman paints a portrait of Bly:
She was a young woman in a plaid coat and cap, neither tall nor short, dark nor fair, not quite pretty enough to turn a head: the sort of woman who could, if necessary, lose herself in a crowd.
Her voice rang with the lilt of the hill towns of western Pennsylvania; there was an unusual rising inflection at the ends of her sentences, the vestige of an Elizabethan dialect that had still been spoken in the hills when she was a girl. She had piercing gray eyes, though sometimes they were called green, or blue-green, or hazel. Her nose was broad at its base and delicately upturned at the end — the papers liked to refer to it as a “retroussé” nose — and it was the only feature about which she was at all self-conscious. She had brown hair that she wore in bangs across her forehead. Most of those who knew her considered her pretty, although this was a subject that in the coming months would be hotly debated in the press.
But, as if the ambitious adventure weren’t scintillating enough, the story takes an unexpected turn: That fateful November morning, as Bly was making her way to the journey’s outset at the Hoboken docks, a man named John Brisben Walker passed her on a ferry in the opposite direction, traveling from Jersey City to Lower Manhattan. He was the publisher of a high-brow magazine titled The Cosmopolitan, the same publication that decades later, under the new ownership of William Randolph Hearst, would take a dive for the commercially low-brow. On his ferry ride, Walker skimmed that morning’s edition of The World and paused over the front-page feature announcing Bly’s planned adventure around the world. A seasoned media manipulator of the public’s voracious appetite for drama, he instantly birthed an idea that would seize upon a unique publicity opportunity — The Cosmopolitan would send another circumnavigator to race against Bly. To keep things equal, it would have to be a woman. To keep them interesting, she’d travel in the opposite direction.
And so it went:
Elizabeth Bisland was twenty-eight years old, and after nearly a decade of freelance writing she had recently obtained a job as literary editor of The Cosmopolitan, for which she wrote a monthly review of recently published books entitled “In the Library.” Born into a Louisiana plantation family ruined by the Civil War and its aftermath, at the age of twenty she had moved to New Orleans and then, a few years later, to New York, where she contributed to a variety of magazines and was regularly referred to as the most beautiful woman in metropolitan journalism. Bisland was tall, with an elegant, almost imperious bearing that accentuated her height; she had large dark eyes and luminous pale skin and spoke in a low, gentle voice. She reveled in gracious hospitality and smart conversation, both of which were regularly on display in the literary salon that she hosted in the little apartment she shared with her sister on Fourth Avenue, where members of New York’s creative set, writers and painters and actors, gathered to discuss the artistic issues of the day. Bisland’s particular combination of beauty, charm, and erudition seems to have been nothing short of bewitching.
She took pride in the fact that she had arrived in New York with only fifty dollars in her pocket, and that the thousands of dollars now in her bank account had come by virtue of her own pen. Capable of working for eighteen hours at a stretch, she wrote book reviews, essays, feature articles, and poetry in the classical vein. She was a believer, more than anything else, in the joys of literature, which she had first experienced as a girl in ancient volumes of Shakespeare and Cervantes that she found in the library of her family’s plantation house. (She taught herself French while she churned butter, so that she might read Rousseau’s Confessions in the original — a book, as it turned out, that she hated.) She cared nothing for fame, and indeed found the prospect of it distasteful.
And yet, despite their competitive circumstances and seemingly divergent dispositions, something greater bound the two women together, some ineffable force of culture that quietly united them in a bold defiance of their era’s normative biases:
On the surface the two women … were about as different as could be: one woman a Northerner, the other from the South; one a scrappy, hard-driving crusader, the other priding herself on her gentility; one seeking out the most sensational of news stories, the other preferring novels and poetry and disdaining much newspaper writing as “a wild, crooked, shrieking hodge-podge,” a “caricature of life.” Elizabeth Bisland hosted tea parties; Nellie Bly was known to frequent O’Rourke’s saloon on the Bowery. But each of them was acutely conscious of the unequal position of women in America. Each had grown up without much money and had come to New York to make a place for herself in big-city journalism, achieving a hard-won success in what was still, unquestionably, a man’s world.
Originally featured in May — read the full article, including Bly’s entertaining illustrated packing list, here.
13. DON’T GO BACK TO SCHOOL
“The present education system is the trampling of the herd,” legendary architect Frank Lloyd Wright lamented in 1956. Half a century later, I started Brain Pickings in large part out of frustration and disappointment with my trampling experience of our culturally fetishized “Ivy League education.” I found myself intellectually and creatively unstimulated by the industrialized model of the large lecture hall, the PowerPoint presentations, the standardized tests assessing my rote memorization of facts rather than my ability to transmute that factual knowledge into a pattern-recognition mechanism that connects different disciplines to cultivate wisdom about how the world works and a moral lens on how it should work. So Brain Pickings became the record of my alternative learning, of that cross-disciplinary curiosity that took me from art to psychology to history to science, by way of the myriad pieces of knowledge I discovered — and connected — on my own. I didn’t live up to the entrepreneurial ideal of the college drop-out and begrudgingly graduated “with honors,” but refused to go to my own graduation and decided never to go back to school. Years later, I’ve learned more in the course of writing and researching the thousands of articles to date than in all the years of my formal education combined.
So, in 2012, when I found out that writer Kio Stark was crowdfunding a book that would serve as a manifesto for learning outside formal education, I eagerly chipped in. Now, Don’t Go Back to School: A Handbook for Learning Anything is out and is everything I could’ve wished for when I was in college, an essential piece of cultural literacy, at once tantalizing and practically grounded assurance that success doesn’t lie at the end of a single highway but is sprinkled along a thousand alternative paths. Stark describes it as “a radical project, the opposite of reform … not about fixing school [but] about transforming learning — and making traditional school one among many options rather than the only option.” Through a series of interviews with independent learners who have reached success and happiness in fields as diverse as journalism, illustration, and molecular biology, Stark — who herself dropped out of a graduate program at Yale, despite being offered a prestigious fellowship — cracks open the secret to defining your own success and finding your purpose outside the factory model of formal education. She notes the patterns that emerge:
People who forgo school build their own infrastructures. They create and borrow and reinvent the best that formal schooling has to offer, and they leave the worst behind. That buys them the freedom to learn on their own terms.
From their stories, you’ll see that when you step away from the prepackaged structure of traditional education, you’ll discover that there are many more ways to learn outside school than within.
Reflecting on her own exit from academia, Stark articulates a much more broadly applicable insight:
A gracefully executed quit is a beautiful thing, opening up more doors than it closes.
But despite discovering in dismay that “liberal arts graduate school is professional school for professors,” which she had no interest in becoming, Stark did learn something immensely valuable from her third year of independent study, during which she read about 200 books of her own choosing:
I learned how to teach myself. I had to make my own reading lists for the exams, which meant I learned how to take a subject I was interested in and make myself a map for learning it.
The interviews revealed four key common tangents: learning is collaborative rather than done alone; the importance of academic credentials in many professions is declining; the most fulfilling learning tends to take place outside of school; and those happiest about learning are those who learn out of intrinsic motivation rather than in pursuit of extrinsic rewards. The first of these insights, of course, appears on the surface to contradict the very notion of “independent learning,” but Stark offers an eloquent semantic caveat:
Independent learning suggests ideas such as “self-taught,” or “autodidact.” These imply that independence means working solo. But that’s just not how it happens. People don’t learn in isolation. When I talk about independent learners, I don’t mean people learning alone. I’m talking about learning that happens independent of schools.
Anyone who really wants to learn without school has to find other people to learn with and from. That’s the open secret of learning outside of school. It’s a social act. Learning is something we do together.
Independent learners are interdependent learners.
Much of the argument for formal education rests on statistics indicating that people with college and graduate degrees earn more. But those statistics, Stark notes, suffer an important and rarely heeded bias:
The problem is that this statistic is based on long-term data, gathered from a period of moderate loan debt, easy employability, and annual increases in the value of a college degree. These conditions have been the case for college grads for decades. Given the dramatically changed circumstances grads today face, we already know that the trends for debt, employability, and the value of a degree have all degraded, and we cannot assume the trend toward greater lifetime earnings will hold true for the current generation. This is a critical omission from media coverage. The fact is we do not know. There’s absolutely no guarantee it will hold true.
Some heartening evidence suggests the blind reliance on degrees might be beginning to change. Stark cites Zappos CEO Tony Hsieh:
I haven’t looked at a résumé in years. I hire people based on their skills and whether or not they are going to fit our culture.
Another common argument for formal education extols the alleged advantages of its structure, proposing that homework assignments, reading schedules, and regular standardized testing would motivate you to learn with greater rigor. But, as Daniel Pink has written about the psychology of motivation, in school, as in work, intrinsic drives far outweigh extrinsic, carrots-and-sticks paradigms of reward and punishment, rendering this argument unsound. Stark writes:
Learning outside school is necessarily driven by an internal engine. … [I]ndependent learners stick with the reading, thinking, making, and experimenting by which they learn because they do it for love, to scratch an itch, to satisfy curiosity, following the compass of passion and wonder about the world.
So how can you best fuel that internal engine of learning outside the depot of formal education? Stark offers an essential insight, which places self-discovery at the heart of acquiring external knowledge:
Learning your own way means finding the methods that work best for you and creating conditions that support sustained motivation. Perseverance, pleasure, and the ability to retain what you learn are among the wonderful byproducts of getting to learn using methods that suit you best and in contexts that keep you going. Figuring out your personal approach to each of these takes trial and error.
For independent learners, it’s essential to find the process and methods that match your instinctual tendencies as a learner. Everyone I talked to went through a period of experimenting and sorting out what works for them, and they’ve become highly aware of their own preferences. They’re clear that learning by methods that don’t suit them shuts down their drive and diminishes their enjoyment of learning. Independent learners also find that their preferred methods are different for different areas. So one of the keys to success and enjoyment as an independent learner is to discover how you learn.
School isn’t very good at dealing with the multiplicity of individual learning preferences, and it’s not very good at helping you figure out what works for you.
Any young child you observe displays these traits. But passion and curiosity can be easily lost. School itself can be a primary cause; arbitrary motivators such as grades leave little room for variation in students’ abilities and interests, and fail to reward curiosity itself. There are also significant social factors working against children’s natural curiosity and capacity for learning, such as family support or the lack of it, or a degree of poverty that puts families in survival mode with little room to nurture curiosity.
Stark returns to the question of motivators that do work, once again calling to mind Pink’s advocacy of autonomy, mastery, and purpose as the trifecta of success. She writes:
[T]hree broadly defined elements of the learning experience support internal motivation and the persistence it enables. Internal motivation relies on learners having autonomy in their learning, a progressing sense of competence in their skills and knowledge, and the ability to learn in a concrete or “real world” context rather than in the abstract. These are mostly absent from classroom learning. Autonomy is rare, useful context is absent, and school’s means for affirming competence often feel so arbitrary as to be almost without use — and are sometimes actively demotivating. . . . [A]utonomy means that you follow your own path. You learn what you want to learn, when and how you want to learn it, for your own reasons. Your impetus to learn comes from within because you control the conditions of your learning rather than working within a structure that’s pre-made and inflexible.
The second thing you need to stick with learning independently is to set your own goals toward an increasing sense of competence. You need to create a feedback loop that confirms your work is worth it and keeps you moving forward. In school this is provided by advancing through the steps of the linear path within an individual class or a set curriculum, as well as from feedback from grades and praise.
But Stark found that outside of school, those most successful at learning sought their sense of competence through alternative sources. Many, like James Mangan advised in his 1936 blueprint to acquiring knowledge, solidified their learning by teaching it to other people, increasing their own sense of mastery and deepening their understanding. Others centered their learning around specific projects, which enabled them to make progress more modular and thus more attainable. Another cohort cited failure as an essential part of the road to mastery. Stark continues:
The third thing [that] can make or break your ability to sustain internal motivation … is to situate what you’re learning in a context that matters to you. In some cases, the context is a specific project you want to accomplish, which … also functions to support your sense of progress.
She sums up the failings of the establishment:
School is not designed to offer these three conditions; autonomy and context are sorely lacking in classrooms. School can provide a sense of increasing mastery, via grades and moving from introductory classes to harder ones. But a sense of true competence is harder to come by in a school environment. Fortunately, there are professors in higher education who are working to change the motivational structures that underlie their curricula.
The interviews, to be sure, offer a remarkably diverse array of callings, underpinned by a number of shared values and common characteristics. Computational biologist Florian Wagner, for instance, echoes Steve Jobs’s famous words on the secret of life in articulating a sentiment shared by many of the other interviewees:
There is something really special about when you first realize you can figure out really cool things completely on your own. That alone is a valuable lesson in life.
Investigative journalist Quinn Norton subscribes to Mangan’s prescription for learning by teaching:
I ended up teaching [my] knowledge to others at the school. That’s one of my most effective ways to learn, by teaching; you just have to stay a week ahead of your students. … Everything I learned, I immediately turned around and taught to others.
When I wanted to learn something new as a professional writer, I’d pitch a story on it. I was interested in neurology, and I figured, why don’t I start interviewing neurologists? The great thing about being a journalist is that you can pick up the phone and talk to anybody. It was just like what I found out about learning from experts on mailing lists. People like to talk about what they know.
I’m stuffed with trivial, useless knowledge, on a panoply of bizarre topics, so I can find something that they’re interested in that I know something about. Being able to do that is tremendously socially valuable. The exchange of knowledge is a very human way to learn. I try never to walk into a room where I want to get information without knowing what I’m bringing to the other person.
I think part of the problem with the usual mindset of the student is that it’s like being a sponge. It’s passive. It’s not about having something to bring to the interaction. People who are experts in things are experts because they like learning.
Software engineer, artist, and University of Texas molecular biologist Zack Booth Simpson speaks to the value of cultivating what William Gibson has called “a personal micro-culture” and learning from the people with whom you surround yourself:
In a way, the best education you can get is just talking with people who are really smart and interested in things, and you can get that for the cost of lunch.
I was … a constant reader. At home, I lived next to this thrift store that sold paperbacks for 10¢ apiece so I would go and buy massive stacks of paperback books on everything. Everything from trashy 1970s romance novels to Plato. When I went to Europe, I brought with me every single book that I didn’t think I would read voluntarily, because I figured if I was on a bus ride, I would read them. So I read Plato and Dante’s Inferno, and all types of literature. I got my education on the bus.
I need to tell a story. It’s an obsession. Each story is a seed inside of me that starts to grow and grow, like a tumor, and I have to deal with it sooner or later. Why a particular story? I don’t know when I begin. That I learn much later. Over the years I’ve discovered that all the stories I’ve told, all the stories I will ever tell, are connected to me in some way. If I’m talking about a woman in Victorian times who leaves the safety of her home and comes to the Gold Rush in California, I’m really talking about feminism, about liberation, about the process I’ve gone through in my own life, escaping from a Chilean, Catholic, patriarchal, conservative, Victorian family and going out into the world.
I start all my books on January eighth. Can you imagine January seventh? It’s hell. Every year on January seventh, I prepare my physical space. I clean up everything from my other books. I just leave my dictionaries, and my first editions, and the research materials for the new one. And then on January eighth I walk seventeen steps from the kitchen to the little pool house that is my office. It’s like a journey to another world. It’s winter, it’s raining usually. I go with my umbrella and the dog following me. From those seventeen steps on, I am in another world and I am another person. I go there scared. And excited. And disappointed — because I have a sort of idea that isn’t really an idea. The first two, three, four weeks are wasted. I just show up in front of the computer. Show up, show up, show up, and after a while the muse shows up, too. If she doesn’t show up invited, eventually she just shows up.
She offers three pieces of advice for aspiring writers:
It’s worth the work to find the precise word that will create a feeling or describe a situation. Use a thesaurus, use your imagination, scratch your head until it comes to you, but find the right word.
When you feel the story is beginning to pick up rhythm—the characters are shaping up, you can see them, you can hear their voices, and they do things that you haven’t planned, things you couldn’t have imagined—then you know the book is somewhere, and you just have to find it, and bring it, word by word, into this world.
When you tell a story in the kitchen to a friend, it’s full of mistakes and repetitions. It’s good to avoid that in literature, but still, a story should feel like a conversation. It’s not a lecture.
Celebrated journalist and New Yorker staff writer Susan Orlean considers the critical difference between fiction and nonfiction, exploring the osmotic balance of escapism and inner stillness:
When it comes to nonfiction, it’s important to note the very significant difference between the two stages of the work. Stage one is reporting. Stage two is writing.
Reporting is like being the new kid in school. You’re scrambling to learn something very quickly, being a detective, figuring out who the people are, dissecting the social structure of the community you’re writing about. Emotionally, it puts you in the place that everybody dreads. You’re the outsider. You can’t give in to your natural impulse to run away from situations and people you don’t know. You can’t retreat to the familiar.
Writing is exactly the opposite. It’s private. The energy of it is so intense and internal, it sometimes makes you feel like you’re going to crumple. A lot of it happens invisibly. When you’re sitting at your desk, it looks like you’re just sitting there, doing nothing.
A necessary antidote to the tortured-genius cultural mythology of the writer, Orlean, like Ray Bradbury, conceives of writing as a source of joy, even when challenging:
Writing gives me great feelings of pleasure. There’s a marvelous sense of mastery that comes with writing a sentence that sounds exactly as you want it to. It’s like trying to write a song, making tiny tweaks, reading it out loud, shifting things to make it sound a certain way. It’s very physical. I get antsy. I jiggle my feet a lot, get up a lot, tap my fingers on the keyboard, check my e-mail. Sometimes it feels like digging out of a hole, but sometimes it feels like flying. When it’s working and the rhythm’s there, it does feel like magic to me.
She ends with four pieces of wisdom for writers:
You have to simply love writing, and you have to remind yourself often that you love it.
You should read as much as possible. That’s the best way to learn how to write.
You have to appreciate the spiritual component of having an opportunity to do something as wondrous as writing. You should be practical and smart and you should have a good agent and you should work really, really hard. But you should also be filled with awe and gratitude about this amazing way to be in the world.
Don’t be ashamed to use the thesaurus. I could spend all day reading Roget’s! There’s nothing better when you’re in a hurry and you need the right word right now.
Before I wrote my first book in 1989, the sum total of my earnings as a writer, over four years of freelancing, was about three thousand bucks. So it did appear to be financial suicide when I quit my job at Salomon Brothers — where I’d been working for a couple of years, and where I’d just gotten a bonus of $225,000, which they promised they’d double the following year—to take a $40,000 book advance for a book that took a year and a half to write.
My father thought I was crazy. I was twenty-seven years old, and they were throwing all this money at me, and it was going to be an easy career. He said, “Do it another ten years, then you can be a writer.” But I looked around at the people on Wall Street who were ten years older than me, and I didn’t see anyone who could have left. You get trapped by the money. Something dies inside. It’s very hard to preserve the quality in a kid that makes him jump out of a high-paying job to go write a book.
More than a living, Lewis found in writing a true calling — the kind of deep flow that fully absorbs the mind and soul:
There’s no simple explanation for why I write. It changes over time. There’s no hole inside me to fill or anything like that, but once I started doing it, I couldn’t imagine wanting to do anything else for a living. I noticed very quickly that writing was the only way for me to lose track of the time.
I used to get the total immersion feeling by writing at midnight. The day is not structured to write, and so I unplug the phones. I pull down the blinds. I put my headset on and play the same soundtrack of twenty songs over and over and I don’t hear them. It shuts everything else out. So I don’t hear myself as I’m writing and laughing and talking to myself. I’m not even aware I’m making noise. I’m having a physical reaction to a very engaging experience. It is not a detached process.
“Art suffers the moment other people start paying for it,” Hugh MacLeod famously wrote. It might be an overly cynical notion, one that perpetuates the unjustified yet deep-seated cultural guilt over simultaneously doing good and doing well, but Lewis echoes the sentiment:
Once you have a career, and once you have an audience, once you have paying customers, the motives for doing it just change.
And yet Lewis approaches the friction between intrinsic and extrinsic motivation — one experienced by anyone who loves what they do and takes pride in clarity of editorial vision, but has an audience whose approval or disapproval becomes increasingly challenging to tune out — with extraordinary candor and insight:
Commercial success makes writing books a lot easier to do, and it also creates pressure to be more of a commercial success. If you sold a million books once, your publisher really, really thinks you might sell a million books again. And they really want you to do it.
That dynamic has the possibility of constraining the imagination. There are invisible pressures. There’s a huge incentive to write about things that you know will sell. But I don’t find myself thinking, “I can’t write about that because it won’t sell.” It’s such a pain in the ass to write a book, I can’t imagine writing one if I’m not interested in the subject.
And yet his clarity of vision is still what guides the best of his work:
Those are the best moments, when I’ve got the whale on the line, when I see exactly what it is I’ve got to do.
After that moment there’s always misery. It never goes quite like you think, but that moment is a touchstone, a place to come back to. It gives you a kind of compass to guide you through the story.
That feeling has never done me wrong. Sometimes you don’t understand the misery it will lead to, but it’s always been right to feel it. And it’s a great feeling.
It’s always good to have a motive to get you in the chair. If your motive is money, find another one.
I took my biggest risk when I walked away from a lucrative job at age twenty-seven to be a writer. I’m glad I was too young to realize what a dumb decision it seemed to be, because it was the right decision for me.
A lot of my best decisions were made in a state of self-delusion. When you’re trying to create a career as a writer, a little delusional thinking goes a long way.
We seem to have a strange but all too human cultural fixation on the daily routines and daily rituals of famous creators, from Vonnegut to Burroughs to Darwin — as if a glimpse of their day-to-day would somehow magically infuse ours with equal potency, or replicating it would allow us to replicate their genius in turn. And though much of this is mere cultural voyeurism, there is something to be said for the value of a well-engineered daily routine to anchor the creative process. Manage Your Day-to-Day: Build Your Routine, Find Your Focus, and Sharpen Your Creative Mind (public library), edited by Behance’s 99U editor-in-chief Jocelyn Glei and featuring contributions from a twenty of today’s most celebrated thinkers and doers, delves into the secrets of this holy grail of creativity.
Reflecting Thomas Edison’s oft-cited proclamation that “genius is one percent inspiration, ninety-nine percent perspiration,” after which 99U is named, the crucial importance of consistent application is a running theme. (Though I prefer to paraphrase Edison to “Genius is one percent inspiration, ninety-nine percent aspiration” — since true aspiration produces effort that feels gratifying rather than merely grueling, enhancing the grit of perspiration with the gift of gratification.)
We tend to overestimate what we can do in a short period, and underestimate what we can do over a long period, provided we work slowly and consistently. Anthony Trollope, the nineteenth-century writer who managed to be a prolific novelist while also revolutionizing the British postal system, observed, “A small daily task, if it be really daily, will beat the labours of a spasmodic Hercules.” Over the long run, the unglamorous habit of frequency fosters both productivity and creativity.
You’re much more likely to spot surprising relationships and to see fresh connections among ideas, if your mind is constantly humming with issues related to your work. When I’m deep in a project, everything I experience seems to relate to it in a way that’s absolutely exhilarating. The entire world becomes more interesting. That’s critical, because I have a voracious need for material, and as I become hyperaware of potential fodder, ideas pour in. By contrast, working sporadically makes it hard to keep your focus. It’s easy to become blocked, confused, or distracted, or to forget what you were aiming to accomplish.
Creativity arises from a constant churn of ideas, and one of the easiest ways to encourage that fertile froth is to keep your mind engaged with your project. When you work regularly, inspiration strikes regularly.
Step by step, you make your way forward. That’s why practices such as daily writing exercises or keeping a daily blog can be so helpful. You see yourself do the work, which shows you that you can do the work. Progress is reassuring and inspiring; panic and then despair set in when you find yourself getting nothing done day after day. One of the painful ironies of work life is that the anxiety of procrastination often makes people even less likely to buckle down in the future.
I have a long list of “Secrets of Adulthood,” the lessons I’ve learned as I’ve grown up, such as: “It’s the task that’s never started that’s more tiresome,” “The days are long, but the years are short,” and “Always leave plenty of room in the suitcase.” One of my most helpful Secrets is, “What I do every day matters more than what I do once in a while.”
Day by day, we build our lives, and day by day, we can take steps toward making real the magnificent creations of our imaginations.
Entrepreneurship guru and culture-sageSeth Godin seconds Rubin and admonishes against confusing vacant ritualization with creative rituals that actually spur productivity:
Everybody who does creative work has figured out how to deal with their own demons to get their work done. There is no evidence that setting up your easel like Van Gogh makes you paint better. Tactics are idiosyncratic. But strategies are universal, and there are a lot of talented folks who are not succeeding the way they want to because their strategies are broken.
The strategy is simple, I think. The strategy is to have a practice, and what it means to have a practice is to regularly and reliably do the work in a habitual way.
There are many ways you can signify to yourself that you are doing your practice. For example, some people wear a white lab coat or a particular pair of glasses, or always work in a specific place — in doing these things, they are professionalizing their art.
The notion that I do my work here, now, like this, even when I do not feel like it, and especially when I do not feel like it, is very important. Because lots and lots of people are creative when they feel like it, but you are only going to become a professional if you do it when you don’t feel like it. And that emotional waiver is why this is your work and not your hobby.
“At its best, the sensation of writing is that of any unmerited grace,”Annie Dillard famously observed, adding the quintessential caveat, “It is handed to you, but only if you look for it. You search, you break your heart, your back, your brain, and then — and only then — it is handed to you.” And yet, Zadie Smith admonished in her 10 rules of writing, it’s perilous to romanticize the “vocation of writing”: “There is no ‘writer’s lifestyle.’ All that matters is what you leave on the page.”
Still, surely there must be more to it than that — whole worlds rise and fall, entire universes blossom and die daily in that enchanted space between the writer’s sensation of writing and the word’s destiny of being written on a page. For all that’s been mulled about the writing life and its perpetual osmosis of everyday triumphs and tragedies, its existential feats and failures, at its heart remains an immutable mystery — how can a calling be at once so transcendent and so soul-crushing, and what is it that enthralls so many souls into its paradoxical grip, into feeling compelled to write “not because they can but because they have to”? That, and oh so much more, is what Dani Shapiro explores in Still Writing: The Pleasures and Perils of a Creative Life (public library) — her magnificent memoir of the writing life, at once disarmingly personal and brimming with widely resonant wisdom on the most universal challenges and joys of writing.
Shapiro opens with the kind of crisp conviction that underpins the entire book:
Everything you need to know about life can be learned from a genuine and ongoing attempt to write.
Far from a lazy aphorism, however, this proclamation comes from her own hard-earned experience — fragments of which resonate deeply with most of us, on one level or another — that Shapiro synthesizes beautifully:
When I wasn’t writing, I was reading. And when I wasn’t writing or reading, I was staring out the window, lost in thought. Life was elsewhere — I was sure of it—and writing was what took me there. In my notebooks, I escaped an unhappy and lonely childhood. I tried to make sense of myself. I had no intention of becoming a writer. I didn’t know that becoming a writer was possible. Still, writing was what saved me. It presented me with a window into the infinite. It allowed me to create order out of chaos.
The writing life requires courage, patience, persistence, empathy, openness, and the ability to deal with rejection. It requires the willingness to be alone with oneself. To be gentle with oneself. To look at the world without blinders on. To observe and withstand what one sees. To be disciplined, and at the same time, take risks. To be willing to fail — not just once, but again and again, over the course of a lifetime. “Ever tried, ever failed,” Samuel Beckett once wrote. “No matter. Try again. Fail again. Fail better.” It requires what the great editor Ted Solotoroff once called endurability.
We are all unsure of ourselves. Every one of us walking the planet wonders, secretly, if we are getting it wrong. We stumble along. We love and we lose. At times, we find unexpected strength, and at other times, we succumb to our fears. We are impatient. We want to know what’s around the corner, and the writing life won’t offer us this. It forces us into the here and now. There is only this moment, when we put pen to page.
The page is your mirror. What happens inside you is reflected back. You come face-to-face with your own resistance, lack of balance, self-loathing, and insatiable ego—and also with your singular vision, guts, and fortitude. No matter what you’ve achieved the day before, you begin each day at the bottom of the mountain. … Life is usually right there, though, ready to knock us over when we get too sure of ourselves. Fortunately, if we have learned the lessons that years of practice have taught us, when this happens, we endure. We fail better. We sit up, dust ourselves off, and begin again.
What is it about writing that makes it—for some of us — as necessary as breathing? It is in the thousands of days of trying, failing, sitting, thinking, resisting, dreaming, raveling, unraveling that we are at our most engaged, alert, and alive. Time slips away. The body becomes irrelevant. We are as close to consciousness itself as we will ever be. This begins in the darkness. Beneath the frozen ground, buried deep below anything we can see, something may be taking root. Stay there, if you can. Don’t resist. Don’t force it, but don’t run away. Endure. Be patient. The rewards cannot be measured. Not now. But whatever happens, any writer will tell you: This is the best part.
If I dismiss the ordinary — waiting for the special, the extreme, the extraordinary to happen — I may just miss my life.
To allow ourselves to spend afternoons watching dancers rehearse, or sit on a stone wall and watch the sunset, or spend the whole weekend rereading Chekhov stories—to know that we are doing what we’re supposed to be doing — is the deepest form of permission in our creative lives. The British author and psychologist Adam Phillips has noted, “When we are inspired, rather like when we are in love, we can feel both unintelligible to ourselves and most truly ourselves.” This is the feeling I think we all yearn for, a kind of hyperreal dream state. We read Emily Dickinson. We watch the dancers. We research a little known piece of history obsessively. We fall in love. We don’t know why, and yet these moments form the source from which all our words will spring.
As curious as these habits are, however, Johnson reminds us that public intellectuals often engineer their own myths, which means the quirky behaviors recorded in history’s annals should be taken with a grain of Salinger salt. She offers a necessary disclaimer, enveloped in a thoughtful meta-disclaimer:
One must always keep in mind that these writers and the people around them may have, at some point, embellished the facts. Quirks are great fodder for gossip and can morph into gross exaggeration when passed from one person to the next. There’s also no way to escape the self-mythologizing particularly when dealing with some of the greatest storytellers that ever lived. Yet even when authors stretch the truth, they reveal something about themselves, when it is the desire to project a certain image or the need to shy away from one.
Mode and medium of writing seem to be a recurring theme of personal idiosyncrasy. Wallace Stevens composed his poetry on slips of paper while walking — an activity he, like Maira Kalman, saw as a creative stimulant — then handed them to his secretary to type up. Edgar Allan Poe, champion of marginalia, wrote his final drafts on separate pieces of paper attached into a running scroll with sealing wax. Jack Kerouac was especially partial to scrolling: In 1951, planning the book for years and amassing ample notes in his journals, he wrote On The Road in one feverish burst, letting it pour onto pages taped together into one enormously long strip of paper — a format he thought lent itself particularly well to his project, since it allowed him to maintain his rapid pace without pausing to reload the typewriter at the end of each page. When he was done, he marched into his editor Robert Giroux’s office and proudly spun out the scroll across the floor. The result, however, was equal parts comical and tragic:
To [Kerouac’s] dismay, Giroux focused on the unusual packaging. He asked, “But Jack, how can you make corrections on a manuscript like that?” Giroux recalled saying, “Jack, you know you have to cut this up. It has to be edited.” Kerouac left the office in a rage. It took several years for Kerouac’s agent, Sterling Lord, to finally find a home for the book, at the Viking Press.
James Joyce wrote lying on his stomach in bed, with a large blue pencil, clad in a white coat, and composed most of Finnegans Wake with crayon pieces on cardboard. But this was a matter more of pragmatism than of superstition or vain idiosyncrasy: Of the many outrageously misguided myths the celebrated author of Ulysses and wordsmith of little-known children’s books, one was actually right: he was nearly blind. His childhood myopia developed into severe eye problems by his twenties. To make matters worse, he developed rheumatic fever when he was twenty-five, which resulted in a painful eye condition called iritis. By 1930, he had undergone twenty-five eye surgeries, none of which improved his sight. The large crayons thus helped him see what he was writing, and the white coat helped reflect more light onto the page at night. (As someone partial to black bedding, not for aesthetic reasons but because I believe it provides a deeper dark at night, I can certainly relate to Joyce’s seemingly arbitrary but actually physics-driven attire choice.)
Virginia Woolf was equally opinionated about the right way to write as she was about the right way to read. In her twenties, she spent two and a half hours every morning writing, on a three-and-half-foot tall desk with an angled top that allowed her to look at her work both up-close and from afar. But according to her nephew and irreverent collaborator, Quentin Bell, Woolf’s prescient version of today’s trendy standing desk was less a practical matter than a symptom of her sibling rivalry with her sister, the Bloomsbury artist Vanessa Bell — the same sibling rivalry that would later inspire a charming picture-book: Vanessa painted standing, and Virginia didn’t want to be outdone by her sister. Johnson cites Quentin, who was known for his wry family humor:
This led Virginia to feel that her own pursuit might appear less arduous than that of her sister unless she set matters on a footing of equality.
Many authors measured the quality of their output by uncompromisingly quantitative metrics like daily word quotas. Jack London wrote 1,000 words a day every single day of his career and William Golding once declared at a party that he wrote 3,000 words daily, a number Norman Mailer and Arthur Conan Doyle shared. Raymond Chandler, a man of strong opinions on the craft of writing, didn’t subscribe to a specific daily quota, but was known to write up to 5,000 words a day at his most productive. Anthony Trollope, who began his day promptly at 5:30 A.M. every morning, disciplined himself to write 250 words every 15 minutes, pacing himself with a watch. Stephen King does whatever it takes to reach his daily quota of 2,000 adverbless words and Thomas Wolfe keeps his at 1,800, not letting himself stop until he has reached it.
We already know how much famous authors loved their pets, but for many their non-human companions were essential to the creative process. Edgar Allan Poe considered his darling tabby named Catterina his literary guardian who “purred as if in complacent approval of the world proceeding under [her] supervision.” Flannery O’Connor developed an early affection for domestic poultry, from her childhood chicken (which, curiously enough, could walk backwards and once ended up in a newsreel clip) to her growing collection of pheasants, ducks, turkeys, and quail. Most famously, however, twenty-something O’Connor mail-ordered six peacocks, a peahen, and four peachicks, which later populated her fiction. But by far the most bizarre pet-related habit comes from Colette, who enlisted her dog in a questionable procrastination mechanism:
Colette would study the fur of her French bulldog, Souci, with a discerning eye. Then she’d pluck a flea from Souci’s back and would continue the hunt until she was ready to write.
But arguably the strangest habit of all comes from Friedrich Schiller, relayed by his friend Goethe:
[Goethe] had dropped by Schiller’s home and, after finding that his friend was out, decided to wait for him to return. Rather than wasting a few spare moments, the productive poet sat down at Schiller’s desk to jot down a few notes. Then a peculiar stench prompted Goethe to pause. Somehow, an oppressive odor had infiltrated the room.
Goethe followed the odor to its origin, which was actually right by where he sat. It was emanating from a drawer in Schiller’s desk. Goethe leaned down, opened the drawer, and found a pile of rotten apples. The smell was so overpowering that he became light-headed. He walked to the window and breathed in a few good doses of fresh air. Goethe was naturally curious about the trove of trash, though Schiller’s wife, Charlotte, could only offer the strange truth: Schiller had deliberately let the apples spoil. The aroma, somehow, inspired him, and according to his spouse, he “could not live or work without it.”
Then there was the color-coding of the muses: In addition to his surprising gastronome streak, Alexandre Dumas was also an aesthete: For decades, he penned all of his fiction on a particular shade of blue paper, his poetry on yellow, and his articles on pink; on one occasion, while traveling in Europe, he ran out of his precious blue paper and was forced to write on a cream-colored pad, which he was convinced made his fiction suffer. Charles Dickens was partial to blue ink, but not for superstitious reasons — because it dried faster than other colors, it allowed him to pen his fiction and letters without the drudgery of blotting. Virginia Woolf used different-colored inks in her pens — greens, blues, and purples. Purple was her favorite, reserved for letters (including her love letters to Vita Sackville-West, diary entries, and manuscript drafts. Lewis Carroll also preferred purple ink (and shared with Woolf a penchant for standing desks), but for much more pragmatic reasons: During his years teaching mathematics at Oxford, teachers were expected to use purple ink to correct students’ work — a habit that carried over to Carroll’s fiction.
But lest we hastily surmise that writing in a white coat would make us a Joyce or drowning pages in purple ink a Woolf, Johnson prefaces her exploration with another important, beautifully phrased disclaimer:
That power to mesmerize has an intangible, almost magical quality, one I wouldn’t dare to try to meddle with by attempting to define it. It was never my goal as I wrote this book to discover what made literary geniuses tick. The nuances of any mind are impossible to pinpoint.
You could adopt one of these practices or, more ambitiously, combine several of them, and chances are you still wouldn’t invoke genius. These tales don’t hold a secret formula for writing a great novel. Rather, the authors in the book prove that the path to great literature is paved with one’s own eccentricities rather than someone else’s.
Originally featured in September — for more quirky habits, read the original article here.
If the twentieth-century career was a ladder that we climbed from one predictable rung to the next, the twenty-first-century career is more like a broad rock face that we are all free-climbing. There’s no defined route, and we must use our own ingenuity, training, and strength to rise to the top. We must make our own luck.
Lucky people take advantage of chance occurrences that come their way. Instead of going through life on cruise control, they pay attention to what’s happening around them and, therefore, are able to extract greater value from each situation… Lucky people are also open to novel opportunities and willing to try things outside of their usual experiences. They’re more inclined to pick up a book on an unfamiliar subject, to travel to less familiar destinations, and to interact with people who are different than themselves.
If you think hard about it, you’ll notice just how many “automatic” decisions you make each day. But these habits aren’t always as trivial as what you eat for breakfast. Your health, your productivity, and the growth of your career are all shaped by the things you do each day — most by habit, not by choice.
Even the choices you do make consciously are heavily influenced by automatic patterns. Researchers have found that our conscious mind is better understood as an explainer of our actions, not the cause of them. Instead of triggering the action itself, our consciousness tries to explain why we took the action after the fact, with varying degrees of success. This means that even the choices we do appear to make intentionally are at least somewhat influenced by unconscious patterns.
Given this, what you do every day is best seen as an iceberg, with a small fraction of conscious decision sitting atop a much larger foundation of habits and behaviors.
We can’t, however, simply will ourselves into better habits. Since willpower is a limited resource, whenever we’ve overexerted our self-discipline in one domain, a concept known as “ego depletion” kicks in and renders us mindless automata in another. Instead, Young suggests, the key to changing a habit is to invest heavily in the early stages of habit-formation so that the behavior becomes automated and we later default into it rather than exhausting our willpower wrestling with it. Young also cautions that it’s a self-defeating strategy to try changing several habits at once. Rather, he advises, spend one month on each habit alone before moving on to the next — a method reminiscent of the cognitive strategy of “chunking” that allows our brains to commit more new information to memory.
Though the chapter, penned by Steven Kramer and Teresa Amabile of the Harvard Business School, co-authors of The Progress Principle, along with 13-year IDEO veteran Ela Ben-Ur, frames the primary benefit of a diary as a purely pragmatic record of your workday productivity and progress — while most dedicated diarists would counter that the core benefits are spiritual and psychoemotional — it does offer some valuable insight into the psychology of how journaling elevates our experience of everyday life:
This is one of the most important reasons to keep a diary: it can make you more aware of your own progress, thus becoming a wellspring of joy in your workday.
Citing their research into the journals of more than two hundred creative professionals, the authors point to a pattern that reveals the single most important motivator: palpable progress on meaningful work:
On the days when these professionals saw themselves moving forward on something they cared about — even if the progress was a seemingly incremental “small win” — they were more likely to be happy and deeply engaged in their work. And, being happier and more deeply engaged, they were more likely to come up with new ideas and solve problems creatively.
Even more importantly, however, they argue that a diary offers an invaluable feedback loop:
Although the act of reflecting and writing, in itself, can be beneficial, you’ll multiply the power of your diary if you review it regularly — if you listen to what your life has been telling you. Periodically, maybe once a month, set aside time to get comfortable and read back through your entries. And, on New Year’s Day, make an annual ritual of reading through the previous year.
This, they suggest, can yield profound insights into the inner workings of your own mind — especially if you look for specific clues and patterns, trying to identify the richest sources of meaning in your work and the types of projects that truly make your heart sing. Once you understand what motivates you most powerfully, you’ll be able to prioritize this type of work in going forward. Just as important, however, is cultivating a gratitude practice and acknowledging your own accomplishments in the diary:
This is your life; savor it. Hold on to the threads across days that, when woven together, reveal the rich tapestry of what you are achieving and who you are becoming. The best part is that, seeing the story line appearing, you can actively create what it — and you — will become.
Every creative endeavor, from writing a book to designing a brand to launching a company, follows what’s known as an Uncertainty Curve. The beginning of a project is defined by maximum freedom, very little constraint, and high levels of uncertainty. Everything is possible; options, paths, ideas, variations, and directions are all on the table. At the same time, nobody knows exactly what the final output or outcome will be. And, at times, even whether it will be. Which is exactly the way it should be.
Those who are doggedly attached to the idea they began with may well execute on that idea. And do it well and fast. But along the way, they often miss so many unanticipated possibilities, options, alternatives, and paths that would’ve taken them away from that linear focus on executing on the vision, and sent them back into a place of creative dissidence and uncertainty, but also very likely yielded something orders of magnitude better.
All creators need to be able to live in the shade of the big questions long enough for truly revolutionary ideas and insights to emerge. They need to stay and act in that place relentlessly through the first, most obvious wave of ideas.
Fields argues that if we move along the Uncertainty Curve either too fast or too slowly, we risk either robbing the project of its creative potential and ending up in mediocrity. Instead, becoming mindful of the psychology of that process allows us to pace ourselves better and master that vital osmosis between freedom and constraint. He sums up both the promise and the peril of this delicate dance beautifully:
Nothing truly innovative, nothing that has advanced art, business, design, or humanity, was ever created in the face of genuine certainty or perfect information. Because the only way to be certain before you begin is if the thing you seek to do has already been done.
The story begins with a skit titled “Business, Business,” which Henson performed on The Ed Sullivan Show in 1968. It tells the story of two conflicting sets of creatures — the slot-machine-eyed, cash-register-voiced corporate heads who talk in business-ese, and the naïve, light-bulb-headed softies who talk of love, joy, and beauty:
“Business, Business” implies that business and idealism are diametrically opposed. The idealist is attacked not just by the establishment, but also from within, where greed starts to change one’s motives.
For the most part, money is the enemy of art. … Put simply, great art wants quality, whereas good business wants profit. Quality requires many man-hours to produce, which any accountant will tell you cuts significantly into your profit. Great artists fight for such expenditures, whereas successful businessmen fight against them.
And yet, like most dogmatic dichotomies — take, for instance, science and spirituality — this, too, is invariably reductionistic. Henson’s life and legacy, Stevens argues, is proof that art and business can be — and inherently are — complementary rather than contradictory. Produced only six months after the Summer of Love, “Business, Business” straddled a profound cultural shift as a new generation of “light-bulb idealists” — baby boomers, flower children, and hippies who lived in youth collectives, listened to rock, and championed free love — rejected the material ideals of their parents and embraced the philosophy of Alan Watts. And yet Henson himself was an odd hybrid of these two worlds. When he made “Business, Business,” he was thirty-one, which placed him squarely between the boomers and their parents, and lived in New York City with his wife, living comfortably after having made hundreds of television commercials for everything from lunch meats to computers. In his heart, however, Henson, a self-described Mississippi Tom Sawyer who often went barefoot, was an artist — and he was ready to defend this conviction with the choices he made.
Henson was already a capitalist when he made “Business, Business.” And we could even conclude that the skit describes his own conversion from idealism to capitalism. In 1968, he had an agent who got him TV appearances on Ed Sullivan and freelance commercial gigs hawking products as unhippielike as IBM computers and Getty oil.
Yet Jim Henson’s business wasn’t oil — it was art. While today, most artists are too timid to admit it, Henson freely referred to himself as an “artist,” and his agent went even further, calling him “artsy-craftsy.” Henson may have worked in show business, but he’d also traveled in Europe as a young man, sketching pictures of its architecture. He owned a business, but his business rested on the ideas the idealists were shouting—brotherhood, joy, and love. He wore a beard. Biographers would say it was to cover acne scars, but in the context of the late sixties, it aligns Henson with a category of people that is unmistakable. Though a capitalist, he was also a staunch artist.
“It seems to be difficult for any one to take in the idea that two truths cannot conflict,” pioneering astronomer Maria Mitchell wrote in her diaries. And yet what Henson’s case tells us, Stevens suggests in returning to “Business, Business,” is that the very notion of “selling out” is one big non-truth that pits two parallel possibilities against each other:
If art and money are at odds, which side was Jim Henson really on? If you watch the skit, the clue is in the characters’ voices. Of the Slinky-necked business-heads and idealist-heads, Henson was really both and neither, because in “Business, Business,” he parodies both. Locked in conflict, they sound like blowhards and twerps, respectively, but they were both facets of his life. As an employer to two other men, Henson was the boss man — the suit, cash register, and slot machine — who wrote the checks. But he also got together with his friends to sing, laugh, and play with puppets in the kind of collectivism that hippies celebrated.
Today — especially with Generation X and Millennials — serious artists often refuse contact with business. Large numbers of liberal arts graduates bristle when presented with the corporate world, rejecting its values to protect their ideals. Devoted artists move home to a parent’s basement to complete their masterpieces, while the more pragmatic artists live in cloistered “Neverland” artist collectives, grant-funded arts colonies, and university faculty lounges.
What is a human being? Complex to the point of absurdity, a whole person is both greedy and generous. It is foolish to think we can’t be both artists and entrepreneurs, especially when Henson was so wildly successful in both categories.
Since he was in college, Jim Henson was a natural capitalist. He owned a printmaking business and made commercials for lunchmeats. In the 1970s, he became a merchandizing millionaire and made Hollywood movies. By 1987, he had shows on all three major networks plus HBO and PBS. … Of course, Henson was not just another Trump. Believe the beard.
When Henson joined on to the experimental PBS show Sesame Street in 1968, he was underpaid for his services creating Big Bird and Oscar. Yet he spent his free nights in his basement, shooting stop-motion films that taught kids to count. If you watch these counting films, the spirit of Henson’s gift shines through. I think any struggling artist today could count Henson among their ilk. He had all the makings of a tragic starving artist. The only difference between him and us is that he made peace with money. He found a way to make art and money dance.
The key, of course, is to master this dance with equal parts determination and grace. Riffing off Lewis Hyde’s famous meditation on gift economies in The Gift, where he argues that the artist must first cultivate a protected gift-sphere for making pure art and then make contact with the market, Stevens offers a blueprint:
The dance involves art and money, but not at the same time. In the first stage, it is paramount that the artist “reserves a protected gift-sphere in which the art is created.” He keeps money out of it. But in the next two phases, they can dance. The way I see it, Hyde’s dance steps go a little something like this:
Make art make money.
Make money make art.
It is the last step that turns this dance into a waltz — something cyclical so that the money is not the real end. Truly, for Jim Henson, money was a fuel that fed art.
To write well about the elegant world you have to know it and experience it to the depths of your being just as Proust, Radiguet and Fitzgerald did: what matters is not whether you love it or hate it, but only to be quite clear about your position regarding it.
In another, he considers the secret of living well:
The inferno of the living is not something that will be; if there is one, it is what is already here, the inferno where we live every day, that we form by being together. There are two ways to escape suffering it. The first is easy for many: accept the inferno and become such a part of it that you can no longer see it. The second is risky and demands constant vigilance and apprehension: seek and learn to recognize who and what, in the midst of inferno, are not inferno, then make them endure, give them space.
In a lengthy letter to literary critic Mario Motta dated January 16, 1950, Calvino addresses the alleged death of the novel, a death toll still nervously resounding today:
There have been so many debates on the novel in the last thirty years, both by those who claimed it was dead and by those who wanted it to be alive in a certain way, that if one conducts the debate without serious preliminary work to establish the terms of the question as it has to be set up and as it has never been set up before, we’ll end up saying and making others say a lot of commonplaces.
The fact is that I already feel I am a prisoner of a kind of style and it is essential that I escape from it at all costs: I’m now trying to write a totally different book, but it’s damned difficult; I’m trying to break up the rhythms, the echoes which I feel the sentences I write eventually slide into, as into pre-existing molds, I try to see facts and things and people in the round instead of being drawn in colors that have no shading. For that reason the book I’m going to write interests me infinitely more than the other one.
As dangerous as the blind adhesion to a style, Calvino writes in a May 1959 letter, is the blind reliance on tools, the cult of medium over message — but harnessing the power of tools is one of the craft’s greatest arts:
One should never have taboos about the tools we use, that as long as the thought or images or style one wants to put forward do not become deformed by the medium, one must on the contrary try to make use of the most powerful and most efficient of those tools.
Several years later, Calvino returns to his conception of fiction, this time with more dimension and more sensitivity to the inherent contradictions of literature:
One cannot construct in fiction a harmonious language to express something that is not yet harmonious. We live in a cultural ambience where many different languages and levels of knowledge intersect and contradict each other.
When things get tough, this is what you should do: Make good art. I’m serious. Husband runs off with a politician — make good art. Leg crushed and then eaten by a mutated boa constrictor — make good art. IRS on your trail — make good art. Cat exploded — make good art. Someone on the Internet thinks what you’re doing is stupid or evil or it’s all been done before — make good art. Probably things will work out somehow, eventually time will take the sting away, and that doesn’t even matter. Do what only you can do best: Make good art. Make it on the bad days, make it on the good days, too.
A wise woman once said, “If you are not making mistakes, you’re not taking enough risks.” Gaiman articulates the same sentiment with his own brand of exquisite eloquence:
I hope that in this year to come, you make mistakes.
Because if you are making mistakes, then you are making new things, trying new things, learning, living, pushing yourself, changing yourself, changing your world. You’re doing things you’ve never done before, and more importantly, you’re Doing Something.
So that’s my wish for you, and all of us, and my wish for myself. Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before. Don’t freeze, don’t stop, don’t worry that it isn’t good enough, or it isn’t perfect, whatever it is: art, or love, or work or family or life.
“Whether or not there is life on Mars now, there WILL be by the end of this century,” Arthur C. Clarke predicted in 1971 while contemplating humanity’s quest to conquer the Red Planet. “Whatever the reason you’re on Mars is, I’m glad you’re there. And I wish I was with you,” Carl Sagan said a quarter century later in his bittersweet message to future Mars explorers shortly before his death. Sagan, of course, has always been with us — especially as we fulfill, at least partially, Clarke’s prophecy: On March 10, 2006, we put a proxy of human life on, or at least very near, Mars — NASA’s Mars Reconnaissance Orbiter, with its powerful HiRISE telescope, arrived in the Red Planet’s orbit and began mapping its surface in unprecedented detail.
This Is Mars (public library) — a lavish visual atlas by French photographer, graphic designer and editor Xavier Barral, featuring 150 glorious ultra-high-resolution black-and-white images culled from the 30,000 photographs taken by NASA’s MRO, alongside essays by HiRISE telescope principal researcher Alfred S. McEwen, astrophysicist Francis Rocard, and geophysicist Nicolas Mangold — offers an unparalleled glimpse of those mesmerizing visions of otherworldly landscapes beamed back by the MRO in all their romantic granularity, making the ever-enthralling Red Planet feel at once more palpable and more mysterious than ever. At the intersection of art and science, these mesmerizing images belong somewhere between Berenice Abbot’s vintage science photography, the most enchanting aerial photography of Earth, and the NASA Art Project.
At the end of this voyage, I have gathered here the most endemic landscapes. They send us back to Earth, to the genesis of geological forms, and, at the same time, they upend our reference points: dunes that are made of black sand, ice that sublimates. These places and reliefs can be read as a series of hieroglyphs that take us back to our origins.
In the summer of 2010, Brandon Stanton — one of the warmest, most talented and most generous humans I know — lost his job as a bond trader in Chicago and was forced to make new light of his life. Having recently gotten his first camera and fallen in love with photography, he decided to follow that fertile combination of necessity and passion, and, to his parents’ terror and dismay, set out to pursue photography as a hobby-turned-vocation. (For his mother, who saw bond trading as a reputable occupation, photography “seemed like a thinly veiled attempt to avoid employment.”) Brandon recalls:
I had enjoyed my time as a trader. The job was challenging and stimulating. And I’d obsessed over markets in the same way I’d later obsess over photography. But the end goal of trading was always money. Two years of my life were spent obsessing over money, and in the end I had nothing to show for it. I wanted to spend the next phase of my life doing work that I valued as much as the reward.
In photography, he found that rewarding obsession. Approaching it with the priceless freshness of Beginner’s Mind, he brought to his new calling the gift of ignorance and an art of seeing untainted by the arrogance of expertise, hungry to make sense of the world through his lens as he made sense of his own life. And make he did: Brandon, who quickly realized that “the best way to become a photographer was to start photographing,” set out on a photo tour across several major American cities, beginning in Pittsburgh and ending up in New York City, where he had only planned to spend a week but where he found both his new home and his new calling.
And so, in a beautiful embodiment of how to find your purpose and do what you love, Brandon’s now-legendary online project documenting Gotham’s living fabric was born — at first a humble Facebook page, which blossomed into one of today’s most popular photojournalism blogs with millions of monthly readers. Now, his photographic census of the world’s most vibrant city spills into the eponymous offline masterpiece Humans of New York (public library) — a magnificent mosaic of lives constructed through four hundred of Brandon’s expressive and captivating photos, many never before featured online.
These portraits — poignant, poetic, playful, heartbreaking, heartening — dance across the entire spectrum of the human condition not with the mockingly complacent lens of a freak-show gawker but with the affectionate admiration and profound respect that one human holds for another.
In the age of the aesthetic consumerism of visual culture online, HONY stands as a warm beacon of humanity, gently reminding us that every image is not a disposable artifact to be used as social currency but a heart that beat in the blink of the shutter, one that will continue to beat with its private turbulence of daily triumphs and tribulations even as we move away from the screen or the page to resume our own lives.
The captions, some based on Brandon’s interviews with the subjects and others an unfiltered record of his own observations, add a layer of thought to the visual story: One photograph, depicting two elderly gentlemen intimately leaning into each other on a park bench, reads: “It takes a lot of disquiet to achieve this sort of quiet comfort.” Another, portraying a very old gentleman in a wheelchair with matching yellow sneakers, shorts, and baseball cap, surprises us by revealing that this is Banana George, world record-holder as the oldest barefoot water-skier.
Some are full of humor:
Damn liberal arts degree.
Others are hopelessly charming:
When I walked by, she was really moving to the music — hands up, head nodding, shoulders swinging. I really wanted to take her photo, so I walked up to the nearest adult and asked: “Does she belong to you?”
Suddenly the music stopped, and I heard: “I belong to myself!”
Others still are humbling and soul-stirring:
My wife passed away a few years back. Her name was Barbara, I used to call her Ba. My name was Lawrence, she used to call me La. When she died, I changed my name to Bala.
I stepped inside an Upper West Side nursing home, and met this man in the lobby. He was on his way to deliver a yellow teddy bear to his wife. “I visit her every day,” he said. “Even when the mind is gone, the heart shows through.”
Then there are the city’s favorite tropes: Its dogs…
…and its bikes…
I’m ninety years old and I ride this thing around everywhere. I don’t see why more people don’t use them. I carry my cane in the basket, I get all my shopping done. I can go everywhere. I’ve never hit anyone and never been hit. Of course, I ride on the sidewalk, which I don’t think I’m supposed to do, but still…
…and the deuce delight of dogs on bikes:
Above all, however, there is something especially magical about framing these moments of stillness and of absolute attention to the individual amidst this bustling city of millions, a city that never sleeps and never stops.
Whatever your geographic givens, Humans of New York is an absolute masterpiece of cultural celebration, both as vibrant visual anthropology and as a meta-testament, by way of Brandon’s own story, to the heartening notion that this is indeed a glorious age in which we can make our own luck and make a living doing what we love.
For nearly three decades, photographer and visual artist David Maisel — whose gloriously haunting Library of Dust project you might recall from a few years back — has been transforming landscape photography with his stunning aerial images exploring the relationship between Earth and humanity. Now, the best of them are collected in the magnificent monograph Black Maps: American Landscape and the Apocalyptic Sublime (public library) — a lavish large-format tome featuring more than 100 of Maisel’s surreally entrancing portraits of our worldly reality, at once beautiful and tragic. From cyanide-leaching ponds to open-pit mines to the sprawl of urbanization, Maisel’s mesmerizing photographs — which, without context, could be mistaken as much for abstract impressionism as they could for cellular microscopy — capture fragments of the landscape that “correspond to the structure of human thought and feeling.”
Among the biographical sketches is also the story of Lange’s best-known, infinitely expressive, most iconic photograph of all — Migrant Mother, depicting an agricultural worker named Florence Owens Thompson with her children — which came to capture the harrowing realities of the Great Depression not merely as an economic phenomenon but as a human tragedy.
In 1935, Lange and her second husband, the Berkeley economics professor and self-taught photographer Paul Taylor, were transferred to the Resettlement Administration (RA), one of Roosevelt’s New Deal programs designed to help the country recover from the depression. Lange began working as a Field Investigator and Photographer under Roy Stryker, head of the Information Division.
In early February of 1936, while living in a small two-bedroom house in California with Taylor and her two step-children, Lange received an assignment to photograph California’s rural and urban slums and farmworkers. She was supposed to spend a month on the road, but severe weather along the coast delayed her departure. When she finally set out for Los Angeles, the first destination on her route, she wrote in a letter to Stryker:
Tried to work in the pea camps in heavy rain from the back of the station wagon. I doubt that I got anything. . . . Made other mistakes too. . . . I make the most mistakes on subject matter that I get excited about and enthusiastic. In other words, the worse the work, the richer the material was.
It was in the pea camps that she captured her most iconic image less than two weeks later — an image that, due to its unshakable grip of empathy, would transcend the status of mere visual icon and effect critical cultural awareness on both a social and political level. Partridge writes:
Two weeks of sleet and steady rain had caused a rust blight, destroying the pea crop. There was no work, no money to buy food. Dorothea approached “the hungry and desperate mother,” huddled under a torn canvas tent with her children. The family had been living on frozen vegetables they’d gleaned from the fields and birds the children killed. Working quickly, Dorothea made just a few exposures, climbed back in her car, and drove home.
Dorothea knew the starving pea pickers couldn’t wait for someone in Washington, DC to act. They needed help immediately. She developed the negatives of the stranded family, and rushed several photographs to the San Francisco News. Two of her images accompanied an article on March 10th as the federal government rushed twenty thousand pounds of food to the migrants.
The most remarkable part of the story, however, is that this was an image Lange almost didn’t take: At the end of that cold and wretched winter, she had been on the road for almost a month, with only the insufficient protection of her camera lens between her and the desperate, soul-stirringly dejected living and working conditions of California’s migratory farm workers. Downhearted and weary, both physically and psychologically, she decided she had seen and captured enough, packed up her clunky camera equipment, and headed north on Highway 101, bickering with herself in her notebook: “Haven’t you plenty of negatives already on the subject? Isn’t this just one more of the same?” But then something happened — a fleeting glance, one of those pivotal chance encounters that shape lives. Partridge transports us to that fateful March day:
The cold, wet conditions of Northern California gave way to sweltering heat in Los Angeles, a “vile town,” Dorothea wrote. By the beginning of March she was headed home, exhausted, her camera bags packed on the front seat beside her.
Hours later, the hand-lettered “Pea pickers camp” sign flashed by her. Did she have it in her to try one more time?
The long, hard rains that had delayed Dorothea at the outset of her journey had deluged the Nipomo pea pickers. And even as Dorothea drove north and homeward, the camp was still floundering in water and mud. Not long before Dorothea arrived, Florence Thompson and four of her six children, along with some of the other stranded migrants, had moved to a higher, sandy location nearby. Thompson left word at the first camp for her partner, Jim Hill, on where to find them. Earlier in the day he’d set off walking with Thompson’s two sons to find parts for their broken-down car.
The sandy camp in front of a windbreak of eucalyptus trees is where Dorothea pulled in and found Florence Thompson and her children. They were waiting for Hill and the boys to show up, for the ground to dry, for crops to ripen for harvesting. They were waiting for their luck to change.
In minutes, Dorothea took the photograph that would become the definitive icon of the Great Depression, intuitively conveying the migrants’ perilous predicament in the frame of her camera.
In the late 1990s, photographer Jimmy Nelson became fascinated by Earth’s last living indigenous tribes. It took him a decade to begin documenting their fascinating lives, but once he did, what came out of his 4×5 camera was nothing short of mesmerizing — a glimpse of what feels like a parallel universe, or rather parallel multiverses, to our Western eyes, yet one full of our immutable shared humanity. The magnificent results are now gathered in Before They Pass Away (public library) — a lavish large-format tome featuring 500 of Nelson’s striking photographs, standing somewhere between Jeroen Toirkens’s visual catalog of Earth’s last nomads and Rachel Sussman’s photographic record of the oldest living things in the world.
The journey took Nelson all over the world, from the deserts of Africa to the steppes of Siberia. He writes:
I wanted to create an ambitious aesthetic photographic document that would stand the test of time. A body of work that would be an irreplaceable ethnographic record of a fast disappearing world.
The semi-nomadic Kazakhs, descendent from the Huns, have been herding in the valleys of Mongolia since the 19th century and take great pride in their ancient art of eagle-hunting.
The Huli of Papua New Guinea migrated to the island about 45,000 years ago. Today, the remaining tribes often fight with one another for resources — land, livestock, women. To intimidate the enemy, the largest tribe, the Huli wigmen, continue the ancient tradition of painting their faces in yellow, red and white and making elaborate wigs of their own hair.
Though the Gauchos of South America might appear more “modern” than other indigenous tribes, these free-spirited nomadic horsemen have remained a self-contained culture since they first started roaming the prairies in the 1700s.
A distinct ethnic group and even more distinct cultural collective, Tibetans, descendent from aboriginal and nomadic Qiang tribes, are known for their prayer flags, sky burials, spirit traps, and festival devil dances, which encapsulate their history and beliefs.
The Maasai endure as one of the oldest and greatest warrior cultures. As they migrated from the Sudan in the 15th century, they took possession of the local tribes’ cattle and conquered much of the Rift Valley. To this day, they depend on the natural cycles of rainfall and drought for their cattle, which remain their core source of sustenance.
The reindeer-herding Nenets of northern Arctic Russia have thrived for over a millennium at temperatures ranging from 58ºF below zero in the winter to 95ºF in the summer, migrating across more than 620 miles per year, 30 of which consist in the grueling crossing of the frozen Ob River.
Originally featured in November — see more here, including Nelson’s entertaining and moving TEDxAmsterdam talk.
How do people come to feel so passionately about fairness and freedom that they will risk their livelihoods, even their lives, to pursue justice? A few years ago, I became fascinated by such people—people for whom the “rule of law” is no mere abstraction, for whom human rights is a fiercely urgent concern. I wanted to give a face to social justice by making portraits of human rights pioneers. I am a photographer. I understand by seeing. Peering through the camera lens, I hoped to gain an understanding of how they become so devoted to the rights and dignity of others.
Accompanying each portrait is a micro-essay exploring the life, legacy, and singular spirit of its subject.
In 2007, 26-year-old amateur historian and collector John Maloof wandered into the auction house across from his home and won, for $380, a box of 30,000 extraordinary negatives by an unknown artist whose street photographs of mid-century Chicago and New York rivaled those of Berenice Abbott and predated modern fixtures like Humans of New York by decades. They turned out to be the work of a mysterious nanny named Vivian Maier, who made a living by raising wealthy suburbanites’ children and made her life by capturing the world around her in exquisite detail and striking composition. Mesmerized, Maloof began tracking down more of Maier’s work and amassed more than 100,000 negatives, thousands of prints, 700 rolls of undeveloped color film, home movies, audio interviews, and even her original cameras. Only after Maier’s death in 2009 did her remarkable work gain international acclaim — exhibitions were staged all over the world, magnificent monograph of her photographs published, and a documentary made.
As secretive as Vivian Maier was in life, in death her mystery has only deepened. Without the creator to reveal her motives and her craft, we are left to piece together the life and intent of an artist based on scraps of evidence, with no way to gain definitive answers.
There is, however, something fundamentally unsettling with this proposition — after all, a human being is a constantly evolving open question rather than a definitive answer, a fluid self only trapped by the labels applied from without. And so even though Maloof argues that the book answers “the nagging question of who Vivian Maier really was” by revealing her true self through her self-portraits, what it really does — and what its greatest, most enchanting gift is — is take us along as silent companions on a complex woman’s journey of self-knowledge and creative exploration, a journey without a definitive destination but one that is its own reward.
It’s also, however, hopelessly human to try to interpret others and assign them into categories based on the “scraps of evidence” they bequeath. I was certainly not immune to this tendency, as I began to suspect Maier was a queer woman who found in her art a vehicle for connection, for belonging, for feeling at once a part of the society she documented and an onlooker forever separated by her lens. Because we know so little about Maier’s life, this remains nothing more than intuitive speculation — but one I find increasingly hard to dismiss as her self-portraits peel off another layer of guarded intimacy.
The beauty and magnetism of Vivian Maier: Self-Portraits is that it leaves you with your own interpretations, not with definitive answers but with crystalline awareness of Maier’s elusive selfhood.