For double delight, the recipes — ranging from Willy Wonka’s Nutty Crunch Surprise to Lickable Wallpaper — are garnished with illustrations by the great Sir Quentin Blake, who had previously illustrated most of Dahl’s stories (as well as Sylvia Plath’s little-known children’s book and the first Dr. Seuss book not illustrated by Geisel himself).
The concept for the cookbook came to the Dahls shortly before Roald’s death in 1990, as they were writing a memoir of sorts about the foods they loved. Friends kept suggesting that they should consider writing a recipe book for children, based on the many fanciful edibles in Dahl’s books. But whenever the idea resurfaced, Roald would bury his face in his hands and gasp to his wife, “Oh no, Liccy, the work! The thought daunts me!”
A few weeks after his death, as Mrs. Dahl was making her way through the grief, she noticed a neat pile of papers in the corner of her desk. Listed on the sheets was every single food ever consumed in Roald’s books. Atop the pile was a note in her husband’s handwriting: “It’s a great idea, but God knows how you will do it.”
Treats were an essential part of Roald’s life — never too many, never too few, and always perfectly timed. He made you feel like a king receiving the finest gift in the land.
A treat could be a wine gum lifted silently in the middle of the night out of a large sweet jar kept permanently by his bedside. It could be a lobster and oyster feast placed on the table after a secret visit to the fishmonger, his favorite shop. It could be the first new potato, broad bean, or lettuce from the garden, a basket of field mushrooms, or a superb conker. A different kind of treat would be an unannounced visit to a school, causing chaos to teachers and, I suspect, a great deal of fun for the children.
Here is but a sampler taste of the full spread of delights:
8×10 inch shallow pan
7 ounces semisweet chocolate, broken into small pieces
4 tablespoons (½ stick) unsalted butter
5 tablespoons light corn syrup
3 ounces slivered almonds
6 plain vanilla cookies (Rich Tea biscuits are good) or graham crackers, finely crushed
1 ounce Rice Krispies
a few drops of vanilla extract
For the nutty crunch:
2 tablespoons water
½ cup sugar
2 ounces slivered almonds, finely chopped
For the chocolate coating:
7 ounces milk chocolate, broken into small pieces.
Put the semisweet chocolate, butter, and corn syrup in a Pyrex bowl and place in a saucepan of simmering water. Stir occasionally until melted. (Or place the bowl in a microwave oven and cook on high for about 1 ½ minutes)
Add the almonds, crushed cookies, Rice Krispies, and vanilla extract and mix well.
Spoon the mixture into a shallow pan lined with wax paper. Press the mixture down firmly with the back of a fork, creating a level surface.
Refrigerate until cool, then cut into bars.
Once the bars are ready, make the nutty crunch. Begin by placing the water and sugar in a small saucepan. Cook over low heat until the sugar has dissolved. Do not stir, but occasionally swirl the pan around gently. Once the sugar has dissolved, increase the heat and stir constantly until the sugar caramelizes and turns golden brown, about 2 to 3 minutes.
Remove from the heat. Working quickly, add the chopped almonds, stir thoroughly, and dip one end of each bar in the mixture. Place the bars on a sheet of buttered wax paper to set.
Melt the milk chocolate in a Pyrex bowl set in a saucepan of simmering water, or microwave as above. Once it has melted, remove from the heat and dip the other end of each bar in the chocolate.
food processor (optional)
two round cookie cutters, 1 ¼ inches and 2 ½ inches
½ cup light brown sugar, firmly packed
4 tablespoons unsalted butter
1 egg, lightly beaten
1 pound all-purpose flour
½ tablespoon baking powder
½ tablespoon cinnamon
a large pinch of salt
2 tablespoons hot water
¼ teaspoon vanilla extract
½ cup milk
vegetable oil for deep frying
sugar for coating
These are best eaten warm. The dough needs to be made and refrigerated for at least two hours before cooking, and will keep overnight in the refrigerator.
Cream the brown sugar and butter until pale and creamy — this can be done using a food processor.
Gradually add the egg until blended.
Add the remaining ingredients. The dough should be fairly stiff but smooth.
Cover with plastic wrap and refrigerate for 2 hours
Divide the dough in half and return one half to the refrigerator.
On a floured surface roll out the other half of the dough to a quarter-inch thick. With the cutters cut out as many doughnuts as possible, using the large one to cut the doughnut shape and the smaller one to make the hole.
Gather up the scraps and roll and cut out as many additional doughnuts as possible. Repeat the rolling and cutting with the remaining half of the dough.
Heat the vegetable oil to 375ºF.
Fry the doughnuts in small batches, turning once, until deep golden brown.
Drain on paper towels.
Put the sugar in a bowl and add a few doughnuts at a time, shaking them in the sugar until coated. Serve immediately.
blackbird (a black pastry funnel found in specialty cooks’ shops and mail order catalogs)
9-inch pie dish
¼ cup pearl barley
2 tablespoons unsalted butter
1 onion, finely chopped
1 pound turkey breast, cut into thin strips
12 ounces pork sausage meat
2 tablespoons chopped fresh sage (optional)
5 ounces sour cream
5 ounces plain yogurt
1 level teaspoon cornstarch, mixed with 1 teaspoon cold water
½ cup chicken stock
2 eggs, one beaten, one hard-boiled and chopped
salt and pepper
2 ounces ham, chopped
9 ounces ready-made puff pastry or instant biscuit dough
1 egg yolk
8 parsley sprigs with the leaves pinched off or colored pipe cleaners
Simmer the pearl barley in water for about 20 minutes, or until soft.
In a large saucepan melt the butter and gently fry the onion until soft. Add the turkey strips and fry quickly until golden.
Remove the saucepan from the heat and add the sausage meat. Mix well.
Add the sage (if using), sour cream, yogurt, cornstarch mixture, chicken stock, and beaten egg. Season to taste with salt and pepper and mix thoroughly.
Place the blackbird in the middle of the pie dish. Surround with the turkey mixture. Sprinkle on the chopped ham, followed by the chopped egg.
Preheat oven to 400ºF
Roll out the pastry to a circle 1/8 inch thick. Make sure it is at least one inch wider than the pie dish all the way around.
Cut the extra one inch from the pastry in one long circular strip (it should be slightly larger than the rim of the pie dish). Brush the pie dish rim with egg yolk, press the pastry strip down onto the rim, and brush the strip with egg yolk.
Lift the remaining pastry carefully (you can drape it over the rolling pin) and lay it over the turkey mixture. Cut a slit in the center and ease the blackbird’s beak through the pastry, taking care not to stretch it. Press the pastry down firmly along the rim and cut away any excess. Use a fork to crimp the edge.
Glaze the pastry with egg yolk and scatter the pearl barely on top. Form a “worm” out of a strip of pastry, glaze it with egg yolk, and place it inside the bird’s beak.
Refrigerate the pie for ten minutes.
Bake for 30 to 40 minutes, or until the pastry is well risen and golden brown.
Stick the stripped parsley stalks, or folded pipe cleaners, in pairs into the pastry crust to look like birds’ legs. If you like, singe the ends to look like toes.
The project is in many ways an organic extension of the Little Golden Book ethos, which has sustained generations through troubled times with creative nourishment for young souls. This compendium offers heartening solace for those weary of the hardships our world is currently facing. Diane Muldrow, longtime editor of the beloved children’s series, writes in the introduction:
We’ve been forced to look at ourselves and how we’re living our lives. Ironically, in this health-conscious, ecologically aware age of information, many of us have overborrowed, overspent, overeaten, and generally overdosed on habits or ways of life that aren’t good for us — or for our world. The chickens have come home to roost, and their names are Debt, Depression, and Diabetes.
How did we get here? How, like Tootle the Train, did we get so off track? Perhaps it’s time to revisit these beloved stories and start all over again. Trying to figure out where you belong, like Scuffy the Tugboat? Maybe, as time marches on, you’re beginning to feel that you resemble the Saggy Baggy Elephant.
Or perhaps your problems are more sweeping. Like the Poky Little Puppy, do you seem to be getting into trouble rather often and missing out on the strawberry shortcake in life? Maybe this book can help you! After all, Little Golden Books were first published during the dark days of World War II, and they’ve been comforting people during trying times ever since — while gently teaching us a thing or two. And they remind us that we’ve had the potential to be wise and content all along.
“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,”Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library), also among the year’s best psychology books — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.
Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:
Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.
By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.
This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.
For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:
Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.
The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:
Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.
In this way, the familiar becomes unfamiliar, and the old the new.
Horowitz’s approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.
First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)
I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.
In 1865, when he was only thirty, Mark Twain penned a playful short story mischievously encouraging girls to think independently rather than blindly obey rules and social mores. In the summer of 2011, I chanced upon and fell in love with a lovely Italian edition of this little-known gem with Victorian-scrapbook-inspired artwork by celebrated Russian-born children’s book illustrator Vladimir Radunsky. I knew the book had to come to life in English, so I partnered with the wonderful Claudia Zoe Bedrick of Brooklyn-based indie publishing house Enchanted Lion, maker of extraordinarily beautiful picture-books, and we spent the next two years bringing Advice to Little Girls (public library) to life in America — a true labor-of-love project full of so much delight for readers of all ages. (And how joyous to learn that it was also selected among NPR’s best books of 2013!)
Good little girls ought not to make mouths at their teachers for every trifling offense. This retaliation should only be resorted to under peculiarly aggravated circumstances.
If you have nothing but a rag-doll stuffed with sawdust, while one of your more fortunate little playmates has a costly China one, you should treat her with a show of kindness nevertheless. And you ought not to attempt to make a forcible swap with her unless your conscience would justify you in it, and you know you are able to do it.
One can’t help but wonder whether this particular bit may have in part inspired the irreverent 1964 anthology Beastly Boys and Ghastly Girls and its mischievous advice on brother-sister relations:
If at any time you find it necessary to correct your brother, do not correct him with mud — never, on any account, throw mud at him, because it will spoil his clothes. It is better to scald him a little, for then you obtain desirable results. You secure his immediate attention to the lessons you are inculcating, and at the same time your hot water will have a tendency to move impurities from his person, and possibly the skin, in spots.
If your mother tells you to do a thing, it is wrong to reply that you won’t. It is better and more becoming to intimate that you will do as she bids you, and then afterward act quietly in the matter according to the dictates of your best judgment.
Good little girls always show marked deference for the aged. You ought never to ‘sass’ old people unless they ‘sass’ you first.
Originally featured in April — see more spreads, as well as the story behind the project, here.
The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology, genetics, computer science, neurophysiology, psychology, cosmology, and physics. Emerging out of these contributions is a new natural philosophy, new ways of understanding physical systems, new ways of thinking that call into question many of our basic assumptions.
Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple principles in a surprising way. These explanations are called ‘beautiful’ or ‘elegant.’
The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining knowledge about anything — including such fields of inquiry as philosophy, mathematics, economics, history, language, and human behavior. The common thread is that a simple and nonobvious idea is proposed as the explanation of a diverse and complicated set of phenomena.
Observe a single ant, and it doesn’t make much sense, walking in one direction, suddenly careening in another for no obvious reason, doubling back on itself. Thoroughly unpredictable.
The same happens with two ants, a handful of ants. But a colony of ants makes fantastic sense. Specialized jobs, efficient means of exploiting new food sources, complex underground nests with temperature regulated within a few degrees. And critically, there’s no blueprint or central source of command—each individual ants has algorithms for their behaviors. But this is not wisdom of the crowd, where a bunch of reasonably informed individuals outperform a single expert. The ants aren’t reasonably informed about the big picture. Instead, the behavior algorithms of each ant consist of a few simple rules for interacting with the local environment and local ants. And out of this emerges a highly efficient colony.
Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple rules about when to lay down a pheromone trail and what to do when encountering someone else’s trail—approximations of optimal solutions to the Traveling Salesman problem. This has useful applications. In “ant-based routing,” simulations using virtual ants with similar rules can generate optimal ways of connecting the nodes in a network, something of great interest to telecommunications companies. It applies to the developing brain, which must wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of connecting axons. And migrating fetal neurons generate an efficient solution with a different version of ant-based routine.
A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple molecules in an organic soup to occasionally form more complex ones. Life may have originated this way without the requirement of bolts of lightning to catalyze the formation of complex molecules.
And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems don’t require a blue print, they don’t require a blue print maker. If they don’t require lightning bolts, they don’t require Someone hurtling lightning bolts.
In a planet occupied now by seven billion inhabitants, I am amazed by the difference that one human being can make. Think of classical music without Mozart or Stravinsky; of painting without Caravaggio, Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible contributions of Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs (or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses or Christ.
Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of single individuals, or of small groups, working against the odds. As scholars, we cannot and should not sweep these instances under the investigative rug. We should bear in mind anthropologist Margaret Mead’s famous injunction: ‘Never doubt that a small group of thoughtful committed citizens can change the world. It is the only thing that ever has.’
In art, the title of a work can often be its first explanation. And in this context I am thinking especially of the titles of Gerhard Richter. In 2006, when I visited Richter in his studio in Cologne, he had just finished a group of six corresponding abstract paintings which he gave the title Cage.
There are many relations between Richter’s painting and the compositions of John Cage. In a book about the Cage series, Robert Storr has traced them from Richter‘s attendance of a Cage performance at the Festum Fluxorum Fluxus in Düsseldorf 1963 to analogies in their artistic processes. Cage has often applied chance procedures in his compositions, notably with the use of the I Ching. Richter in his abstract paintings also intentionally allows effects of chance. In these paintings, he applies the oil paint on the canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the paint leaves on the canvas is to a large extent the outcome of chance.
Richter‘s concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other works)—but, one can say, the short form already contains everything. The title, like an explanation of a phenomenon, unlocks the works, describing their relation to one of the most important cultural figures of the twentieth century, John Cage, who shares with Richter the great themes of chance and uncertainty.
Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains indispensable, offers a lyrical meditation on the peculiar odds behind coincidences and déja vus:
I take comfort in the fact that there are two human moments that seem to be doled out equally and democratically within the human condition—and that there is no satisfying ultimate explanation for either. One is coincidence, the other is déja vu. It doesn’t matter if you’re Queen Elizabeth, one of the thirty-three miners rescued in Chile, a South Korean housewife or a migrant herder in Zimbabwe—in the span of 365 days you will pretty much have two déja vus as well as one coincidence that makes you stop and say, “Wow, that was a coincidence.”
The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can happen at any given moment, the fact is, that in practice, coincidences almost never do occur. Coincidences are actually so rare that when they do occur they are, in fact memorable. This suggests to me that the universe is designed to ward off coincidence whenever possible—the universe hates coincidence—I don’t know why—it just seems to be true. So when a coincidence happens, that coincidence had to work awfully hard to escape the system. There’s a message there. What is it? Look. Look harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a theorem for something larger than what they think it is.
What’s both eerie and interesting to me about déja vus is that they occur almost like metronomes throughout our lives, about one every six months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely assume that my thirteen year old niece, Stephen Hawking and someone working in a Beijing luggage-making factory each experience two déja vus a year. Not one. Not three. Two.
The underlying biodynamics of déja vus is probably ascribable to some sort of tingling neurons in a certain part of the brain, yet this doesn’t tell us why they exist. They seem to me to be a signal from larger point of view that wants to remind us that our lives are distinct, that they have meaning, and that they occur throughout a span of time. We are important, and what makes us valuable to the universe is our sentience and our curse and blessing of perpetual self-awareness.
Given my soft spot for famous diaries, it should come as no surprise that I keep one myself. Perhaps the greatest gift of the practice has been the daily habit of reading what I had written on that day a year earlier; not only is it a remarkable tool of introspection and self-awareness, but it also illustrates that our memory “is never a precise duplicate of the original [but] a continuing act of creation” and how flawed our perception of time is — almost everything that occurred a year ago appears as having taken place either significantly further in the past (“a different lifetime,” I’d often marvel at this time-illusion) or significantly more recently (“this feels like just last month!”). Rather than a personal deficiency of those of us befallen by this tendency, however, it turns out to be a defining feature of how the human mind works, the science of which is at first unsettling, then strangely comforting, and altogether intensely interesting.
That’s precisely what acclaimed BBC broadcaster and psychology writer Claudia Hammond explores in Time Warped: Unlocking the Mysteries of Time Perception (public library) — a fascinating foray into the idea that our experience of time is actively created by our own minds and how these sensations of what neuroscientists and psychologists call “mind time” are created, and also among the year’s best psychology books. As disorienting as the concept might seem — after all, we’ve been nursed on the belief that time is one of those few utterly reliable and objective things in life — it is also strangely empowering to think that the very phenomenon depicted as the unforgiving dictator of life is something we might be able to shape and benefit from. Hammond writes:
We construct the experience of time in our minds, so it follows that we are able to change the elements we find troubling — whether it’s trying to stop the years racing past, or speeding up time when we’re stuck in a queue, trying to live more in the present, or working out how long ago we last saw our old friends. Time can be a friend, but it can also be an enemy. The trick is to harness it, whether at home, at work, or even in social policy, and to work in line with our conception of time. Time perception matters because it is the experience of time that roots us in our mental reality. Time is not only at the heart of the way we organize life, but the way we experience it.
Among the most intriguing illustrations of “mind time” is the incredible elasticity of how we experience time. (“Where is it, this present?,”William James famously wondered. “It has melted in our grasp, fled ere we could touch it, gone in the instant of becoming.”) For instance, Hammond points out, we slow time down when gripped by mortal fear — the cliche about the slow-motion car crash is, in fact, a cognitive reality. This plays out even in situations that aren’t life-or-death per se but are still associated with strong feelings of fear. Hammond points to a study in which people with arachnophobia were asked to look at spiders — the very object of their intense fear — for 45 seconds and they overestimated the elapsed time. The same pattern was observed in novice skydivers, who estimated the duration of their peers’ falls as short, whereas their own, from the same altitude, were deemed longer.
Inversely, time seems to speed up as we get older — a phenomenon of which competing theories have attempted to make light. One, known as the “proportionality theory,” uses pure mathematics, holding that a year feels faster when you’re 40 than when you’re 8 because it only constitutes one fortieth of your life rather than a whole eighth. Among its famous proponents are Vladimir Nabokov and William James. But Hammond remains unconvinced:
The problem with the proportionality theory is that it fails to account for the way we experience time at any one moment. We don’t judge one day in the context of our whole lives. If we did, then for a 40-year-old every single day should flash by because it is less than one fourteen-thousandth of the life they’ve had so far. It should be fleeting and inconsequential, yet if you have nothing to do or an enforced wait at an airport for example, a day at 40 can still feel long and boring and surely longer than a fun day at the seaside packed with adventure for a child. … It ignores attention and emotion, which … can have a considerable impact on time perception.
Another theory suggests that perhaps it is the tempo of life in general that has accelerated, making things from the past appear as slower, including the passage of time itself.
But one definite change does take place with age: As we grow older, we tend to feel like the previous decade elapsed more rapidly, while the earlier decades of our lives seem to have lasted longer. Similarly, we tend to think of events that took place in the past 10 years as having happened more recently than they actually did. (Quick: What year did the devastating Japanese tsunami hit? When did we lose Maurice Sendak?) Conversely, we perceive events that took place more than a decade ago as having happened even longer ago. (When did Princess Diana die? What year was the Chernobyl disaster?) This, Hammond points out, is known as “forward telescoping”:
It is as though time has been compressed and — as if looking through a telescope — things seem closer than they really are. The opposite is called backward or reverse telescoping, also known as time expansion. This is when you guess that events happened longer ago than they really did. This is rare for distant events, but not uncommon for recent weeks.
The most straightforward explanation for it is called the clarity of memory hypothesis, proposed by the psychologist Norman Bradburn in 1987. This is the simple idea that because we know that memories fade over time, we use the clarity of a memory as a guide to its recency. So if a memory seems unclear we assume it happened longer ago.
“Still this childish fascination with my handwriting,” young Susan Sontag wrote in her diary in 1949. “To think that I always have this sensuous potentiality glowing within my fingers.” This is the sort of sensuous potentiality that comes aglow in Self-Portrait as Your Traitor (public library) — the magnificent collection of hand-lettered poems and illustrated essays by friend-of-Brain-Pickings and frequentcontributorDebbie Millman. In the introduction, design legend Paula Scher aptly describes this singular visual form as a “21st-century illuminated manuscript.” Personal bias aside, these moving, lovingly crafted poems and essays — some handwritten, some drawn with colored pencils, some typeset in felt on felt — vibrate at that fertile intersection of the deeply personal and the universally profound.
John Maeda once explained, “The computer will do anything within its abilities, but it will do nothing unless commanded to do so.” I think people are the same — we like to operate within our abilities. But whereas the computer has a fixed code, our abilities are limited only by our perceptions. Two decades since determining my code, and after 15 years of working in the world of branding, I am now in the process of rewriting the possibilities of what comes next. I don’t know exactly what I will become; it is not something I can describe scientifically or artistically. Perhaps it is a “code in progress.”
Self-Portrait as Your Traitor, a glorious large-format tome full of textured colors to which the screen does absolutely no justice, is the result of this progress — a brave and heartening embodiment of what it truly means, as Rilke put it, to live the questions; the stunning record of one woman’s personal and artistic code-rewriting, brimming with wisdom on life and art for all.
Originally featured in November. See an exclusive excerpt here, then take a peek at Debbie’s creative process here.
6. SUSAN SONTAG: THE COMPLETE ROLLING STONE INTERVIEW
In 1978, Rolling Stone contributing editor Jonathan Cott interviewed Susan Sontag in twelve hours of conversation, beginning in Paris and continuing in New York, only a third of which was published in the magazine. More than three decades later and almost a decade after Sontag’s death, the full, wide-ranging magnificence of their tête-à-tête, spanning literature, philosophy, illness, mental health, music, art, and much more, is at last released in Susan Sontag: The Complete Rolling Stone Interview (public library) — a rare glimpse of one of modern history’s greatest minds in her element.
Cott marvels at what made the dialogue especially extraordinary:
Unlike almost any other person whom I’ve ever interviewed — the pianist Glenn Gould is the one other exception — Susan spoke not in sentences but in measured and expansive paragraphs. And what seemed most striking to me was the exactitude and “moral and linguistic fine-tuning” — as she once described Henry James’s writing style—with which she framed and elaborated her thoughts, precisely calibrating her intended meanings with parenthetical remarks and qualifying words (“sometimes,” “occasionally,” “usually,” “for the most part,” “in almost all cases”), the munificence and fluency of her conversation manifesting what the French refer to as an ivresse du discours — an inebriation with the spoken word. “I am hooked on talk as a creative dialogue,” she once remarked in her journals, and added: “For me, it’s the principal medium of my salvation.
I really believe in history, and that’s something people don’t believe in anymore. I know that what we do and think is a historical creation. I have very few beliefs, but this is certainly a real belief: that most everything we think of as natural is historical and has roots — specifically in the late eighteenth and early nineteenth centuries, the so-called Romantic revolutionary period — and we’re essentially still dealing with expectations and feelings that were formulated at that time, like ideas about happiness, individuality, radical social change, and pleasure. We were given a vocabulary that came into existence at a particular historical moment. So when I go to a Patti Smith concert at CBGB, I enjoy, participate, appreciate, and am tuned in better because I’ve read Nietzsche.
What I want is to be fully present in my life — to be really where you are, contemporary with yourself in your life, giving full attention to the world, which includes you. You are not the world, the world is not identical to you, but you’re in it and paying attention to it. That’s what a writer does — a writer pays attention to the world. Because I’m very against this solipsistic notion that you find it all in your head. You don’t, there really is a world that’s there whether you’re in it or not.
I want to feel as responsible as I possibly can. As I told you before, I hate feeling like a victim, which not only gives me no pleasure but also makes me feel very uncomfortable. Insofar as it’s possible, and not crazy, I want to enlarge to the furthest extent possible my sense of my own autonomy, so that in friendship and love relationships I’m eager to take responsibility for both the good and the bad things. I don’t want this attitude of “I was so wonderful and that person did me in.” Even when it’s sometimes true, I’ve managed to convince myself that I was at least co-responsible for bad things that have happened to me, because it actually makes me feel stronger and makes me feel that things could perhaps be different.
The conversation, in which Sontag reaches unprecedented depths of self-revelation, also debunks some misconceptions about her public image as an intellectual in the dry, scholarly sense of the term:
Most of what I do, contrary to what people think, is so intuitive and unpremeditated and not at all that kind of cerebral, calculating thing people imagine it to be. I’m just following my instincts and intuitions. […] An argument appears to me much more like the spokes of a wheel than the links of a chain.
A lot of our ideas about what we can do at different ages and what age means are so arbitrary — as arbitrary as sexual stereotypes. I think that the young-old polarization and the male-female polarization are perhaps the two leading stereotypes that imprison people. The values associated with youth and with masculinity are considered to be the human norms, and anything else is taken to be at least less worthwhile or inferior. Old people have a terrific sense of inferiority. They’re embarrassed to be old. What you can do when you’re young and what you can do when you’re old is as arbitrary and without much basis as what you can do if you’re a woman or what you can do if you’re a man.
Originally featured in November — take a closer look here and here.
Cautious that “all advice can only be a product of the man who gives it” — a caveat other literary legends have stressed with varying degrees of irreverence — Thompson begins with a necessary disclaimer about the very notion of advice-giving:
To give advice to a man who asks what to do with his life implies something very close to egomania. To presume to point a man to the right and ultimate goal — to point with a trembling finger in the RIGHT direction is something only a fool would take upon himself.
And yet he honors his friend’s request, turning to Shakespeare for an anchor of his own advice:
“To be, or not to be: that is the question: Whether ’tis nobler in the mind to suffer the slings and arrows of outrageous fortune, or to take arms against a sea of troubles…”
And indeed, that IS the question: whether to float with the tide, or to swim for a goal. It is a choice we must all make consciously or unconsciously at one time in our lives. So few people understand this! Think of any decision you’ve ever made which had a bearing on your future: I may be wrong, but I don’t see how it could have been anything but a choice however indirect — between the two things I’ve mentioned: the floating or the swimming.
He acknowledges the obvious question of why not take the path of least resistance and float aimlessly, then counters it:
The answer — and, in a sense, the tragedy of life — is that we seek to understand the goal and not the man. We set up a goal which demands of us certain things: and we do these things. We adjust to the demands of a concept which CANNOT be valid. When you were young, let us say that you wanted to be a fireman. I feel reasonably safe in saying that you no longer want to be a fireman. Why? Because your perspective has changed. It’s not the fireman who has changed, but you.
Every man is the sum total of his reactions to experience. As your experiences differ and multiply, you become a different man, and hence your perspective changes. This goes on and on. Every reaction is a learning process; every significant experience alters your perspective.
So it would seem foolish, would it not, to adjust our lives to the demands of a goal we see from a different angle every day? How could we ever hope to accomplish anything other than galloping neurosis?
The answer, then, must not deal with goals at all, or not with tangible goals, anyway. It would take reams of paper to develop this subject to fulfillment. God only knows how many books have been written on “the meaning of man” and that sort of thing, and god only knows how many people have pondered the subject. (I use the term “god only knows” purely as an expression.)* There’s very little sense in my trying to give it up to you in the proverbial nutshell, because I’m the first to admit my absolute lack of qualifications for reducing the meaning of life to one or two paragraphs.
To put our faith in tangible goals would seem to be, at best, unwise. So we do not strive to be firemen, we do not strive to be bankers, nor policemen, nor doctors. WE STRIVE TO BE OURSELVES.
But don’t misunderstand me. I don’t mean that we can’t BE firemen, bankers, or doctors—but that we must make the goal conform to the individual, rather than make the individual conform to the goal. In every man, heredity and environment have combined to produce a creature of certain abilities and desires—including a deeply ingrained need to function in such a way that his life will be MEANINGFUL. A man has to BE something; he has to matter.
As I see it then, the formula runs something like this: a man must choose a path which will let his ABILITIES function at maximum efficiency toward the gratification of his DESIRES. In doing this, he is fulfilling a need (giving himself identity by functioning in a set pattern toward a set goal) he avoids frustrating his potential (choosing a path which puts no limit on his self-development), and he avoids the terror of seeing his goal wilt or lose its charm as he draws closer to it (rather than bending himself to meet the demands of that which he seeks, he has bent his goal to conform to his own abilities and desires).
In short, he has not dedicated his life to reaching a pre-defined goal, but he has rather chosen a way of life he KNOWS he will enjoy. The goal is absolutely secondary: it is the functioning toward the goal which is important. And it seems almost ridiculous to say that a man MUST function in a pattern of his own choosing; for to let another man define your own goals is to give up one of the most meaningful aspects of life — the definitive act of will which makes a man an individual.
Noting that his friend had thus far lived “a vertical rather than horizontal existence,” Thompson acknowledges the challenge of this choice but admonishes that however difficult, the choice must be made or else it melts away into those default modes of society:
A man who procrastinates in his CHOOSING will inevitably have his choice made for him by circumstance. So if you now number yourself among the disenchanted, then you have no choice but to accept things as they are, or to seriously seek something else. But beware of looking for goals: look for a way of life. Decide how you want to live and then see what you can do to make a living WITHIN that way of life. But you say, “I don’t know where to look; I don’t know what to look for.”
And there’s the crux. Is it worth giving up what I have to look for something better? I don’t know — is it? Who can make that decision but you? But even by DECIDING TO LOOK, you go a long way toward making the choice.
He ends by returning to his original disclaimer by reiterating that rather than a prescription for living, his “advice” is merely a reminder that how and what we choose — choices we’re in danger of forgetting even exist — shapes the course and experience of our lives:
I’m not trying to send you out “on the road” in search of Valhalla, but merely pointing out that it is not necessary to accept the choices handed down to you by life as you know it. There is more to it than that — no one HAS to do something he doesn’t want to do for the rest of his life.
Though most of his 77 “intuition pumps” address concrete questions, a dozen are “general-purpose” tools that apply deeply and widely, across just about any domain of thinking. The first of them is also arguably the most useful yet most uncomfortable: making mistakes.
The history of philosophy is in large measure the history of very smart people making very tempting mistakes, and if you don’t know the history, you are doomed to making the same darn mistakes all over again. … There is no such thing as philosophy-free science, just science that has been conducted without any consideration of its underlying philosophical assumptions.
He speaks for the generative potential of mistakes and their usefulness as an empirical tool:
Sometimes you don’t just want to risk making mistakes; you actually want to make them — if only to give you something clear and detailed to fix.
Therein lies the power of mistakes as a vehicle for, as Rilke famously put it, “living the questions” and thus advancing knowledge in a way that certainty cannot — for, as Richard Feynman memorably noted, the scientist’s job is to remain unsure, and so seems the philosopher’s. Dennett writes:
We philosophers are mistake specialists. … While other disciplines specialize in getting the right answers to their defining questions, we philosophers specialize in all the ways there are of getting things so mixed up, so deeply wrong, that nobody is even sure what the right questions are, let alone the answers. Asking the wrong questions risks setting any inquiry off on the wrong foot. Whenever that happens, this is a job for philosophers! Philosophy — in every field of inquiry — is what you have to do until you figure out what questions you should have been asking in the first place.
Mistakes are not just opportunities for learning; they are, in an important sense, the only opportunity for learning or making something truly new. Before there can be learning, there must be learners. There are only two non-miraculous ways for learners to come into existence: they must either evolve or be designed and built by learners that evolved. Biological evolution proceeds by a grand, inexorable process of trial and error — and without the errors the trials wouldn’t accomplish anything.
Trials can be either blind or foresighted. You, who know a lot, but not the answer to the question at hand, can take leaps — foresighted leaps. You can look before you leap, and hence be somewhat guided from the outset by what you already know. You need not be guessing at random, but don’t look down your nose at random guesses; among its wonderful products is … you!
And since evolution is the highest epitome of how the process of trial and error drives progress, Dennett makes a case for understanding evolution as a key to understanding everything else we humans value:
Evolution … is the central, enabling process not only of life but also of knowledge and learning and understanding. If you attempt to make sense of the world of ideas and meanings, free will and morality, art and science and even philosophy itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back. … For evolution, which knows nothing, the steps into novelty are blindly taken by mutations, which are random copying “errors” in DNA.
Dennett echoes Dostoyevsky (“Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others.”) and offers the key to making productive mistakes:
The chief trick to making good mistakes is not to hide them — especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own mistakes, turning them over in your mind as if they were works of art, which in a way they are. … The trick is to take advantage of the particular details of the mess you’ve made, so that your next attempt will be informed by it and not just another blind stab in the dark.
We have all heard the forlorn refrain “Well, it seemed like a good idea at the time!” This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say, “Well, it seemed like a good idea at the time!” is standing on the threshold of brilliance.
After Caroline crashes an experimental plane she was piloting, she finds herself severely injured and spiraling into the depths of depression. It both helps and doesn’t that Caroline and Wendy have just fallen in love, soaring in the butterfly heights of new romance, “the phase of love that didn’t obey any known rules of physics,” until the crash pulls them into a place that would challenge even the most seasoned and grounded of relationships. And yet they persevere as Wendy patiently and lovingly takes care of Caroline.
When Caroline returns from the hospital with a shattered ankle, her two thirteen-year-old tabbies — the shy, anxious Tibby (short for Tibia, affectionately — and, in these circumstances, ironically — named after the shinbone) and the sociable, amicable Fibby (short for Fibula, after the calf bone on the lateral side of the tibia) — are, short of Wendy, her only joy and comfort:
Tibia and Fibula meowed happily when I arrived. They were undaunted by my ensuing stupor. In fact they were delighted; suddenly I had become a human who didn’t shout into a small rectangle of lights and plastic in her hand, peer at a computer, or get up and disappear from the vicinity, only to reappear through the front door hours later. Instead, I was completely available to them at all times. Amazed by their good luck, they took full feline advantage. They asked for ear scratches and chin rubs. They rubbed their whiskers along my face. They purred in response to my slurred, affectionate baby talk. But mostly they just settled in and went to sleep. Fibby snored into my neck. Tibby snored on the rug nearby. Meanwhile I lay awake, circling the deep dark hole of depression.
Without my cats, I would have fallen right in.
And then, one day, Tibby disappears.
Wendy and Caroline proceed to flyer the neighborhood, visit every animal shelter in the vicinity, and even, in their desperation, enlist the help of a psychic who specializes in lost pets — but to no avail. Heartbroken, they begin to mourn Tibby’s loss.
And then, one day five weeks later, Tibby reappears. But once the initial elation of the recovery has worn off, Caroline begins to wonder where he’d been and why he’d left. He is now no longer eating at home and regularly leaves the house for extended periods of time — Tibby clearly has a secret place he now returns to. Even more worrisomely, he’s no longer the shy, anxious tabby he’d been for thirteen years — instead, he’s a half pound heavier, chirpy, with “a youthful spring in his step.” But why would a happy cat abandon his loving lifelong companion and find comfort — find himself, even — elsewhere?
When the relief that my cat was safe began to fade, and the joy of his prone, snoring form — sprawled like an athlete after a celebratory night of boozing — started to wear thin, I was left with darker emotions. Confusion. Jealousy. Betrayal. I thought I’d known my cat of thirteen years. But that cat had been anxious and shy. This cat was a swashbuckling adventurer back from the high seas. What siren call could have lured him away? Was he still going to this gilded place, with its overflowing food bowls and endless treats?
There only one obvious thing left to do: Track Tibby on his escapades. So Caroline, despite Wendy’s lovingly suppressed skepticism, heads to a spy store — yes, those exist — and purchases a real-time GPS tracker, complete with a camera that they program to take snapshots every few minutes, which they then attach to Tibby’s collar.
What follows is a wild, hilarious, and sweet tale of tinkering, tracking, and tenderness. Underpinning the obsessive quest is the subtle yet palpable subplot of Wendy and Caroline’s growing love for each other, the deepening of trust and affection that happens when two people share in a special kind of insanity.
“Evert quest is a journey, every journey a story. Every story, in turn, has a moral,” writes Caroline in the final chapter, then offers several “possible morals” for the story, the last of which embody everything that makes Lost Cat an absolute treat from cover to cover:
6. You can never know your cat. In fact, you can never know anyone as completely as you want.
In 1976, Italian artist, architect, and designer Luigi Serafini, only 27 at the time, set out to create an elaborate encyclopedia of imaginary objects and creatures that fell somewhere between Edward Gorey’s cryptic alphabets, Albertus Seba’s cabinet of curiosities, the book of surrealist games, and Alice in Wonderland. What’s more, it wasn’t written in any ordinary language but in an unintelligible alphabet that appeared to be a conlang — an undertaking so complex it constitutes one of the highest feats of cryptography. It took him nearly three years to complete the project, and three more to publish it, but when it was finally released, the book — a weird and wonderful masterpiece of art and philosophical provocation on the precipice of the information age — attracted a growing following that continued to gather momentum even as the original edition went out of print.
Now, for the first time in more than thirty years, Codex Seraphinianus (public library) is resurrected in a lavish new edition by Rizzoli — who have a penchant for excavating forgotten gems — featuring a new chapter by Serafini, now in his 60s, and a gorgeous signed print with each deluxe tome. Besides a visual masterwork, it’s also a timeless meditation on what “reality” really is, one all the timelier in today’s age of such seemingly surrealist feats as bioengineering whole new lifeforms, hurling subatomic particles at each other faster than the speed of light, and encoding an entire book onto a DNA molecule.
In an interview for Wired Italy, Serafini aptly captures the subtle similarity to children’s books in how the Codex bewitches our grown-up fancy with its bizarre beauty:
What I want my alphabet to convey to the reader is the sensation that children feel in front of books they cannot yet understand. I used it to describe analytically an imaginary world and give a coherent framework. The images originate from the clash between this fantasy vocabulary and the real world. … The Codex became so popular because it makes you feel more comfortable with your fantasies. Another world is not possible, but a fantasy one maybe is.
The [new] edition is very rich and also pricey, I know, but it’s just like psychoanalysis: Money matters and the fee is part of the process of healing. At the end of the day, the Codex is similar to the Rorschach inkblot test. You see what you want to see. You might think it’s speaking to you, but it’s just your imagination.
It tells the story of two brothers, Jack and Guy, torn asunder when a falling star crashes onto Earth. Though on the surface about the beloved author’s own brother Jack, who died 18 years ago, the story is also about the love of Sendak’s life and his partner of fifty years, psychoanalyst Eugene Glynn, whose prolonged illness and eventual loss in 2007 devastated Sendak — the character of Guy reads like a poetic fusion of Sendak and Glynn. And while the story might be a universal “love letter to those who have gone before,” as NPR’s Renee Montagne suggests in Morning Edition, it is in equal measure a private love letter to Glynn. (Sendak passed away the day before President Obama announced his support for same-sex marriage, but Sendak fans were quick to honor both historic moments with a bittersweet homage.)
Indeed, the theme of all-consuming love manifests viscerally in Sendak’s books. Playwright Tony Kushner, a longtime close friend of Sendak’s and one of his most heartfelt mourners, tells NPR:
There’s a lot of consuming and devouring and eating in Maurice’s books. And I think that when people play with kids, there’s a lot of fake ferocity and threats of, you know, devouring — because love is so enormous, the only thing you can think of doing is swallowing the person that you love entirely.
My Brother’s Book ends on a soul-stirring note, tender and poignant in its posthumous light:
And Jack slept safe
Enfolded in his brother’s arms
And Guy whispered ‘Good night
And you will dream of me.’
No female reporter before her had ever seemed quite so audacious, so willing to risk personal safety in pursuit of a story. In her first exposé for The World, Bly had gone undercover … feigning insanity so that she might report firsthand on the mistreatment of the female patients of the Blackwell’s Island Insane Asylum. … Bly trained with the boxing champion John L. Sullivan; she performed, with cheerfulness but not much success, as a chorus girl at the Academy of Music (forgetting the cue to exit, she momentarily found herself all alone onstage). She visited with a remarkable deaf, dumb, and blind nine-year-old girl in Boston by the name of Helen Keller. Once, to expose the workings of New York’s white slave trade, she even bought a baby. Her articles were by turns lighthearted and scolding and indignant, some meant to edify and some merely to entertain, but all were shot through with Bly’s unmistakable passion for a good story and her uncanny ability to capture the public’s imagination, the sheer force of her personality demanding that attention be paid to the plight of the unfortunate, and, not incidentally, to herself.
For all her extraordinary talent and work ethic, Bly’s appearance was decidedly unremarkable — a fact that shouldn’t matter, but one that would be repeatedly remarked upon by her critics and commentators, something we’ve made sad little progress on in discussing women’s professional, intellectual, and creative merit more than a century later. Goodman paints a portrait of Bly:
She was a young woman in a plaid coat and cap, neither tall nor short, dark nor fair, not quite pretty enough to turn a head: the sort of woman who could, if necessary, lose herself in a crowd.
Her voice rang with the lilt of the hill towns of western Pennsylvania; there was an unusual rising inflection at the ends of her sentences, the vestige of an Elizabethan dialect that had still been spoken in the hills when she was a girl. She had piercing gray eyes, though sometimes they were called green, or blue-green, or hazel. Her nose was broad at its base and delicately upturned at the end — the papers liked to refer to it as a “retroussé” nose — and it was the only feature about which she was at all self-conscious. She had brown hair that she wore in bangs across her forehead. Most of those who knew her considered her pretty, although this was a subject that in the coming months would be hotly debated in the press.
But, as if the ambitious adventure weren’t scintillating enough, the story takes an unexpected turn: That fateful November morning, as Bly was making her way to the journey’s outset at the Hoboken docks, a man named John Brisben Walker passed her on a ferry in the opposite direction, traveling from Jersey City to Lower Manhattan. He was the publisher of a high-brow magazine titled The Cosmopolitan, the same publication that decades later, under the new ownership of William Randolph Hearst, would take a dive for the commercially low-brow. On his ferry ride, Walker skimmed that morning’s edition of The World and paused over the front-page feature announcing Bly’s planned adventure around the world. A seasoned media manipulator of the public’s voracious appetite for drama, he instantly birthed an idea that would seize upon a unique publicity opportunity — The Cosmopolitan would send another circumnavigator to race against Bly. To keep things equal, it would have to be a woman. To keep them interesting, she’d travel in the opposite direction.
And so it went:
Elizabeth Bisland was twenty-eight years old, and after nearly a decade of freelance writing she had recently obtained a job as literary editor of The Cosmopolitan, for which she wrote a monthly review of recently published books entitled “In the Library.” Born into a Louisiana plantation family ruined by the Civil War and its aftermath, at the age of twenty she had moved to New Orleans and then, a few years later, to New York, where she contributed to a variety of magazines and was regularly referred to as the most beautiful woman in metropolitan journalism. Bisland was tall, with an elegant, almost imperious bearing that accentuated her height; she had large dark eyes and luminous pale skin and spoke in a low, gentle voice. She reveled in gracious hospitality and smart conversation, both of which were regularly on display in the literary salon that she hosted in the little apartment she shared with her sister on Fourth Avenue, where members of New York’s creative set, writers and painters and actors, gathered to discuss the artistic issues of the day. Bisland’s particular combination of beauty, charm, and erudition seems to have been nothing short of bewitching.
She took pride in the fact that she had arrived in New York with only fifty dollars in her pocket, and that the thousands of dollars now in her bank account had come by virtue of her own pen. Capable of working for eighteen hours at a stretch, she wrote book reviews, essays, feature articles, and poetry in the classical vein. She was a believer, more than anything else, in the joys of literature, which she had first experienced as a girl in ancient volumes of Shakespeare and Cervantes that she found in the library of her family’s plantation house. (She taught herself French while she churned butter, so that she might read Rousseau’s Confessions in the original — a book, as it turned out, that she hated.) She cared nothing for fame, and indeed found the prospect of it distasteful.
And yet, despite their competitive circumstances and seemingly divergent dispositions, something greater bound the two women together, some ineffable force of culture that quietly united them in a bold defiance of their era’s normative biases:
On the surface the two women … were about as different as could be: one woman a Northerner, the other from the South; one a scrappy, hard-driving crusader, the other priding herself on her gentility; one seeking out the most sensational of news stories, the other preferring novels and poetry and disdaining much newspaper writing as “a wild, crooked, shrieking hodge-podge,” a “caricature of life.” Elizabeth Bisland hosted tea parties; Nellie Bly was known to frequent O’Rourke’s saloon on the Bowery. But each of them was acutely conscious of the unequal position of women in America. Each had grown up without much money and had come to New York to make a place for herself in big-city journalism, achieving a hard-won success in what was still, unquestionably, a man’s world.
Originally featured in May — read the full article, including Bly’s entertaining illustrated packing list, here.
13. DON’T GO BACK TO SCHOOL
“The present education system is the trampling of the herd,” legendary architect Frank Lloyd Wright lamented in 1956. Half a century later, I started Brain Pickings in large part out of frustration and disappointment with my trampling experience of our culturally fetishized “Ivy League education.” I found myself intellectually and creatively unstimulated by the industrialized model of the large lecture hall, the PowerPoint presentations, the standardized tests assessing my rote memorization of facts rather than my ability to transmute that factual knowledge into a pattern-recognition mechanism that connects different disciplines to cultivate wisdom about how the world works and a moral lens on how it should work. So Brain Pickings became the record of my alternative learning, of that cross-disciplinary curiosity that took me from art to psychology to history to science, by way of the myriad pieces of knowledge I discovered — and connected — on my own. I didn’t live up to the entrepreneurial ideal of the college drop-out and begrudgingly graduated “with honors,” but refused to go to my own graduation and decided never to go back to school. Years later, I’ve learned more in the course of writing and researching the thousands of articles to date than in all the years of my formal education combined.
So, in 2012, when I found out that writer Kio Stark was crowdfunding a book that would serve as a manifesto for learning outside formal education, I eagerly chipped in. Now, Don’t Go Back to School: A Handbook for Learning Anything is out and is everything I could’ve wished for when I was in college, an essential piece of cultural literacy, at once tantalizing and practically grounded assurance that success doesn’t lie at the end of a single highway but is sprinkled along a thousand alternative paths. Stark describes it as “a radical project, the opposite of reform … not about fixing school [but] about transforming learning — and making traditional school one among many options rather than the only option.” Through a series of interviews with independent learners who have reached success and happiness in fields as diverse as journalism, illustration, and molecular biology, Stark — who herself dropped out of a graduate program at Yale, despite being offered a prestigious fellowship — cracks open the secret to defining your own success and finding your purpose outside the factory model of formal education. She notes the patterns that emerge:
People who forgo school build their own infrastructures. They create and borrow and reinvent the best that formal schooling has to offer, and they leave the worst behind. That buys them the freedom to learn on their own terms.
From their stories, you’ll see that when you step away from the prepackaged structure of traditional education, you’ll discover that there are many more ways to learn outside school than within.
Reflecting on her own exit from academia, Stark articulates a much more broadly applicable insight:
A gracefully executed quit is a beautiful thing, opening up more doors than it closes.
But despite discovering in dismay that “liberal arts graduate school is professional school for professors,” which she had no interest in becoming, Stark did learn something immensely valuable from her third year of independent study, during which she read about 200 books of her own choosing:
I learned how to teach myself. I had to make my own reading lists for the exams, which meant I learned how to take a subject I was interested in and make myself a map for learning it.
The interviews revealed four key common tangents: learning is collaborative rather than done alone; the importance of academic credentials in many professions is declining; the most fulfilling learning tends to take place outside of school; and those happiest about learning are those who learn out of intrinsic motivation rather than in pursuit of extrinsic rewards. The first of these insights, of course, appears on the surface to contradict the very notion of “independent learning,” but Stark offers an eloquent semantic caveat:
Independent learning suggests ideas such as “self-taught,” or “autodidact.” These imply that independence means working solo. But that’s just not how it happens. People don’t learn in isolation. When I talk about independent learners, I don’t mean people learning alone. I’m talking about learning that happens independent of schools.
Anyone who really wants to learn without school has to find other people to learn with and from. That’s the open secret of learning outside of school. It’s a social act. Learning is something we do together.
Independent learners are interdependent learners.
Much of the argument for formal education rests on statistics indicating that people with college and graduate degrees earn more. But those statistics, Stark notes, suffer an important and rarely heeded bias:
The problem is that this statistic is based on long-term data, gathered from a period of moderate loan debt, easy employability, and annual increases in the value of a college degree. These conditions have been the case for college grads for decades. Given the dramatically changed circumstances grads today face, we already know that the trends for debt, employability, and the value of a degree have all degraded, and we cannot assume the trend toward greater lifetime earnings will hold true for the current generation. This is a critical omission from media coverage. The fact is we do not know. There’s absolutely no guarantee it will hold true.
Some heartening evidence suggests the blind reliance on degrees might be beginning to change. Stark cites Zappos CEO Tony Hsieh:
I haven’t looked at a résumé in years. I hire people based on their skills and whether or not they are going to fit our culture.
Another common argument for formal education extols the alleged advantages of its structure, proposing that homework assignments, reading schedules, and regular standardized testing would motivate you to learn with greater rigor. But, as Daniel Pink has written about the psychology of motivation, in school, as in work, intrinsic drives far outweigh extrinsic, carrots-and-sticks paradigms of reward and punishment, rendering this argument unsound. Stark writes:
Learning outside school is necessarily driven by an internal engine. … [I]ndependent learners stick with the reading, thinking, making, and experimenting by which they learn because they do it for love, to scratch an itch, to satisfy curiosity, following the compass of passion and wonder about the world.
So how can you best fuel that internal engine of learning outside the depot of formal education? Stark offers an essential insight, which places self-discovery at the heart of acquiring external knowledge:
Learning your own way means finding the methods that work best for you and creating conditions that support sustained motivation. Perseverance, pleasure, and the ability to retain what you learn are among the wonderful byproducts of getting to learn using methods that suit you best and in contexts that keep you going. Figuring out your personal approach to each of these takes trial and error.
For independent learners, it’s essential to find the process and methods that match your instinctual tendencies as a learner. Everyone I talked to went through a period of experimenting and sorting out what works for them, and they’ve become highly aware of their own preferences. They’re clear that learning by methods that don’t suit them shuts down their drive and diminishes their enjoyment of learning. Independent learners also find that their preferred methods are different for different areas. So one of the keys to success and enjoyment as an independent learner is to discover how you learn.
School isn’t very good at dealing with the multiplicity of individual learning preferences, and it’s not very good at helping you figure out what works for you.
Any young child you observe displays these traits. But passion and curiosity can be easily lost. School itself can be a primary cause; arbitrary motivators such as grades leave little room for variation in students’ abilities and interests, and fail to reward curiosity itself. There are also significant social factors working against children’s natural curiosity and capacity for learning, such as family support or the lack of it, or a degree of poverty that puts families in survival mode with little room to nurture curiosity.
Stark returns to the question of motivators that do work, once again calling to mind Pink’s advocacy of autonomy, mastery, and purpose as the trifecta of success. She writes:
[T]hree broadly defined elements of the learning experience support internal motivation and the persistence it enables. Internal motivation relies on learners having autonomy in their learning, a progressing sense of competence in their skills and knowledge, and the ability to learn in a concrete or “real world” context rather than in the abstract. These are mostly absent from classroom learning. Autonomy is rare, useful context is absent, and school’s means for affirming competence often feel so arbitrary as to be almost without use — and are sometimes actively demotivating. . . . [A]utonomy means that you follow your own path. You learn what you want to learn, when and how you want to learn it, for your own reasons. Your impetus to learn comes from within because you control the conditions of your learning rather than working within a structure that’s pre-made and inflexible.
The second thing you need to stick with learning independently is to set your own goals toward an increasing sense of competence. You need to create a feedback loop that confirms your work is worth it and keeps you moving forward. In school this is provided by advancing through the steps of the linear path within an individual class or a set curriculum, as well as from feedback from grades and praise.
But Stark found that outside of school, those most successful at learning sought their sense of competence through alternative sources. Many, like James Mangan advised in his 1936 blueprint to acquiring knowledge, solidified their learning by teaching it to other people, increasing their own sense of mastery and deepening their understanding. Others centered their learning around specific projects, which enabled them to make progress more modular and thus more attainable. Another cohort cited failure as an essential part of the road to mastery. Stark continues:
The third thing [that] can make or break your ability to sustain internal motivation … is to situate what you’re learning in a context that matters to you. In some cases, the context is a specific project you want to accomplish, which … also functions to support your sense of progress.
She sums up the failings of the establishment:
School is not designed to offer these three conditions; autonomy and context are sorely lacking in classrooms. School can provide a sense of increasing mastery, via grades and moving from introductory classes to harder ones. But a sense of true competence is harder to come by in a school environment. Fortunately, there are professors in higher education who are working to change the motivational structures that underlie their curricula.
The interviews, to be sure, offer a remarkably diverse array of callings, underpinned by a number of shared values and common characteristics. Computational biologist Florian Wagner, for instance, echoes Steve Jobs’s famous words on the secret of life in articulating a sentiment shared by many of the other interviewees:
There is something really special about when you first realize you can figure out really cool things completely on your own. That alone is a valuable lesson in life.
Investigative journalist Quinn Norton subscribes to Mangan’s prescription for learning by teaching:
I ended up teaching [my] knowledge to others at the school. That’s one of my most effective ways to learn, by teaching; you just have to stay a week ahead of your students. … Everything I learned, I immediately turned around and taught to others.
When I wanted to learn something new as a professional writer, I’d pitch a story on it. I was interested in neurology, and I figured, why don’t I start interviewing neurologists? The great thing about being a journalist is that you can pick up the phone and talk to anybody. It was just like what I found out about learning from experts on mailing lists. People like to talk about what they know.
I’m stuffed with trivial, useless knowledge, on a panoply of bizarre topics, so I can find something that they’re interested in that I know something about. Being able to do that is tremendously socially valuable. The exchange of knowledge is a very human way to learn. I try never to walk into a room where I want to get information without knowing what I’m bringing to the other person.
I think part of the problem with the usual mindset of the student is that it’s like being a sponge. It’s passive. It’s not about having something to bring to the interaction. People who are experts in things are experts because they like learning.
Software engineer, artist, and University of Texas molecular biologist Zack Booth Simpson speaks to the value of cultivating what William Gibson has called “a personal micro-culture” and learning from the people with whom you surround yourself:
In a way, the best education you can get is just talking with people who are really smart and interested in things, and you can get that for the cost of lunch.
I was … a constant reader. At home, I lived next to this thrift store that sold paperbacks for 10¢ apiece so I would go and buy massive stacks of paperback books on everything. Everything from trashy 1970s romance novels to Plato. When I went to Europe, I brought with me every single book that I didn’t think I would read voluntarily, because I figured if I was on a bus ride, I would read them. So I read Plato and Dante’s Inferno, and all types of literature. I got my education on the bus.
I need to tell a story. It’s an obsession. Each story is a seed inside of me that starts to grow and grow, like a tumor, and I have to deal with it sooner or later. Why a particular story? I don’t know when I begin. That I learn much later. Over the years I’ve discovered that all the stories I’ve told, all the stories I will ever tell, are connected to me in some way. If I’m talking about a woman in Victorian times who leaves the safety of her home and comes to the Gold Rush in California, I’m really talking about feminism, about liberation, about the process I’ve gone through in my own life, escaping from a Chilean, Catholic, patriarchal, conservative, Victorian family and going out into the world.
I start all my books on January eighth. Can you imagine January seventh? It’s hell. Every year on January seventh, I prepare my physical space. I clean up everything from my other books. I just leave my dictionaries, and my first editions, and the research materials for the new one. And then on January eighth I walk seventeen steps from the kitchen to the little pool house that is my office. It’s like a journey to another world. It’s winter, it’s raining usually. I go with my umbrella and the dog following me. From those seventeen steps on, I am in another world and I am another person. I go there scared. And excited. And disappointed — because I have a sort of idea that isn’t really an idea. The first two, three, four weeks are wasted. I just show up in front of the computer. Show up, show up, show up, and after a while the muse shows up, too. If she doesn’t show up invited, eventually she just shows up.
She offers three pieces of advice for aspiring writers:
It’s worth the work to find the precise word that will create a feeling or describe a situation. Use a thesaurus, use your imagination, scratch your head until it comes to you, but find the right word.
When you feel the story is beginning to pick up rhythm—the characters are shaping up, you can see them, you can hear their voices, and they do things that you haven’t planned, things you couldn’t have imagined—then you know the book is somewhere, and you just have to find it, and bring it, word by word, into this world.
When you tell a story in the kitchen to a friend, it’s full of mistakes and repetitions. It’s good to avoid that in literature, but still, a story should feel like a conversation. It’s not a lecture.
Celebrated journalist and New Yorker staff writer Susan Orlean considers the critical difference between fiction and nonfiction, exploring the osmotic balance of escapism and inner stillness:
When it comes to nonfiction, it’s important to note the very significant difference between the two stages of the work. Stage one is reporting. Stage two is writing.
Reporting is like being the new kid in school. You’re scrambling to learn something very quickly, being a detective, figuring out who the people are, dissecting the social structure of the community you’re writing about. Emotionally, it puts you in the place that everybody dreads. You’re the outsider. You can’t give in to your natural impulse to run away from situations and people you don’t know. You can’t retreat to the familiar.
Writing is exactly the opposite. It’s private. The energy of it is so intense and internal, it sometimes makes you feel like you’re going to crumple. A lot of it happens invisibly. When you’re sitting at your desk, it looks like you’re just sitting there, doing nothing.
A necessary antidote to the tortured-genius cultural mythology of the writer, Orlean, like Ray Bradbury, conceives of writing as a source of joy, even when challenging:
Writing gives me great feelings of pleasure. There’s a marvelous sense of mastery that comes with writing a sentence that sounds exactly as you want it to. It’s like trying to write a song, making tiny tweaks, reading it out loud, shifting things to make it sound a certain way. It’s very physical. I get antsy. I jiggle my feet a lot, get up a lot, tap my fingers on the keyboard, check my e-mail. Sometimes it feels like digging out of a hole, but sometimes it feels like flying. When it’s working and the rhythm’s there, it does feel like magic to me.
She ends with four pieces of wisdom for writers:
You have to simply love writing, and you have to remind yourself often that you love it.
You should read as much as possible. That’s the best way to learn how to write.
You have to appreciate the spiritual component of having an opportunity to do something as wondrous as writing. You should be practical and smart and you should have a good agent and you should work really, really hard. But you should also be filled with awe and gratitude about this amazing way to be in the world.
Don’t be ashamed to use the thesaurus. I could spend all day reading Roget’s! There’s nothing better when you’re in a hurry and you need the right word right now.
Before I wrote my first book in 1989, the sum total of my earnings as a writer, over four years of freelancing, was about three thousand bucks. So it did appear to be financial suicide when I quit my job at Salomon Brothers — where I’d been working for a couple of years, and where I’d just gotten a bonus of $225,000, which they promised they’d double the following year—to take a $40,000 book advance for a book that took a year and a half to write.
My father thought I was crazy. I was twenty-seven years old, and they were throwing all this money at me, and it was going to be an easy career. He said, “Do it another ten years, then you can be a writer.” But I looked around at the people on Wall Street who were ten years older than me, and I didn’t see anyone who could have left. You get trapped by the money. Something dies inside. It’s very hard to preserve the quality in a kid that makes him jump out of a high-paying job to go write a book.
More than a living, Lewis found in writing a true calling — the kind of deep flow that fully absorbs the mind and soul:
There’s no simple explanation for why I write. It changes over time. There’s no hole inside me to fill or anything like that, but once I started doing it, I couldn’t imagine wanting to do anything else for a living. I noticed very quickly that writing was the only way for me to lose track of the time.
I used to get the total immersion feeling by writing at midnight. The day is not structured to write, and so I unplug the phones. I pull down the blinds. I put my headset on and play the same soundtrack of twenty songs over and over and I don’t hear them. It shuts everything else out. So I don’t hear myself as I’m writing and laughing and talking to myself. I’m not even aware I’m making noise. I’m having a physical reaction to a very engaging experience. It is not a detached process.
“Art suffers the moment other people start paying for it,” Hugh MacLeod famously wrote. It might be an overly cynical notion, one that perpetuates the unjustified yet deep-seated cultural guilt over simultaneously doing good and doing well, but Lewis echoes the sentiment:
Once you have a career, and once you have an audience, once you have paying customers, the motives for doing it just change.
And yet Lewis approaches the friction between intrinsic and extrinsic motivation — one experienced by anyone who loves what they do and takes pride in clarity of editorial vision, but has an audience whose approval or disapproval becomes increasingly challenging to tune out — with extraordinary candor and insight:
Commercial success makes writing books a lot easier to do, and it also creates pressure to be more of a commercial success. If you sold a million books once, your publisher really, really thinks you might sell a million books again. And they really want you to do it.
That dynamic has the possibility of constraining the imagination. There are invisible pressures. There’s a huge incentive to write about things that you know will sell. But I don’t find myself thinking, “I can’t write about that because it won’t sell.” It’s such a pain in the ass to write a book, I can’t imagine writing one if I’m not interested in the subject.
And yet his clarity of vision is still what guides the best of his work:
Those are the best moments, when I’ve got the whale on the line, when I see exactly what it is I’ve got to do.
After that moment there’s always misery. It never goes quite like you think, but that moment is a touchstone, a place to come back to. It gives you a kind of compass to guide you through the story.
That feeling has never done me wrong. Sometimes you don’t understand the misery it will lead to, but it’s always been right to feel it. And it’s a great feeling.
It’s always good to have a motive to get you in the chair. If your motive is money, find another one.
I took my biggest risk when I walked away from a lucrative job at age twenty-seven to be a writer. I’m glad I was too young to realize what a dumb decision it seemed to be, because it was the right decision for me.
A lot of my best decisions were made in a state of self-delusion. When you’re trying to create a career as a writer, a little delusional thinking goes a long way.
We seem to have a strange but all too human cultural fixation on the daily routines and daily rituals of famous creators, from Vonnegut to Burroughs to Darwin — as if a glimpse of their day-to-day would somehow magically infuse ours with equal potency, or replicating it would allow us to replicate their genius in turn. And though much of this is mere cultural voyeurism, there is something to be said for the value of a well-engineered daily routine to anchor the creative process. Manage Your Day-to-Day: Build Your Routine, Find Your Focus, and Sharpen Your Creative Mind (public library), edited by Behance’s 99U editor-in-chief Jocelyn Glei and featuring contributions from a twenty of today’s most celebrated thinkers and doers, delves into the secrets of this holy grail of creativity.
Reflecting Thomas Edison’s oft-cited proclamation that “genius is one percent inspiration, ninety-nine percent perspiration,” after which 99U is named, the crucial importance of consistent application is a running theme. (Though I prefer to paraphrase Edison to “Genius is one percent inspiration, ninety-nine percent aspiration” — since true aspiration produces effort that feels gratifying rather than merely grueling, enhancing the grit of perspiration with the gift of gratification.)
We tend to overestimate what we can do in a short period, and underestimate what we can do over a long period, provided we work slowly and consistently. Anthony Trollope, the nineteenth-century writer who managed to be a prolific novelist while also revolutionizing the British postal system, observed, “A small daily task, if it be really daily, will beat the labours of a spasmodic Hercules.” Over the long run, the unglamorous habit of frequency fosters both productivity and creativity.
You’re much more likely to spot surprising relationships and to see fresh connections among ideas, if your mind is constantly humming with issues related to your work. When I’m deep in a project, everything I experience seems to relate to it in a way that’s absolutely exhilarating. The entire world becomes more interesting. That’s critical, because I have a voracious need for material, and as I become hyperaware of potential fodder, ideas pour in. By contrast, working sporadically makes it hard to keep your focus. It’s easy to become blocked, confused, or distracted, or to forget what you were aiming to accomplish.
Creativity arises from a constant churn of ideas, and one of the easiest ways to encourage that fertile froth is to keep your mind engaged with your project. When you work regularly, inspiration strikes regularly.
Step by step, you make your way forward. That’s why practices such as daily writing exercises or keeping a daily blog can be so helpful. You see yourself do the work, which shows you that you can do the work. Progress is reassuring and inspiring; panic and then despair set in when you find yourself getting nothing done day after day. One of the painful ironies of work life is that the anxiety of procrastination often makes people even less likely to buckle down in the future.
I have a long list of “Secrets of Adulthood,” the lessons I’ve learned as I’ve grown up, such as: “It’s the task that’s never started that’s more tiresome,” “The days are long, but the years are short,” and “Always leave plenty of room in the suitcase.” One of my most helpful Secrets is, “What I do every day matters more than what I do once in a while.”
Day by day, we build our lives, and day by day, we can take steps toward making real the magnificent creations of our imaginations.
Entrepreneurship guru and culture-sageSeth Godin seconds Rubin and admonishes against confusing vacant ritualization with creative rituals that actually spur productivity:
Everybody who does creative work has figured out how to deal with their own demons to get their work done. There is no evidence that setting up your easel like Van Gogh makes you paint better. Tactics are idiosyncratic. But strategies are universal, and there are a lot of talented folks who are not succeeding the way they want to because their strategies are broken.
The strategy is simple, I think. The strategy is to have a practice, and what it means to have a practice is to regularly and reliably do the work in a habitual way.
There are many ways you can signify to yourself that you are doing your practice. For example, some people wear a white lab coat or a particular pair of glasses, or always work in a specific place — in doing these things, they are professionalizing their art.
The notion that I do my work here, now, like this, even when I do not feel like it, and especially when I do not feel like it, is very important. Because lots and lots of people are creative when they feel like it, but you are only going to become a professional if you do it when you don’t feel like it. And that emotional waiver is why this is your work and not your hobby.
“At its best, the sensation of writing is that of any unmerited grace,”Annie Dillard famously observed, adding the quintessential caveat, “It is handed to you, but only if you look for it. You search, you break your heart, your back, your brain, and then — and only then — it is handed to you.” And yet, Zadie Smith admonished in her 10 rules of writing, it’s perilous to romanticize the “vocation of writing”: “There is no ‘writer’s lifestyle.’ All that matters is what you leave on the page.”
Still, surely there must be more to it than that — whole worlds rise and fall, entire universes blossom and die daily in that enchanted space between the writer’s sensation of writing and the word’s destiny of being written on a page. For all that’s been mulled about the writing life and its perpetual osmosis of everyday triumphs and tragedies, its existential feats and failures, at its heart remains an immutable mystery — how can a calling be at once so transcendent and so soul-crushing, and what is it that enthralls so many souls into its paradoxical grip, into feeling compelled to write “not because they can but because they have to”? That, and oh so much more, is what Dani Shapiro explores in Still Writing: The Pleasures and Perils of a Creative Life (public library) — her magnificent memoir of the writing life, at once disarmingly personal and brimming with widely resonant wisdom on the most universal challenges and joys of writing.
Shapiro opens with the kind of crisp conviction that underpins the entire book:
Everything you need to know about life can be learned from a genuine and ongoing attempt to write.
Far from a lazy aphorism, however, this proclamation comes from her own hard-earned experience — fragments of which resonate deeply with most of us, on one level or another — that Shapiro synthesizes beautifully:
When I wasn’t writing, I was reading. And when I wasn’t writing or reading, I was staring out the window, lost in thought. Life was elsewhere — I was sure of it—and writing was what took me there. In my notebooks, I escaped an unhappy and lonely childhood. I tried to make sense of myself. I had no intention of becoming a writer. I didn’t know that becoming a writer was possible. Still, writing was what saved me. It presented me with a window into the infinite. It allowed me to create order out of chaos.
The writing life requires courage, patience, persistence, empathy, openness, and the ability to deal with rejection. It requires the willingness to be alone with oneself. To be gentle with oneself. To look at the world without blinders on. To observe and withstand what one sees. To be disciplined, and at the same time, take risks. To be willing to fail — not just once, but again and again, over the course of a lifetime. “Ever tried, ever failed,” Samuel Beckett once wrote. “No matter. Try again. Fail again. Fail better.” It requires what the great editor Ted Solotoroff once called endurability.
We are all unsure of ourselves. Every one of us walking the planet wonders, secretly, if we are getting it wrong. We stumble along. We love and we lose. At times, we find unexpected strength, and at other times, we succumb to our fears. We are impatient. We want to know what’s around the corner, and the writing life won’t offer us this. It forces us into the here and now. There is only this moment, when we put pen to page.
The page is your mirror. What happens inside you is reflected back. You come face-to-face with your own resistance, lack of balance, self-loathing, and insatiable ego—and also with your singular vision, guts, and fortitude. No matter what you’ve achieved the day before, you begin each day at the bottom of the mountain. … Life is usually right there, though, ready to knock us over when we get too sure of ourselves. Fortunately, if we have learned the lessons that years of practice have taught us, when this happens, we endure. We fail better. We sit up, dust ourselves off, and begin again.
What is it about writing that makes it—for some of us — as necessary as breathing? It is in the thousands of days of trying, failing, sitting, thinking, resisting, dreaming, raveling, unraveling that we are at our most engaged, alert, and alive. Time slips away. The body becomes irrelevant. We are as close to consciousness itself as we will ever be. This begins in the darkness. Beneath the frozen ground, buried deep below anything we can see, something may be taking root. Stay there, if you can. Don’t resist. Don’t force it, but don’t run away. Endure. Be patient. The rewards cannot be measured. Not now. But whatever happens, any writer will tell you: This is the best part.
If I dismiss the ordinary — waiting for the special, the extreme, the extraordinary to happen — I may just miss my life.
To allow ourselves to spend afternoons watching dancers rehearse, or sit on a stone wall and watch the sunset, or spend the whole weekend rereading Chekhov stories—to know that we are doing what we’re supposed to be doing — is the deepest form of permission in our creative lives. The British author and psychologist Adam Phillips has noted, “When we are inspired, rather like when we are in love, we can feel both unintelligible to ourselves and most truly ourselves.” This is the feeling I think we all yearn for, a kind of hyperreal dream state. We read Emily Dickinson. We watch the dancers. We research a little known piece of history obsessively. We fall in love. We don’t know why, and yet these moments form the source from which all our words will spring.
As curious as these habits are, however, Johnson reminds us that public intellectuals often engineer their own myths, which means the quirky behaviors recorded in history’s annals should be taken with a grain of Salinger salt. She offers a necessary disclaimer, enveloped in a thoughtful meta-disclaimer:
One must always keep in mind that these writers and the people around them may have, at some point, embellished the facts. Quirks are great fodder for gossip and can morph into gross exaggeration when passed from one person to the next. There’s also no way to escape the self-mythologizing particularly when dealing with some of the greatest storytellers that ever lived. Yet even when authors stretch the truth, they reveal something about themselves, when it is the desire to project a certain image or the need to shy away from one.
Mode and medium of writing seem to be a recurring theme of personal idiosyncrasy. Wallace Stevens composed his poetry on slips of paper while walking — an activity he, like Maira Kalman, saw as a creative stimulant — then handed them to his secretary to type up. Edgar Allan Poe, champion of marginalia, wrote his final drafts on separate pieces of paper attached into a running scroll with sealing wax. Jack Kerouac was especially partial to scrolling: In 1951, planning the book for years and amassing ample notes in his journals, he wrote On The Road in one feverish burst, letting it pour onto pages taped together into one enormously long strip of paper — a format he thought lent itself particularly well to his project, since it allowed him to maintain his rapid pace without pausing to reload the typewriter at the end of each page. When he was done, he marched into his editor Robert Giroux’s office and proudly spun out the scroll across the floor. The result, however, was equal parts comical and tragic:
To [Kerouac’s] dismay, Giroux focused on the unusual packaging. He asked, “But Jack, how can you make corrections on a manuscript like that?” Giroux recalled saying, “Jack, you know you have to cut this up. It has to be edited.” Kerouac left the office in a rage. It took several years for Kerouac’s agent, Sterling Lord, to finally find a home for the book, at the Viking Press.
James Joyce wrote lying on his stomach in bed, with a large blue pencil, clad in a white coat, and composed most of Finnegans Wake with crayon pieces on cardboard. But this was a matter more of pragmatism than of superstition or vain idiosyncrasy: Of the many outrageously misguided myths the celebrated author of Ulysses and wordsmith of little-known children’s books, one was actually right: he was nearly blind. His childhood myopia developed into severe eye problems by his twenties. To make matters worse, he developed rheumatic fever when he was twenty-five, which resulted in a painful eye condition called iritis. By 1930, he had undergone twenty-five eye surgeries, none of which improved his sight. The large crayons thus helped him see what he was writing, and the white coat helped reflect more light onto the page at night. (As someone partial to black bedding, not for aesthetic reasons but because I believe it provides a deeper dark at night, I can certainly relate to Joyce’s seemingly arbitrary but actually physics-driven attire choice.)
Virginia Woolf was equally opinionated about the right way to write as she was about the right way to read. In her twenties, she spent two and a half hours every morning writing, on a three-and-half-foot tall desk with an angled top that allowed her to look at her work both up-close and from afar. But according to her nephew and irreverent collaborator, Quentin Bell, Woolf’s prescient version of today’s trendy standing desk was less a practical matter than a symptom of her sibling rivalry with her sister, the Bloomsbury artist Vanessa Bell — the same sibling rivalry that would later inspire a charming picture-book: Vanessa painted standing, and Virginia didn’t want to be outdone by her sister. Johnson cites Quentin, who was known for his wry family humor:
This led Virginia to feel that her own pursuit might appear less arduous than that of her sister unless she set matters on a footing of equality.
Many authors measured the quality of their output by uncompromisingly quantitative metrics like daily word quotas. Jack London wrote 1,000 words a day every single day of his career and William Golding once declared at a party that he wrote 3,000 words daily, a number Norman Mailer and Arthur Conan Doyle shared. Raymond Chandler, a man of strong opinions on the craft of writing, didn’t subscribe to a specific daily quota, but was known to write up to 5,000 words a day at his most productive. Anthony Trollope, who began his day promptly at 5:30 A.M. every morning, disciplined himself to write 250 words every 15 minutes, pacing himself with a watch. Stephen King does whatever it takes to reach his daily quota of 2,000 adverbless words and Thomas Wolfe keeps his at 1,800, not letting himself stop until he has reached it.
We already know how much famous authors loved their pets, but for many their non-human companions were essential to the creative process. Edgar Allan Poe considered his darling tabby named Catterina his literary guardian who “purred as if in complacent approval of the world proceeding under [her] supervision.” Flannery O’Connor developed an early affection for domestic poultry, from her childhood chicken (which, curiously enough, could walk backwards and once ended up in a newsreel clip) to her growing collection of pheasants, ducks, turkeys, and quail. Most famously, however, twenty-something O’Connor mail-ordered six peacocks, a peahen, and four peachicks, which later populated her fiction. But by far the most bizarre pet-related habit comes from Colette, who enlisted her dog in a questionable procrastination mechanism:
Colette would study the fur of her French bulldog, Souci, with a discerning eye. Then she’d pluck a flea from Souci’s back and would continue the hunt until she was ready to write.
But arguably the strangest habit of all comes from Friedrich Schiller, relayed by his friend Goethe:
[Goethe] had dropped by Schiller’s home and, after finding that his friend was out, decided to wait for him to return. Rather than wasting a few spare moments, the productive poet sat down at Schiller’s desk to jot down a few notes. Then a peculiar stench prompted Goethe to pause. Somehow, an oppressive odor had infiltrated the room.
Goethe followed the odor to its origin, which was actually right by where he sat. It was emanating from a drawer in Schiller’s desk. Goethe leaned down, opened the drawer, and found a pile of rotten apples. The smell was so overpowering that he became light-headed. He walked to the window and breathed in a few good doses of fresh air. Goethe was naturally curious about the trove of trash, though Schiller’s wife, Charlotte, could only offer the strange truth: Schiller had deliberately let the apples spoil. The aroma, somehow, inspired him, and according to his spouse, he “could not live or work without it.”
Then there was the color-coding of the muses: In addition to his surprising gastronome streak, Alexandre Dumas was also an aesthete: For decades, he penned all of his fiction on a particular shade of blue paper, his poetry on yellow, and his articles on pink; on one occasion, while traveling in Europe, he ran out of his precious blue paper and was forced to write on a cream-colored pad, which he was convinced made his fiction suffer. Charles Dickens was partial to blue ink, but not for superstitious reasons — because it dried faster than other colors, it allowed him to pen his fiction and letters without the drudgery of blotting. Virginia Woolf used different-colored inks in her pens — greens, blues, and purples. Purple was her favorite, reserved for letters (including her love letters to Vita Sackville-West, diary entries, and manuscript drafts. Lewis Carroll also preferred purple ink (and shared with Woolf a penchant for standing desks), but for much more pragmatic reasons: During his years teaching mathematics at Oxford, teachers were expected to use purple ink to correct students’ work — a habit that carried over to Carroll’s fiction.
But lest we hastily surmise that writing in a white coat would make us a Joyce or drowning pages in purple ink a Woolf, Johnson prefaces her exploration with another important, beautifully phrased disclaimer:
That power to mesmerize has an intangible, almost magical quality, one I wouldn’t dare to try to meddle with by attempting to define it. It was never my goal as I wrote this book to discover what made literary geniuses tick. The nuances of any mind are impossible to pinpoint.
You could adopt one of these practices or, more ambitiously, combine several of them, and chances are you still wouldn’t invoke genius. These tales don’t hold a secret formula for writing a great novel. Rather, the authors in the book prove that the path to great literature is paved with one’s own eccentricities rather than someone else’s.
Originally featured in September — for more quirky habits, read the original article here.
If the twentieth-century career was a ladder that we climbed from one predictable rung to the next, the twenty-first-century career is more like a broad rock face that we are all free-climbing. There’s no defined route, and we must use our own ingenuity, training, and strength to rise to the top. We must make our own luck.
Lucky people take advantage of chance occurrences that come their way. Instead of going through life on cruise control, they pay attention to what’s happening around them and, therefore, are able to extract greater value from each situation… Lucky people are also open to novel opportunities and willing to try things outside of their usual experiences. They’re more inclined to pick up a book on an unfamiliar subject, to travel to less familiar destinations, and to interact with people who are different than themselves.
If you think hard about it, you’ll notice just how many “automatic” decisions you make each day. But these habits aren’t always as trivial as what you eat for breakfast. Your health, your productivity, and the growth of your career are all shaped by the things you do each day — most by habit, not by choice.
Even the choices you do make consciously are heavily influenced by automatic patterns. Researchers have found that our conscious mind is better understood as an explainer of our actions, not the cause of them. Instead of triggering the action itself, our consciousness tries to explain why we took the action after the fact, with varying degrees of success. This means that even the choices we do appear to make intentionally are at least somewhat influenced by unconscious patterns.
Given this, what you do every day is best seen as an iceberg, with a small fraction of conscious decision sitting atop a much larger foundation of habits and behaviors.
We can’t, however, simply will ourselves into better habits. Since willpower is a limited resource, whenever we’ve overexerted our self-discipline in one domain, a concept known as “ego depletion” kicks in and renders us mindless automata in another. Instead, Young suggests, the key to changing a habit is to invest heavily in the early stages of habit-formation so that the behavior becomes automated and we later default into it rather than exhausting our willpower wrestling with it. Young also cautions that it’s a self-defeating strategy to try changing several habits at once. Rather, he advises, spend one month on each habit alone before moving on to the next — a method reminiscent of the cognitive strategy of “chunking” that allows our brains to commit more new information to memory.
Though the chapter, penned by Steven Kramer and Teresa Amabile of the Harvard Business School, co-authors of The Progress Principle, along with 13-year IDEO veteran Ela Ben-Ur, frames the primary benefit of a diary as a purely pragmatic record of your workday productivity and progress — while most dedicated diarists would counter that the core benefits are spiritual and psychoemotional — it does offer some valuable insight into the psychology of how journaling elevates our experience of everyday life:
This is one of the most important reasons to keep a diary: it can make you more aware of your own progress, thus becoming a wellspring of joy in your workday.
Citing their research into the journals of more than two hundred creative professionals, the authors point to a pattern that reveals the single most important motivator: palpable progress on meaningful work:
On the days when these professionals saw themselves moving forward on something they cared about — even if the progress was a seemingly incremental “small win” — they were more likely to be happy and deeply engaged in their work. And, being happier and more deeply engaged, they were more likely to come up with new ideas and solve problems creatively.
Even more importantly, however, they argue that a diary offers an invaluable feedback loop:
Although the act of reflecting and writing, in itself, can be beneficial, you’ll multiply the power of your diary if you review it regularly — if you listen to what your life has been telling you. Periodically, maybe once a month, set aside time to get comfortable and read back through your entries. And, on New Year’s Day, make an annual ritual of reading through the previous year.
This, they suggest, can yield profound insights into the inner workings of your own mind — especially if you look for specific clues and patterns, trying to identify the richest sources of meaning in your work and the types of projects that truly make your heart sing. Once you understand what motivates you most powerfully, you’ll be able to prioritize this type of work in going forward. Just as important, however, is cultivating a gratitude practice and acknowledging your own accomplishments in the diary:
This is your life; savor it. Hold on to the threads across days that, when woven together, reveal the rich tapestry of what you are achieving and who you are becoming. The best part is that, seeing the story line appearing, you can actively create what it — and you — will become.
Every creative endeavor, from writing a book to designing a brand to launching a company, follows what’s known as an Uncertainty Curve. The beginning of a project is defined by maximum freedom, very little constraint, and high levels of uncertainty. Everything is possible; options, paths, ideas, variations, and directions are all on the table. At the same time, nobody knows exactly what the final output or outcome will be. And, at times, even whether it will be. Which is exactly the way it should be.
Those who are doggedly attached to the idea they began with may well execute on that idea. And do it well and fast. But along the way, they often miss so many unanticipated possibilities, options, alternatives, and paths that would’ve taken them away from that linear focus on executing on the vision, and sent them back into a place of creative dissidence and uncertainty, but also very likely yielded something orders of magnitude better.
All creators need to be able to live in the shade of the big questions long enough for truly revolutionary ideas and insights to emerge. They need to stay and act in that place relentlessly through the first, most obvious wave of ideas.
Fields argues that if we move along the Uncertainty Curve either too fast or too slowly, we risk either robbing the project of its creative potential and ending up in mediocrity. Instead, becoming mindful of the psychology of that process allows us to pace ourselves better and master that vital osmosis between freedom and constraint. He sums up both the promise and the peril of this delicate dance beautifully:
Nothing truly innovative, nothing that has advanced art, business, design, or humanity, was ever created in the face of genuine certainty or perfect information. Because the only way to be certain before you begin is if the thing you seek to do has already been done.
The story begins with a skit titled “Business, Business,” which Henson performed on The Ed Sullivan Show in 1968. It tells the story of two conflicting sets of creatures — the slot-machine-eyed, cash-register-voiced corporate heads who talk in business-ese, and the naïve, light-bulb-headed softies who talk of love, joy, and beauty:
“Business, Business” implies that business and idealism are diametrically opposed. The idealist is attacked not just by the establishment, but also from within, where greed starts to change one’s motives.
For the most part, money is the enemy of art. … Put simply, great art wants quality, whereas good business wants profit. Quality requires many man-hours to produce, which any accountant will tell you cuts significantly into your profit. Great artists fight for such expenditures, whereas successful businessmen fight against them.
And yet, like most dogmatic dichotomies — take, for instance, science and spirituality — this, too, is invariably reductionistic. Henson’s life and legacy, Stevens argues, is proof that art and business can be — and inherently are — complementary rather than contradictory. Produced only six months after the Summer of Love, “Business, Business” straddled a profound cultural shift as a new generation of “light-bulb idealists” — baby boomers, flower children, and hippies who lived in youth collectives, listened to rock, and championed free love — rejected the material ideals of their parents and embraced the philosophy of Alan Watts. And yet Henson himself was an odd hybrid of these two worlds. When he made “Business, Business,” he was thirty-one, which placed him squarely between the boomers and their parents, and lived in New York City with his wife, living comfortably after having made hundreds of television commercials for everything from lunch meats to computers. In his heart, however, Henson, a self-described Mississippi Tom Sawyer who often went barefoot, was an artist — and he was ready to defend this conviction with the choices he made.
Henson was already a capitalist when he made “Business, Business.” And we could even conclude that the skit describes his own conversion from idealism to capitalism. In 1968, he had an agent who got him TV appearances on Ed Sullivan and freelance commercial gigs hawking products as unhippielike as IBM computers and Getty oil.
Yet Jim Henson’s business wasn’t oil — it was art. While today, most artists are too timid to admit it, Henson freely referred to himself as an “artist,” and his agent went even further, calling him “artsy-craftsy.” Henson may have worked in show business, but he’d also traveled in Europe as a young man, sketching pictures of its architecture. He owned a business, but his business rested on the ideas the idealists were shouting—brotherhood, joy, and love. He wore a beard. Biographers would say it was to cover acne scars, but in the context of the late sixties, it aligns Henson with a category of people that is unmistakable. Though a capitalist, he was also a staunch artist.
“It seems to be difficult for any one to take in the idea that two truths cannot conflict,” pioneering astronomer Maria Mitchell wrote in her diaries. And yet what Henson’s case tells us, Stevens suggests in returning to “Business, Business,” is that the very notion of “selling out” is one big non-truth that pits two parallel possibilities against each other:
If art and money are at odds, which side was Jim Henson really on? If you watch the skit, the clue is in the characters’ voices. Of the Slinky-necked business-heads and idealist-heads, Henson was really both and neither, because in “Business, Business,” he parodies both. Locked in conflict, they sound like blowhards and twerps, respectively, but they were both facets of his life. As an employer to two other men, Henson was the boss man — the suit, cash register, and slot machine — who wrote the checks. But he also got together with his friends to sing, laugh, and play with puppets in the kind of collectivism that hippies celebrated.
Today — especially with Generation X and Millennials — serious artists often refuse contact with business. Large numbers of liberal arts graduates bristle when presented with the corporate world, rejecting its values to protect their ideals. Devoted artists move home to a parent’s basement to complete their masterpieces, while the more pragmatic artists live in cloistered “Neverland” artist collectives, grant-funded arts colonies, and university faculty lounges.
What is a human being? Complex to the point of absurdity, a whole person is both greedy and generous. It is foolish to think we can’t be both artists and entrepreneurs, especially when Henson was so wildly successful in both categories.
Since he was in college, Jim Henson was a natural capitalist. He owned a printmaking business and made commercials for lunchmeats. In the 1970s, he became a merchandizing millionaire and made Hollywood movies. By 1987, he had shows on all three major networks plus HBO and PBS. … Of course, Henson was not just another Trump. Believe the beard.
When Henson joined on to the experimental PBS show Sesame Street in 1968, he was underpaid for his services creating Big Bird and Oscar. Yet he spent his free nights in his basement, shooting stop-motion films that taught kids to count. If you watch these counting films, the spirit of Henson’s gift shines through. I think any struggling artist today could count Henson among their ilk. He had all the makings of a tragic starving artist. The only difference between him and us is that he made peace with money. He found a way to make art and money dance.
The key, of course, is to master this dance with equal parts determination and grace. Riffing off Lewis Hyde’s famous meditation on gift economies in The Gift, where he argues that the artist must first cultivate a protected gift-sphere for making pure art and then make contact with the market, Stevens offers a blueprint:
The dance involves art and money, but not at the same time. In the first stage, it is paramount that the artist “reserves a protected gift-sphere in which the art is created.” He keeps money out of it. But in the next two phases, they can dance. The way I see it, Hyde’s dance steps go a little something like this:
Make art make money.
Make money make art.
It is the last step that turns this dance into a waltz — something cyclical so that the money is not the real end. Truly, for Jim Henson, money was a fuel that fed art.
To write well about the elegant world you have to know it and experience it to the depths of your being just as Proust, Radiguet and Fitzgerald did: what matters is not whether you love it or hate it, but only to be quite clear about your position regarding it.
In another, he considers the secret of living well:
The inferno of the living is not something that will be; if there is one, it is what is already here, the inferno where we live every day, that we form by being together. There are two ways to escape suffering it. The first is easy for many: accept the inferno and become such a part of it that you can no longer see it. The second is risky and demands constant vigilance and apprehension: seek and learn to recognize who and what, in the midst of inferno, are not inferno, then make them endure, give them space.
In a lengthy letter to literary critic Mario Motta dated January 16, 1950, Calvino addresses the alleged death of the novel, a death toll still nervously resounding today:
There have been so many debates on the novel in the last thirty years, both by those who claimed it was dead and by those who wanted it to be alive in a certain way, that if one conducts the debate without serious preliminary work to establish the terms of the question as it has to be set up and as it has never been set up before, we’ll end up saying and making others say a lot of commonplaces.
The fact is that I already feel I am a prisoner of a kind of style and it is essential that I escape from it at all costs: I’m now trying to write a totally different book, but it’s damned difficult; I’m trying to break up the rhythms, the echoes which I feel the sentences I write eventually slide into, as into pre-existing molds, I try to see facts and things and people in the round instead of being drawn in colors that have no shading. For that reason the book I’m going to write interests me infinitely more than the other one.
As dangerous as the blind adhesion to a style, Calvino writes in a May 1959 letter, is the blind reliance on tools, the cult of medium over message — but harnessing the power of tools is one of the craft’s greatest arts:
One should never have taboos about the tools we use, that as long as the thought or images or style one wants to put forward do not become deformed by the medium, one must on the contrary try to make use of the most powerful and most efficient of those tools.
Several years later, Calvino returns to his conception of fiction, this time with more dimension and more sensitivity to the inherent contradictions of literature:
One cannot construct in fiction a harmonious language to express something that is not yet harmonious. We live in a cultural ambience where many different languages and levels of knowledge intersect and contradict each other.
When things get tough, this is what you should do: Make good art. I’m serious. Husband runs off with a politician — make good art. Leg crushed and then eaten by a mutated boa constrictor — make good art. IRS on your trail — make good art. Cat exploded — make good art. Someone on the Internet thinks what you’re doing is stupid or evil or it’s all been done before — make good art. Probably things will work out somehow, eventually time will take the sting away, and that doesn’t even matter. Do what only you can do best: Make good art. Make it on the bad days, make it on the good days, too.
A wise woman once said, “If you are not making mistakes, you’re not taking enough risks.” Gaiman articulates the same sentiment with his own brand of exquisite eloquence:
I hope that in this year to come, you make mistakes.
Because if you are making mistakes, then you are making new things, trying new things, learning, living, pushing yourself, changing yourself, changing your world. You’re doing things you’ve never done before, and more importantly, you’re Doing Something.
So that’s my wish for you, and all of us, and my wish for myself. Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before. Don’t freeze, don’t stop, don’t worry that it isn’t good enough, or it isn’t perfect, whatever it is: art, or love, or work or family or life.