Brain Pickings Icon
Brain Pickings

Search results for “seven word biography”

How Diego Rivera Met the Fierce Teenage Frida Kahlo and Fell in Love with Her Years Later

“I did not know it then, but Frida had already become the most important fact in my life. And she would continue to be, up to the moment she died…”

There is something singularly mesmerizing about the fateful encounters that sparked epic, often turbulent, lifelong love affairs — take, for instance, Gertrude Stein and Alice B. Toklas or Sylvia Plath and Ted Hughes. But one of modern history’s most vibrant, passionate, and tumultuous loves is that between legendary artists Frida Kahlo and Diego Rivera, the unusual and enchanting beginning of which is recounted first-hand in My Art, My Life: An Autobiography (public library) — a rare glimpse of Rivera’s inner life posthumously published in 1960, based on the interviews Gladys March conducted with the artist while shadowing him between 1944 and his death in 1957. March describes the book as “Rivera’s apologia: a self-portrait of a complex and controversial personality, and a key to the work of perhaps the greatest artist the Americas have yet produced.”

In a section titled An Apparition of Frida, Rivera describes his first encounter with the fierce teenage Kahlo while painting his first significant mural, Creation, at the Bolívar Auditorium of the National Preparatory School in Mexico City in 1922. Kahlo was one of only thirty-five female students at the prestigious institution.

Diego Rivera and Frida Kahlo in Mexico, 1933 (Photograph by Martin Munkácsi)

While painting, I suddenly heard, from behind one of the colonial pillars in the spacious room, the voice of an unseen girl.

Teasingly, she shouted, “On guard, Diego, Nahui is coming!”

Nahui was the Indian name of a talented woman painter who was posing for one of the auditorium figures.

The invisible voice continued to play pranks on Rivera until it finally presented itself in the mischievous flesh: One night, as Rivera was painting up on the scaffolding and his then-wife Guadalupe “Lupe” Marín was working below, they heard loud commotion coming from a group of students pushing against the auditorium door. Rivera describes the moment, which he would only later, in hindsight, come to recognize as pivotal in his life:

All at once the door flew open, and a girl who seemed to be no more than ten or twelve* was propelled inside.

She was dressed like any other high school student but her manner immediately set her apart. She had unusual dignity and self-assurance, and there was a strange fire in her eyes. Her beauty was that of a child, yet her breasts were well developed.

She looked straight up at me. “Would it cause you any annoyance if I watched you at work?” she asked.

“No, young lady, I’d be charmed,” I said.

She sat down and watched me silently, her eyes riveted on every move of my paint brush. After a few hours, Lupe’s jealousy was aroused, and she began to insult the girl. But the girl paid no attention to her. This, of course, enraged Lupe the more. Hands on hips, Lupe walked toward the girl and confronted her belligerently. The girl merely stiffened and returned Lupe’s stare without a word.

Visibly amazed, Lupe glared at her a long time, then smiled, and in a tone of grudging admiration, said to me, “Look at that girl! Small as she is, she does not fear a tall, strong woman like me. I really like her.”

The girl stayed about three hours. When she left, she said only, “Good night.” A year later I learned that she was the hidden owner of the voice which had come from behind the pillar and that her name was Frida Kahlo. But I had no idea that she would one day be my wife.

‘Nothing compares to your hands, nothing like the green-gold of your eyes. My body is filled with you for days and days…’ A page from Kahlo’s handwritten love letters to Rivera. Click image for more.

It wasn’t until several years later that the two crossed paths again. In another section of the book, titled Frida Becomes My Wife, Rivera recounts how their passionate epic, in the true sense of the word, love affair began:

I was at work on one of the uppermost frescoes at the Ministry of Education building one day, when I heard a girl shouting up to me, “Diego, please come down from there! I have something important to discuss with you!”

I turned my head and looked down from my scaffold.

On the ground beneath me stood a girl of about eighteen. She had a fine nervous body, topped by a delicate face. Her hair was long; dark and thick eyebrows met above her nose. They seemed like the wings of a blackbird, their black arches framing two extraordinary brown eyes.

As he climbed down the scaffolding, Frida made no attempt to conceal her spirited attitude and fierce ambition, telling Rivera:

I didn’t come here for fun. I have to work to earn my livelihood. I have done some paintings which I want you to look over professionally. I want an absolutely straightforward opinion, because I cannot afford to go on just to appease my vanity. I want you to tell me whether you think I can become a good enough artist to make it worth my while to go on. I’ve brought three of my paintings here. Will you come and look at them?

Rivera agrees and follows her into a cubicle under the staircase, where she has stowed her paintings. His recollection captures the ineffable magic of a rare occurrence — that priceless precipice of creative communion where one artist is humbled by another, a recognition that inevitably blossoms into love:

She turned each of them, leaning against the wall, to face me. They were all three portraits of women. As I looked at them, one by one, I was immediately impressed. The canvases revealed an unusual energy of expression, precise delineation of character, and true severity. They showed none of the tricks in the name of originality that usually mark the work of ambitious beginners. They had a fundamental plastic honesty, and an artistic personality of their own. They communicated a vital sensuality, complemented by a merciless yet sensitive power of observation. It was obvious to me that this girl was an authentic artist.

Kahlo, however, having been warned of Rivera’s reputation as a ladies’ man, is skeptical of the noticeable enthusiasm in his face and immediately scolds him “in a harshly defensive tone”:

I have not come to you looking for compliments. I want the criticism of a serious man. I’m neither an art lover nor an amateur. I’m simply a girl who must work for her living.

Rivera is smitten — intellectually, creatively and, though he doesn’t yet realize it, romantically. He simply notes:

I felt deeply moved by admiration for this girl.

‘Only one mountain can know the core of another mountain…’ A page from Kahlo’s handwritten love letters to Rivera. Click image for more.

So when she insists on his honest opinion regarding whether she has what it takes to become a professional artist or whether she should pursue another line of work, he answers resolutely:

In my opinion, no matter how difficult it is for you, you must continue to paint.”

Vowing to follow his advice, Kahlo asks him one last favor — to come to her place the following Sunday and look at the rest of her paintings. It is only after providing her address that she tells Rivera her name — a revelation that stops him dead in his tracks as he remembers two important pieces of information about how he had come to know that name. The first was relayed to him by a good friend, then-director of the National Preparatory School where Kahlo went, who had identified her as the leader of a “band of juvenile delinquents” and had even considered quitting his job out of frustration with Kahlo’s mischief. The second is the memory of their first encounter at the Creation mural years earlier, that brave twelve-year-old girl who had stood up for herself without a shadow of fear or self-doubt. Rivera describes the exhilarating exchange that followed:

I said, “But you are…”

She stopped me quickly, almost putting her hand on my mouth in her anxiety. Her eyes acquired a devilish brilliancy.

Threateningly, she said, “Yes, so what? I was the girl in the auditorium, but that has absolutely nothing to do with now. You still want to come Sunday?”

I had a great difficulty not answering, “More than ever!” But if I showed my excitement she might not let me come at all. So I only answered, “Yes.”

Then, after refusing my help in carrying her paintings, Frida departed, the big canvases jiggling under her arms.

The following Sunday, Rivera promptly presented himself at the address Kahlo had given him, only to find her engaged in an appropriately fearless and mischievous activity:

In the top of a high tree, I saw Frida in overalls, starting to climb down. Laughing gaily, she took my hand and ushered me through the house, which seemed to be empty, and into her room. Then she paraded all her paintings before me. These, her room, her sparkling presence, filled me with a wonderful joy.

I did not know it then, but Frida had already become the most important fact in my life. And she would continue to be, up to the moment she died, twenty-seven years later.

‘I ask for it, I get it, I sing, sang, I’ll sing from now on our magic — love…’ A page from Kahlo’s handwritten love letters to Rivera. Click image for more.

A few days later, the two kissed for the first time and Rivera “began courting her in earnest.” Although she was eighteen and he twice her age, neither of them “felt the least bit awkward.” Four years later, on August 21, 1929, they were married in a civil ceremony by the Mayor of Coyoacán, one of Mexico City’s sixteen boroughs, who proclaimed the merger “an historical event.” Kahlo was 22 and Rivera 42. They remained together until Kahlo’s death in July of 1954. Despite having an open marriage where each had multiple affairs — for the bisexual Kahlo, the most notable were those with French singer, dancer, and actress Josephine Baker and Russian Marxist theorist Leon Trotsky — both Kahlo and Rivera maintained that they were the love of each other’s life.

My Art, My Life: An Autobiography is a fascinating read in its candid, often scandalous entirety. Complement it with Kahlo’s exquisite handwritten love letters to Rivera.

* In factuality, Kahlo was two weeks shy of her 15th birthday

BP

From the Gold Rush to Silicon Valley: How Mark Twain Became the Steve Jobs of His Day

The power of mischief, timing, and typography.

Mark Twain has no shortage of cultural credits — celebrated humorist, irreverent adviser to little girls, opinionated critic and cultural commentator, underappreciated poet, recipient of some outrageous requests from his fans. But perhaps his greatest feat was his own becoming — how he transformed Samuel Clemens into Mark Twain, “the Lincoln of Literature.”

In The Bohemians: Mark Twain and the San Francisco Writers Who Reinvented American Literature (public library), Ben Tarnoff chronicles that becoming alongside the rise of the singular “vanguard of democracy” that shaped the course of the Western written word, a course largely steered by Twain. Embedded in the altogether fascinating story, on a closer read, is a testament to history’s cyclical nature as a parallel emerges between San Francisco in the Gold Rush era and Silicon Valley today, and between the respective patron saints of those worlds — Mark Twain and Steve Jobs.

Born in 1835, Twain came of age “at the best possible time,” as the country was on the cusp of a remarkable cultural change, driven in large part by the discovery of gold in California in 1848 — the spark for the famous Gold Rush, which drew risk-takers, pioneers, and enterprising vagabonds from all over the world and elevated San Francisco as a gateway to the era’s El Dorado. Twain, born in the west and raised in Missouri, found in San Francisco what many entrepreneurs today find in Silicon Valley — like many other young men, he “hadn’t come to stay, but to get rich and get out.” The confluence of all these cultural and personal factors created a unique backdrop:

They erected tents and wooden hovels, makeshift structures that made easy kindling for the city’s frequent fires. They built gambling dens and saloons and brothels. They lived among the cultures of five continents, often condensed into the space of a single street: Cantonese stir-fry competing with German wurst, Chilean whores with Australian. On the far margin of the continent, they created a complex urban society virtually overnight.

By the time Twain got there, San Francisco still roared. It was densely urban, yet unmistakably western; isolated yet cosmopolitan; crude yet cultured. The city craved spectacle, whether on the gaslit stages of its many theaters or in the ornately costumed pageantry of its streets. Its wide-open atmosphere endeared it to the young and the odd, to anyone seeking refuge from the overcivilized East. It had an acute sense of its own history, and a paganish appetite for mythmaking and ritual.

Mark Twain in 1863, taken on his first visit to San Francisco. He was twenty-seven.

It’s unsurprising, then, that the city quickly sprouted a thriving literary scene — a “band of outsiders” known as the Bohemians — for writers are a culture’s foremost mythmakers. Tarnoff captures their spirit:

The Bohemians were nonconformists by choice or by circumstance, and they eased their isolation by forming intense friendships with one another. San Francisco was where their story began, but it would continue in Boston, New York, and London; in the palace and the poorhouse; in success and humiliation, fame and poverty.

Two concurrent and seemingly opposite cultural forces contributed to the rise of the Bohemians: On the one hand, the Civil War disrupted America’s moral and aesthetic tradition, creating “rifts in the culture wide enough for new voices to be heard”; on the other, the technological advances brought on by the war had also shrunk the country, bringing East and West closer together with the advent of the railroad and the telegraph. San Francisco was in a unique position to reap the fruits of both developments, and “its writers found a wider readership at a moment when the nation sorely needed new storytellers.

In joining the Bohemians, Twain forever changed the course of both his own life and American literature. Tarnoff writes:

No Bohemian made better art than Twain. San Francisco gave him his education as a writer, nurturing the literary powers he would later use to transform American literature. He would help steer the country through its newfangled nationhood, and become the supreme cultural icon of the postwar age. But first, he would spend his formative years on the Far Western fringe, in the company of other young Bohemians struggling to reinvent American writing.

Artwork by Debbie Millman. Click image for more.

Tarnoff’s enchanting portrait of young Twain is remarkable in several ways — it is exquisitely written, it paints a somewhat timeless picture of the eccentric entrepreneur archetype, but perhaps most of all it reveals details about the beloved author of which most of us are unaware, for seemingly trivial reasons related to the trajectory of recording technology: left with only black-and-white photographs as sensory documentation of his life, we are deaf-blind to details like his striking carrot-colored hair or the peculiar drawl and even more peculiar rhythm of his speech. Tarnoff brings young Twain to life:

What people remembered best about him, aside from his brambly red brows and rambling gait, was his strange way of speaking: a drawl that spun syllables slowly, like fallen branches on the surface of a stream. Printers transcribed it with hyphens and dashes, trying to render rhythms so complex they could’ve been scored as sheet music. He rasped and droned, lapsed into long silences, soared in the swaying tenor inherited from the slave songs of his childhood. He made people laugh while remaining dreadfully, imperially serious. He mixed the sincere and the satiric, the factual and the fictitious, in proportions too obscure for even his closest friends to decipher. He was prickly, irreverent, ambitious, vindictive — a personality as impenetrably vast as the American West, and as prone to seismic outbursts. He was Samuel Clemens before he became Mark Twain, and in the spring of 1863, he made a decision that brought him one step closer to the fame he craved.

Cover of ‘Advice to Little Girls,’ which Twain wrote at the age of thirty in 1865. Click image for more.

So on May 2, 1863, Samuel Clemens boarded a stagecoach headed to Mark Twain via a rocky two-hundred-mile journey to San Francisco. At age 27, he was already extraordinary before he had even arrived:

The young Twain … already had more interesting memories than most men twice his age. He had piloted steamboats on the Mississippi, roamed his native Missouri with a band of Confederate guerrillas, and as the Civil War began in earnest, taken the overland route to the Territory of Nevada — or Washoe, as westerners called it, after a local Indian tribe.

Originally, Twain hadn’t planned to stay long, but he found himself entranced by the city’s perpetual cycle of eating, drinking, sailing, and socializing. In a letter to his mother and sister from mid-May, he announced he was only going to stay a little while longer — ten days or so, and no more than two weeks. But San Francisco’s bountiful buffet of saloons, dance halls, and gambling dens sang to him a siren song he could not resist. By June, he was still there, living “to the hilt” (to borrow Anne Sexton’s term) and estimating that he knew at least a thousand of San Francisco’s 115,000 citizens — “knew” in the pre-Facebook sense, which makes the scale of his social life all the more impressive.

He eventually left in July, but the spell had been cast:

Over the course of the next year he would find many reasons to return: first to visit, then to live. He would chronicle its quirks, and hurt the feelings of not a few of its citizens. In exchange, San Francisco would mold him to literary maturity. It would inspire his evolution from a provincial scribbler into a great American writer, from Hannibal’s Samuel Clemens into America’s Mark Twain.

One particular detail about Twain made me consider a curious parallel between him and Steve Jobs beyond their similar cultural legacy of revolutionizing their respective fields: Twain, like Jobs, was a disobedient boy (his mother described him as “very wild and mischievous”) who hated school. But perhaps most importantly, he was a typesetter by trade and dropped out of school (as did Jobs) to become “a printer’s devil,” as type apprentices were called at the time. It was through typesetting that he entered into literature, and through typesetting that he found his irreverent voice. Tarnoff writes:

The shop became his schoolroom. He put other people’s lines into print and composed a few of his own. He learned to think of words as things, as slivers of ink-stained metal that, if strung in the right sequence, could make more mischief than any schoolboy prank. At fifteen he began typesetting for his brother Orion’s newspaper, the Western Union, and wrote the occasional sketch. When Orion left on a business trip and put his sibling in charge, the teenager lost no time in testing the incendiary potential of the medium. He ignited a feud with the editor of a rival newspaper, scorching the poor man so thoroughly that when Orion returned, he was forced to run an apology.

This sounds remarkably like the story of Steve Jobs, whose era-defining design vision was first tickled by a typography class that he serendipitous wandered into, out of academic mischief and boredom with his prescribed course. Type led Jobs to design innovation and Twain to literary innovation. Both men also came of age at a time of incredible technological progress — for Jobs, the golden age of modern computing, and for Twain, the momentum of the Industrial Revolution. As a young man, Twain saw steamboats and trains and telegraphs transform the diffusion of the printed word across the country. He witnessed America’s booming love affair with newspapers — a century before he was born, the country had 37 newspapers. By the time he entered typesetting, there were more than a thousand.

Tarnoff contextualizes the significance of this shift, and the particular fertility of Twain’s locale of choice:

The newspaper revolution created America’s first popular culture. Twain belonged wholly to this revolution, and the world he discovered in the Far West was its most fertile staging ground… By 1870, California had one of the highest literacy rates in the nation: only 7.3 percent of its residents over the age of ten couldn’t write, compared with 20 percent nationwide. The region’s wealth financed a range of publications and gave people the leisure to read them. As Twain observed, there was no surer sign of “flush times” in a Far Western boomtown than the founding of a “literary paper.” Poetry and fiction mattered to miners and farmers, merchants and bankers. For them the printed word wasn’t a luxury — it was a lifeline. It fostered a sense of place, a feeling of community, in a frontier far from home.

The Bohemians is fantastic read in its entirety. Complement it with Twain on religion and our human egotism, morality vs. intelligence, and his mischievous advice to little girls.

BP

The Benjamin Franklin Effect: The Surprising Psychology of How to Handle Haters

“He that has once done you a kindness will be more ready to do you another, than he whom you yourself have obliged.”

“We are what we pretend to be,” Kurt Vonnegut famously wrote, “so we must be careful about what we pretend to be.” But given how much our minds mislead us, what if we don’t realize when we’re pretending — who are we then? That’s precisely what David McRaney explores in You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself (public library) — a “book about self-delusion, but also a celebration of it,” a fascinating and pleasantly uncomfortable-making look at why “self-delusion is as much a part of the human condition as fingers and toes,” and the follow-up to McRaney’s You Are Not So Smart, one of the best psychology books of 2011. McRaney, with his signature fusion of intelligent irreverence and irreverent intelligence, writes in the introduction:

The human mind is obviously vaster and more powerful than any other animal mind, and that’s something people throughout all human history couldn’t help but notice. You probably considered this the last time you visited the zoo or watched a dog battle its own hind legs. Your kind seems the absolute pinnacle of what evolution can produce, maybe even the apex and final beautiful result of the universe unfolding itself. It is a delectable idea to entertain. Even before we had roller skates and Salvador Dalí, it was a conviction in which great thinkers liked to wallow. Of course, as soon as you settle into that thought, you’ll accidentally send an e-mail to your boss meant for your proctologist, or you’ll read a news story about how hot dog-stuffed pizza is now the most popular food in the country. It’s always true that whenever you look at the human condition and get a case of the smugs, a nice heaping helping of ridiculousness plops in your lap and remedies the matter.

This tendency of ours is known as “naïve realism” — the assertion that we see the world as it actually is and our impression of it is an objective, accurate representation of “reality” — a concept that comes from ancient philosophy and has since been amply debunked by modern science. McRaney writes:

The last one hundred years of research suggest that you, and everyone else, still believe in a form of naïve realism. You still believe that although your inputs may not be perfect, once you get to thinking and feeling, those thoughts and feelings are reliable and predictable. We now know that there is no way you can ever know an “objective” reality, and we know that you can never know how much of subjective reality is a fabrication, because you never experience anything other than the output of your mind. Everything that’s ever happened to you has happened inside your skull.

In sum, we are excellent at deluding ourselves, and terrible in recognizing when our own perceptions, attitudes, impressions, and opinions about the external world are altered from within. And one of the most remarkable of manifestations of this is the Benjamin Franklin Effect, which McRaney examines in the third chapter. The self-delusion in question is that we do nice things to people we like and bad things to those we dislike. But what the psychology behind the effect reveals is quite the opposite, a reverse-engineering of attitudes that takes place as we grow to like people for whom we do nice things and dislike those to whom we are unkind.

benjaminfranklin1
Benjamin Franklin

This curious effect is named after a specific incident early in the Founding Father’s political career. Franklin, born one of seventeen children to poor parents, entered this world — despite his parents’ and society’s priorities in his favor relative to his siblings — with very low odds of becoming an educated scientist, gentleman, scholar, entrepreneur, and, perhaps most of all, a man of significant political power. To compensate for his unfavorable givens, he quickly learned formidable people skills and became “a master of the game of personal politics.” McRaney writes:

Like many people full of drive and intelligence born into a low station, Franklin developed strong people skills and social powers. All else denied, the analytical mind will pick apart behavior, and Franklin became adroit at human relations. From an early age, he was a talker and a schemer, a man capable of guile, cunning, and persuasive charm. He stockpiled a cache of secret weapons, one of which was the Benjamin Franklin effect, a tool as useful today as it was in the 1730s and still just as counterintuitive.

[…]

At age twenty-one, he formed a “club of mutual improvement” called the Junto. It was a grand scheme to gobble up knowledge. He invited working-class polymaths like him to have the chance to pool together their books and trade thoughts and knowledge of the world on a regular basis. They wrote and recited essays, held debates, and devised ways to acquire currency. Franklin used the Junto as a private consulting firm, a think tank, and he bounced ideas off the other members so he could write and print better pamphlets. Franklin eventually founded the first subscription library in America, writing that it would make “the common tradesman and farmers as intelligent as most gentlemen from other countries,” not to mention give him access to whatever books he wanted to buy.

This is where his eponymous effect comes into play: When Franklin ran for his second term as a clerk, a peer whose name he never mentions in his autobiography delivered a long election speech censuring Franklin and tarnishing his reputation. Although Franklin won, he was furious with his opponent and, observing that this was “a gentleman of fortune and education” who might one day come to hold great power in government, rather concerned about future frictions with him.

The troll had to be tamed, and tamed shrewdly. McRaney writes:

Franklin set out to turn his hater into a fan, but he wanted to do it without “paying any servile respect to him.” Franklin’s reputation as a book collector and library founder gave him a standing as a man of discerning literary tastes, so Franklin sent a letter to the hater asking if he could borrow a specific selection from his library, one that was a “very scarce and curious book.” The rival, flattered, sent it right away. Franklin sent it back a week later with a thank-you note. Mission accomplished. The next time the legislature met, the man approached Franklin and spoke to him in person for the first time. Franklin said the man “ever after manifested a readiness to serve me on all occasions, so that we became great friends, and our friendship continued to his death.”

Instant pause-giver: In what universe does inducing an opponent to do you a favor magically turn him into a supporter? This, it turns out, shares a psychological basis with the reason why the art of asking is the art of cultivating community — and, McRaney explains, it has a lot to do with the psychology of attitudes, those clusters of convictions about and emotional impressions of a person or a situation:

For many things, your attitudes came from actions that led to observations that led to explanations that led to beliefs. Your actions tend to chisel away at the raw marble of your persona, carving into being the self you experience from day to day. It doesn’t feel that way, though. To conscious experience, it feels as if you were the one holding the chisel, motivated by existing thoughts and beliefs. It feels as though the person wearing your pants performed actions consistent with your established character, yet there is plenty of research suggesting otherwise. The things you do often create the things you believe.

Indeed, this is what Gandhi touched on when he observed that our thoughts become our words, our words become our actions, our actions become our character, our character becomes our destiny, and it’s also the foundation of Cognitive Behavioral Therapy, which aims to change how we think by first changing what we do, until we internalize a set of beliefs about how those actions define who we are. McRaney explains how this works:

At the lowest level, behavior-into-attitude conversion begins with impression management theory, which says you present to your peers the person you wish to be. You engage in something economists call signaling by buying and displaying to your peers the sorts of things that give you social capital… Whatever are the easiest-to-obtain, loudest forms of the ideals you aspire to portray become the things you own, such as bumper stickers signaling to the world you are in one group and not another. These things then influence you to become the sort of person who owns them.

[…]

Anxiety over being ostracized, over being an outsider, has driven the behavior of billions for millions of years. Impression management theory says you are always thinking about how you appear to others, even when there are no others around. In the absence of onlookers, deep in your mind a mirror reflects back that which you have done, and when you see a person who has behaved in a way that could get you booted from your in-group, the anxiety drives you to seek a realignment.

This brings us to the chicken-or-the-egg question of whether the belief or the display came first. According to self-perception theory, we are both observers and narrators of our own experience — we see ourselves do something and, unable to pin down our motive, we try to make sense of it by constructing a plausible story. We then form beliefs about ourselves based on observing our actions, as narrated by that story, which of course is based on our existing beliefs in the first place. This is what happened to Franklin’s nemesis: He observed himself performing an act of kindness toward Franklin, which he explained to himself by constructing the most plausible story — that he did so willfully, because he liked Franklin after all.

This, as we’ve previously seen in the way we rationalize our dishonesty, is an example of cognitive dissonance, a mental affliction that befalls us all as we struggle to reconcile conflicting ideas about ourselves, others, or a situation. McRaney points to the empirical evidence:

You can see the proof in an MRI scan of someone presented with political opinions that conflict with her own. The brain scans of a person shown statements that oppose her political stance show that the highest areas of the cortex, the portions responsible for providing rational thought, get less blood until another statement is presented that confirms her beliefs. Your brain literally begins to shut down when you feel your ideology is threatened.

One of the most vivid examples of this process in action comes from a Stanford study:

Students … signed up for a two-hour experiment called “Measures of Performance” as a requirement to pass a class. Researchers divided them into two groups. One was told they would receive $1 (about $8 in today’s money). The other group was told they would receive $20 (about $150 in today’s money). The scientists then explained that the students would be helping improve the research department by evaluating a new experiment. They were then led into a room where they had to use one hand to place wooden spools into a tray and remove them over and over again. A half hour later, the task changed to turning square pegs clockwise on a flat board one-quarter spin at a time for half an hour. All the while, an experimenter watched and scribbled. It was one hour of torturous tedium, with a guy watching and taking notes. After the hour was up, the researcher asked the student if he could do the school a favor on his way out by telling the next student scheduled to perform the tasks, who was waiting outside, that the experiment was fun and interesting. Finally, after lying, people in both groups — one with one dollar in their pocket and one with twenty dollars — filled out a survey in which they were asked their true feelings about the study.

Something extraordinary and baffling had happened: The students who were paid $20 lied to their peers but reported in the survey, as expected, that they’d just endured two hours of mind-numbing tedium. But those who were only paid a dollar completely internalized the lie, reporting even in the survey that they found the task stimulating. The first group, the researchers concluded, were able to justify both the tedium and the lie with the dollar amount of their compensation, but the second group, having been paid hardly anything, had no external justification and instead had to assuage their mental unease by convincing themselves that it was all inherently worth it. McRaney extends the insight to the broader question of volunteerism:

This is why volunteering feels good and unpaid interns work so hard. Without an obvious outside reward you create an internal one. That’s the cycle of cognitive dissonance; a painful confusion about who you are gets resolved by seeing the world in a more satisfying way.

This dynamic plays out in reverse as well — as the infamous Stanford Prison Experiment, being induced to perform unkind behaviors makes us develop unkind attitudes. It all brings us back to Franklin’s foe-turned-friend:

When you feel anxiety over your actions, you will seek to lower the anxiety by creating a fantasy world in which your anxiety can’t exist, and then you come to believe the fantasy is reality, just as Benjamin Franklin’s rival did. He couldn’t possibly have lent a rare book to a guy he didn’t like, so he must actually like him. Problem solved.

[…]

The Benjamin Franklin effect is the result of your concept of self coming under attack. Every person develops a persona, and that persona persists because inconsistencies in your personal narrative get rewritten, redacted, and misinterpreted. If you are like most people, you have high self-esteem and tend to believe you are above average in just about every way. It keeps you going, keeps your head above water, so when the source of your own behavior is mysterious you will confabulate a story that paints you in a positive light. If you are on the other end of the self-esteem spectrum and tend to see yourself as undeserving and unworthy [and] will rewrite nebulous behavior as the result of attitudes consistent with the persona of an incompetent person, deviant, or whatever flavor of loser you believe yourself to be. Successes will make you uncomfortable, so you will dismiss them as flukes. If people are nice to you, you will assume they have ulterior motives or are mistaken. Whether you love or hate your persona, you protect the self with which you’ve become comfortable. When you observe your own behavior, or feel the gaze of an outsider, you manipulate the facts so they match your expectations.

Indeed, Franklin noted in his autobiography: “He that has once done you a kindness will be more ready to do you another, than he whom you yourself have obliged.” McRaney leaves us with some grounding advice:

Pay attention to when the cart is getting before the horse. Notice when a painful initiation leads to irrational devotion, or when unsatisfying jobs start to seem worthwhile. Remind yourself pledges and promises have power, as do uniforms and parades. Remember in the absence of extrinsic rewards you will seek out or create intrinsic ones. Take into account [that] the higher the price you pay for your decisions the more you value them. See that ambivalence becomes certainty with time. Realize that lukewarm feelings become stronger once you commit to a group, club, or product. Be wary of the roles you play and the acts you put on, because you tend to fulfill the labels you accept. Above all, remember the more harm you cause, the more hate you feel. The more kindness you express, the more you come to love those you help.

So Milton Glaser was right after all when he observed, “If you perceive the universe as being a universe of abundance, then it will be. If you think of the universe as one of scarcity, then it will be.”

You Are Now Less Dumb is excellent in its entirety, exploring such facets of our self-delusion as why we see patterns where there aren’t any, how we often confuse the origin of our emotional states, and more. Complement it with its prequel, then treat yourself to McRaney’s excellent podcast.

BP

The 13 Best Books of 2013: The Definitive Annual Reading List of Overall Favorites

Soul-stirring, brain-expanding reads on intuition, love, grief, attention, education, and the meaning of life.

All gratifying things must come to an end: The season’s subjective selection of best-of reading lists — which covered writing and creativity, photography, psychology and philosophy, art and design, history and biography, science and technology, children’s literature, and pets and animals — comes full-circle with this final omnibus of the year’s most indiscriminately wonderful reads, a set of overall favorites that spill across multiple disciplines, cross-pollinate subjects, and defy categorization in the most stimulating of ways. (Revisit last year’s selection here.)

1. ON LOOKING

“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library), also among the year’s best psychology books — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Horowitz’s approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

Art by Maira Kalman from ‘On Looking: Eleven Walks with Expert Eyes’

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.

2. ADVICE TO LITTLE GIRLS

In 1865, when he was only thirty, Mark Twain penned a playful short story mischievously encouraging girls to think independently rather than blindly obey rules and social mores. In the summer of 2011, I chanced upon and fell in love with a lovely Italian edition of this little-known gem with Victorian-scrapbook-inspired artwork by celebrated Russian-born children’s book illustrator Vladimir Radunsky. I knew the book had to come to life in English, so I partnered with the wonderful Claudia Zoe Bedrick of Brooklyn-based indie publishing house Enchanted Lion, maker of extraordinarily beautiful picture-books, and we spent the next two years bringing Advice to Little Girls (public library) to life in America — a true labor-of-love project full of so much delight for readers of all ages. (And how joyous to learn that it was also selected among NPR’s best books of 2013!)

While frolicsome in tone and full of wink, the story is colored with subtle hues of grown-up philosophy on the human condition, exploring all the deft ways in which we creatively rationalize our wrongdoing and reconcile the good and evil we each embody.

Good little girls ought not to make mouths at their teachers for every trifling offense. This retaliation should only be resorted to under peculiarly aggravated circumstances.

If you have nothing but a rag-doll stuffed with sawdust, while one of your more fortunate little playmates has a costly China one, you should treat her with a show of kindness nevertheless. And you ought not to attempt to make a forcible swap with her unless your conscience would justify you in it, and you know you are able to do it.

One can’t help but wonder whether this particular bit may have in part inspired the irreverent 1964 anthology Beastly Boys and Ghastly Girls and its mischievous advice on brother-sister relations:

If at any time you find it necessary to correct your brother, do not correct him with mud — never, on any account, throw mud at him, because it will spoil his clothes. It is better to scald him a little, for then you obtain desirable results. You secure his immediate attention to the lessons you are inculcating, and at the same time your hot water will have a tendency to move impurities from his person, and possibly the skin, in spots.

If your mother tells you to do a thing, it is wrong to reply that you won’t. It is better and more becoming to intimate that you will do as she bids you, and then afterward act quietly in the matter according to the dictates of your best judgment.

Good little girls always show marked deference for the aged. You ought never to ‘sass’ old people unless they ‘sass’ you first.

Originally featured in April — see more spreads, as well as the story behind the project, here.

3. THIS EXPLAINS EVERYTHING

Every year since 1998, intellectual impresario and Edge editor John Brockman has been posing a single grand question to some of our time’s greatest thinkers across a wide spectrum of disciplines, then collecting the answers in an annual anthology. Last year’s answers to the question “What scientific concept will improve everybody’s cognitive toolkit?” were released in This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking, one of the year’s best psychology and philosophy books.

In 2012, the question Brockman posed, proposed by none other than Steven Pinker, was “What is your favorite deep, elegant, or beautiful explanation?” The answers, representing an eclectic mix of 192 (alas, overwhelmingly male) minds spanning psychology, quantum physics, social science, political theory, philosophy, and more, are collected in the edited compendium This Explains Everything: Deep, Beautiful, and Elegant Theories of How the World Works (UK; public library) and are also available online.

In the introduction preceding the micro-essays, Brockman frames the question and its ultimate objective, adding to history’s most timeless definitions of science:

The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology, genetics, computer science, neurophysiology, psychology, cosmology, and physics. Emerging out of these contributions is a new natural philosophy, new ways of understanding physical systems, new ways of thinking that call into question many of our basic assumptions.

[…]

Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple principles in a surprising way. These explanations are called ‘beautiful’ or ‘elegant.’

[…]

The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining knowledge about anything — including such fields of inquiry as philosophy, mathematics, economics, history, language, and human behavior. The common thread is that a simple and nonobvious idea is proposed as the explanation of a diverse and complicated set of phenomena.

Puffer fish with Akule by photographer Wayne Levin. Click image for details.

Stanford neuroscientist Robert Sapolsky, eloquent as ever, marvels at the wisdom of the crowd and the emergence of swarm intelligence:

Observe a single ant, and it doesn’t make much sense, walking in one direction, suddenly careening in another for no obvious reason, doubling back on itself. Thoroughly unpredictable.

The same happens with two ants, a handful of ants. But a colony of ants makes fantastic sense. Specialized jobs, efficient means of exploiting new food sources, complex underground nests with temperature regulated within a few degrees. And critically, there’s no blueprint or central source of command—each individual ants has algorithms for their behaviors. But this is not wisdom of the crowd, where a bunch of reasonably informed individuals outperform a single expert. The ants aren’t reasonably informed about the big picture. Instead, the behavior algorithms of each ant consist of a few simple rules for interacting with the local environment and local ants. And out of this emerges a highly efficient colony.

Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple rules about when to lay down a pheromone trail and what to do when encountering someone else’s trail—approximations of optimal solutions to the Traveling Salesman problem. This has useful applications. In “ant-based routing,” simulations using virtual ants with similar rules can generate optimal ways of connecting the nodes in a network, something of great interest to telecommunications companies. It applies to the developing brain, which must wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of connecting axons. And migrating fetal neurons generate an efficient solution with a different version of ant-based routine.

A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple molecules in an organic soup to occasionally form more complex ones. Life may have originated this way without the requirement of bolts of lightning to catalyze the formation of complex molecules.

And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems don’t require a blue print, they don’t require a blue print maker. If they don’t require lightning bolts, they don’t require Someone hurtling lightning bolts.

Developmental psychologist Howard Gardner, who famously coined the seminal theory of multiple intelligences, echoes Anaïs Nin in advocating for the role of the individual and Susan Sontag in stressing the impact of individual acts on collective fate. His answer, arguing for the importance of human beings, comes as a welcome antidote to a question that suffers the danger of being inherently reductionist:

In a planet occupied now by seven billion inhabitants, I am amazed by the difference that one human being can make. Think of classical music without Mozart or Stravinsky; of painting without Caravaggio, Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible contributions of Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs (or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses or Christ.

[…]

Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of single individuals, or of small groups, working against the odds. As scholars, we cannot and should not sweep these instances under the investigative rug. We should bear in mind anthropologist Margaret Mead’s famous injunction: ‘Never doubt that a small group of thoughtful committed citizens can change the world. It is the only thing that ever has.’

Uber-curator Hans Ulrich Obrist, who also contributed to last year’s volume, considers the parallel role of patterns and chance in the works of iconic composer John Cage and painter Gerhard Richter, and the role of uncertainty in the creative process:

In art, the title of a work can often be its first explanation. And in this context I am thinking especially of the titles of Gerhard Richter. In 2006, when I visited Richter in his studio in Cologne, he had just finished a group of six corresponding abstract paintings which he gave the title Cage.

There are many relations between Richter’s painting and the compositions of John Cage. In a book about the Cage series, Robert Storr has traced them from Richter‘s attendance of a Cage performance at the Festum Fluxorum Fluxus in Düsseldorf 1963 to analogies in their artistic processes. Cage has often applied chance procedures in his compositions, notably with the use of the I Ching. Richter in his abstract paintings also intentionally allows effects of chance. In these paintings, he applies the oil paint on the canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the paint leaves on the canvas is to a large extent the outcome of chance.

[…]

Richter‘s concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other works)—but, one can say, the short form already contains everything. The title, like an explanation of a phenomenon, unlocks the works, describing their relation to one of the most important cultural figures of the twentieth century, John Cage, who shares with Richter the great themes of chance and uncertainty.

Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains indispensable, offers a lyrical meditation on the peculiar odds behind coincidences and déja vus:

I take comfort in the fact that there are two human moments that seem to be doled out equally and democratically within the human condition—and that there is no satisfying ultimate explanation for either. One is coincidence, the other is déja vu. It doesn’t matter if you’re Queen Elizabeth, one of the thirty-three miners rescued in Chile, a South Korean housewife or a migrant herder in Zimbabwe—in the span of 365 days you will pretty much have two déja vus as well as one coincidence that makes you stop and say, “Wow, that was a coincidence.”

The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can happen at any given moment, the fact is, that in practice, coincidences almost never do occur. Coincidences are actually so rare that when they do occur they are, in fact memorable. This suggests to me that the universe is designed to ward off coincidence whenever possible—the universe hates coincidence—I don’t know why—it just seems to be true. So when a coincidence happens, that coincidence had to work awfully hard to escape the system. There’s a message there. What is it? Look. Look harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a theorem for something larger than what they think it is.

What’s both eerie and interesting to me about déja vus is that they occur almost like metronomes throughout our lives, about one every six months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely assume that my thirteen year old niece, Stephen Hawking and someone working in a Beijing luggage-making factory each experience two déja vus a year. Not one. Not three. Two.

The underlying biodynamics of déja vus is probably ascribable to some sort of tingling neurons in a certain part of the brain, yet this doesn’t tell us why they exist. They seem to me to be a signal from larger point of view that wants to remind us that our lives are distinct, that they have meaning, and that they occur throughout a span of time. We are important, and what makes us valuable to the universe is our sentience and our curse and blessing of perpetual self-awareness.

Originally featured in January — read more here.

4. TIME WARPED

Given my soft spot for famous diaries, it should come as no surprise that I keep one myself. Perhaps the greatest gift of the practice has been the daily habit of reading what I had written on that day a year earlier; not only is it a remarkable tool of introspection and self-awareness, but it also illustrates that our memory “is never a precise duplicate of the original [but] a continuing act of creation” and how flawed our perception of time is — almost everything that occurred a year ago appears as having taken place either significantly further in the past (“a different lifetime,” I’d often marvel at this time-illusion) or significantly more recently (“this feels like just last month!”). Rather than a personal deficiency of those of us befallen by this tendency, however, it turns out to be a defining feature of how the human mind works, the science of which is at first unsettling, then strangely comforting, and altogether intensely interesting.

That’s precisely what acclaimed BBC broadcaster and psychology writer Claudia Hammond explores in Time Warped: Unlocking the Mysteries of Time Perception (public library) — a fascinating foray into the idea that our experience of time is actively created by our own minds and how these sensations of what neuroscientists and psychologists call “mind time” are created, and also among the year’s best psychology books. As disorienting as the concept might seem — after all, we’ve been nursed on the belief that time is one of those few utterly reliable and objective things in life — it is also strangely empowering to think that the very phenomenon depicted as the unforgiving dictator of life is something we might be able to shape and benefit from. Hammond writes:

We construct the experience of time in our minds, so it follows that we are able to change the elements we find troubling — whether it’s trying to stop the years racing past, or speeding up time when we’re stuck in a queue, trying to live more in the present, or working out how long ago we last saw our old friends. Time can be a friend, but it can also be an enemy. The trick is to harness it, whether at home, at work, or even in social policy, and to work in line with our conception of time. Time perception matters because it is the experience of time that roots us in our mental reality. Time is not only at the heart of the way we organize life, but the way we experience it.

Discus chronologicus, a depiction of time by German engraver Christoph Weigel, published in the early 1720s; from Cartographies of Time. (Click for details)

Among the most intriguing illustrations of “mind time” is the incredible elasticity of how we experience time. (“Where is it, this present?,” William James famously wondered. “It has melted in our grasp, fled ere we could touch it, gone in the instant of becoming.”) For instance, Hammond points out, we slow time down when gripped by mortal fear — the cliche about the slow-motion car crash is, in fact, a cognitive reality. This plays out even in situations that aren’t life-or-death per se but are still associated with strong feelings of fear. Hammond points to a study in which people with arachnophobia were asked to look at spiders — the very object of their intense fear — for 45 seconds and they overestimated the elapsed time. The same pattern was observed in novice skydivers, who estimated the duration of their peers’ falls as short, whereas their own, from the same altitude, were deemed longer.

Inversely, time seems to speed up as we get older — a phenomenon of which competing theories have attempted to make light. One, known as the “proportionality theory,” uses pure mathematics, holding that a year feels faster when you’re 40 than when you’re 8 because it only constitutes one fortieth of your life rather than a whole eighth. Among its famous proponents are Vladimir Nabokov and William James. But Hammond remains unconvinced:

The problem with the proportionality theory is that it fails to account for the way we experience time at any one moment. We don’t judge one day in the context of our whole lives. If we did, then for a 40-year-old every single day should flash by because it is less than one fourteen-thousandth of the life they’ve had so far. It should be fleeting and inconsequential, yet if you have nothing to do or an enforced wait at an airport for example, a day at 40 can still feel long and boring and surely longer than a fun day at the seaside packed with adventure for a child. … It ignores attention and emotion, which … can have a considerable impact on time perception.

Another theory suggests that perhaps it is the tempo of life in general that has accelerated, making things from the past appear as slower, including the passage of time itself.

But one definite change does take place with age: As we grow older, we tend to feel like the previous decade elapsed more rapidly, while the earlier decades of our lives seem to have lasted longer. Similarly, we tend to think of events that took place in the past 10 years as having happened more recently than they actually did. (Quick: What year did the devastating Japanese tsunami hit? When did we lose Maurice Sendak?) Conversely, we perceive events that took place more than a decade ago as having happened even longer ago. (When did Princess Diana die? What year was the Chernobyl disaster?) This, Hammond points out, is known as “forward telescoping”:

It is as though time has been compressed and — as if looking through a telescope — things seem closer than they really are. The opposite is called backward or reverse telescoping, also known as time expansion. This is when you guess that events happened longer ago than they really did. This is rare for distant events, but not uncommon for recent weeks.

[…]

The most straightforward explanation for it is called the clarity of memory hypothesis, proposed by the psychologist Norman Bradburn in 1987. This is the simple idea that because we know that memories fade over time, we use the clarity of a memory as a guide to its recency. So if a memory seems unclear we assume it happened longer ago.

Originally featured in July, with a deeper dive into the psychology of why time slows down when we’re afraid, speeds up as we age, and gets warped when we’re on vacation.

5. SELF-PORTRAIT AS YOUR TRAITOR

“Still this childish fascination with my handwriting,” young Susan Sontag wrote in her diary in 1949. “To think that I always have this sensuous potentiality glowing within my fingers.” This is the sort of sensuous potentiality that comes aglow in Self-Portrait as Your Traitor (public library) — the magnificent collection of hand-lettered poems and illustrated essays by friend-of-Brain-Pickings and frequent contributor Debbie Millman. In the introduction, design legend Paula Scher aptly describes this singular visual form as a “21st-century illuminated manuscript.” Personal bias aside, these moving, lovingly crafted poems and essays — some handwritten, some drawn with colored pencils, some typeset in felt on felt — vibrate at that fertile intersection of the deeply personal and the universally profound.

In “Fail Safe,” her widely read essay-turned-commencement-address on creative courage and embracing the unknown from the 2009 anthology Look Both Ways, Millman wrote:

John Maeda once explained, “The computer will do anything within its abilities, but it will do nothing unless commanded to do so.” I think people are the same — we like to operate within our abilities. But whereas the computer has a fixed code, our abilities are limited only by our perceptions. Two decades since determining my code, and after 15 years of working in the world of branding, I am now in the process of rewriting the possibilities of what comes next. I don’t know exactly what I will become; it is not something I can describe scientifically or artistically. Perhaps it is a “code in progress.”

Self-Portrait as Your Traitor, a glorious large-format tome full of textured colors to which the screen does absolutely no justice, is the result of this progress — a brave and heartening embodiment of what it truly means, as Rilke put it, to live the questions; the stunning record of one woman’s personal and artistic code-rewriting, brimming with wisdom on life and art for all.

Originally featured in November. See an exclusive excerpt here, then take a peek at Debbie’s creative process here.

6. SUSAN SONTAG: THE COMPLETE ROLLING STONE INTERVIEW

In 1978, Rolling Stone contributing editor Jonathan Cott interviewed Susan Sontag in twelve hours of conversation, beginning in Paris and continuing in New York, only a third of which was published in the magazine. More than three decades later and almost a decade after Sontag’s death, the full, wide-ranging magnificence of their tête-à-tête, spanning literature, philosophy, illness, mental health, music, art, and much more, is at last released in Susan Sontag: The Complete Rolling Stone Interview (public library) — a rare glimpse of one of modern history’s greatest minds in her element.

Cott marvels at what made the dialogue especially extraordinary:

Unlike almost any other person whom I’ve ever interviewed — the pianist Glenn Gould is the one other exception — Susan spoke not in sentences but in measured and expansive paragraphs. And what seemed most striking to me was the exactitude and “moral and linguistic fine-tuning” — as she once described Henry James’s writing style—with which she framed and elaborated her thoughts, precisely calibrating her intended meanings with parenthetical remarks and qualifying words (“sometimes,” “occasionally,” “usually,” “for the most part,” “in almost all cases”), the munificence and fluency of her conversation manifesting what the French refer to as an ivresse du discours — an inebriation with the spoken word. “I am hooked on talk as a creative dialogue,” she once remarked in her journals, and added: “For me, it’s the principal medium of my salvation.

In one segment of the conversation, Sontag discusses how the false divide between “high” and pop culture impoverishes our lives. In another, she makes a beautiful case for the value of history:

I really believe in history, and that’s something people don’t believe in anymore. I know that what we do and think is a historical creation. I have very few beliefs, but this is certainly a real belief: that most everything we think of as natural is historical and has roots — specifically in the late eighteenth and early nineteenth centuries, the so-called Romantic revolutionary period — and we’re essentially still dealing with expectations and feelings that were formulated at that time, like ideas about happiness, individuality, radical social change, and pleasure. We were given a vocabulary that came into existence at a particular historical moment. So when I go to a Patti Smith concert at CBGB, I enjoy, participate, appreciate, and am tuned in better because I’ve read Nietzsche.

In another meditation, she argues for the existential and creative value of presence:

What I want is to be fully present in my life — to be really where you are, contemporary with yourself in your life, giving full attention to the world, which includes you. You are not the world, the world is not identical to you, but you’re in it and paying attention to it. That’s what a writer does — a writer pays attention to the world. Because I’m very against this solipsistic notion that you find it all in your head. You don’t, there really is a world that’s there whether you’re in it or not.

In another passage, she considers how taking responsibility empowers rather than disempowers us:

I want to feel as responsible as I possibly can. As I told you before, I hate feeling like a victim, which not only gives me no pleasure but also makes me feel very uncomfortable. Insofar as it’s possible, and not crazy, I want to enlarge to the furthest extent possible my sense of my own autonomy, so that in friendship and love relationships I’m eager to take responsibility for both the good and the bad things. I don’t want this attitude of “I was so wonderful and that person did me in.” Even when it’s sometimes true, I’ve managed to convince myself that I was at least co-responsible for bad things that have happened to me, because it actually makes me feel stronger and makes me feel that things could perhaps be different.

The conversation, in which Sontag reaches unprecedented depths of self-revelation, also debunks some misconceptions about her public image as an intellectual in the dry, scholarly sense of the term:

Most of what I do, contrary to what people think, is so intuitive and unpremeditated and not at all that kind of cerebral, calculating thing people imagine it to be. I’m just following my instincts and intuitions. […] An argument appears to me much more like the spokes of a wheel than the links of a chain.

In one of her most poignant insights, Sontag admonishes against our culture’s dangerous polarities and imprisoning stereotypes:

A lot of our ideas about what we can do at different ages and what age means are so arbitrary — as arbitrary as sexual stereotypes. I think that the young-old polarization and the male-female polarization are perhaps the two leading stereotypes that imprison people. The values associated with youth and with masculinity are considered to be the human norms, and anything else is taken to be at least less worthwhile or inferior. Old people have a terrific sense of inferiority. They’re embarrassed to be old. What you can do when you’re young and what you can do when you’re old is as arbitrary and without much basis as what you can do if you’re a woman or what you can do if you’re a man.

Originally featured in November — take a closer look here and here.

7. LETTERS OF NOTE

As a hopeless lover of letters, I was thrilled for the release of Letters of Note: Correspondence Deserving of a Wider Audience (public library) — the aptly titled, superb collection based on Shaun Usher’s indispensable website of the same name, which stands as a heartening echelon of independent online scholarship and journalism at the intersection of the editorial and the curatorial and features timeless treasures from such diverse icons and Brain Pickings favorites as E. B. White, Virginia Woolf, Ursula Nordstrom, Nick Cave, Ray Bradbury, Amelia Earhart, Galileo Galilei, and more.

One of the most beautiful letters in the collection comes from Hunter S. Thompsongonzo journalism godfather, pundit of media politics, dark philosopher. The letter, which Thompson sent to his friend Hume Logan in 1958, makes for an exquisite addition to luminaries’ reflections on the meaning of life, speaking to what it really means to find your purpose.

Cautious that “all advice can only be a product of the man who gives it” — a caveat other literary legends have stressed with varying degrees of irreverence — Thompson begins with a necessary disclaimer about the very notion of advice-giving:

To give advice to a man who asks what to do with his life implies something very close to egomania. To presume to point a man to the right and ultimate goal — to point with a trembling finger in the RIGHT direction is something only a fool would take upon himself.

And yet he honors his friend’s request, turning to Shakespeare for an anchor of his own advice:

“To be, or not to be: that is the question: Whether ’tis nobler in the mind to suffer the slings and arrows of outrageous fortune, or to take arms against a sea of troubles…”

And indeed, that IS the question: whether to float with the tide, or to swim for a goal. It is a choice we must all make consciously or unconsciously at one time in our lives. So few people understand this! Think of any decision you’ve ever made which had a bearing on your future: I may be wrong, but I don’t see how it could have been anything but a choice however indirect — between the two things I’ve mentioned: the floating or the swimming.

He acknowledges the obvious question of why not take the path of least resistance and float aimlessly, then counters it:

The answer — and, in a sense, the tragedy of life — is that we seek to understand the goal and not the man. We set up a goal which demands of us certain things: and we do these things. We adjust to the demands of a concept which CANNOT be valid. When you were young, let us say that you wanted to be a fireman. I feel reasonably safe in saying that you no longer want to be a fireman. Why? Because your perspective has changed. It’s not the fireman who has changed, but you.

Touching on the same notion that William Gibson termed “personal micro-culture,” Austin Kleon captured in asserting that “you are the mashup of what you let into your life,” and Paula Scher articulated so succinctly in speaking of the combinatorial nature of our creativity, Thompson writes:

Every man is the sum total of his reactions to experience. As your experiences differ and multiply, you become a different man, and hence your perspective changes. This goes on and on. Every reaction is a learning process; every significant experience alters your perspective.

So it would seem foolish, would it not, to adjust our lives to the demands of a goal we see from a different angle every day? How could we ever hope to accomplish anything other than galloping neurosis?

The answer, then, must not deal with goals at all, or not with tangible goals, anyway. It would take reams of paper to develop this subject to fulfillment. God only knows how many books have been written on “the meaning of man” and that sort of thing, and god only knows how many people have pondered the subject. (I use the term “god only knows” purely as an expression.)* There’s very little sense in my trying to give it up to you in the proverbial nutshell, because I’m the first to admit my absolute lack of qualifications for reducing the meaning of life to one or two paragraphs.

Resolving to steer clear of the word “existentialism,” Thompson nonetheless strongly urges his friend to read Sartre’s Nothingness and the anthology Existentialism: From Dostoyevsky to Sartre, then admonishes against succumbing to faulty definitions of success at the expense of finding one’s own purpose:

To put our faith in tangible goals would seem to be, at best, unwise. So we do not strive to be firemen, we do not strive to be bankers, nor policemen, nor doctors. WE STRIVE TO BE OURSELVES.

But don’t misunderstand me. I don’t mean that we can’t BE firemen, bankers, or doctors—but that we must make the goal conform to the individual, rather than make the individual conform to the goal. In every man, heredity and environment have combined to produce a creature of certain abilities and desires—including a deeply ingrained need to function in such a way that his life will be MEANINGFUL. A man has to BE something; he has to matter.

As I see it then, the formula runs something like this: a man must choose a path which will let his ABILITIES function at maximum efficiency toward the gratification of his DESIRES. In doing this, he is fulfilling a need (giving himself identity by functioning in a set pattern toward a set goal) he avoids frustrating his potential (choosing a path which puts no limit on his self-development), and he avoids the terror of seeing his goal wilt or lose its charm as he draws closer to it (rather than bending himself to meet the demands of that which he seeks, he has bent his goal to conform to his own abilities and desires).

In short, he has not dedicated his life to reaching a pre-defined goal, but he has rather chosen a way of life he KNOWS he will enjoy. The goal is absolutely secondary: it is the functioning toward the goal which is important. And it seems almost ridiculous to say that a man MUST function in a pattern of his own choosing; for to let another man define your own goals is to give up one of the most meaningful aspects of life — the definitive act of will which makes a man an individual.

Noting that his friend had thus far lived “a vertical rather than horizontal existence,” Thompson acknowledges the challenge of this choice but admonishes that however difficult, the choice must be made or else it melts away into those default modes of society:

A man who procrastinates in his CHOOSING will inevitably have his choice made for him by circumstance. So if you now number yourself among the disenchanted, then you have no choice but to accept things as they are, or to seriously seek something else. But beware of looking for goals: look for a way of life. Decide how you want to live and then see what you can do to make a living WITHIN that way of life. But you say, “I don’t know where to look; I don’t know what to look for.”

And there’s the crux. Is it worth giving up what I have to look for something better? I don’t know — is it? Who can make that decision but you? But even by DECIDING TO LOOK, you go a long way toward making the choice.

He ends by returning to his original disclaimer by reiterating that rather than a prescription for living, his “advice” is merely a reminder that how and what we choose — choices we’re in danger of forgetting even exist — shapes the course and experience of our lives:

I’m not trying to send you out “on the road” in search of Valhalla, but merely pointing out that it is not necessary to accept the choices handed down to you by life as you know it. There is more to it than that — no one HAS to do something he doesn’t want to do for the rest of his life.

Originally featured in November.

8. INTUITION PUMPS

“If you are not making mistakes, you’re not taking enough risks,” Debbie Millman counseled. “Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before,” Neil Gaiman advised young creators. In Intuition Pumps And Other Tools for Thinking (public library), also among the year’s best psychology books, the inimitable Daniel Dennett, one of our greatest living philosophers, offers a set of thinking tools — “handy prosthetic imagination-extenders and focus holders” — that allow us to “think reliably and even gracefully about really hard questions” — to enhance your cognitive toolkit. He calls these tools “intuition pumps” — thought experiments designed to stir “a heartfelt, table-thumping intuition” (which we know is a pillar of even the most “rational” of science) about the question at hand, a kind of persuasion tool the reverse-engineering of which enables us to think better about thinking itself. Intuition, of course, is a domain-specific ability that relies on honed critical thinking rather than a mystical quality bestowed by the gods — but that’s precisely Dennett’s point, and his task is to help us hone it.

Though most of his 77 “intuition pumps” address concrete questions, a dozen are “general-purpose” tools that apply deeply and widely, across just about any domain of thinking. The first of them is also arguably the most useful yet most uncomfortable: making mistakes.

Echoing Dorion Sagan’s case for why science and philosophy need each other, Dennett begins with an astute contribution to the best definitions of philosophy, wrapped in a necessary admonition about the value of history:

The history of philosophy is in large measure the history of very smart people making very tempting mistakes, and if you don’t know the history, you are doomed to making the same darn mistakes all over again. … There is no such thing as philosophy-free science, just science that has been conducted without any consideration of its underlying philosophical assumptions.

He speaks for the generative potential of mistakes and their usefulness as an empirical tool:

Sometimes you don’t just want to risk making mistakes; you actually want to make them — if only to give you something clear and detailed to fix.

Therein lies the power of mistakes as a vehicle for, as Rilke famously put it, “living the questions” and thus advancing knowledge in a way that certainty cannot — for, as Richard Feynman memorably noted, the scientist’s job is to remain unsure, and so seems the philosopher’s. Dennett writes:

We philosophers are mistake specialists. … While other disciplines specialize in getting the right answers to their defining questions, we philosophers specialize in all the ways there are of getting things so mixed up, so deeply wrong, that nobody is even sure what the right questions are, let alone the answers. Asking the wrong questions risks setting any inquiry off on the wrong foot. Whenever that happens, this is a job for philosophers! Philosophy — in every field of inquiry — is what you have to do until you figure out what questions you should have been asking in the first place.

[…]

Mistakes are not just opportunities for learning; they are, in an important sense, the only opportunity for learning or making something truly new. Before there can be learning, there must be learners. There are only two non-miraculous ways for learners to come into existence: they must either evolve or be designed and built by learners that evolved. Biological evolution proceeds by a grand, inexorable process of trial and error — and without the errors the trials wouldn’t accomplish anything.

Dennett offers a caveat that at once highlights the importance of acquiring knowledge and reminds us of the power of “chance-opportunism”:

Trials can be either blind or foresighted. You, who know a lot, but not the answer to the question at hand, can take leaps — foresighted leaps. You can look before you leap, and hence be somewhat guided from the outset by what you already know. You need not be guessing at random, but don’t look down your nose at random guesses; among its wonderful products is … you!

And since evolution is the highest epitome of how the process of trial and error drives progress, Dennett makes a case for understanding evolution as a key to understanding everything else we humans value:

Evolution … is the central, enabling process not only of life but also of knowledge and learning and understanding. If you attempt to make sense of the world of ideas and meanings, free will and morality, art and science and even philosophy itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back. … For evolution, which knows nothing, the steps into novelty are blindly taken by mutations, which are random copying “errors” in DNA.

Dennett echoes Dostoyevsky (“Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others.”) and offers the key to making productive mistakes:

The chief trick to making good mistakes is not to hide them — especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own mistakes, turning them over in your mind as if they were works of art, which in a way they are. … The trick is to take advantage of the particular details of the mess you’ve made, so that your next attempt will be informed by it and not just another blind stab in the dark.

We have all heard the forlorn refrain “Well, it seemed like a good idea at the time!” This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say, “Well, it seemed like a good idea at the time!” is standing on the threshold of brilliance.

Originally featured in May — read the full article here.

9. LOST CAT

“Dogs are not about something else. Dogs are about dogs,” Malcolm Gladwell asserted indignantly in the introduction to The Big New Yorker Book of Dogs. Though hailed as memetic rulers of the internet, cats have also enjoyed a long history as artistic and literary muses, but never have they been at once more about cats and more about something else than in Lost Cat: A True Story of Love, Desperation, and GPS Technology (public library) by firefighter-turned-writer Caroline Paul and illustrator extraordinaire Wendy MacNaughton, she of many wonderful collaborations — a tender, imaginative memoir, among the year’s best biographies and memoirs and best books on pets and animals, infused with equal parts humor and humanity. Though “about” a cat, this heartwarming and heartbreaking tale is really about what it means to be human — about the osmosis of hollowing loneliness and profound attachment, the oscillation between boundless affection and paralyzing fear of abandonment, the unfair promise of loss implicit to every possibility of love.

After Caroline crashes an experimental plane she was piloting, she finds herself severely injured and spiraling into the depths of depression. It both helps and doesn’t that Caroline and Wendy have just fallen in love, soaring in the butterfly heights of new romance, “the phase of love that didn’t obey any known rules of physics,” until the crash pulls them into a place that would challenge even the most seasoned and grounded of relationships. And yet they persevere as Wendy patiently and lovingly takes care of Caroline.

When Caroline returns from the hospital with a shattered ankle, her two thirteen-year-old tabbies — the shy, anxious Tibby (short for Tibia, affectionately — and, in these circumstances, ironically — named after the shinbone) and the sociable, amicable Fibby (short for Fibula, after the calf bone on the lateral side of the tibia) — are, short of Wendy, her only joy and comfort:

Tibia and Fibula meowed happily when I arrived. They were undaunted by my ensuing stupor. In fact they were delighted; suddenly I had become a human who didn’t shout into a small rectangle of lights and plastic in her hand, peer at a computer, or get up and disappear from the vicinity, only to reappear through the front door hours later. Instead, I was completely available to them at all times. Amazed by their good luck, they took full feline advantage. They asked for ear scratches and chin rubs. They rubbed their whiskers along my face. They purred in response to my slurred, affectionate baby talk. But mostly they just settled in and went to sleep. Fibby snored into my neck. Tibby snored on the rug nearby. Meanwhile I lay awake, circling the deep dark hole of depression.

Without my cats, I would have fallen right in.

And then, one day, Tibby disappears.

Wendy and Caroline proceed to flyer the neighborhood, visit every animal shelter in the vicinity, and even, in their desperation, enlist the help of a psychic who specializes in lost pets — but to no avail. Heartbroken, they begin to mourn Tibby’s loss.

And then, one day five weeks later, Tibby reappears. But once the initial elation of the recovery has worn off, Caroline begins to wonder where he’d been and why he’d left. He is now no longer eating at home and regularly leaves the house for extended periods of time — Tibby clearly has a secret place he now returns to. Even more worrisomely, he’s no longer the shy, anxious tabby he’d been for thirteen years — instead, he’s a half pound heavier, chirpy, with “a youthful spring in his step.” But why would a happy cat abandon his loving lifelong companion and find comfort — find himself, even — elsewhere?

When the relief that my cat was safe began to fade, and the joy of his prone, snoring form — sprawled like an athlete after a celebratory night of boozing — started to wear thin, I was left with darker emotions. Confusion. Jealousy. Betrayal. I thought I’d known my cat of thirteen years. But that cat had been anxious and shy. This cat was a swashbuckling adventurer back from the high seas. What siren call could have lured him away? Was he still going to this gilded place, with its overflowing food bowls and endless treats?

There only one obvious thing left to do: Track Tibby on his escapades. So Caroline, despite Wendy’s lovingly suppressed skepticism, heads to a spy store — yes, those exist — and purchases a real-time GPS tracker, complete with a camera that they program to take snapshots every few minutes, which they then attach to Tibby’s collar.

What follows is a wild, hilarious, and sweet tale of tinkering, tracking, and tenderness. Underpinning the obsessive quest is the subtle yet palpable subplot of Wendy and Caroline’s growing love for each other, the deepening of trust and affection that happens when two people share in a special kind of insanity.

“Evert quest is a journey, every journey a story. Every story, in turn, has a moral,” writes Caroline in the final chapter, then offers several “possible morals” for the story, the last of which embody everything that makes Lost Cat an absolute treat from cover to cover:

6. You can never know your cat. In fact, you can never know anyone as completely as you want.

7. But that’s okay, love is better.

Take a closer look here, then hear MacNaughton and Paul in conversation about combining creative collaboration with a romantic relationship.

10. CODEX SERAPHINIANUS

In 1976, Italian artist, architect, and designer Luigi Serafini, only 27 at the time, set out to create an elaborate encyclopedia of imaginary objects and creatures that fell somewhere between Edward Gorey’s cryptic alphabets, Albertus Seba’s cabinet of curiosities, the book of surrealist games, and Alice in Wonderland. What’s more, it wasn’t written in any ordinary language but in an unintelligible alphabet that appeared to be a conlang — an undertaking so complex it constitutes one of the highest feats of cryptography. It took him nearly three years to complete the project, and three more to publish it, but when it was finally released, the book — a weird and wonderful masterpiece of art and philosophical provocation on the precipice of the information age — attracted a growing following that continued to gather momentum even as the original edition went out of print.

Now, for the first time in more than thirty years, Codex Seraphinianus (public library) is resurrected in a lavish new edition by Rizzoli — who have a penchant for excavating forgotten gems — featuring a new chapter by Serafini, now in his 60s, and a gorgeous signed print with each deluxe tome. Besides a visual masterwork, it’s also a timeless meditation on what “reality” really is, one all the timelier in today’s age of such seemingly surrealist feats as bioengineering whole new lifeforms, hurling subatomic particles at each other faster than the speed of light, and encoding an entire book onto a DNA molecule.

In an interview for Wired Italy, Serafini aptly captures the subtle similarity to children’s books in how the Codex bewitches our grown-up fancy with its bizarre beauty:

What I want my alphabet to convey to the reader is the sensation that children feel in front of books they cannot yet understand. I used it to describe analytically an imaginary world and give a coherent framework. The images originate from the clash between this fantasy vocabulary and the real world. … The Codex became so popular because it makes you feel more comfortable with your fantasies. Another world is not possible, but a fantasy one maybe is.

Playfully addressing the book’s towering price point, Serafini makes a more serious point about how it bespeaks the seductive selectiveness of our attention:

The [new] edition is very rich and also pricey, I know, but it’s just like psychoanalysis: Money matters and the fee is part of the process of healing. At the end of the day, the Codex is similar to the Rorschach inkblot test. You see what you want to see. You might think it’s speaking to you, but it’s just your imagination.

Originally featured in October — see more here.

11. MY BROTHER’S BOOK

For those of us who loved legendary children’s book author Maurice Sendak — famed creator of wild things, little-known illustrator of velveteen rabbits, infinitely warm heart, infinitely witty mind — his death in 2012 was one of the year’s greatest heartaches. Now, half a century after his iconic Where The Wild Things Are, comes My Brother’s Book (public library), one of the year’s best children’s books — a bittersweet posthumous farewell to the world, illustrated in vibrant, dreamsome watercolors and written in verse inspired by some of Sendak’s lifelong influences: Shakespeare, Blake, Keats, and the music of Mozart. In fact, a foreword by Shakespeare scholar Stephen Greenblatt reveals the book is based on the Bard’s “A Winter’s Tale.”

It tells the story of two brothers, Jack and Guy, torn asunder when a falling star crashes onto Earth. Though on the surface about the beloved author’s own brother Jack, who died 18 years ago, the story is also about the love of Sendak’s life and his partner of fifty years, psychoanalyst Eugene Glynn, whose prolonged illness and eventual loss in 2007 devastated Sendak — the character of Guy reads like a poetic fusion of Sendak and Glynn. And while the story might be a universal “love letter to those who have gone before,” as NPR’s Renee Montagne suggests in Morning Edition, it is in equal measure a private love letter to Glynn. (Sendak passed away the day before President Obama announced his support for same-sex marriage, but Sendak fans were quick to honor both historic moments with a bittersweet homage.)

Indeed, the theme of all-consuming love manifests viscerally in Sendak’s books. Playwright Tony Kushner, a longtime close friend of Sendak’s and one of his most heartfelt mourners, tells NPR:

There’s a lot of consuming and devouring and eating in Maurice’s books. And I think that when people play with kids, there’s a lot of fake ferocity and threats of, you know, devouring — because love is so enormous, the only thing you can think of doing is swallowing the person that you love entirely.

My Brother’s Book ends on a soul-stirring note, tender and poignant in its posthumous light:

And Jack slept safe
Enfolded in his brother’s arms
And Guy whispered ‘Good night
And you will dream of me.’

Originally featured in February.

12. EIGHTY DAYS

“Anything one man can imagine, other men can make real,” science fiction godfather Jules Verne famously proclaimed. He was right about the general sentiment but oh how very wrong about its gendered language: Sixteen years after Verne’s classic novel Eighty Days Around the World, his vision for speed-circumnavigation would be made real — but by a woman. On the morning of November 14, 1889, Nellie Bly, an audacious newspaper reporter, set out to outpace Verne’s fictional itinerary by circumnavigating the globe in seventy-five days, thus setting the real-world record for the fastest trip around the world. In Eighty Days: Nellie Bly and Elizabeth Bisland’s History-Making Race Around the World (public library), also among the year’s best history books, Matthew Goodman traces the groundbreaking adventure, beginning with a backdrop of Bly’s remarkable journalistic fortitude and contribution to defying our stubbornly enduring biases about women writers:

No female reporter before her had ever seemed quite so audacious, so willing to risk personal safety in pursuit of a story. In her first exposé for The World, Bly had gone undercover … feigning insanity so that she might report firsthand on the mistreatment of the female patients of the Blackwell’s Island Insane Asylum. … Bly trained with the boxing champion John L. Sullivan; she performed, with cheerfulness but not much success, as a chorus girl at the Academy of Music (forgetting the cue to exit, she momentarily found herself all alone onstage). She visited with a remarkable deaf, dumb, and blind nine-year-old girl in Boston by the name of Helen Keller. Once, to expose the workings of New York’s white slave trade, she even bought a baby. Her articles were by turns lighthearted and scolding and indignant, some meant to edify and some merely to entertain, but all were shot through with Bly’s unmistakable passion for a good story and her uncanny ability to capture the public’s imagination, the sheer force of her personality demanding that attention be paid to the plight of the unfortunate, and, not incidentally, to herself.

For all her extraordinary talent and work ethic, Bly’s appearance was decidedly unremarkable — a fact that shouldn’t matter, but one that would be repeatedly remarked upon by her critics and commentators, something we’ve made sad little progress on in discussing women’s professional, intellectual, and creative merit more than a century later. Goodman paints a portrait of Bly:

She was a young woman in a plaid coat and cap, neither tall nor short, dark nor fair, not quite pretty enough to turn a head: the sort of woman who could, if necessary, lose herself in a crowd.

[…]

Her voice rang with the lilt of the hill towns of western Pennsylvania; there was an unusual rising inflection at the ends of her sentences, the vestige of an Elizabethan dialect that had still been spoken in the hills when she was a girl. She had piercing gray eyes, though sometimes they were called green, or blue-green, or hazel. Her nose was broad at its base and delicately upturned at the end — the papers liked to refer to it as a “retroussé” nose — and it was the only feature about which she was at all self-conscious. She had brown hair that she wore in bangs across her forehead. Most of those who knew her considered her pretty, although this was a subject that in the coming months would be hotly debated in the press.

But, as if the ambitious adventure weren’t scintillating enough, the story takes an unexpected turn: That fateful November morning, as Bly was making her way to the journey’s outset at the Hoboken docks, a man named John Brisben Walker passed her on a ferry in the opposite direction, traveling from Jersey City to Lower Manhattan. He was the publisher of a high-brow magazine titled The Cosmopolitan, the same publication that decades later, under the new ownership of William Randolph Hearst, would take a dive for the commercially low-brow. On his ferry ride, Walker skimmed that morning’s edition of The World and paused over the front-page feature announcing Bly’s planned adventure around the world. A seasoned media manipulator of the public’s voracious appetite for drama, he instantly birthed an idea that would seize upon a unique publicity opportunity — The Cosmopolitan would send another circumnavigator to race against Bly. To keep things equal, it would have to be a woman. To keep them interesting, she’d travel in the opposite direction.

And so it went:

Elizabeth Bisland was twenty-eight years old, and after nearly a decade of freelance writing she had recently obtained a job as literary editor of The Cosmopolitan, for which she wrote a monthly review of recently published books entitled “In the Library.” Born into a Louisiana plantation family ruined by the Civil War and its aftermath, at the age of twenty she had moved to New Orleans and then, a few years later, to New York, where she contributed to a variety of magazines and was regularly referred to as the most beautiful woman in metropolitan journalism. Bisland was tall, with an elegant, almost imperious bearing that accentuated her height; she had large dark eyes and luminous pale skin and spoke in a low, gentle voice. She reveled in gracious hospitality and smart conversation, both of which were regularly on display in the literary salon that she hosted in the little apartment she shared with her sister on Fourth Avenue, where members of New York’s creative set, writers and painters and actors, gathered to discuss the artistic issues of the day. Bisland’s particular combination of beauty, charm, and erudition seems to have been nothing short of bewitching.

But Bisland was no literary bombshell. Wary of beauty’s fleeting and superficial nature — she once lamented, “After the period of sex-attraction has passed, women have no power in America” — she blended Edison’s circadian relentlessness and Tchaikovsky’s work ethic:

She took pride in the fact that she had arrived in New York with only fifty dollars in her pocket, and that the thousands of dollars now in her bank account had come by virtue of her own pen. Capable of working for eighteen hours at a stretch, she wrote book reviews, essays, feature articles, and poetry in the classical vein. She was a believer, more than anything else, in the joys of literature, which she had first experienced as a girl in ancient volumes of Shakespeare and Cervantes that she found in the library of her family’s plantation house. (She taught herself French while she churned butter, so that she might read Rousseau’s Confessions in the original — a book, as it turned out, that she hated.) She cared nothing for fame, and indeed found the prospect of it distasteful.

And yet, despite their competitive circumstances and seemingly divergent dispositions, something greater bound the two women together, some ineffable force of culture that quietly united them in a bold defiance of their era’s normative biases:

On the surface the two women … were about as different as could be: one woman a Northerner, the other from the South; one a scrappy, hard-driving crusader, the other priding herself on her gentility; one seeking out the most sensational of news stories, the other preferring novels and poetry and disdaining much newspaper writing as “a wild, crooked, shrieking hodge-podge,” a “caricature of life.” Elizabeth Bisland hosted tea parties; Nellie Bly was known to frequent O’Rourke’s saloon on the Bowery. But each of them was acutely conscious of the unequal position of women in America. Each had grown up without much money and had come to New York to make a place for herself in big-city journalism, achieving a hard-won success in what was still, unquestionably, a man’s world.

Originally featured in May — read the full article, including Bly’s entertaining illustrated packing list, here.

13. DON’T GO BACK TO SCHOOL

“The present education system is the trampling of the herd,” legendary architect Frank Lloyd Wright lamented in 1956. Half a century later, I started Brain Pickings in large part out of frustration and disappointment with my trampling experience of our culturally fetishized “Ivy League education.” I found myself intellectually and creatively unstimulated by the industrialized model of the large lecture hall, the PowerPoint presentations, the standardized tests assessing my rote memorization of facts rather than my ability to transmute that factual knowledge into a pattern-recognition mechanism that connects different disciplines to cultivate wisdom about how the world works and a moral lens on how it should work. So Brain Pickings became the record of my alternative learning, of that cross-disciplinary curiosity that took me from art to psychology to history to science, by way of the myriad pieces of knowledge I discovered — and connected — on my own. I didn’t live up to the entrepreneurial ideal of the college drop-out and begrudgingly graduated “with honors,” but refused to go to my own graduation and decided never to go back to school. Years later, I’ve learned more in the course of writing and researching the thousands of articles to date than in all the years of my formal education combined.

So, in 2012, when I found out that writer Kio Stark was crowdfunding a book that would serve as a manifesto for learning outside formal education, I eagerly chipped in. Now, Don’t Go Back to School: A Handbook for Learning Anything is out and is everything I could’ve wished for when I was in college, an essential piece of cultural literacy, at once tantalizing and practically grounded assurance that success doesn’t lie at the end of a single highway but is sprinkled along a thousand alternative paths. Stark describes it as “a radical project, the opposite of reform … not about fixing school [but] about transforming learning — and making traditional school one among many options rather than the only option.” Through a series of interviews with independent learners who have reached success and happiness in fields as diverse as journalism, illustration, and molecular biology, Stark — who herself dropped out of a graduate program at Yale, despite being offered a prestigious fellowship — cracks open the secret to defining your own success and finding your purpose outside the factory model of formal education. She notes the patterns that emerge:

People who forgo school build their own infrastructures. They create and borrow and reinvent the best that formal schooling has to offer, and they leave the worst behind. That buys them the freedom to learn on their own terms.

[…]

From their stories, you’ll see that when you step away from the prepackaged structure of traditional education, you’ll discover that there are many more ways to learn outside school than within.

Reflecting on her own exit from academia, Stark articulates a much more broadly applicable insight:

A gracefully executed quit is a beautiful thing, opening up more doors than it closes.

But despite discovering in dismay that “liberal arts graduate school is professional school for professors,” which she had no interest in becoming, Stark did learn something immensely valuable from her third year of independent study, during which she read about 200 books of her own choosing:

I learned how to teach myself. I had to make my own reading lists for the exams, which meant I learned how to take a subject I was interested in and make myself a map for learning it.

The interviews revealed four key common tangents: learning is collaborative rather than done alone; the importance of academic credentials in many professions is declining; the most fulfilling learning tends to take place outside of school; and those happiest about learning are those who learn out of intrinsic motivation rather than in pursuit of extrinsic rewards. The first of these insights, of course, appears on the surface to contradict the very notion of “independent learning,” but Stark offers an eloquent semantic caveat:

Independent learning suggests ideas such as “self-taught,” or “autodidact.” These imply that independence means working solo. But that’s just not how it happens. People don’t learn in isolation. When I talk about independent learners, I don’t mean people learning alone. I’m talking about learning that happens independent of schools.

[…]

Anyone who really wants to learn without school has to find other people to learn with and from. That’s the open secret of learning outside of school. It’s a social act. Learning is something we do together.

Independent learners are interdependent learners.

Much of the argument for formal education rests on statistics indicating that people with college and graduate degrees earn more. But those statistics, Stark notes, suffer an important and rarely heeded bias:

The problem is that this statistic is based on long-term data, gathered from a period of moderate loan debt, easy employability, and annual increases in the value of a college degree. These conditions have been the case for college grads for decades. Given the dramatically changed circumstances grads today face, we already know that the trends for debt, employability, and the value of a degree have all degraded, and we cannot assume the trend toward greater lifetime earnings will hold true for the current generation. This is a critical omission from media coverage. The fact is we do not know. There’s absolutely no guarantee it will hold true.

Some heartening evidence suggests the blind reliance on degrees might be beginning to change. Stark cites Zappos CEO Tony Hsieh:

I haven’t looked at a résumé in years. I hire people based on their skills and whether or not they are going to fit our culture.

Another common argument for formal education extols the alleged advantages of its structure, proposing that homework assignments, reading schedules, and regular standardized testing would motivate you to learn with greater rigor. But, as Daniel Pink has written about the psychology of motivation, in school, as in work, intrinsic drives far outweigh extrinsic, carrots-and-sticks paradigms of reward and punishment, rendering this argument unsound. Stark writes:

Learning outside school is necessarily driven by an internal engine. … [I]ndependent learners stick with the reading, thinking, making, and experimenting by which they learn because they do it for love, to scratch an itch, to satisfy curiosity, following the compass of passion and wonder about the world.

So how can you best fuel that internal engine of learning outside the depot of formal education? Stark offers an essential insight, which places self-discovery at the heart of acquiring external knowledge:

Learning your own way means finding the methods that work best for you and creating conditions that support sustained motivation. Perseverance, pleasure, and the ability to retain what you learn are among the wonderful byproducts of getting to learn using methods that suit you best and in contexts that keep you going. Figuring out your personal approach to each of these takes trial and error.

[…]

For independent learners, it’s essential to find the process and methods that match your instinctual tendencies as a learner. Everyone I talked to went through a period of experimenting and sorting out what works for them, and they’ve become highly aware of their own preferences. They’re clear that learning by methods that don’t suit them shuts down their drive and diminishes their enjoyment of learning. Independent learners also find that their preferred methods are different for different areas. So one of the keys to success and enjoyment as an independent learner is to discover how you learn.

[…]

School isn’t very good at dealing with the multiplicity of individual learning preferences, and it’s not very good at helping you figure out what works for you.

Echoing Neil deGrasse Tyson, who has argued that “every child is a scientist” since curiosity is coded into our DNA, and Sir Ken Robinson, who has lamented that the industrial model of education schools us out of our inborn curiosity, Stark observes:

Any young child you observe displays these traits. But passion and curiosity can be easily lost. School itself can be a primary cause; arbitrary motivators such as grades leave little room for variation in students’ abilities and interests, and fail to reward curiosity itself. There are also significant social factors working against children’s natural curiosity and capacity for learning, such as family support or the lack of it, or a degree of poverty that puts families in survival mode with little room to nurture curiosity.

Stark returns to the question of motivators that do work, once again calling to mind Pink’s advocacy of autonomy, mastery, and purpose as the trifecta of success. She writes:

[T]hree broadly defined elements of the learning experience support internal motivation and the persistence it enables. Internal motivation relies on learners having autonomy in their learning, a progressing sense of competence in their skills and knowledge, and the ability to learn in a concrete or “real world” context rather than in the abstract. These are mostly absent from classroom learning. Autonomy is rare, useful context is absent, and school’s means for affirming competence often feel so arbitrary as to be almost without use — and are sometimes actively demotivating. . . . [A]utonomy means that you follow your own path. You learn what you want to learn, when and how you want to learn it, for your own reasons. Your impetus to learn comes from within because you control the conditions of your learning rather than working within a structure that’s pre-made and inflexible.

The second thing you need to stick with learning independently is to set your own goals toward an increasing sense of competence. You need to create a feedback loop that confirms your work is worth it and keeps you moving forward. In school this is provided by advancing through the steps of the linear path within an individual class or a set curriculum, as well as from feedback from grades and praise.

But Stark found that outside of school, those most successful at learning sought their sense of competence through alternative sources. Many, like James Mangan advised in his 1936 blueprint to acquiring knowledge, solidified their learning by teaching it to other people, increasing their own sense of mastery and deepening their understanding. Others centered their learning around specific projects, which enabled them to make progress more modular and thus more attainable. Another cohort cited failure as an essential part of the road to mastery. Stark continues:

The third thing [that] can make or break your ability to sustain internal motivation … is to situate what you’re learning in a context that matters to you. In some cases, the context is a specific project you want to accomplish, which … also functions to support your sense of progress.

She sums up the failings of the establishment:

School is not designed to offer these three conditions; autonomy and context are sorely lacking in classrooms. School can provide a sense of increasing mastery, via grades and moving from introductory classes to harder ones. But a sense of true competence is harder to come by in a school environment. Fortunately, there are professors in higher education who are working to change the motivational structures that underlie their curricula.

The interviews, to be sure, offer a remarkably diverse array of callings, underpinned by a number of shared values and common characteristics. Computational biologist Florian Wagner, for instance, echoes Steve Jobs’s famous words on the secret of life in articulating a sentiment shared by many of the other interviewees:

There is something really special about when you first realize you can figure out really cool things completely on your own. That alone is a valuable lesson in life.

Investigative journalist Quinn Norton subscribes to Mangan’s prescription for learning by teaching:

I ended up teaching [my] knowledge to others at the school. That’s one of my most effective ways to learn, by teaching; you just have to stay a week ahead of your students. … Everything I learned, I immediately turned around and taught to others.

She also used the gift of ignorance to proactively drive her knowledge forward:

When I wanted to learn something new as a professional writer, I’d pitch a story on it. I was interested in neurology, and I figured, why don’t I start interviewing neurologists? The great thing about being a journalist is that you can pick up the phone and talk to anybody. It was just like what I found out about learning from experts on mailing lists. People like to talk about what they know.

Norton speaks to the usefulness of useless knowledge, not only in one’s own intellectual development but also as social currency:

I’m stuffed with trivial, useless knowledge, on a panoply of bizarre topics, so I can find something that they’re interested in that I know something about. Being able to do that is tremendously socially valuable. The exchange of knowledge is a very human way to learn. I try never to walk into a room where I want to get information without knowing what I’m bringing to the other person.

[…]

I think part of the problem with the usual mindset of the student is that it’s like being a sponge. It’s passive. It’s not about having something to bring to the interaction. People who are experts in things are experts because they like learning.

Software engineer, artist, and University of Texas molecular biologist Zack Booth Simpson speaks to the value of cultivating what William Gibson has called “a personal micro-culture” and learning from the people with whom you surround yourself:

In a way, the best education you can get is just talking with people who are really smart and interested in things, and you can get that for the cost of lunch.

Artist Molly Crabapple, who inked this beautiful illustration of Salvador Dalí’s creative credo and live-sketched Susan Cain’s talk on the power of introverts, recalls how self-initiated reading shaped her life:

I was … a constant reader. At home, I lived next to this thrift store that sold paperbacks for 10¢ apiece so I would go and buy massive stacks of paperback books on everything. Everything from trashy 1970s romance novels to Plato. When I went to Europe, I brought with me every single book that I didn’t think I would read voluntarily, because I figured if I was on a bus ride, I would read them. So I read Plato and Dante’s Inferno, and all types of literature. I got my education on the bus.

Originally featured in May — read more here.

* * *

For a subject-specific lens on the year’s finest reading, revisit the best books on best-of reading lists — which covered writing and creativity, photography, psychology and philosophy, art and design, history and biography, science and technology, children’s literature, and pets and animals.

BP

View Full Site

Brain Pickings participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from a link on here, I receive a small percentage of its price. Privacy policy.