Brain Pickings Icon
Brain Pickings

Search results for “obama identity”

The World’s First Children’s Book about a Two-Mom Family

A pioneering picture-book with an enduring message of equality.

“Many homosexuals live together in stable relationships. The time will come when homosexual marriages are recognized,” two Danish psychologists predicted in their honest, controversial, and now-iconic guide to teenage sexuality in 1969. But decades would pass before their prognosis would slowly, painfully begin to come true. In the meantime, those “stable relationships” were denied the dignity of being called a family and forced to conform to the mainstream-normative narratives of what a family actually is.

In the 1980s, writer Lesléa Newman began noticing that same-sex couples were having kids like everybody else, but had no children’s books to read to them portraying non-traditional family units. At that point, women had been “marrying” one another for ages, but true marriage equality in the eyes of the law and the general public was still two decades away, as were children’s books offering alternate narratives on what makes a family. So Newman enacted the idea that the best way to complain is to make things and penned Heather Has Two Mommies (public library) — a sweet, straightforward picture-book illustrated by Diana Souza, telling the story of a warm and accepting playground discussion of little Heather’s life with Mama Kate, a doctor, and Mama Jane, a carpenter.

Heather’s favorite number is two. She has two arms, two legs, two eyes, two ears, two hands, and two feet. She also has two pets: a ginger-colored cat named Gingersnap and a big black dog named Midnight.

Heather also has two mommies: Mama Jane and Mama Kate.

The book, which predated even Maurice Sendak’s controversial children’s story grazing the subject, was unflinchingly pioneering — with the proper social outrage to attest to this status. Not only did it rank number 11 on the American Library Association’s chart of America’s most frequently challenged books in the 1990s, but its impact continued for decades — comedian Bill Hicks, an eloquent champion of free speech, paid homage to it in his final act on Letterman in October of 1993 and it was even parodied in a 2006 episode of The Simpsons titled “Bart Has Two Mommies.”

Despite that, or perhaps precisely because of it, the book lives on as a bold embodiment of Bertrand Russell’s famous proclamation: “Do not fear to be eccentric in opinion, for every opinion now accepted was once eccentric.”

Twenty years later, Newman followed up with the board books Mommy, Mama, and Me and Daddy, Papa, and Me, affectionately illustrated by artist Carol Thompson.

Complement Heather Has Two Mommies with Andrew Solomon’s remarkable Far From the Tree: Parents, Children and the Search for Identity, a moving meditation on how love both changes us and makes us more ourselves, and the impossibly charming And Tango Makes Three, an allegorical marriage equality primer telling the true story of Central Park Zoo’s gay penguin family.

BP

Worn Stories: Playful and Poignant Tales of Clothes That Encode Life’s Most Meaningful Memories

Wearable emotional memories from John Hodgman, Marina Abramovic, Piper Kerman, Pat Mahoney, Debbie Millman, Paola Antonelli, Kenneth Goldsmith, Meghan O’Rourke, Rosanne Cash, and more.

One of the most extraordinary things about human beings is that we weave our lives of stories, stories woven of sentimental memories, which we can’t help but attach to our physical environment — from where we walk, creating emotional place-memory maps of a city, to how smell transports us across space and time, to what we wear.

For artist and editor Emily Spivack, clothes can be an “evolving archive of experiences, adventures, and memories” and a powerful storytelling device. Since 2010, she has been meticulously curating a remarkable catalog of such wearable personal histories from the living archives of some of the most interesting minds of our time — artists and Holocaust survivors, writers and renegades, hip-hop legends and public radio personalities. In Worn Stories (public library), published by Princeton Architectural Press, Spivack shares the best of these stories — some poignant, some funny, all imbued with disarming humanity and surprising vulnerability — from an impressive roster of contributors, including performance artist Marina Abramovic, writer Susan Orlean, comedian John Hodgman, fashion designer Cynthia Rowley, Orange Is the New Black memoirist Piper Kerman, artist Maira Kalman, MoMA curator Paola Antonelli, and artist, writer, and educator Debbie Millman.

The stories span a remarkable range — a traditional Indian shirt worn during a spiritual Hindu gathering turned kidnapping; the shoes in which Marina Abramovic walked the Great Wall of China while saying farewell to a soulmate; an oddly uncharacteristic purple silk tuxedo shirt that belonged to Johnny Cash, preserved by his daughter; and, among myriad other shreds and threads of the human experience, various mementos from the “soul loss” — as one contributor puts it — of love affairs ending.

Spivack writes in the introduction:

The clothes that protect us, that make us laugh, that serve as a uniform, that help us assert our identity or aspirations, that we wear to remember someone — in all of these are encoded the stories of our lives. We all have a memoir in miniature living in a garment we’ve worn.

Piper Kerman

Piper Kerman selects an outfit she wore at a key moment in the memoir-turned-TV-hit Orange Is the New Black — a vintage suit that was among the three outfits she packed for her final court appearance and sentencing after taking a plea deal (which, she explains, 95% of criminal defendants do):

As your case wends through the system, you barely speak in court; the prosecutor and defense attorney do most of the talking. Unlike 80 percent of criminal defendants, I could afford to hire a lawyer, and I was lucky that he was a very good and experienced one. He had advocated long and hard with the prosecutor on my behalf, and then the day came where his work and my case would be decided by the judge, a Reagan appointee to the federal bench.

Most criminal defendants wear whatever they are given by their attorney or family to their sentencing ; a lot of people are too poor to afford bail, and so they have been wearing jailhouse orange for many months before ever getting their day in court. I was much more fortunate; when I flew to Chicago to be sentenced to prison, I had three choices of court attire in my suitcase. A cadet-blue pantsuit, a very severe navy coatdress, and a wild card I had packed at the last minute: a vintage fifties pencil-skirt suit I had bought on eBay, in a coffee and cream tweed with a subtle sky blue check. It looked like something a Hitchcock heroine would have worn.

“That’s the one,” said my lawyer, pointing to the skirt suit. “We want the judge to be reminded of his own daughter or niece or neighbor when he looks at you.”

For someone standing for judgment, the importance of being seen as a complete human being, someone who is more than just the contents of the file folders that rest on the bench in front of His or Her Honor, cannot be overstated.

Despite the dramatic circumstances, Kerman’s experience captures something central to Spivack’s project — something fundamental about how we use clothing as this paradoxical combination of camouflage and self-revelation, a shield for and a stripping to our basic humanity.

Simon Doonan

Simon Doonan selects a pair of decidedly eighties Lycra cycle tights with orange-and-black graffiti writing and shares the touching story behind the seemingly silly garment:

One by one my roommates, friends, and boyfriends in Los Angeles started getting sick from AIDS. It was very early on in the epidemic and when you went to the doctor, they couldn’t refer you to an expert. They asked you if you were religious, meaning, you were going to die.

I decided to join a gym with a friend who had been diagnosed with AIDS. At least we could be healthy, we thought… I went every day. In an attempt to do “healthy” things, I became addicted to the lights, the music, the endorphins. It was a very showbiz-y way to keep in shape, and many actresses would go to the class, like Madonna when she was starting to become well known.

[…]

The cult of aerobics was waning by the time I moved to New York in 1985, but with so many people getting sick, for a couple of years it was an antidote to this incredible malaise of melancholy that had been blanketing L.A.

Debbie Millman

Debbie Millman recounts the story of a peculiar yellow coat from the era in her life when she was standing on the precipice of her creative journey, long before she was a successful artist, prolific author, and award-winning interviewer. She recounts one July afternoon in her late twenties when she, broke and lusting after a glamorous life, ended up at the Hermès store on Madison Avenue after a months-long quest to track down the mysterious, enchanting perfume she had smelled on an exceptionally elegant woman. A uniformed man opened the gates to an unfamiliar world, “the most elegantly expensive environment” she had ever entered, where people very much unlike her — people “very, very rich” — were browsing $200 scarves.

Just then, a kindly saleslady — one imagines a character like Cinderella’s fairy godmother — took pity on Millman and whispered in her ear a thrilling secret: they were having a sale upstairs. Millman was thrilled, but it didn’t take her long to realize that, even with the markdowns, she couldn’t afford anything — until she spotted “the softest, most luxurious, ultra-bright lemony-yellow cashmere coat ever made.” Certain it would cost thousands of dollars, she apprehensively searched for the price, which revealed itself like a miracle — the original $2,200 was crossed out, and a hopeful $400 was written in its place.

Millman writes:

I calculated what the expense would mean to my budget. Undeterred, I tried the coat on. It was at least one size too big. None of this mattered to me. I felt glamorous and beautiful. As the clerk wrapped up the coat in the biggest orange box I had ever seen, I knew this wasn’t a mistake. I would wear this coat forever.

And wear it I did! I wore it every day from September until March. I wore it to work, I wore it every weekend, I wore it on vacation in Vermont, and I wore it traveling to the West Coast. The only time I wished for a warmer coat was en route to a client’s office on Fifth Avenue one blustery subzero February afternoon. I was chewing a large piece of purple bubble gum and realized I’d have to get rid of it before my meeting. It was so cold I didn’t want to take my gloves off to take the gum out of my mouth. Perhaps the temperature affected my judgment, or perhaps I was lazy, but suddenly I did something I had never, ever done before: I raised my chin, puckered up my lips, and let my gum fly. As it descended onto the sidewalk, I saw that a man walking toward me was about to collide with the arc of its fall. I made eye contact with him as the sticky mass fell at his feet. Horrified, I instantly realized I was face-to-face with Woody Allen. Mercifully, he sidestepped the gum. But his outrage was palpable. He shook his head in disgust and passed me by. I was too embarrassed and frightened to even say I was sorry.

Two days later I went out with my friend Ellen. She had snagged a reservation at the newly reopened Le Cirque and we got all dolled up for the occasion. I, of course, wore my yellow coat. We were seated between the coat check and the front door, and since New York City was still in a deep freeze, I decided to keep my coat wrapped around me.

Then I saw him. He was approaching the coat check with his wife, fumbling for his ticket. Wildly, I looked around for a place to hide. Ellen asked me if I was okay and I hissed, no. I motioned with my eyes. Ellen squealed in delight, and he looked over at us. Once again, in the span of forty-eight hours, I was face-to-face with Woody Allen.

Our eyes locked and I saw him recognize my unmistakable ultra-bright yellow coat and the same frightened face. He grimaced. “You!” he said, as his wife pulled on his arm. I felt myself turn white and then red, as everyone turned to stare.

Two and a half decades later, I still have my beloved coat. It’s lost its belt and much of its lemony sheen, and it hasn’t left its special place in my closet in a long time. Maybe I’ll wear it again one day. As Woody Allen famously said, “Eternal nothingness is fine if you happen to be dressed for it.” I’ll remind him of that if I ever bump into him again.

Paola Antonelli

MoMA curator extraordinaire Paola Antonelli selects a pair of aviator glasses that capture the strange blend of terror and optimism of growing up amidst Milan’s political unrest in the 1970s. Class was often interrupted by bomb threats. Her daily morning walk to school took her, always scared, through a contentious urban borderland that divided the peacoat-clad, anti-authoritarian leftists and the Sanbabilini — the “gun-toting neo-fascists from wealthy Milanese families who shared responsibility for much of the violence around Italy at that time” — whose distinctive look included fitted shirts, trench coats, and Ray-Ban aviator glasses. She tells Spivack:

Sometime in the late 1970s, during this time of upheaval, my father came home from his first trip to the United States with a pair of Ray-Ban Aviators he’d bought for me. He had not thought of the political implications; he had just wanted a gift that embodied “America.” Even though it was only a pair of sunglasses, it was like holding a bomb in my hands. I couldn’t wear them.

Along the way, the first pair of Aviators disappeared, and I decided to buy myself another pair. They never looked good on me, but it was a sort of exorcism. Even then, years later, it felt almost like they were burning in my hands; they transport me to a moment that was formative, but one that I also want to forget.

I compare notes with friends who grew up in Israel or Beirut, for example, and I realize we all went through something similar — living despite the bombs going off, despite the fact that it was almost a war zone. Little details, like scents or sounds or a piece of clothing, bring back the violence, and that’s what these Aviators do for me.

Maira Kalman

Maira Kalman, a woman of unparalleled creative vision and extraordinary wisdom, selects an apple-green sweater that belonged to her mother. She writes:

It is my lucky sweater because I always need luck. And the feeling of being lucky, which is ridiculous and elusive, is still a pleasant one.

Margaret D. Stetz

Then there are the bunny ears, which turn out to belong not to a Playmate but to Harvard Ph.D. Margaret D. Stetz — a self-described “middle-aged professor of women’s studies and literature” and a Beatrix Potter scholar, who wears the ears while lecturing about Potter’s iconic Peter Rabbit character. Stetz writes:

The notion of dressing women as “Bunnies” was, of course, the invention of Hugh Hefner’s Playboy Clubs. When she was a young journalist, Ms. magazine founder Gloria Steinem famously went undercover as a Bunny in 1963 to expose the harassment and miserable working conditions of women who wore the Clubs’ uniforms. Today, there’s something satisfying about taking these symbols of sexual availability and servility and flipping their meaning. By combining bunny ears with a tailored jacket and skirt on the lecture platform at a university, museum, or other cultural institution, I’m doing something subversive. No longer do they signify that women are merely “Playmates.” Conversely, this is also my way of suggesting that women don’t have to be wholly serious to be feminists.

Emily Spivack

Tucked midway through the book is Spivack’s own story about a pair of cheap black flip-flops her grandmother bought for her nearly twenty years ago off the Delaware boardwalk. In a way, these unassuming essentials capture the essence of the project — a seemingly ordinary object of clothing imbued with immeasurable sentimental value, amplified over a lifetime. Spivack writes:

Over time, these flip-flops — plain, pulled from a rack without a thought, manufactured to be disposable but apparently indestructible — have become such a lasting fixture in my life. Precisely, perhaps, because they are so ordinary: you don’t even notice them casually accumulating the years, like the shops along Rehoboth Avenue, like grandmothers, like everything.

Susan Orlean

Susan Orlean, sage of the written word, recounts her “uniform fixation” — the lifelong quest to find the ideal, and as it turns out mythic, outfit that would capture her personality perfectly and be therefore bought in multiples to be worn forever. She captures this cyclical infatuation elegantly:

It’s a temporary delusion that comes over me with regularity — a belief that by wearing this perfect thing, I will look right and feel good no matter what. Like, “How did I not know that I’m an agnes b. T-shirt and denim skirt kind of person? Now, I’m going to order ten of each and I never have to buy clothes again.” When I’m in it, I totally believe I have found my look, my personal style.

It’s cultish and my own particular mania. Each time I start over again, I think, “those were false gods — I have now found the true God.” I even observe myself doing it. I understand that fashion, by definition, is a changing thing, and so is one’s body. I try to talk myself out of my own crazy conviction that I’ve finally solved the puzzle — and yet I can’t do it.

I guess it’s probably safer to be this way about clothing than men or religion or something that could be really dangerous.

Ross Intelisano

Ross Intelisano picks a tie once made by his beloved immigrant grandmother, Anna, who tailored all of his clothes growing up, worked until she was 78, and lived to be 95. Two weeks after her death, Hurricane Sandy devastated the Rockaways, where Intelisano’s family lived. The house was condemned and all access was denied, but a family friend bravely ventured in to retrieve a few surviving valuables, including Anna’s ties. Intelisano writes:

That day, my father came over to my house, smiling for what seemed like the first time since the storm. He proudly presented me with two of Anna’s ties. I wear them all the time. I like handling the silk as I knot the ties.

Kenneth Goldsmith

Kenneth Goldsmith turns his penchant for subverting literature to fashion and recounts wearing an over-the-top paisley suit by avant-garde designer Thom Browne to the White House, where Goldsmith was invited to read some of his poetry to President Obama. Browne had designed the suit under his Brooks Brothers-owned Black Fleece label, taking Brooks Brothers’ signature patterns to an intentional extreme. Goldsmith recounts his exchange with the President that night, who was, coincidentally, wearing a conventional Brooks Brothers suit himself:

Upon our introduction, the first thing the President said to me was, “That’s a great suit! You know? I’d wear a suit like that. But my staff would never let me.” To which I replied, “Mr. President, this is one instance where it’s better being an artist than being the President of the United States: artists can wear anything they want.” And then he glanced down at my saddle shoes and exclaimed, “You’re wearing golf shoes!” Which in part was true, that being the genius of Thom Browne, to take something familiar and recontextualize it to the point of it being “wrong.” And that is exactly what I aimed to do with my performance: to straddle tradition and radicality, being both and, at the same time, being neither; to embrace contradiction, keep people guessing.

David Carr

The New York Times’ David Carr reminisces about searching for a cheap t-shirt in a classic New York moment of sweltering need, when you’re forced to choose between suffering your wholly indiscreet sweat stains or buying one of those ubiquitous “I♥NY” touristy shirts. (“You can’t wear a shirt like that ironically,” Carr notes, “unless, say, you hate New York, which I do not.”) Instead, he chanced upon a miraculous find at one of Manhattan’s myriad souvenir shops — a defective t-shirt, with the “New York” script printed upside-down. Noting the kink, Carr offered the shopkeeper $3 for the oddball shirt and gleefully walked off with his find. Except for the occasional compliment from a hipster on the subway, people notice the shirt but say nothing, of which Carr remarks:

I like that about my shirt: it is something that is intuitively understood in the City, as we insufferable locals call it, and is baffling to others, akin to many other aspects of living or working in New York.

Pat Mahoney

Some of the stories remind us that the assumptions we make — in this case, assumptions about what people seek to signal with their clothing choices — are often lightyears away from the truth. Take, for instance, the flesh-colored American Apparel nylon shorts selected by LCD Soundsystem founding member and drummer Pat Mahoney. One might write them off as a hipster or counter-hipster joke, but they actually represent a curious combination of extreme practicality and creative ritualization. Mahoney, who admits to sweating a great deal during his “epic sets” on stage, would regularly end up “drenched to the bone” after a show, his jeans left to dry in the tour bus until they smelled, much to his fellow bandmates’ dismay, “like rotten cotton.” He needed something quick-drying and light to wear onstage, so he bought the shorts on the road, fully aware of their laughable connotations. But over time, they came to serve a deeper psychological function — like a number of famous creators known for maintaining various odd habits and rituals to keep their creative flow flowing, Mahoney overcomes his chronic stage fright by employing “elaborate juju” — a compulsion to have everything just right, from the way he ties his sneakers to how he positions his drums on the rug — to get himself in a more secure mindset. Wearing the shorts became part of that creative ritual — part of the behavioral and environmental cues that psychology suggests help put us in a state of creative flow.

John Hodgman

“I have a dress and I have worn it many times,” comedian John Hodgman opens with a bang. That dress, it turns out, got its start when Hodgman was invited to impersonate Ayn Rand on the Dead Authors podcast. He writes:

I’d been fascinated with Rand since I’d written a story in the New York Times magazine about a competitive championship tournament bridge player who was also an active objectivist and Rand devotee. I had read half of Atlas Shrugged before I got the gist of my role. I really enjoyed the book because of its absurdly reductive philosophy that inadvertently plays on adolescent male narcissism like a jazz saxophone — to draw a connection to the famous Randian saxophonist and economist Alan Greenspan — but it also spoke directly to the adolescent male fantasy of “I’m the only smart one. Everyone is leeching off of me and I’d rather destroy my work than compromise my integrity by being nice to others.” Her moral severity came as a tonic to my cultural relativist upbringing.

The Rand impersonation eventually became a part of his stand-up routine and the dress was worn many times. Hodgman reflects on its allure:

Even though I’m imitating, in a ridiculous fashion, an exaggerated version of Ayn Rand, what precedes the moment of putting on the dress is an utter nudity of self, about as close as I’ll ever get.

Dorthy Finger

But undoubtedly the most moving story comes from Holocaust survivor Dorothy Finger, who was a child when the Nazis invaded her native Poland. Her family went into hiding, but inevitably succumbed to the tragic fate of so many Jews at the time. Her father was the first one killed, “almost beaten to death and then sent to an extermination camp.” Soon, her mother was shot. Dorothy herself was sent to a labor camp, where she was subjected to grueling toil and regularly beaten by the Nazis. On July 27, 1943, she escaped into the nearby forest with her aunt and two cousins as machine guns shot after them. They survived for a few months, into the middle of winter, huddling together to keep from freezing. But the Nazis eventually went looking for them, killing Dorothy’s aunt and her 17-year-old male cousin. Finger writes:

I was shot in the ear and I fainted. It just grazed my ear, but the impact of the explosion threw me on the earth and I was unconscious. I swear I saw my soul go to heaven, white angels and things like that. I thought I was dead and that when you’re dead you see yourself go to heaven. Of course, I understand now that I was not conscious. When I came to, I was even more upset. “God, why didn’t you finish me off, why didn’t you kill me, rather than slowly starve me to death? I have nobody. No parents.” I just had my second cousin in the forest with me.

The Nazis came back a second time. We heard shots coming from one side and we ran the other way. I fell through some ice into a body of water that wasn’t very deep. My instinct to live was so great that I could still think about how to survive.

I covered myself up with branches. I could hear the Ukrainians saying to the Nazis, “Somebody must have been running through there. I can see footprints.” And the Nazis said, “I don’t want to go that way or we’ll fall into the ice too. We’ll catch him the next time.” My heart stopped beating. I stopped breathing. I waited until I couldn’t hear their boots on the ice. I came out of the water with everything frozen on me, including the little dress that I wore until the day we were liberated from the forest.

In the forest, Finger got typhus and sank into a delirious fever. She lost all her hair and was so sick that she stayed in the fetal position until she was unable to walk, let alone run. At that point, she knew that if the Nazis came back, she’d be killed. She writes:

I don’t know what was worse — the fear, the hunger, the lice, or the humiliation.

Springtime came, and then summer, and it was warmer—although to this day, I am still cold.

I have not overcome it. The shooting started and it was coming from both sides. I still couldn’t run because I hadn’t completely recovered from the typhus. “I do not want to see the face of the Nazis that will shoot me,” I thought. I slowly moved from my back to my stomach. “Let them just shoot me in my back or my head and then it’ll be over.” The shooting stopped and I heard tanks coming into the forest, and I didn’t know if they were German or Russian. They were Russian tanks and they had come to liberate us, exactly one year from the day I entered the forest, July 27, 1944.

Young Dorothy eventually made it back to her hometown, but her few surviving neighbors had assumed she, like her parents, was dead, so they had discarded the family’s remaining possessions. The only thing she could recover was a piece of wool fabric from her family’s department store. She saved that, then wrote to an aunt and uncle in Delaware, who were eventually able to bring her to the United States. When she moved, her entire luggage — all her earthly possessions — consisted of two dresses, including the one she had worn that gruesome year in the forest, and the wool fabric. She made it to America, enrolled in high school, and graduated a year and a half later. Upon graduation, another uncle gave her $25 — a fortune in 1949 and in the context of her life — which she used to have a suit made from the wool fabric that was her only link to her family and her past life. Finger, now in her eighties, writes:

I always figured I’d be buried in it. But if people can learn from this suit and its history, what difference does it make what I’m buried in?

Worn Stories is absolutely remarkable in its entirety — a true labor of love that weaves this common thread of intensely personal, courageously vulnerable sartorial memories into a colorful tapestry of the human experience.

Photographs by Ally Lindsay; courtesy of Emily Spivack / Princeton Architectural Press

BP

The Greatest Commencement Addresses of All Time

Kurt Vonnegut, J.K. Rowling, David Foster Wallace, Patti Smith, Anna Quindlen, Steve Jobs, and more.

The commencement address is the secular sermon of our time — a packet of timeless advice on life, dispensed by a podium-perched patronly or matronly shaman of wisdom to a congregation of eager young minds about to enter the so-called “real world.” But the genre’s finest specimens speak to all of us looking for some guidance on the path to the Good Life, transcending boundaries of age or occupation or life-stage. The best commencement speeches are also masterworks of paradox: On the one hand, they gently remind us that what we think we know, we don’t; on the other, they urge us to trust our deepest intuitions about confidence, kindness, integrity, and all those embarrassingly elemental truths which, in all other contexts, our culturally conditioned cynicism leads us to dismiss as tired truisms. But not here — the commencement address is society’s most potent mechanism for clearing the clouds of our cynicism just long enough to allow a few rays of receptivity to shine through, long enough to hang our beliefs and vulnerabilities and hopes on something solid and soul-affirming, and to do so in a non-ironic way.

Gathered in this ongoing archive are the best commencement addresses I’ve encountered over the years — words of wisdom that offer such rare respite, a source of sincere solace for us cynical moderns. Please enjoy.

  1. Joseph Brodsky on winning the game of life (University of Michigan, 1988)
    “Of all the parts of your body, be most vigilant over your index finger, for it is blame-thirsty. A pointed finger is a victim’s logo.”
  2. David Foster Wallace on life (Kenyon College, 2005)
    Revisiting the tragic literary hero’s only public insights on life.
  3. Kurt Vonnegut on kindness, technology, community, and the power of great teachers (Agnes Scott College, 1999)
    “Teaching, may I say, is the noblest profession of all in a democracy.”
  4. Bill Watterson on life and creative integrity (Kenyon College, 1990)
    “The truth is, most of us discover where we are headed when we arrive.”
  5. Anna Quindlen on the secret to a happy life (Villanova, 2000 / undelivered)
    “You cannot be really first-rate at your work if your work is all you are.”
  6. George Saunders on the power of kindness (Syracuse, 2013)
    “What I regret most in my life are failures of kindness.”
  7. Patti Smith on life and making a name for yourself (Pratt, 2010)
    How dental care protects our inner Pinocchio.
  8. Greil Marcus on the toxic division of high vs. low culture (School of Visual Arts, 2013)
    “What art does … is tell us, make us feel that what we think we know, we don’t.”
  9. Joss Whedon on embracing our inner contradictions (Wesleyan, 2013)
    “Identity is something that you are constantly earning. It is a process that you must be active in.”
  10. Neil Gaiman on mistakes and the creative life (Philadelphia University of the Arts, 2012)
    “Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before.”
  11. Ann Patchett on writing and life (Sarah Lawrence College, 2006)
    “Coming back is the thing that enables you to see how all the dots in your life are connected.”
  12. Judith Butler on the value of the humanities and why we read (McGill, 2013)
    “We lose ourselves in what we read, only to return to ourselves, transformed and part of a more expansive world.”
  13. Kurt Vonnegut on reading, boredom, belonging, and hate (Fredonia, 1978)
    “Hate, in the long run, is about as nourishing as cyanide.”
  14. Ellen Degeneres on success and following your own path (Tulane, 2009)
    “Never follow anyone else’s path, unless you’re in the woods and you’re lost and you see a path, and by all means you should follow that.”
  15. Aaron Sorkin on trusting your compass (Syracuse, 2012)
    “Take risks, dare to fail, remember the first person through the wall always gets hurt.”
  16. Barack Obama on the life of service and the impulse to change the world (Wesleyan, 2008)
    “All it takes is one act of service — one blow against injustice — to send forth what Robert Kennedy called that tiny ripple of hope. That’s what changes the world. That one act.”
  17. Conan O’Brien on disappointment and what defines us (Dartmouth, 2011)
    “Whether you fear it or not, disappointment will come. The beauty is that through disappointment you can gain clarity, and with clarity comes conviction and true originality.”
  18. J.K. Rowling on defining failure for ourselves (Harvard, 2008)
    “Climbing out of poverty by your own efforts, that is something on which to pride yourself. But poverty itself is romanticized only by fools.”
  19. Robert Krulwich on friends in low places (Berkeley School of Journalism, 2011)
    “This is the era of Friends in Low Places. The ones you meet now, who will notice you, challenge you, work with you, and watch your back. Maybe they will be your strength.”
  20. Meryl Streep on change and making our own “normal” (Barnard, 2010)
    “Really, there is no ‘normal.’ There’s only change, and resistance to it, and then more change.”
  21. Jeff Bezos on cleverness vs. kindness (Princeton, 2010)
    “Cleverness is a gift, kindness is a choice. Gifts are easy — they’re given after all. Choices can be hard. You can seduce yourself with your gifts if you’re not careful, and if you do, it’ll probably be to the detriment of your choices.”
  22. Oprah Winfrey on failure and maxing out our humanity (Harvard, 2013)
    “The key to life is to develop an internal moral, emotional GPS that can tell you which way to go.”
  23. Adrienne Rich on why an education is something we claim, not something we receive (Douglass College, 1977)
    “Responsibility to yourself means that you don’t fall for shallow and easy solutions.”
  24. Steve Jobs on serendipity and connecting the dots of life (Stanford, 2005)
    “You can’t connect the dots looking forward; you can only connect them looking backwards.”
  25. Debbie Millman on courage and the creative life (San Jose State University, 2013)
    “Imagine immensities, don’t compromise, and don’t waste time.”
  26. Richard Feynman on integrity (Caltech, 1974)
    “The first principle is that you must not fool yourself.”
  27. Daniel Pink on why the best roadmap to an interesting life is the one you make up as you go along (Weinberg College, 2014)
    “Sometimes, the only way to discover who you are or what life you should lead is to do less PLANNING and more LIVING — to burst the double bubble of comfort and convention and just DO stuff.”
  28. Teresita Fernández on What It Really Takes to Be an Artist
    “Being an artist is not just about what happens when you are in the studio. The way you live, the people you choose to love and the way you love them, the way you vote, the words that come out of your mouth… will also become the raw material for the art you make.”
  29. Tom Wolfe on the rise of the pseudo-intellectual (Boston University, 2000)
    “We live in an age in which ideas, important ideas, are worn like articles of fashion.”
  30. John Waters on Creative Rebellion and the Artist’s Task to Cause Constructive Chaos (RISD, 2015)
    “Refuse to isolate yourself. Separatism is for losers.”
  31. Toni Morrison on How to Be Your Own Story and Reap the Rewards of Adulthood in a Culture That Fetishizes Youth (Wellesley, 2004)
    “There is nothing, believe me, more satisfying, more gratifying than true adulthood… Its achievement is a difficult beauty, an intensely hard won glory, which commercial forces and cultural vapidity should not be permitted to deprive you of.”
  32. Parker Palmer on the Six Pillars of the Examined Life (Naropa University, 2015)
    “Take everything that’s bright and beautiful in you and introduce it to the shadow side of yourself… When you are able to say, ‘I am … my shadow as well as my light,’ the shadow’s power is put in service of the good.”
BP

The 13 Best Books of 2013: The Definitive Annual Reading List of Overall Favorites

Soul-stirring, brain-expanding reads on intuition, love, grief, attention, education, and the meaning of life.

All gratifying things must come to an end: The season’s subjective selection of best-of reading lists — which covered writing and creativity, photography, psychology and philosophy, art and design, history and biography, science and technology, children’s literature, and pets and animals — comes full-circle with this final omnibus of the year’s most indiscriminately wonderful reads, a set of overall favorites that spill across multiple disciplines, cross-pollinate subjects, and defy categorization in the most stimulating of ways. (Revisit last year’s selection here.)

1. ON LOOKING

“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library), also among the year’s best psychology books — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Horowitz’s approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

Art by Maira Kalman from ‘On Looking: Eleven Walks with Expert Eyes’

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.

2. ADVICE TO LITTLE GIRLS

In 1865, when he was only thirty, Mark Twain penned a playful short story mischievously encouraging girls to think independently rather than blindly obey rules and social mores. In the summer of 2011, I chanced upon and fell in love with a lovely Italian edition of this little-known gem with Victorian-scrapbook-inspired artwork by celebrated Russian-born children’s book illustrator Vladimir Radunsky. I knew the book had to come to life in English, so I partnered with the wonderful Claudia Zoe Bedrick of Brooklyn-based indie publishing house Enchanted Lion, maker of extraordinarily beautiful picture-books, and we spent the next two years bringing Advice to Little Girls (public library) to life in America — a true labor-of-love project full of so much delight for readers of all ages. (And how joyous to learn that it was also selected among NPR’s best books of 2013!)

While frolicsome in tone and full of wink, the story is colored with subtle hues of grown-up philosophy on the human condition, exploring all the deft ways in which we creatively rationalize our wrongdoing and reconcile the good and evil we each embody.

Good little girls ought not to make mouths at their teachers for every trifling offense. This retaliation should only be resorted to under peculiarly aggravated circumstances.

If you have nothing but a rag-doll stuffed with sawdust, while one of your more fortunate little playmates has a costly China one, you should treat her with a show of kindness nevertheless. And you ought not to attempt to make a forcible swap with her unless your conscience would justify you in it, and you know you are able to do it.

One can’t help but wonder whether this particular bit may have in part inspired the irreverent 1964 anthology Beastly Boys and Ghastly Girls and its mischievous advice on brother-sister relations:

If at any time you find it necessary to correct your brother, do not correct him with mud — never, on any account, throw mud at him, because it will spoil his clothes. It is better to scald him a little, for then you obtain desirable results. You secure his immediate attention to the lessons you are inculcating, and at the same time your hot water will have a tendency to move impurities from his person, and possibly the skin, in spots.

If your mother tells you to do a thing, it is wrong to reply that you won’t. It is better and more becoming to intimate that you will do as she bids you, and then afterward act quietly in the matter according to the dictates of your best judgment.

Good little girls always show marked deference for the aged. You ought never to ‘sass’ old people unless they ‘sass’ you first.

Originally featured in April — see more spreads, as well as the story behind the project, here.

3. THIS EXPLAINS EVERYTHING

Every year since 1998, intellectual impresario and Edge editor John Brockman has been posing a single grand question to some of our time’s greatest thinkers across a wide spectrum of disciplines, then collecting the answers in an annual anthology. Last year’s answers to the question “What scientific concept will improve everybody’s cognitive toolkit?” were released in This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking, one of the year’s best psychology and philosophy books.

In 2012, the question Brockman posed, proposed by none other than Steven Pinker, was “What is your favorite deep, elegant, or beautiful explanation?” The answers, representing an eclectic mix of 192 (alas, overwhelmingly male) minds spanning psychology, quantum physics, social science, political theory, philosophy, and more, are collected in the edited compendium This Explains Everything: Deep, Beautiful, and Elegant Theories of How the World Works (UK; public library) and are also available online.

In the introduction preceding the micro-essays, Brockman frames the question and its ultimate objective, adding to history’s most timeless definitions of science:

The ideas presented on Edge are speculative; they represent the frontiers in such areas as evolutionary biology, genetics, computer science, neurophysiology, psychology, cosmology, and physics. Emerging out of these contributions is a new natural philosophy, new ways of understanding physical systems, new ways of thinking that call into question many of our basic assumptions.

[…]

Perhaps the greatest pleasure in science comes from theories that derive the solution to some deep puzzle from a small set of simple principles in a surprising way. These explanations are called ‘beautiful’ or ‘elegant.’

[…]

The contributions presented here embrace scientific thinking in the broadest sense: as the most reliable way of gaining knowledge about anything — including such fields of inquiry as philosophy, mathematics, economics, history, language, and human behavior. The common thread is that a simple and nonobvious idea is proposed as the explanation of a diverse and complicated set of phenomena.

Puffer fish with Akule by photographer Wayne Levin. Click image for details.

Stanford neuroscientist Robert Sapolsky, eloquent as ever, marvels at the wisdom of the crowd and the emergence of swarm intelligence:

Observe a single ant, and it doesn’t make much sense, walking in one direction, suddenly careening in another for no obvious reason, doubling back on itself. Thoroughly unpredictable.

The same happens with two ants, a handful of ants. But a colony of ants makes fantastic sense. Specialized jobs, efficient means of exploiting new food sources, complex underground nests with temperature regulated within a few degrees. And critically, there’s no blueprint or central source of command—each individual ants has algorithms for their behaviors. But this is not wisdom of the crowd, where a bunch of reasonably informed individuals outperform a single expert. The ants aren’t reasonably informed about the big picture. Instead, the behavior algorithms of each ant consist of a few simple rules for interacting with the local environment and local ants. And out of this emerges a highly efficient colony.

Ant colonies excel at generating trails that connect locations in the shortest possible way, accomplished with simple rules about when to lay down a pheromone trail and what to do when encountering someone else’s trail—approximations of optimal solutions to the Traveling Salesman problem. This has useful applications. In “ant-based routing,” simulations using virtual ants with similar rules can generate optimal ways of connecting the nodes in a network, something of great interest to telecommunications companies. It applies to the developing brain, which must wire up vast numbers of neurons with vaster numbers of connections without constructing millions of miles of connecting axons. And migrating fetal neurons generate an efficient solution with a different version of ant-based routine.

A wonderful example is how local rules about attraction and repulsion (i.e., positive and negative charges) allow simple molecules in an organic soup to occasionally form more complex ones. Life may have originated this way without the requirement of bolts of lightning to catalyze the formation of complex molecules.

And why is self-organization so beautiful to my atheistic self? Because if complex, adaptive systems don’t require a blue print, they don’t require a blue print maker. If they don’t require lightning bolts, they don’t require Someone hurtling lightning bolts.

Developmental psychologist Howard Gardner, who famously coined the seminal theory of multiple intelligences, echoes Anaïs Nin in advocating for the role of the individual and Susan Sontag in stressing the impact of individual acts on collective fate. His answer, arguing for the importance of human beings, comes as a welcome antidote to a question that suffers the danger of being inherently reductionist:

In a planet occupied now by seven billion inhabitants, I am amazed by the difference that one human being can make. Think of classical music without Mozart or Stravinsky; of painting without Caravaggio, Picasso or Pollock; of drama without Shakespeare or Beckett. Think of the incredible contributions of Michelangelo or Leonardo, or, in recent times, the outpouring of deep feeling at the death of Steve Jobs (or, for that matter, Michael Jackson or Princess Diana). Think of human values in the absence of Moses or Christ.

[…]

Despite the laudatory efforts of scientists to ferret out patterns in human behavior, I continue to be struck by the impact of single individuals, or of small groups, working against the odds. As scholars, we cannot and should not sweep these instances under the investigative rug. We should bear in mind anthropologist Margaret Mead’s famous injunction: ‘Never doubt that a small group of thoughtful committed citizens can change the world. It is the only thing that ever has.’

Uber-curator Hans Ulrich Obrist, who also contributed to last year’s volume, considers the parallel role of patterns and chance in the works of iconic composer John Cage and painter Gerhard Richter, and the role of uncertainty in the creative process:

In art, the title of a work can often be its first explanation. And in this context I am thinking especially of the titles of Gerhard Richter. In 2006, when I visited Richter in his studio in Cologne, he had just finished a group of six corresponding abstract paintings which he gave the title Cage.

There are many relations between Richter’s painting and the compositions of John Cage. In a book about the Cage series, Robert Storr has traced them from Richter‘s attendance of a Cage performance at the Festum Fluxorum Fluxus in Düsseldorf 1963 to analogies in their artistic processes. Cage has often applied chance procedures in his compositions, notably with the use of the I Ching. Richter in his abstract paintings also intentionally allows effects of chance. In these paintings, he applies the oil paint on the canvas by means of a large squeegee. He selects the colors on the squeegee, but the factual trace that the paint leaves on the canvas is to a large extent the outcome of chance.

[…]

Richter‘s concise title, Cage, can be unfolded into an extensive interpretation of these abstract paintings (and of other works)—but, one can say, the short form already contains everything. The title, like an explanation of a phenomenon, unlocks the works, describing their relation to one of the most important cultural figures of the twentieth century, John Cage, who shares with Richter the great themes of chance and uncertainty.

Writer, artist, and designer Douglas Coupland, whose biography of Marshall McLuhan remains indispensable, offers a lyrical meditation on the peculiar odds behind coincidences and déja vus:

I take comfort in the fact that there are two human moments that seem to be doled out equally and democratically within the human condition—and that there is no satisfying ultimate explanation for either. One is coincidence, the other is déja vu. It doesn’t matter if you’re Queen Elizabeth, one of the thirty-three miners rescued in Chile, a South Korean housewife or a migrant herder in Zimbabwe—in the span of 365 days you will pretty much have two déja vus as well as one coincidence that makes you stop and say, “Wow, that was a coincidence.”

The thing about coincidence is that when you imagine the umpteen trillions of coincidences that can happen at any given moment, the fact is, that in practice, coincidences almost never do occur. Coincidences are actually so rare that when they do occur they are, in fact memorable. This suggests to me that the universe is designed to ward off coincidence whenever possible—the universe hates coincidence—I don’t know why—it just seems to be true. So when a coincidence happens, that coincidence had to work awfully hard to escape the system. There’s a message there. What is it? Look. Look harder. Mathematicians perhaps have a theorem for this, and if they do, it might, by default be a theorem for something larger than what they think it is.

What’s both eerie and interesting to me about déja vus is that they occur almost like metronomes throughout our lives, about one every six months, a poetic timekeeping device that, at the very least, reminds us we are alive. I can safely assume that my thirteen year old niece, Stephen Hawking and someone working in a Beijing luggage-making factory each experience two déja vus a year. Not one. Not three. Two.

The underlying biodynamics of déja vus is probably ascribable to some sort of tingling neurons in a certain part of the brain, yet this doesn’t tell us why they exist. They seem to me to be a signal from larger point of view that wants to remind us that our lives are distinct, that they have meaning, and that they occur throughout a span of time. We are important, and what makes us valuable to the universe is our sentience and our curse and blessing of perpetual self-awareness.

Originally featured in January — read more here.

4. TIME WARPED

Given my soft spot for famous diaries, it should come as no surprise that I keep one myself. Perhaps the greatest gift of the practice has been the daily habit of reading what I had written on that day a year earlier; not only is it a remarkable tool of introspection and self-awareness, but it also illustrates that our memory “is never a precise duplicate of the original [but] a continuing act of creation” and how flawed our perception of time is — almost everything that occurred a year ago appears as having taken place either significantly further in the past (“a different lifetime,” I’d often marvel at this time-illusion) or significantly more recently (“this feels like just last month!”). Rather than a personal deficiency of those of us befallen by this tendency, however, it turns out to be a defining feature of how the human mind works, the science of which is at first unsettling, then strangely comforting, and altogether intensely interesting.

That’s precisely what acclaimed BBC broadcaster and psychology writer Claudia Hammond explores in Time Warped: Unlocking the Mysteries of Time Perception (public library) — a fascinating foray into the idea that our experience of time is actively created by our own minds and how these sensations of what neuroscientists and psychologists call “mind time” are created, and also among the year’s best psychology books. As disorienting as the concept might seem — after all, we’ve been nursed on the belief that time is one of those few utterly reliable and objective things in life — it is also strangely empowering to think that the very phenomenon depicted as the unforgiving dictator of life is something we might be able to shape and benefit from. Hammond writes:

We construct the experience of time in our minds, so it follows that we are able to change the elements we find troubling — whether it’s trying to stop the years racing past, or speeding up time when we’re stuck in a queue, trying to live more in the present, or working out how long ago we last saw our old friends. Time can be a friend, but it can also be an enemy. The trick is to harness it, whether at home, at work, or even in social policy, and to work in line with our conception of time. Time perception matters because it is the experience of time that roots us in our mental reality. Time is not only at the heart of the way we organize life, but the way we experience it.

Discus chronologicus, a depiction of time by German engraver Christoph Weigel, published in the early 1720s; from Cartographies of Time. (Click for details)

Among the most intriguing illustrations of “mind time” is the incredible elasticity of how we experience time. (“Where is it, this present?,” William James famously wondered. “It has melted in our grasp, fled ere we could touch it, gone in the instant of becoming.”) For instance, Hammond points out, we slow time down when gripped by mortal fear — the cliche about the slow-motion car crash is, in fact, a cognitive reality. This plays out even in situations that aren’t life-or-death per se but are still associated with strong feelings of fear. Hammond points to a study in which people with arachnophobia were asked to look at spiders — the very object of their intense fear — for 45 seconds and they overestimated the elapsed time. The same pattern was observed in novice skydivers, who estimated the duration of their peers’ falls as short, whereas their own, from the same altitude, were deemed longer.

Inversely, time seems to speed up as we get older — a phenomenon of which competing theories have attempted to make light. One, known as the “proportionality theory,” uses pure mathematics, holding that a year feels faster when you’re 40 than when you’re 8 because it only constitutes one fortieth of your life rather than a whole eighth. Among its famous proponents are Vladimir Nabokov and William James. But Hammond remains unconvinced:

The problem with the proportionality theory is that it fails to account for the way we experience time at any one moment. We don’t judge one day in the context of our whole lives. If we did, then for a 40-year-old every single day should flash by because it is less than one fourteen-thousandth of the life they’ve had so far. It should be fleeting and inconsequential, yet if you have nothing to do or an enforced wait at an airport for example, a day at 40 can still feel long and boring and surely longer than a fun day at the seaside packed with adventure for a child. … It ignores attention and emotion, which … can have a considerable impact on time perception.

Another theory suggests that perhaps it is the tempo of life in general that has accelerated, making things from the past appear as slower, including the passage of time itself.

But one definite change does take place with age: As we grow older, we tend to feel like the previous decade elapsed more rapidly, while the earlier decades of our lives seem to have lasted longer. Similarly, we tend to think of events that took place in the past 10 years as having happened more recently than they actually did. (Quick: What year did the devastating Japanese tsunami hit? When did we lose Maurice Sendak?) Conversely, we perceive events that took place more than a decade ago as having happened even longer ago. (When did Princess Diana die? What year was the Chernobyl disaster?) This, Hammond points out, is known as “forward telescoping”:

It is as though time has been compressed and — as if looking through a telescope — things seem closer than they really are. The opposite is called backward or reverse telescoping, also known as time expansion. This is when you guess that events happened longer ago than they really did. This is rare for distant events, but not uncommon for recent weeks.

[…]

The most straightforward explanation for it is called the clarity of memory hypothesis, proposed by the psychologist Norman Bradburn in 1987. This is the simple idea that because we know that memories fade over time, we use the clarity of a memory as a guide to its recency. So if a memory seems unclear we assume it happened longer ago.

Originally featured in July, with a deeper dive into the psychology of why time slows down when we’re afraid, speeds up as we age, and gets warped when we’re on vacation.

5. SELF-PORTRAIT AS YOUR TRAITOR

“Still this childish fascination with my handwriting,” young Susan Sontag wrote in her diary in 1949. “To think that I always have this sensuous potentiality glowing within my fingers.” This is the sort of sensuous potentiality that comes aglow in Self-Portrait as Your Traitor (public library) — the magnificent collection of hand-lettered poems and illustrated essays by friend-of-Brain-Pickings and frequent contributor Debbie Millman. In the introduction, design legend Paula Scher aptly describes this singular visual form as a “21st-century illuminated manuscript.” Personal bias aside, these moving, lovingly crafted poems and essays — some handwritten, some drawn with colored pencils, some typeset in felt on felt — vibrate at that fertile intersection of the deeply personal and the universally profound.

In “Fail Safe,” her widely read essay-turned-commencement-address on creative courage and embracing the unknown from the 2009 anthology Look Both Ways, Millman wrote:

John Maeda once explained, “The computer will do anything within its abilities, but it will do nothing unless commanded to do so.” I think people are the same — we like to operate within our abilities. But whereas the computer has a fixed code, our abilities are limited only by our perceptions. Two decades since determining my code, and after 15 years of working in the world of branding, I am now in the process of rewriting the possibilities of what comes next. I don’t know exactly what I will become; it is not something I can describe scientifically or artistically. Perhaps it is a “code in progress.”

Self-Portrait as Your Traitor, a glorious large-format tome full of textured colors to which the screen does absolutely no justice, is the result of this progress — a brave and heartening embodiment of what it truly means, as Rilke put it, to live the questions; the stunning record of one woman’s personal and artistic code-rewriting, brimming with wisdom on life and art for all.

Originally featured in November. See an exclusive excerpt here, then take a peek at Debbie’s creative process here.

6. SUSAN SONTAG: THE COMPLETE ROLLING STONE INTERVIEW

In 1978, Rolling Stone contributing editor Jonathan Cott interviewed Susan Sontag in twelve hours of conversation, beginning in Paris and continuing in New York, only a third of which was published in the magazine. More than three decades later and almost a decade after Sontag’s death, the full, wide-ranging magnificence of their tête-à-tête, spanning literature, philosophy, illness, mental health, music, art, and much more, is at last released in Susan Sontag: The Complete Rolling Stone Interview (public library) — a rare glimpse of one of modern history’s greatest minds in her element.

Cott marvels at what made the dialogue especially extraordinary:

Unlike almost any other person whom I’ve ever interviewed — the pianist Glenn Gould is the one other exception — Susan spoke not in sentences but in measured and expansive paragraphs. And what seemed most striking to me was the exactitude and “moral and linguistic fine-tuning” — as she once described Henry James’s writing style—with which she framed and elaborated her thoughts, precisely calibrating her intended meanings with parenthetical remarks and qualifying words (“sometimes,” “occasionally,” “usually,” “for the most part,” “in almost all cases”), the munificence and fluency of her conversation manifesting what the French refer to as an ivresse du discours — an inebriation with the spoken word. “I am hooked on talk as a creative dialogue,” she once remarked in her journals, and added: “For me, it’s the principal medium of my salvation.

In one segment of the conversation, Sontag discusses how the false divide between “high” and pop culture impoverishes our lives. In another, she makes a beautiful case for the value of history:

I really believe in history, and that’s something people don’t believe in anymore. I know that what we do and think is a historical creation. I have very few beliefs, but this is certainly a real belief: that most everything we think of as natural is historical and has roots — specifically in the late eighteenth and early nineteenth centuries, the so-called Romantic revolutionary period — and we’re essentially still dealing with expectations and feelings that were formulated at that time, like ideas about happiness, individuality, radical social change, and pleasure. We were given a vocabulary that came into existence at a particular historical moment. So when I go to a Patti Smith concert at CBGB, I enjoy, participate, appreciate, and am tuned in better because I’ve read Nietzsche.

In another meditation, she argues for the existential and creative value of presence:

What I want is to be fully present in my life — to be really where you are, contemporary with yourself in your life, giving full attention to the world, which includes you. You are not the world, the world is not identical to you, but you’re in it and paying attention to it. That’s what a writer does — a writer pays attention to the world. Because I’m very against this solipsistic notion that you find it all in your head. You don’t, there really is a world that’s there whether you’re in it or not.

In another passage, she considers how taking responsibility empowers rather than disempowers us:

I want to feel as responsible as I possibly can. As I told you before, I hate feeling like a victim, which not only gives me no pleasure but also makes me feel very uncomfortable. Insofar as it’s possible, and not crazy, I want to enlarge to the furthest extent possible my sense of my own autonomy, so that in friendship and love relationships I’m eager to take responsibility for both the good and the bad things. I don’t want this attitude of “I was so wonderful and that person did me in.” Even when it’s sometimes true, I’ve managed to convince myself that I was at least co-responsible for bad things that have happened to me, because it actually makes me feel stronger and makes me feel that things could perhaps be different.

The conversation, in which Sontag reaches unprecedented depths of self-revelation, also debunks some misconceptions about her public image as an intellectual in the dry, scholarly sense of the term:

Most of what I do, contrary to what people think, is so intuitive and unpremeditated and not at all that kind of cerebral, calculating thing people imagine it to be. I’m just following my instincts and intuitions. […] An argument appears to me much more like the spokes of a wheel than the links of a chain.

In one of her most poignant insights, Sontag admonishes against our culture’s dangerous polarities and imprisoning stereotypes:

A lot of our ideas about what we can do at different ages and what age means are so arbitrary — as arbitrary as sexual stereotypes. I think that the young-old polarization and the male-female polarization are perhaps the two leading stereotypes that imprison people. The values associated with youth and with masculinity are considered to be the human norms, and anything else is taken to be at least less worthwhile or inferior. Old people have a terrific sense of inferiority. They’re embarrassed to be old. What you can do when you’re young and what you can do when you’re old is as arbitrary and without much basis as what you can do if you’re a woman or what you can do if you’re a man.

Originally featured in November — take a closer look here and here.

7. LETTERS OF NOTE

As a hopeless lover of letters, I was thrilled for the release of Letters of Note: Correspondence Deserving of a Wider Audience (public library) — the aptly titled, superb collection based on Shaun Usher’s indispensable website of the same name, which stands as a heartening echelon of independent online scholarship and journalism at the intersection of the editorial and the curatorial and features timeless treasures from such diverse icons and Brain Pickings favorites as E. B. White, Virginia Woolf, Ursula Nordstrom, Nick Cave, Ray Bradbury, Amelia Earhart, Galileo Galilei, and more.

One of the most beautiful letters in the collection comes from Hunter S. Thompsongonzo journalism godfather, pundit of media politics, dark philosopher. The letter, which Thompson sent to his friend Hume Logan in 1958, makes for an exquisite addition to luminaries’ reflections on the meaning of life, speaking to what it really means to find your purpose.

Cautious that “all advice can only be a product of the man who gives it” — a caveat other literary legends have stressed with varying degrees of irreverence — Thompson begins with a necessary disclaimer about the very notion of advice-giving:

To give advice to a man who asks what to do with his life implies something very close to egomania. To presume to point a man to the right and ultimate goal — to point with a trembling finger in the RIGHT direction is something only a fool would take upon himself.

And yet he honors his friend’s request, turning to Shakespeare for an anchor of his own advice:

“To be, or not to be: that is the question: Whether ’tis nobler in the mind to suffer the slings and arrows of outrageous fortune, or to take arms against a sea of troubles…”

And indeed, that IS the question: whether to float with the tide, or to swim for a goal. It is a choice we must all make consciously or unconsciously at one time in our lives. So few people understand this! Think of any decision you’ve ever made which had a bearing on your future: I may be wrong, but I don’t see how it could have been anything but a choice however indirect — between the two things I’ve mentioned: the floating or the swimming.

He acknowledges the obvious question of why not take the path of least resistance and float aimlessly, then counters it:

The answer — and, in a sense, the tragedy of life — is that we seek to understand the goal and not the man. We set up a goal which demands of us certain things: and we do these things. We adjust to the demands of a concept which CANNOT be valid. When you were young, let us say that you wanted to be a fireman. I feel reasonably safe in saying that you no longer want to be a fireman. Why? Because your perspective has changed. It’s not the fireman who has changed, but you.

Touching on the same notion that William Gibson termed “personal micro-culture,” Austin Kleon captured in asserting that “you are the mashup of what you let into your life,” and Paula Scher articulated so succinctly in speaking of the combinatorial nature of our creativity, Thompson writes:

Every man is the sum total of his reactions to experience. As your experiences differ and multiply, you become a different man, and hence your perspective changes. This goes on and on. Every reaction is a learning process; every significant experience alters your perspective.

So it would seem foolish, would it not, to adjust our lives to the demands of a goal we see from a different angle every day? How could we ever hope to accomplish anything other than galloping neurosis?

The answer, then, must not deal with goals at all, or not with tangible goals, anyway. It would take reams of paper to develop this subject to fulfillment. God only knows how many books have been written on “the meaning of man” and that sort of thing, and god only knows how many people have pondered the subject. (I use the term “god only knows” purely as an expression.)* There’s very little sense in my trying to give it up to you in the proverbial nutshell, because I’m the first to admit my absolute lack of qualifications for reducing the meaning of life to one or two paragraphs.

Resolving to steer clear of the word “existentialism,” Thompson nonetheless strongly urges his friend to read Sartre’s Nothingness and the anthology Existentialism: From Dostoyevsky to Sartre, then admonishes against succumbing to faulty definitions of success at the expense of finding one’s own purpose:

To put our faith in tangible goals would seem to be, at best, unwise. So we do not strive to be firemen, we do not strive to be bankers, nor policemen, nor doctors. WE STRIVE TO BE OURSELVES.

But don’t misunderstand me. I don’t mean that we can’t BE firemen, bankers, or doctors—but that we must make the goal conform to the individual, rather than make the individual conform to the goal. In every man, heredity and environment have combined to produce a creature of certain abilities and desires—including a deeply ingrained need to function in such a way that his life will be MEANINGFUL. A man has to BE something; he has to matter.

As I see it then, the formula runs something like this: a man must choose a path which will let his ABILITIES function at maximum efficiency toward the gratification of his DESIRES. In doing this, he is fulfilling a need (giving himself identity by functioning in a set pattern toward a set goal) he avoids frustrating his potential (choosing a path which puts no limit on his self-development), and he avoids the terror of seeing his goal wilt or lose its charm as he draws closer to it (rather than bending himself to meet the demands of that which he seeks, he has bent his goal to conform to his own abilities and desires).

In short, he has not dedicated his life to reaching a pre-defined goal, but he has rather chosen a way of life he KNOWS he will enjoy. The goal is absolutely secondary: it is the functioning toward the goal which is important. And it seems almost ridiculous to say that a man MUST function in a pattern of his own choosing; for to let another man define your own goals is to give up one of the most meaningful aspects of life — the definitive act of will which makes a man an individual.

Noting that his friend had thus far lived “a vertical rather than horizontal existence,” Thompson acknowledges the challenge of this choice but admonishes that however difficult, the choice must be made or else it melts away into those default modes of society:

A man who procrastinates in his CHOOSING will inevitably have his choice made for him by circumstance. So if you now number yourself among the disenchanted, then you have no choice but to accept things as they are, or to seriously seek something else. But beware of looking for goals: look for a way of life. Decide how you want to live and then see what you can do to make a living WITHIN that way of life. But you say, “I don’t know where to look; I don’t know what to look for.”

And there’s the crux. Is it worth giving up what I have to look for something better? I don’t know — is it? Who can make that decision but you? But even by DECIDING TO LOOK, you go a long way toward making the choice.

He ends by returning to his original disclaimer by reiterating that rather than a prescription for living, his “advice” is merely a reminder that how and what we choose — choices we’re in danger of forgetting even exist — shapes the course and experience of our lives:

I’m not trying to send you out “on the road” in search of Valhalla, but merely pointing out that it is not necessary to accept the choices handed down to you by life as you know it. There is more to it than that — no one HAS to do something he doesn’t want to do for the rest of his life.

Originally featured in November.

8. INTUITION PUMPS

“If you are not making mistakes, you’re not taking enough risks,” Debbie Millman counseled. “Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before,” Neil Gaiman advised young creators. In Intuition Pumps And Other Tools for Thinking (public library), also among the year’s best psychology books, the inimitable Daniel Dennett, one of our greatest living philosophers, offers a set of thinking tools — “handy prosthetic imagination-extenders and focus holders” — that allow us to “think reliably and even gracefully about really hard questions” — to enhance your cognitive toolkit. He calls these tools “intuition pumps” — thought experiments designed to stir “a heartfelt, table-thumping intuition” (which we know is a pillar of even the most “rational” of science) about the question at hand, a kind of persuasion tool the reverse-engineering of which enables us to think better about thinking itself. Intuition, of course, is a domain-specific ability that relies on honed critical thinking rather than a mystical quality bestowed by the gods — but that’s precisely Dennett’s point, and his task is to help us hone it.

Though most of his 77 “intuition pumps” address concrete questions, a dozen are “general-purpose” tools that apply deeply and widely, across just about any domain of thinking. The first of them is also arguably the most useful yet most uncomfortable: making mistakes.

Echoing Dorion Sagan’s case for why science and philosophy need each other, Dennett begins with an astute contribution to the best definitions of philosophy, wrapped in a necessary admonition about the value of history:

The history of philosophy is in large measure the history of very smart people making very tempting mistakes, and if you don’t know the history, you are doomed to making the same darn mistakes all over again. … There is no such thing as philosophy-free science, just science that has been conducted without any consideration of its underlying philosophical assumptions.

He speaks for the generative potential of mistakes and their usefulness as an empirical tool:

Sometimes you don’t just want to risk making mistakes; you actually want to make them — if only to give you something clear and detailed to fix.

Therein lies the power of mistakes as a vehicle for, as Rilke famously put it, “living the questions” and thus advancing knowledge in a way that certainty cannot — for, as Richard Feynman memorably noted, the scientist’s job is to remain unsure, and so seems the philosopher’s. Dennett writes:

We philosophers are mistake specialists. … While other disciplines specialize in getting the right answers to their defining questions, we philosophers specialize in all the ways there are of getting things so mixed up, so deeply wrong, that nobody is even sure what the right questions are, let alone the answers. Asking the wrong questions risks setting any inquiry off on the wrong foot. Whenever that happens, this is a job for philosophers! Philosophy — in every field of inquiry — is what you have to do until you figure out what questions you should have been asking in the first place.

[…]

Mistakes are not just opportunities for learning; they are, in an important sense, the only opportunity for learning or making something truly new. Before there can be learning, there must be learners. There are only two non-miraculous ways for learners to come into existence: they must either evolve or be designed and built by learners that evolved. Biological evolution proceeds by a grand, inexorable process of trial and error — and without the errors the trials wouldn’t accomplish anything.

Dennett offers a caveat that at once highlights the importance of acquiring knowledge and reminds us of the power of “chance-opportunism”:

Trials can be either blind or foresighted. You, who know a lot, but not the answer to the question at hand, can take leaps — foresighted leaps. You can look before you leap, and hence be somewhat guided from the outset by what you already know. You need not be guessing at random, but don’t look down your nose at random guesses; among its wonderful products is … you!

And since evolution is the highest epitome of how the process of trial and error drives progress, Dennett makes a case for understanding evolution as a key to understanding everything else we humans value:

Evolution … is the central, enabling process not only of life but also of knowledge and learning and understanding. If you attempt to make sense of the world of ideas and meanings, free will and morality, art and science and even philosophy itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back. … For evolution, which knows nothing, the steps into novelty are blindly taken by mutations, which are random copying “errors” in DNA.

Dennett echoes Dostoyevsky (“Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others.”) and offers the key to making productive mistakes:

The chief trick to making good mistakes is not to hide them — especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own mistakes, turning them over in your mind as if they were works of art, which in a way they are. … The trick is to take advantage of the particular details of the mess you’ve made, so that your next attempt will be informed by it and not just another blind stab in the dark.

We have all heard the forlorn refrain “Well, it seemed like a good idea at the time!” This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say, “Well, it seemed like a good idea at the time!” is standing on the threshold of brilliance.

Originally featured in May — read the full article here.

9. LOST CAT

“Dogs are not about something else. Dogs are about dogs,” Malcolm Gladwell asserted indignantly in the introduction to The Big New Yorker Book of Dogs. Though hailed as memetic rulers of the internet, cats have also enjoyed a long history as artistic and literary muses, but never have they been at once more about cats and more about something else than in Lost Cat: A True Story of Love, Desperation, and GPS Technology (public library) by firefighter-turned-writer Caroline Paul and illustrator extraordinaire Wendy MacNaughton, she of many wonderful collaborations — a tender, imaginative memoir, among the year’s best biographies and memoirs and best books on pets and animals, infused with equal parts humor and humanity. Though “about” a cat, this heartwarming and heartbreaking tale is really about what it means to be human — about the osmosis of hollowing loneliness and profound attachment, the oscillation between boundless affection and paralyzing fear of abandonment, the unfair promise of loss implicit to every possibility of love.

After Caroline crashes an experimental plane she was piloting, she finds herself severely injured and spiraling into the depths of depression. It both helps and doesn’t that Caroline and Wendy have just fallen in love, soaring in the butterfly heights of new romance, “the phase of love that didn’t obey any known rules of physics,” until the crash pulls them into a place that would challenge even the most seasoned and grounded of relationships. And yet they persevere as Wendy patiently and lovingly takes care of Caroline.

When Caroline returns from the hospital with a shattered ankle, her two thirteen-year-old tabbies — the shy, anxious Tibby (short for Tibia, affectionately — and, in these circumstances, ironically — named after the shinbone) and the sociable, amicable Fibby (short for Fibula, after the calf bone on the lateral side of the tibia) — are, short of Wendy, her only joy and comfort:

Tibia and Fibula meowed happily when I arrived. They were undaunted by my ensuing stupor. In fact they were delighted; suddenly I had become a human who didn’t shout into a small rectangle of lights and plastic in her hand, peer at a computer, or get up and disappear from the vicinity, only to reappear through the front door hours later. Instead, I was completely available to them at all times. Amazed by their good luck, they took full feline advantage. They asked for ear scratches and chin rubs. They rubbed their whiskers along my face. They purred in response to my slurred, affectionate baby talk. But mostly they just settled in and went to sleep. Fibby snored into my neck. Tibby snored on the rug nearby. Meanwhile I lay awake, circling the deep dark hole of depression.

Without my cats, I would have fallen right in.

And then, one day, Tibby disappears.

Wendy and Caroline proceed to flyer the neighborhood, visit every animal shelter in the vicinity, and even, in their desperation, enlist the help of a psychic who specializes in lost pets — but to no avail. Heartbroken, they begin to mourn Tibby’s loss.

And then, one day five weeks later, Tibby reappears. But once the initial elation of the recovery has worn off, Caroline begins to wonder where he’d been and why he’d left. He is now no longer eating at home and regularly leaves the house for extended periods of time — Tibby clearly has a secret place he now returns to. Even more worrisomely, he’s no longer the shy, anxious tabby he’d been for thirteen years — instead, he’s a half pound heavier, chirpy, with “a youthful spring in his step.” But why would a happy cat abandon his loving lifelong companion and find comfort — find himself, even — elsewhere?

When the relief that my cat was safe began to fade, and the joy of his prone, snoring form — sprawled like an athlete after a celebratory night of boozing — started to wear thin, I was left with darker emotions. Confusion. Jealousy. Betrayal. I thought I’d known my cat of thirteen years. But that cat had been anxious and shy. This cat was a swashbuckling adventurer back from the high seas. What siren call could have lured him away? Was he still going to this gilded place, with its overflowing food bowls and endless treats?

There only one obvious thing left to do: Track Tibby on his escapades. So Caroline, despite Wendy’s lovingly suppressed skepticism, heads to a spy store — yes, those exist — and purchases a real-time GPS tracker, complete with a camera that they program to take snapshots every few minutes, which they then attach to Tibby’s collar.

What follows is a wild, hilarious, and sweet tale of tinkering, tracking, and tenderness. Underpinning the obsessive quest is the subtle yet palpable subplot of Wendy and Caroline’s growing love for each other, the deepening of trust and affection that happens when two people share in a special kind of insanity.

“Evert quest is a journey, every journey a story. Every story, in turn, has a moral,” writes Caroline in the final chapter, then offers several “possible morals” for the story, the last of which embody everything that makes Lost Cat an absolute treat from cover to cover:

6. You can never know your cat. In fact, you can never know anyone as completely as you want.

7. But that’s okay, love is better.

Take a closer look here, then hear MacNaughton and Paul in conversation about combining creative collaboration with a romantic relationship.

10. CODEX SERAPHINIANUS

In 1976, Italian artist, architect, and designer Luigi Serafini, only 27 at the time, set out to create an elaborate encyclopedia of imaginary objects and creatures that fell somewhere between Edward Gorey’s cryptic alphabets, Albertus Seba’s cabinet of curiosities, the book of surrealist games, and Alice in Wonderland. What’s more, it wasn’t written in any ordinary language but in an unintelligible alphabet that appeared to be a conlang — an undertaking so complex it constitutes one of the highest feats of cryptography. It took him nearly three years to complete the project, and three more to publish it, but when it was finally released, the book — a weird and wonderful masterpiece of art and philosophical provocation on the precipice of the information age — attracted a growing following that continued to gather momentum even as the original edition went out of print.

Now, for the first time in more than thirty years, Codex Seraphinianus (public library) is resurrected in a lavish new edition by Rizzoli — who have a penchant for excavating forgotten gems — featuring a new chapter by Serafini, now in his 60s, and a gorgeous signed print with each deluxe tome. Besides a visual masterwork, it’s also a timeless meditation on what “reality” really is, one all the timelier in today’s age of such seemingly surrealist feats as bioengineering whole new lifeforms, hurling subatomic particles at each other faster than the speed of light, and encoding an entire book onto a DNA molecule.

In an interview for Wired Italy, Serafini aptly captures the subtle similarity to children’s books in how the Codex bewitches our grown-up fancy with its bizarre beauty:

What I want my alphabet to convey to the reader is the sensation that children feel in front of books they cannot yet understand. I used it to describe analytically an imaginary world and give a coherent framework. The images originate from the clash between this fantasy vocabulary and the real world. … The Codex became so popular because it makes you feel more comfortable with your fantasies. Another world is not possible, but a fantasy one maybe is.

Playfully addressing the book’s towering price point, Serafini makes a more serious point about how it bespeaks the seductive selectiveness of our attention:

The [new] edition is very rich and also pricey, I know, but it’s just like psychoanalysis: Money matters and the fee is part of the process of healing. At the end of the day, the Codex is similar to the Rorschach inkblot test. You see what you want to see. You might think it’s speaking to you, but it’s just your imagination.

Originally featured in October — see more here.

11. MY BROTHER’S BOOK

For those of us who loved legendary children’s book author Maurice Sendak — famed creator of wild things, little-known illustrator of velveteen rabbits, infinitely warm heart, infinitely witty mind — his death in 2012 was one of the year’s greatest heartaches. Now, half a century after his iconic Where The Wild Things Are, comes My Brother’s Book (public library), one of the year’s best children’s books — a bittersweet posthumous farewell to the world, illustrated in vibrant, dreamsome watercolors and written in verse inspired by some of Sendak’s lifelong influences: Shakespeare, Blake, Keats, and the music of Mozart. In fact, a foreword by Shakespeare scholar Stephen Greenblatt reveals the book is based on the Bard’s “A Winter’s Tale.”

It tells the story of two brothers, Jack and Guy, torn asunder when a falling star crashes onto Earth. Though on the surface about the beloved author’s own brother Jack, who died 18 years ago, the story is also about the love of Sendak’s life and his partner of fifty years, psychoanalyst Eugene Glynn, whose prolonged illness and eventual loss in 2007 devastated Sendak — the character of Guy reads like a poetic fusion of Sendak and Glynn. And while the story might be a universal “love letter to those who have gone before,” as NPR’s Renee Montagne suggests in Morning Edition, it is in equal measure a private love letter to Glynn. (Sendak passed away the day before President Obama announced his support for same-sex marriage, but Sendak fans were quick to honor both historic moments with a bittersweet homage.)

Indeed, the theme of all-consuming love manifests viscerally in Sendak’s books. Playwright Tony Kushner, a longtime close friend of Sendak’s and one of his most heartfelt mourners, tells NPR:

There’s a lot of consuming and devouring and eating in Maurice’s books. And I think that when people play with kids, there’s a lot of fake ferocity and threats of, you know, devouring — because love is so enormous, the only thing you can think of doing is swallowing the person that you love entirely.

My Brother’s Book ends on a soul-stirring note, tender and poignant in its posthumous light:

And Jack slept safe
Enfolded in his brother’s arms
And Guy whispered ‘Good night
And you will dream of me.’

Originally featured in February.

12. EIGHTY DAYS

“Anything one man can imagine, other men can make real,” science fiction godfather Jules Verne famously proclaimed. He was right about the general sentiment but oh how very wrong about its gendered language: Sixteen years after Verne’s classic novel Eighty Days Around the World, his vision for speed-circumnavigation would be made real — but by a woman. On the morning of November 14, 1889, Nellie Bly, an audacious newspaper reporter, set out to outpace Verne’s fictional itinerary by circumnavigating the globe in seventy-five days, thus setting the real-world record for the fastest trip around the world. In Eighty Days: Nellie Bly and Elizabeth Bisland’s History-Making Race Around the World (public library), also among the year’s best history books, Matthew Goodman traces the groundbreaking adventure, beginning with a backdrop of Bly’s remarkable journalistic fortitude and contribution to defying our stubbornly enduring biases about women writers:

No female reporter before her had ever seemed quite so audacious, so willing to risk personal safety in pursuit of a story. In her first exposé for The World, Bly had gone undercover … feigning insanity so that she might report firsthand on the mistreatment of the female patients of the Blackwell’s Island Insane Asylum. … Bly trained with the boxing champion John L. Sullivan; she performed, with cheerfulness but not much success, as a chorus girl at the Academy of Music (forgetting the cue to exit, she momentarily found herself all alone onstage). She visited with a remarkable deaf, dumb, and blind nine-year-old girl in Boston by the name of Helen Keller. Once, to expose the workings of New York’s white slave trade, she even bought a baby. Her articles were by turns lighthearted and scolding and indignant, some meant to edify and some merely to entertain, but all were shot through with Bly’s unmistakable passion for a good story and her uncanny ability to capture the public’s imagination, the sheer force of her personality demanding that attention be paid to the plight of the unfortunate, and, not incidentally, to herself.

For all her extraordinary talent and work ethic, Bly’s appearance was decidedly unremarkable — a fact that shouldn’t matter, but one that would be repeatedly remarked upon by her critics and commentators, something we’ve made sad little progress on in discussing women’s professional, intellectual, and creative merit more than a century later. Goodman paints a portrait of Bly:

She was a young woman in a plaid coat and cap, neither tall nor short, dark nor fair, not quite pretty enough to turn a head: the sort of woman who could, if necessary, lose herself in a crowd.

[…]

Her voice rang with the lilt of the hill towns of western Pennsylvania; there was an unusual rising inflection at the ends of her sentences, the vestige of an Elizabethan dialect that had still been spoken in the hills when she was a girl. She had piercing gray eyes, though sometimes they were called green, or blue-green, or hazel. Her nose was broad at its base and delicately upturned at the end — the papers liked to refer to it as a “retroussé” nose — and it was the only feature about which she was at all self-conscious. She had brown hair that she wore in bangs across her forehead. Most of those who knew her considered her pretty, although this was a subject that in the coming months would be hotly debated in the press.

But, as if the ambitious adventure weren’t scintillating enough, the story takes an unexpected turn: That fateful November morning, as Bly was making her way to the journey’s outset at the Hoboken docks, a man named John Brisben Walker passed her on a ferry in the opposite direction, traveling from Jersey City to Lower Manhattan. He was the publisher of a high-brow magazine titled The Cosmopolitan, the same publication that decades later, under the new ownership of William Randolph Hearst, would take a dive for the commercially low-brow. On his ferry ride, Walker skimmed that morning’s edition of The World and paused over the front-page feature announcing Bly’s planned adventure around the world. A seasoned media manipulator of the public’s voracious appetite for drama, he instantly birthed an idea that would seize upon a unique publicity opportunity — The Cosmopolitan would send another circumnavigator to race against Bly. To keep things equal, it would have to be a woman. To keep them interesting, she’d travel in the opposite direction.

And so it went:

Elizabeth Bisland was twenty-eight years old, and after nearly a decade of freelance writing she had recently obtained a job as literary editor of The Cosmopolitan, for which she wrote a monthly review of recently published books entitled “In the Library.” Born into a Louisiana plantation family ruined by the Civil War and its aftermath, at the age of twenty she had moved to New Orleans and then, a few years later, to New York, where she contributed to a variety of magazines and was regularly referred to as the most beautiful woman in metropolitan journalism. Bisland was tall, with an elegant, almost imperious bearing that accentuated her height; she had large dark eyes and luminous pale skin and spoke in a low, gentle voice. She reveled in gracious hospitality and smart conversation, both of which were regularly on display in the literary salon that she hosted in the little apartment she shared with her sister on Fourth Avenue, where members of New York’s creative set, writers and painters and actors, gathered to discuss the artistic issues of the day. Bisland’s particular combination of beauty, charm, and erudition seems to have been nothing short of bewitching.

But Bisland was no literary bombshell. Wary of beauty’s fleeting and superficial nature — she once lamented, “After the period of sex-attraction has passed, women have no power in America” — she blended Edison’s circadian relentlessness and Tchaikovsky’s work ethic:

She took pride in the fact that she had arrived in New York with only fifty dollars in her pocket, and that the thousands of dollars now in her bank account had come by virtue of her own pen. Capable of working for eighteen hours at a stretch, she wrote book reviews, essays, feature articles, and poetry in the classical vein. She was a believer, more than anything else, in the joys of literature, which she had first experienced as a girl in ancient volumes of Shakespeare and Cervantes that she found in the library of her family’s plantation house. (She taught herself French while she churned butter, so that she might read Rousseau’s Confessions in the original — a book, as it turned out, that she hated.) She cared nothing for fame, and indeed found the prospect of it distasteful.

And yet, despite their competitive circumstances and seemingly divergent dispositions, something greater bound the two women together, some ineffable force of culture that quietly united them in a bold defiance of their era’s normative biases:

On the surface the two women … were about as different as could be: one woman a Northerner, the other from the South; one a scrappy, hard-driving crusader, the other priding herself on her gentility; one seeking out the most sensational of news stories, the other preferring novels and poetry and disdaining much newspaper writing as “a wild, crooked, shrieking hodge-podge,” a “caricature of life.” Elizabeth Bisland hosted tea parties; Nellie Bly was known to frequent O’Rourke’s saloon on the Bowery. But each of them was acutely conscious of the unequal position of women in America. Each had grown up without much money and had come to New York to make a place for herself in big-city journalism, achieving a hard-won success in what was still, unquestionably, a man’s world.

Originally featured in May — read the full article, including Bly’s entertaining illustrated packing list, here.

13. DON’T GO BACK TO SCHOOL

“The present education system is the trampling of the herd,” legendary architect Frank Lloyd Wright lamented in 1956. Half a century later, I started Brain Pickings in large part out of frustration and disappointment with my trampling experience of our culturally fetishized “Ivy League education.” I found myself intellectually and creatively unstimulated by the industrialized model of the large lecture hall, the PowerPoint presentations, the standardized tests assessing my rote memorization of facts rather than my ability to transmute that factual knowledge into a pattern-recognition mechanism that connects different disciplines to cultivate wisdom about how the world works and a moral lens on how it should work. So Brain Pickings became the record of my alternative learning, of that cross-disciplinary curiosity that took me from art to psychology to history to science, by way of the myriad pieces of knowledge I discovered — and connected — on my own. I didn’t live up to the entrepreneurial ideal of the college drop-out and begrudgingly graduated “with honors,” but refused to go to my own graduation and decided never to go back to school. Years later, I’ve learned more in the course of writing and researching the thousands of articles to date than in all the years of my formal education combined.

So, in 2012, when I found out that writer Kio Stark was crowdfunding a book that would serve as a manifesto for learning outside formal education, I eagerly chipped in. Now, Don’t Go Back to School: A Handbook for Learning Anything is out and is everything I could’ve wished for when I was in college, an essential piece of cultural literacy, at once tantalizing and practically grounded assurance that success doesn’t lie at the end of a single highway but is sprinkled along a thousand alternative paths. Stark describes it as “a radical project, the opposite of reform … not about fixing school [but] about transforming learning — and making traditional school one among many options rather than the only option.” Through a series of interviews with independent learners who have reached success and happiness in fields as diverse as journalism, illustration, and molecular biology, Stark — who herself dropped out of a graduate program at Yale, despite being offered a prestigious fellowship — cracks open the secret to defining your own success and finding your purpose outside the factory model of formal education. She notes the patterns that emerge:

People who forgo school build their own infrastructures. They create and borrow and reinvent the best that formal schooling has to offer, and they leave the worst behind. That buys them the freedom to learn on their own terms.

[…]

From their stories, you’ll see that when you step away from the prepackaged structure of traditional education, you’ll discover that there are many more ways to learn outside school than within.

Reflecting on her own exit from academia, Stark articulates a much more broadly applicable insight:

A gracefully executed quit is a beautiful thing, opening up more doors than it closes.

But despite discovering in dismay that “liberal arts graduate school is professional school for professors,” which she had no interest in becoming, Stark did learn something immensely valuable from her third year of independent study, during which she read about 200 books of her own choosing:

I learned how to teach myself. I had to make my own reading lists for the exams, which meant I learned how to take a subject I was interested in and make myself a map for learning it.

The interviews revealed four key common tangents: learning is collaborative rather than done alone; the importance of academic credentials in many professions is declining; the most fulfilling learning tends to take place outside of school; and those happiest about learning are those who learn out of intrinsic motivation rather than in pursuit of extrinsic rewards. The first of these insights, of course, appears on the surface to contradict the very notion of “independent learning,” but Stark offers an eloquent semantic caveat:

Independent learning suggests ideas such as “self-taught,” or “autodidact.” These imply that independence means working solo. But that’s just not how it happens. People don’t learn in isolation. When I talk about independent learners, I don’t mean people learning alone. I’m talking about learning that happens independent of schools.

[…]

Anyone who really wants to learn without school has to find other people to learn with and from. That’s the open secret of learning outside of school. It’s a social act. Learning is something we do together.

Independent learners are interdependent learners.

Much of the argument for formal education rests on statistics indicating that people with college and graduate degrees earn more. But those statistics, Stark notes, suffer an important and rarely heeded bias:

The problem is that this statistic is based on long-term data, gathered from a period of moderate loan debt, easy employability, and annual increases in the value of a college degree. These conditions have been the case for college grads for decades. Given the dramatically changed circumstances grads today face, we already know that the trends for debt, employability, and the value of a degree have all degraded, and we cannot assume the trend toward greater lifetime earnings will hold true for the current generation. This is a critical omission from media coverage. The fact is we do not know. There’s absolutely no guarantee it will hold true.

Some heartening evidence suggests the blind reliance on degrees might be beginning to change. Stark cites Zappos CEO Tony Hsieh:

I haven’t looked at a résumé in years. I hire people based on their skills and whether or not they are going to fit our culture.

Another common argument for formal education extols the alleged advantages of its structure, proposing that homework assignments, reading schedules, and regular standardized testing would motivate you to learn with greater rigor. But, as Daniel Pink has written about the psychology of motivation, in school, as in work, intrinsic drives far outweigh extrinsic, carrots-and-sticks paradigms of reward and punishment, rendering this argument unsound. Stark writes:

Learning outside school is necessarily driven by an internal engine. … [I]ndependent learners stick with the reading, thinking, making, and experimenting by which they learn because they do it for love, to scratch an itch, to satisfy curiosity, following the compass of passion and wonder about the world.

So how can you best fuel that internal engine of learning outside the depot of formal education? Stark offers an essential insight, which places self-discovery at the heart of acquiring external knowledge:

Learning your own way means finding the methods that work best for you and creating conditions that support sustained motivation. Perseverance, pleasure, and the ability to retain what you learn are among the wonderful byproducts of getting to learn using methods that suit you best and in contexts that keep you going. Figuring out your personal approach to each of these takes trial and error.

[…]

For independent learners, it’s essential to find the process and methods that match your instinctual tendencies as a learner. Everyone I talked to went through a period of experimenting and sorting out what works for them, and they’ve become highly aware of their own preferences. They’re clear that learning by methods that don’t suit them shuts down their drive and diminishes their enjoyment of learning. Independent learners also find that their preferred methods are different for different areas. So one of the keys to success and enjoyment as an independent learner is to discover how you learn.

[…]

School isn’t very good at dealing with the multiplicity of individual learning preferences, and it’s not very good at helping you figure out what works for you.

Echoing Neil deGrasse Tyson, who has argued that “every child is a scientist” since curiosity is coded into our DNA, and Sir Ken Robinson, who has lamented that the industrial model of education schools us out of our inborn curiosity, Stark observes:

Any young child you observe displays these traits. But passion and curiosity can be easily lost. School itself can be a primary cause; arbitrary motivators such as grades leave little room for variation in students’ abilities and interests, and fail to reward curiosity itself. There are also significant social factors working against children’s natural curiosity and capacity for learning, such as family support or the lack of it, or a degree of poverty that puts families in survival mode with little room to nurture curiosity.

Stark returns to the question of motivators that do work, once again calling to mind Pink’s advocacy of autonomy, mastery, and purpose as the trifecta of success. She writes:

[T]hree broadly defined elements of the learning experience support internal motivation and the persistence it enables. Internal motivation relies on learners having autonomy in their learning, a progressing sense of competence in their skills and knowledge, and the ability to learn in a concrete or “real world” context rather than in the abstract. These are mostly absent from classroom learning. Autonomy is rare, useful context is absent, and school’s means for affirming competence often feel so arbitrary as to be almost without use — and are sometimes actively demotivating. . . . [A]utonomy means that you follow your own path. You learn what you want to learn, when and how you want to learn it, for your own reasons. Your impetus to learn comes from within because you control the conditions of your learning rather than working within a structure that’s pre-made and inflexible.

The second thing you need to stick with learning independently is to set your own goals toward an increasing sense of competence. You need to create a feedback loop that confirms your work is worth it and keeps you moving forward. In school this is provided by advancing through the steps of the linear path within an individual class or a set curriculum, as well as from feedback from grades and praise.

But Stark found that outside of school, those most successful at learning sought their sense of competence through alternative sources. Many, like James Mangan advised in his 1936 blueprint to acquiring knowledge, solidified their learning by teaching it to other people, increasing their own sense of mastery and deepening their understanding. Others centered their learning around specific projects, which enabled them to make progress more modular and thus more attainable. Another cohort cited failure as an essential part of the road to mastery. Stark continues:

The third thing [that] can make or break your ability to sustain internal motivation … is to situate what you’re learning in a context that matters to you. In some cases, the context is a specific project you want to accomplish, which … also functions to support your sense of progress.

She sums up the failings of the establishment:

School is not designed to offer these three conditions; autonomy and context are sorely lacking in classrooms. School can provide a sense of increasing mastery, via grades and moving from introductory classes to harder ones. But a sense of true competence is harder to come by in a school environment. Fortunately, there are professors in higher education who are working to change the motivational structures that underlie their curricula.

The interviews, to be sure, offer a remarkably diverse array of callings, underpinned by a number of shared values and common characteristics. Computational biologist Florian Wagner, for instance, echoes Steve Jobs’s famous words on the secret of life in articulating a sentiment shared by many of the other interviewees:

There is something really special about when you first realize you can figure out really cool things completely on your own. That alone is a valuable lesson in life.

Investigative journalist Quinn Norton subscribes to Mangan’s prescription for learning by teaching:

I ended up teaching [my] knowledge to others at the school. That’s one of my most effective ways to learn, by teaching; you just have to stay a week ahead of your students. … Everything I learned, I immediately turned around and taught to others.

She also used the gift of ignorance to proactively drive her knowledge forward:

When I wanted to learn something new as a professional writer, I’d pitch a story on it. I was interested in neurology, and I figured, why don’t I start interviewing neurologists? The great thing about being a journalist is that you can pick up the phone and talk to anybody. It was just like what I found out about learning from experts on mailing lists. People like to talk about what they know.

Norton speaks to the usefulness of useless knowledge, not only in one’s own intellectual development but also as social currency:

I’m stuffed with trivial, useless knowledge, on a panoply of bizarre topics, so I can find something that they’re interested in that I know something about. Being able to do that is tremendously socially valuable. The exchange of knowledge is a very human way to learn. I try never to walk into a room where I want to get information without knowing what I’m bringing to the other person.

[…]

I think part of the problem with the usual mindset of the student is that it’s like being a sponge. It’s passive. It’s not about having something to bring to the interaction. People who are experts in things are experts because they like learning.

Software engineer, artist, and University of Texas molecular biologist Zack Booth Simpson speaks to the value of cultivating what William Gibson has called “a personal micro-culture” and learning from the people with whom you surround yourself:

In a way, the best education you can get is just talking with people who are really smart and interested in things, and you can get that for the cost of lunch.

Artist Molly Crabapple, who inked this beautiful illustration of Salvador Dalí’s creative credo and live-sketched Susan Cain’s talk on the power of introverts, recalls how self-initiated reading shaped her life:

I was … a constant reader. At home, I lived next to this thrift store that sold paperbacks for 10¢ apiece so I would go and buy massive stacks of paperback books on everything. Everything from trashy 1970s romance novels to Plato. When I went to Europe, I brought with me every single book that I didn’t think I would read voluntarily, because I figured if I was on a bus ride, I would read them. So I read Plato and Dante’s Inferno, and all types of literature. I got my education on the bus.

Originally featured in May — read more here.

* * *

For a subject-specific lens on the year’s finest reading, revisit the best books on best-of reading lists — which covered writing and creativity, photography, psychology and philosophy, art and design, history and biography, science and technology, children’s literature, and pets and animals.

BP

View Full Site

Brain Pickings participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from a link on here, I receive a small percentage of its price. Privacy policy.