Brain Pickings

Posts Tagged ‘history’

11 DECEMBER, 2014

A Burst of Delight and Recognition: E.E. Cummings, the Art of Noticing, and the Spirit of Rebellion

By:

“Cummings despised fear, and his life was lived in defiance of all who ruled by it.”

“The art of seeing has to be learned,” Marguerite Duras memorably wrote. Half a century earlier, a young poet began teaching the world this art, and teaching us to question what is seen, then made another art of that questioning. In E. E. Cummings: A Life (public library | IndieBound), memoirist, biographer, and journalist Susan Cheever chronicles the celebrated poet’s “wildly ambitious attempt at creating a new way of seeing the world through language.”

Cheever considers the three ways in which modernists like Cummings and his coterie — which included such icons as Gertrude Stein, James Joyce, Pablo Picasso, Henri Matisse, and Marcel Duchamp — reshaped culture:

Modernism as Cummings and his mid-twentieth-century colleagues embraced it had three parts. The first was the exploration of using sounds instead of meanings to connect words to the reader’s feelings. The second was the idea of stripping away all unnecessary things to bring attention to form and structure: the formerly hidden skeleton of a work would now be exuberantly visible. The third facet of modernism was an embrace of adversity. In a world seduced by easy understanding, the modernists believed that difficulty enhanced the pleasures of reading. In a Cummings poem the reader must often pick his way toward comprehension, which comes, when it does, in a burst of delight and recognition.

One can’t help but feel the particular timeliness, today, of the third — how often are we offered “a burst of delight and recognition” in our culture of monotonously shrill linkbait as we struggle to glean any semblance of wisdom in the age of information? Cummings knew that equally essential was the capacity to notice the invitation to experience that burst — a capacity ever-shrinking, ever-urgently longed for in our age of compulsive flight from stillness — and he made an art of that noticing. Cheever writes:

[The modernists] were trying to slow down the seemingly inexorable rush of the world, to force people to notice their own lives. In the twenty-first century, that rush has now reached Force Five; we are all inundated with information and given no time to wonder what it means or where it came from. Access without understanding and facts without context have become our daily diet.

(Cummings’s name itself provides tragicomic evidence of our modern hubris in flaunting half-understood, partially correct “facts” — while many people believe, and some would adamantly insist, that the only acceptable spelling of the poet’s name is lowercase, he himself used both lowercase and capitalized versions in signing his work; in fact, he capitalized more frequently than not.)

Cummings cultivated this art of noticing one’s own life with emboldening tenacity. Despite being one of the most popular poets of the 1950s and 1960s, Cheever writes, Cummings lived in a tiny, dilapidated Green Village apartment and often struggled to make rent. And yet, “this bothered Cummings not at all”:

He was delighted by almost everything in life except for the institutions and formal rules that he believed sought to deaden feelings.

Indeed, the spirit of rebellion against institutions was central to Cummings’s character and permeated his art. Cheever met Cummings in 1958, toward the end of “his brilliant and controversial forty-year career as this country’s only true modernist poet,” when he did a reading at the “uptight girls’ school” where she was an unhappy teenager “with failing grades.” Cummings was a friend of her father’s — the famed novelist John Cheever — so the evening of the reading ended with the trio sharing a car ride together, during which Cummings delighted himself and his companions by making fun of young Susan’s teachers:

He said the place was more like a prison than a school. It was a hatchery whose goal was to produce uniformity. I was unhappy there? No wonder! I was a spirited and wise young woman. Only a mindless moron (Cummings loved alliteration) could excel in a place like that. What living soul could even survive a week in that assembly line for obedient girls, that pedagogical factory whose only purpose was to turn out so-called educated wives for upper-class blowhards with red faces and swollen bank balances?

When the small party stopped to grab a bite at a burger joint, the two men proudly shared a flask to spike their coffee, but Cheever recalls being “already drunk on a different kind of substance — inspiration” as she fathomed for the first time the idea that authority is to be questioned, that “being right was a petty goal,” and that “being free was the thing to aim for.” Noting that “history has given us very few heretics who have not been burned at the stake,” she anoints Cummings her generation’s “beloved heretic, a Henry David Thoreau for the twentieth century.” (Thoreau, of course, was the grand master of the art of noticing.) Cheever writes of Cummings’s ennobling heretical sensibility:

In his almost three thousand poems he sometimes furiously, sometimes lovingly debunked anything or anyone in power — even death, in his famous poem about Buffalo Bill, with its spangled alliterations and intimate last lines: “and what i want to know is / how do you like your blueeyed boy / Mister Death.”

Cummings despised fear, and his life was lived in defiance of all who ruled by it.

Illustration from the little-known fairy tales Cummings wrote for his only daughter, whom he almost lost. Click image for story.

Both the great irony and the great affirmation of Cummings’s spirit of rebellion against culture’s soul-deadening institutions is that he grew up with parents who were “Harvard royalty,” was educated at the iconic institution himself, and even stayed an extra year after graduation to earn a master’s degree in Classics. But he also — and perhaps precisely because of that brush with privilege — exiled himself from the Cambridge community and only returned, reluctantly, shortly before his death thirty years later. Cheever writes of the formative act of rebellion that was his self-expulsion:

His self-imposed exile from Cambridge — a town he had come to hate for its intellectualism, Puritan uptightness, racism, and self-righteous xenophobia — had seemed necessary for him as a man and as a poet. Soon after his 1915 class lecture and after serving in World War I, Cummings had permanently fled to sexy, law-breaking Greenwich Village, where he could hang out with other modernist poets like Marianne Moore, talk with writers like Hart Crane, be admired by Dylan Thomas and Edna St. Vincent Millay, have an affair with another man’s wife, go to burlesque performances at the National Winter Garden, and ask William Carlos Williams for medical advice.

Even though he wrote in one early poem that “the Cambridge ladies who live in furnished souls / are unbeautiful and have comfortable minds,” the reason for his eventual return was that he was offered the Charles Eliot Norton Professorship of Poetry at Harvard — the same prestigious yearlong lectureship that produced Calvino’s unforgettable final legacy and over the years featured such luminaries as Jorge Luis Borges, T.S. Eliot, Aaron Copland, Leonard Bernstein, John Cage, and Umberto Eco. But while Cummings took the gig, he brought to it his own rules and co-opted its conventions for his mission of rebellion.

When 58-year-old Cummings arrived at Harvard that fall, wearing a neck-to-hip corset prescribed by his doctor that he called “the Iron Maiden,” he left no doubts as to his irreverence. He titled what he was about to deliver “nonlectures” and lived up to the promise by delivering them with the same galvanizing, acrobatic, highly performative technique he had developed for his poetry readings. Cheever captures the mesmeric mischief of Cummings’s presence at Harvard by quoting one woman, then a Radcliffe student dragged to the lecture by her mother:

There was a hush when he walked out onto the stage. He was enchanting, captivating, and magnetic. He was very virile and sexual on the stage. I think he made some of the men uncomfortable.

Despite having anguished over whether or not to accept the lectureship, and having almost cancelled it on several occasions, Cummings, according to his wife Marion, never worked harder on anything. Perhaps he saw them as a way to solidify what he stood for, to claim position as a generation’s “beloved heretic” and claim it from within the walls of the institution that stood for the very authority he had made an art of defying and deriding. Cheever writes:

Everything he stood for— a puncturing of pretension, an openness to adventure, a deliciously uncensored attitude when it came to sex, a sly sense of humor fueled by a powerful defiance — is in his opening phrases. He stood at the lectern under the fifty-foot carved ceilings and won the hearts of the audience in a few words. “Let me cordially warn you, at the opening of these so called lectures, that I haven’t the remotest intention of posing as a lecturer.”

Exactly ten years later, he died in the same defiant spirit. Cheever recounts the bittersweet story her father so loved telling:

Marion had called him in to dinner as day faded and the glorious sky lit up with the fires of sunset. “I’ll be there in a moment,” Cummings said. “I’m just going to sharpen the axe.” A few minutes later he crumpled to the ground, felled by a cerebral hemorrhage. He was sixty-seven. That, my father let us all know, was the way to die— still manly and useful, still beloved, still strong. “‘how do you like your blueeyed boy / Mister Death,’ ” my father growled, his eyes wet with tears.

In the remainder of the altogether entrancing E. E. Cummings: A Life, Cheever goes on to explore the beliefs, irreverences, and experiences that coalesced into the character of this extraordinary man who rebelled through the art of noticing and who continues to bewitch us with his undying “burst of delight and recognition.”

Complement it with the little-known story of Cummings’s only children’s book, which he wrote for the daughter he almost lost, this enchanting album of seventeen songs based on his poems, and the poet’s magnificent reading of “anyone lived in a pretty how town.”

Donating = Loving

In 2014, I poured thousands of hours and tons of love into bringing you (ad-free) Brain Pickings. But it also took some hefty practical expenses to keep things going. If you found any joy and stimulation here over the year, please consider helping me fuel the former and offset the latter by becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

10 DECEMBER, 2014

How Ada Lovelace, Lord Byron’s Daughter, Became the World’s First Computer Programmer

By:

How a young woman with the uncommon talent of applying poetic imagination to science envisioned the Symbolic Medea that would become the modern computer, sparking the birth of the digital age.

Augusta Ada King, Countess of Lovelace, born Augusta Ada Byron on December 10, 1815, later came to be known simply as Ada Lovelace. Today, she is celebrated as the world’s first computer programmer — the first person to marry the mathematical capabilities of computational machines with the poetic possibilities of symbolic logic applied with imagination. This peculiar combination was the product of Ada’s equally peculiar — and in many ways trying — parenting.

Eleven months before her birth, her father, the great Romantic poet and scandalous playboy Lord Byron, had reluctantly married her mother, Annabella Milbanke, a reserved and mathematically gifted young woman from a wealthy family — reluctantly, because Byron saw in Annabella less a romantic prospect than a hedge against his own dangerous passions, which had carried him along a conveyer belt of indiscriminate affairs with both men and women.

Lord Byron in Albanian dress (Portrait by Thomas Phillips, 1835)

But shortly after Ada was conceived, Lady Byron began suspecting her husband’s incestuous relationship with his half-sister, Augusta. Five weeks after Ada’s birth, Annabella decided to seek a separation. Her attorneys sent Lord Byron a letter stating that “Lady B. positively affirms that she has not at any time spread reports injurious to Lord Byrons [sic] character” — with the subtle but clear implication that unless Lord Byron complies, she might. The poet now came to see his wife, whom he had once called “Princess of Parallelograms” in affectionate reverence for her mathematical talents, as a calculating antagonist, a “Mathematical Medea,” and later came to mock her in his famous epic poem Don Juan: “Her favourite science was the mathematical… She was a walking calculation.”

Augusta Ada Byron as a child

Ada was never to meet her father, who died in Greece the age of thirty-six. Ada was eight. On his deathbed, he implored his valet: “Oh, my poor dear child! — my dear Ada! My God, could I have seen her! Give her my blessing.” The girl was raised by her mother, who was bent on eradicating any trace of her father’s influence by immersing her in science and math from the time she was four. At twelve, Ada became fascinated by mechanical engineering and wrote a book called Flyology, in which she illustrated with her own plates her plan for constructing a flying apparatus. And yet she felt that part of her — the poetic part — was being repressed. In a bout of teenage defiance, she wrote to her mother:

You will not concede me philosophical poetry. Invert the order! Will you give me poetical philosophy, poetical science?

Indeed, the very friction that had caused her parents to separate created the fusion that made Ada a pioneer of “poetical science.”

That fruitful friction is what Walter Isaacson explores as he profiles Ada in the opening chapter of The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution (public library | IndieBound), alongside such trailblazers as Vannevar Bush, Alan Turing, and Stewart Brand. Isaacson writes:

Ada had inherited her father’s romantic spirit, a trait that her mother tried to temper by having her tutored in mathematics. The combination produced in Ada a love for what she took to calling “poetical science,” which linked her rebellious imagination to her enchantment with numbers. For many, including her father, the rarefied sensibilities of the Romantic era clashed with the techno-excitement of the Industrial Revolution. But Ada was comfortable at the intersection of both eras.

Ada King, Countess of Lovelace (Portrait by Alfred Edward Chalon, 1840)

When she was only seventeen, Ada attended one of legendary English polymath Charles Babbage’s equally legendary salons. There, amid the dancing, readings, and intellectual games, Babbage performed a dramatic demonstration of his Difference Engine, a beast of a calculating machine he was building. Ada was instantly captivated by its poetical possibilities, far beyond what the machine’s own inventor had envisioned. Later, one of her friends would remark: “Miss Byron, young as she was, understood its working, and saw the great beauty of the invention.”

Isaacson outlines the significance of that moment, in both Ada’s life and the trajectory of our culture:

Ada’s love of both poetry and math primed her to see beauty in a computing machine. She was an exemplar of the era of Romantic science, which was characterized by a lyrical enthusiasm for invention and discovery.

[…]

It was a time not unlike our own. The advances of the Industrial Revolution, including the steam engine, mechanical loom, and telegraph, transformed the nineteenth century in much the same way that the advances of the Digital Revolution — the computer, microchip, and Internet — have transformed our own. At the heart of both eras were innovators who combined imagination and passion with wondrous technology, a mix that produced Ada’s poetical science and what the twentieth-century poet Richard Brautigan would call “machines of loving grace.”

Enchanted by the prospect of the “poetical science” she imagined possible, Ada set out to convince Charles Babbage to be her mentor. She pitched him in a letter:

I have a peculiar way of learning, and I think it must be a peculiar man to teach me successfully… Do not reckon me conceited, … but I believe I have the power of going just as far as I like in such pursuits, and where there is so decided a taste, I should almost say a passion, as I have for them, I question if there is not always some portion of natural genius even.

Here, Isaacson makes a peculiar remark: “Whether due to her opiates or her breeding or both,” he writes in quoting that letter, “she developed a somewhat outsize opinion of her own talents and began to describe herself as a genius.” The irony, of course, is that she was a genius — Isaacson himself acknowledges that by the very act of choosing to open his biography of innovation with her. But would a man of such ability and such unflinching confidence in that ability be called out for his “outsize opinion,” for being someone with an “exalted view of [his] talents,” as Isaacson later writes of Ada? If a woman of her indisputable brilliance can’t be proud of her own talent without being dubbed delusional, then, surely, there is little hope for the rest of us mere female mortals to make any claim to confidence without being accused of hubris.

To be sure, if Isaacson didn’t see the immense value of Ada’s cultural contribution, he would not have included her in the book — a book that opens and closes with her, no less. These remarks, then, are perhaps less a matter of lamentable personal opinion than a reflection of limiting cultural conventions and our ambivalence about the admissible level of confidence a woman can have in her own talents.

Isaacson, indeed — despite disputing whether Ada deserves anointment as “the world’s first computer programmer” commonly attributed to her — makes the appropriateness of celebrating her contribution clear:

Ada’s ability to appreciate the beauty of mathematics is a gift that eludes many people, including some who think of themselves as intellectual. She realized that math was a lovely language, one that describes the harmonies of the universe and can be poetic at times. Despite her mother’s efforts, she remained her father’s daughter, with a poetic sensibility that allowed her to view an equation as a brushstroke that painted an aspect of nature’s physical splendor, just as she could visualize the “wine-dark sea” or a woman who “walks in beauty, like the night.” But math’s appeal went even deeper; it was spiritual. Math “constitutes the language through which alone we can adequately express the great facts of the natural world,” she said, and it allows us to portray the “changes of mutual relationship” that unfold in creation. It is “the instrument through which the weak mind of man can most effectually read his Creator’s works.”

This ability to apply imagination to science characterized the Industrial Revolution as well as the computer revolution, for which Ada was to become a patron saint. She was able, as she told Babbage, to understand the connection between poetry and analysis in ways that transcended her father’s talents. “I do not believe that my father was (or ever could have been) such a Poet as I shall be an Analyst; for with me the two go together indissolubly,” she wrote.

But Ada’s most important contribution came from her role as both a vocal champion of Babbage’s ideas, at a time when society questioned them as ludicrous, and as an amplifier of their potential beyond what Babbage himself had imagined. Isaacson writes:

Ada Lovelace fully appreciated the concept of a general-purpose machine. More important, she envisioned an attribute that might make it truly amazing: it could potentially process not only numbers but any symbolic notations, including musical and artistic ones. She saw the poetry in such an idea, and she set out to encourage others to see it as well.

Trial model of Babbage's Analytical Engine, completed after his death (Science Museum)

In her 1843 supplement to Babbage’s Analytical Engine, simply titled Notes, she outlined four essential concepts that would shape the birth of modern computing a century later. First, she envisioned a general-purpose machine capable not only of performing preprogrammed tasks but also of being reprogrammed to execute a practically unlimited range of operations — in other words, as Isaacson points out, she envisioned the modern computer.

Her second concept would become a cornerstone of the digital age — the idea that such a machine could handle far more than mathematical calculations; that it could be a Symbolic Medea capable of processing musical and artistic notations. Isaacson writes:

This insight would become the core concept of the digital age: any piece of content, data, or information — music, text, pictures, numbers, symbols, sounds, video — could be expressed in digital form and manipulated by machines. Even Babbage failed to see this fully; he focused on numbers. But Ada realized that the digits on the cogs could represent things other than mathematical quantities. Thus did she make the conceptual leap from machines that were mere calculators to ones that we now call computers.

Her third innovation was a step-by-step outline of “the workings of what we now call a computer program or algorithm.” But it was her fourth one, Isaacson notes, that was and still remains most momentous — the question of whether machines can think independently, which we still struggle to answer in the age of Siri-inspired fantasies like the movie Her. Ada wrote in her Notes:

The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths.

In the closing chapter, titled “Ada Forever,” Isaacson considers the enduring implications of this question:

Ada might also be justified in boasting that she was correct, at least thus far, in her more controversial contention: that no computer, no matter how powerful, would ever truly be a “thinking” machine. A century after she died, Alan Turing dubbed this “Lady Lovelace’s Objection” and tried to dismiss it by providing an operational definition of a thinking machine — that a person submitting questions could not distinguish the machine from a human — and predicting that a computer would pass this test within a few decades. But it’s now been more than sixty years, and the machines that attempt to fool people on the test are at best engaging in lame conversation tricks rather than actual thinking. Certainly none has cleared Ada’s higher bar of being able to “originate” any thoughts of its own.

In encapsulating Ada’s ultimate legacy, Isaacson once again touches on our ambivalence about the mythologies of genius — perhaps even more so of women’s genius — and finds wisdom in her own words:

As she herself wrote in those “Notes,” referring to the Analytical Engine but in words that also describe her fluctuating reputation, “In considering any new subject, there is frequently a tendency, first, to overrate what we find to be already interesting or remarkable; and, secondly, by a sort of natural reaction, to undervalue the true state of the case.”

The reality is that Ada’s contribution was both profound and inspirational. More than Babbage or any other person of her era, she was able to glimpse a future in which machines would become partners of the human imagination, together weaving tapestries as beautiful as those from Jacquard’s loom. Her appreciation for poetical science led her to celebrate a proposed calculating machine that was dismissed by the scientific establishment of her day, and she perceived how the processing power of such a device could be used on any form of information. Thus did Ada, Countess of Lovelace, help sow the seeds for a digital age that would blossom a hundred years later.

Ada died of progressively debilitating uterine cancer in 1852, when she was thirty-six — the same age as Lord Byron. She requested that she be buried in a country grave, alongside the father whom she never knew but whose poetical sensibility profoundly shaped her own genius of “poetical science.”

The Innovators goes on to trace Ada’s influence as it reverberates through the seminal work of a stable of technological pioneers over the century and a half since her death. Complement it with Ada’s spirited letter on science and religion.

Donating = Loving

In 2014, I poured thousands of hours and tons of love into bringing you (ad-free) Brain Pickings. But it also took some hefty practical expenses to keep things going. If you found any joy and stimulation here over the year, please consider helping me fuel the former and offset the latter by becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

10 DECEMBER, 2014

Elie Wiesel’s Timely Nobel Peace Prize Acceptance Speech on Human Rights and Our Shared Duty in Ending Injustice

By:

“We must always take sides. Neutrality helps the oppressor, never the victim. Silence encourages the tormentor, never the tormented.”

In 1986, at the age of fifty-eight, Romanian-born Jewish-American writer and political activist Elie Wiesel was awarded the Nobel Peace Prize. The Nobel committee called him a “messenger to mankind.” Wiesel lived up to that moniker with exquisite eloquence on December 10 that year — exactly ninety years after Alfred Nobel died — as he took the stage at Norway’s Oslo City Hall and delivered a spectacular speech on justice, oppression, and our individual responsibility in our shared freedom. The address was eventually included in Elie Wiesel: Messenger for Peace (public library | IndieBound).

Three decades later, Wiesel’s words ring with discomfiting timeliness as we are jolted out of our generational hubris, out of the illusion of progress, forced to confront the contemporary realities of racism, torture, and other injustice against the human experience. But alongside the reminder of how tragically we have failed Wiesel’s vision is also the promise of possibility reminding us what soaring heights of the human spirit we are capable of reaching if we choose to feed not our lowest impulses but our most exalted. Above all, Wiesel issues an assurance that these choices are not grandiose and reserved for those in power but daily and deeply personal, found in the quality of intention with which we each live our lives.

With the hard-earned wisdom of his own experience as a Holocaust survivor, memorably recounted in his iconic memoir Night, Wiesel extols our duty to speak up against injustice even when the world retreats into the hideout of silence:

I remember: it happened yesterday or eternities ago. A young Jewish boy discovered the kingdom of night. I remember his bewilderment, I remember his anguish. It all happened so fast. The ghetto. The deportation. The sealed cattle car. The fiery altar upon which the history of our people and the future of mankind were meant to be sacrificed.

I remember: he asked his father: “Can this be true?” This is the twentieth century, not the Middle Ages. Who would allow such crimes to be committed? How could the world remain silent?

And now the boy is turning to me: “Tell me,” he asks. “What have you done with my future? What have you done with your life?”

And I tell him that I have tried. That I have tried to keep memory alive, that I have tried to fight those who would forget. Because if we forget, we are guilty, we are accomplices.

And then I explained to him how naïve we were, that the world did know and remained silent. And that is why I swore never to be silent whenever and wherever human beings endure suffering and humiliation. We must always take sides. Neutrality helps the oppressor, never the victim. Silence encourages the tormentor, never the tormented. Sometimes we must interfere. When human lives are endangered, when human dignity is in jeopardy, national borders and sensitivities become irrelevant. Wherever men or women are persecuted because of their race, religion, or political views, that place must — at that moment — become the center of the universe.

Wiesel reminds us that even politically momentous dissent always begins with a personal act — with a single voice refusing to be silenced:

There is so much injustice and suffering crying out for our attention: victims of hunger, of racism, and political persecution, writers and poets, prisoners in so many lands governed by the Left and by the Right. Human rights are being violated on every continent. More people are oppressed than free.

[…]

There is much to be done, there is much that can be done. One person, … one person of integrity, can make a difference, a difference of life and death. As long as one dissident is in prison, our freedom will not be true. As long as one child is hungry, our lives will be filled with anguish and shame. What all these victims need above all is to know that they are not alone; that we are not forgetting them, that when their voices are stifled we shall lend them ours, that while their freedom depends on ours, the quality of our freedom depends on theirs.

This is what I say to the young Jewish boy wondering what I have done with his years. It is in his name that I speak to you and that I express to you my deepest gratitude. No one is as capable of gratitude as one who has emerged from the kingdom of night. We know that every moment is a moment of grace, every hour an offering; not to share them would mean to betray them. Our lives no longer belong to us alone; they belong to all those who need us desperately.

Complement with Viktor Frankl on the human search for meaning and Aung San Suu Kyi, who was awarded the Nobel Peace Prize herself five years later, on freedom from fear, then revisit William Faulkner’s piercing Nobel Prize acceptance speech on the role of the writer as a booster of the human heart, Albert Camus’s beautiful letter of gratitude to his childhood teacher upon receiving the coveted accolade, and the story of why Jean Paul Sartre became the first person to decline the prestigious prize.

Donating = Loving

In 2014, I poured thousands of hours and tons of love into bringing you (ad-free) Brain Pickings. But it also took some hefty practical expenses to keep things going. If you found any joy and stimulation here over the year, please consider helping me fuel the former and offset the latter by becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.