Brain Pickings

How Ada Lovelace, Lord Byron’s Daughter, Became the World’s First Computer Programmer

By:

How a young woman with the uncommon talent of applying poetic imagination to science envisioned the Symbolic Medea that would become the modern computer, sparking the birth of the digital age.

Augusta Ada King, Countess of Lovelace, born Augusta Ada Byron on December 10, 1815, later came to be known simply as Ada Lovelace. Today, she is celebrated as the world’s first computer programmer — the first person to marry the mathematical capabilities of computational machines with the poetic possibilities of symbolic logic applied with imagination. This peculiar combination was the product of Ada’s equally peculiar — and in many ways trying — parenting.

Eleven months before her birth, her father, the great Romantic poet and scandalous playboy Lord Byron, had reluctantly married her mother, Annabella Milbanke, a reserved and mathematically gifted young woman from a wealthy family — reluctantly, because Byron saw in Annabella less a romantic prospect than a hedge against his own dangerous passions, which had carried him along a conveyer belt of indiscriminate affairs with both men and women.

Lord Byron in Albanian dress (Portrait by Thomas Phillips, 1835)

But shortly after Ada was conceived, Lady Byron began suspecting her husband’s incestuous relationship with his half-sister, Augusta. Five weeks after Ada’s birth, Annabella decided to seek a separation. Her attorneys sent Lord Byron a letter stating that “Lady B. positively affirms that she has not at any time spread reports injurious to Lord Byrons [sic] character” — with the subtle but clear implication that unless Lord Byron complies, she might. The poet now came to see his wife, whom he had once called “Princess of Parallelograms” in affectionate reverence for her mathematical talents, as a calculating antagonist, a “Mathematical Medea,” and later came to mock her in his famous epic poem Don Juan: “Her favourite science was the mathematical… She was a walking calculation.”

Augusta Ada Byron as a child

Ada was never to meet her father, who died in Greece the age of thirty-six. Ada was eight. On his deathbed, he implored his valet: “Oh, my poor dear child! — my dear Ada! My God, could I have seen her! Give her my blessing.” The girl was raised by her mother, who was bent on eradicating any trace of her father’s influence by immersing her in science and math from the time she was four. At twelve, Ada became fascinated by mechanical engineering and wrote a book called Flyology, in which she illustrated with her own plates her plan for constructing a flying apparatus. And yet she felt that part of her — the poetic part — was being repressed. In a bout of teenage defiance, she wrote to her mother:

You will not concede me philosophical poetry. Invert the order! Will you give me poetical philosophy, poetical science?

Indeed, the very friction that had caused her parents to separate created the fusion that made Ada a pioneer of “poetical science.”

That fruitful friction is what Walter Isaacson explores as he profiles Ada in the opening chapter of The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution (public library | IndieBound), alongside such trailblazers as Vannevar Bush, Alan Turing, and Stewart Brand. Isaacson writes:

Ada had inherited her father’s romantic spirit, a trait that her mother tried to temper by having her tutored in mathematics. The combination produced in Ada a love for what she took to calling “poetical science,” which linked her rebellious imagination to her enchantment with numbers. For many, including her father, the rarefied sensibilities of the Romantic era clashed with the techno-excitement of the Industrial Revolution. But Ada was comfortable at the intersection of both eras.

Ada King, Countess of Lovelace (Portrait by Alfred Edward Chalon, 1840)

When she was only seventeen, Ada attended one of legendary English polymath Charles Babbage’s equally legendary salons. There, amid the dancing, readings, and intellectual games, Babbage performed a dramatic demonstration of his Difference Engine, a beast of a calculating machine he was building. Ada was instantly captivated by its poetical possibilities, far beyond what the machine’s own inventor had envisioned. Later, one of her friends would remark: “Miss Byron, young as she was, understood its working, and saw the great beauty of the invention.”

Isaacson outlines the significance of that moment, in both Ada’s life and the trajectory of our culture:

Ada’s love of both poetry and math primed her to see beauty in a computing machine. She was an exemplar of the era of Romantic science, which was characterized by a lyrical enthusiasm for invention and discovery.

[…]

It was a time not unlike our own. The advances of the Industrial Revolution, including the steam engine, mechanical loom, and telegraph, transformed the nineteenth century in much the same way that the advances of the Digital Revolution — the computer, microchip, and Internet — have transformed our own. At the heart of both eras were innovators who combined imagination and passion with wondrous technology, a mix that produced Ada’s poetical science and what the twentieth-century poet Richard Brautigan would call “machines of loving grace.”

Enchanted by the prospect of the “poetical science” she imagined possible, Ada set out to convince Charles Babbage to be her mentor. She pitched him in a letter:

I have a peculiar way of learning, and I think it must be a peculiar man to teach me successfully… Do not reckon me conceited, … but I believe I have the power of going just as far as I like in such pursuits, and where there is so decided a taste, I should almost say a passion, as I have for them, I question if there is not always some portion of natural genius even.

Here, Isaacson makes a peculiar remark: “Whether due to her opiates or her breeding or both,” he writes in quoting that letter, “she developed a somewhat outsize opinion of her own talents and began to describe herself as a genius.” The irony, of course, is that she was a genius — Isaacson himself acknowledges that by the very act of choosing to open his biography of innovation with her. But would a man of such ability and such unflinching confidence in that ability be called out for his “outsize opinion,” for being someone with an “exalted view of [his] talents,” as Isaacson later writes of Ada? If a woman of her indisputable brilliance can’t be proud of her own talent without being dubbed delusional, then, surely, there is little hope for the rest of us mere female mortals to make any claim to confidence without being accused of hubris.

To be sure, if Isaacson didn’t see the immense value of Ada’s cultural contribution, he would not have included her in the book — a book that opens and closes with her, no less. These remarks, then, are perhaps less a matter of lamentable personal opinion than a reflection of limiting cultural conventions and our ambivalence about the admissible level of confidence a woman can have in her own talents.

Isaacson, indeed — despite disputing whether Ada deserves anointment as “the world’s first computer programmer” commonly attributed to her — makes the appropriateness of celebrating her contribution clear:

Ada’s ability to appreciate the beauty of mathematics is a gift that eludes many people, including some who think of themselves as intellectual. She realized that math was a lovely language, one that describes the harmonies of the universe and can be poetic at times. Despite her mother’s efforts, she remained her father’s daughter, with a poetic sensibility that allowed her to view an equation as a brushstroke that painted an aspect of nature’s physical splendor, just as she could visualize the “wine-dark sea” or a woman who “walks in beauty, like the night.” But math’s appeal went even deeper; it was spiritual. Math “constitutes the language through which alone we can adequately express the great facts of the natural world,” she said, and it allows us to portray the “changes of mutual relationship” that unfold in creation. It is “the instrument through which the weak mind of man can most effectually read his Creator’s works.”

This ability to apply imagination to science characterized the Industrial Revolution as well as the computer revolution, for which Ada was to become a patron saint. She was able, as she told Babbage, to understand the connection between poetry and analysis in ways that transcended her father’s talents. “I do not believe that my father was (or ever could have been) such a Poet as I shall be an Analyst; for with me the two go together indissolubly,” she wrote.

But Ada’s most important contribution came from her role as both a vocal champion of Babbage’s ideas, at a time when society questioned them as ludicrous, and as an amplifier of their potential beyond what Babbage himself had imagined. Isaacson writes:

Ada Lovelace fully appreciated the concept of a general-purpose machine. More important, she envisioned an attribute that might make it truly amazing: it could potentially process not only numbers but any symbolic notations, including musical and artistic ones. She saw the poetry in such an idea, and she set out to encourage others to see it as well.

Trial model of Babbage's Analytical Engine, completed after his death (Science Museum)

In her 1843 supplement to Babbage’s Analytical Engine, simply titled Notes, she outlined four essential concepts that would shape the birth of modern computing a century later. First, she envisioned a general-purpose machine capable not only of performing preprogrammed tasks but also of being reprogrammed to execute a practically unlimited range of operations — in other words, as Isaacson points out, she envisioned the modern computer.

Her second concept would become a cornerstone of the digital age — the idea that such a machine could handle far more than mathematical calculations; that it could be a Symbolic Medea capable of processing musical and artistic notations. Isaacson writes:

This insight would become the core concept of the digital age: any piece of content, data, or information — music, text, pictures, numbers, symbols, sounds, video — could be expressed in digital form and manipulated by machines. Even Babbage failed to see this fully; he focused on numbers. But Ada realized that the digits on the cogs could represent things other than mathematical quantities. Thus did she make the conceptual leap from machines that were mere calculators to ones that we now call computers.

Her third innovation was a step-by-step outline of “the workings of what we now call a computer program or algorithm.” But it was her fourth one, Isaacson notes, that was and still remains most momentous — the question of whether machines can think independently, which we still struggle to answer in the age of Siri-inspired fantasies like the movie Her. Ada wrote in her Notes:

The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths.

In the closing chapter, titled “Ada Forever,” Isaacson considers the enduring implications of this question:

Ada might also be justified in boasting that she was correct, at least thus far, in her more controversial contention: that no computer, no matter how powerful, would ever truly be a “thinking” machine. A century after she died, Alan Turing dubbed this “Lady Lovelace’s Objection” and tried to dismiss it by providing an operational definition of a thinking machine — that a person submitting questions could not distinguish the machine from a human — and predicting that a computer would pass this test within a few decades. But it’s now been more than sixty years, and the machines that attempt to fool people on the test are at best engaging in lame conversation tricks rather than actual thinking. Certainly none has cleared Ada’s higher bar of being able to “originate” any thoughts of its own.

In encapsulating Ada’s ultimate legacy, Isaacson once again touches on our ambivalence about the mythologies of genius — perhaps even more so of women’s genius — and finds wisdom in her own words:

As she herself wrote in those “Notes,” referring to the Analytical Engine but in words that also describe her fluctuating reputation, “In considering any new subject, there is frequently a tendency, first, to overrate what we find to be already interesting or remarkable; and, secondly, by a sort of natural reaction, to undervalue the true state of the case.”

The reality is that Ada’s contribution was both profound and inspirational. More than Babbage or any other person of her era, she was able to glimpse a future in which machines would become partners of the human imagination, together weaving tapestries as beautiful as those from Jacquard’s loom. Her appreciation for poetical science led her to celebrate a proposed calculating machine that was dismissed by the scientific establishment of her day, and she perceived how the processing power of such a device could be used on any form of information. Thus did Ada, Countess of Lovelace, help sow the seeds for a digital age that would blossom a hundred years later.

Ada died of progressively debilitating uterine cancer in 1852, when she was thirty-six — the same age as Lord Byron. She requested that she be buried in a country grave, alongside the father whom she never knew but whose poetical sensibility profoundly shaped her own genius of “poetical science.”

The Innovators goes on to trace Ada’s influence as it reverberates through the seminal work of a stable of technological pioneers over the century and a half since her death. Complement it with Ada’s spirited letter on science and religion.

Donating = Loving

In 2014, I poured thousands of hours and tons of love into bringing you (ad-free) Brain Pickings. But it also took some hefty practical expenses to keep things going. If you found any joy and stimulation here over the year, please consider helping me fuel the former and offset the latter by becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

Elie Wiesel’s Timely Nobel Peace Prize Acceptance Speech on Human Rights and Our Shared Duty in Ending Injustice

By:

“We must always take sides. Neutrality helps the oppressor, never the victim. Silence encourages the tormentor, never the tormented.”

In 1986, at the age of fifty-eight, Romanian-born Jewish-American writer and political activist Elie Wiesel was awarded the Nobel Peace Prize. The Nobel committee called him a “messenger to mankind.” Wiesel lived up to that moniker with exquisite eloquence on December 10 that year — exactly ninety years after Alfred Nobel died — as he took the stage at Norway’s Oslo City Hall and delivered a spectacular speech on justice, oppression, and our individual responsibility in our shared freedom. The address was eventually included in Elie Wiesel: Messenger for Peace (public library | IndieBound).

Three decades later, Wiesel’s words ring with discomfiting timeliness as we are jolted out of our generational hubris, out of the illusion of progress, forced to confront the contemporary realities of racism, torture, and other injustice against the human experience. But alongside the reminder of how tragically we have failed Wiesel’s vision is also the promise of possibility reminding us what soaring heights of the human spirit we are capable of reaching if we choose to feed not our lowest impulses but our most exalted. Above all, Wiesel issues an assurance that these choices are not grandiose and reserved for those in power but daily and deeply personal, found in the quality of intention with which we each live our lives.

With the hard-earned wisdom of his own experience as a Holocaust survivor, memorably recounted in his iconic memoir Night, Wiesel extols our duty to speak up against injustice even when the world retreats into the hideout of silence:

I remember: it happened yesterday or eternities ago. A young Jewish boy discovered the kingdom of night. I remember his bewilderment, I remember his anguish. It all happened so fast. The ghetto. The deportation. The sealed cattle car. The fiery altar upon which the history of our people and the future of mankind were meant to be sacrificed.

I remember: he asked his father: “Can this be true?” This is the twentieth century, not the Middle Ages. Who would allow such crimes to be committed? How could the world remain silent?

And now the boy is turning to me: “Tell me,” he asks. “What have you done with my future? What have you done with your life?”

And I tell him that I have tried. That I have tried to keep memory alive, that I have tried to fight those who would forget. Because if we forget, we are guilty, we are accomplices.

And then I explained to him how naïve we were, that the world did know and remained silent. And that is why I swore never to be silent whenever and wherever human beings endure suffering and humiliation. We must always take sides. Neutrality helps the oppressor, never the victim. Silence encourages the tormentor, never the tormented. Sometimes we must interfere. When human lives are endangered, when human dignity is in jeopardy, national borders and sensitivities become irrelevant. Wherever men or women are persecuted because of their race, religion, or political views, that place must — at that moment — become the center of the universe.

Wiesel reminds us that even politically momentous dissent always begins with a personal act — with a single voice refusing to be silenced:

There is so much injustice and suffering crying out for our attention: victims of hunger, of racism, and political persecution, writers and poets, prisoners in so many lands governed by the Left and by the Right. Human rights are being violated on every continent. More people are oppressed than free.

[…]

There is much to be done, there is much that can be done. One person, … one person of integrity, can make a difference, a difference of life and death. As long as one dissident is in prison, our freedom will not be true. As long as one child is hungry, our lives will be filled with anguish and shame. What all these victims need above all is to know that they are not alone; that we are not forgetting them, that when their voices are stifled we shall lend them ours, that while their freedom depends on ours, the quality of our freedom depends on theirs.

This is what I say to the young Jewish boy wondering what I have done with his years. It is in his name that I speak to you and that I express to you my deepest gratitude. No one is as capable of gratitude as one who has emerged from the kingdom of night. We know that every moment is a moment of grace, every hour an offering; not to share them would mean to betray them. Our lives no longer belong to us alone; they belong to all those who need us desperately.

Complement with Viktor Frankl on the human search for meaning and Aung San Suu Kyi, who was awarded the Nobel Peace Prize herself five years later, on freedom from fear, then revisit William Faulkner’s piercing Nobel Prize acceptance speech on the role of the writer as a booster of the human heart, Albert Camus’s beautiful letter of gratitude to his childhood teacher upon receiving the coveted accolade, and the story of why Jean Paul Sartre became the first person to decline the prestigious prize.

Donating = Loving

In 2014, I poured thousands of hours and tons of love into bringing you (ad-free) Brain Pickings. But it also took some hefty practical expenses to keep things going. If you found any joy and stimulation here over the year, please consider helping me fuel the former and offset the latter by becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

How We Become Who We Are: Meghan Daum on Nostalgia, Aging, and Why We Romanticize Our Imperfect Younger Selves

By:

“Life is mostly an exercise in being something other than what we used to be while remaining fundamentally—and sometimes maddeningly—who we are.”

In her mind-bending meditation on what makes you and your young self the same person despite a lifetime of changes, philosopher Rebecca Goldstein pondered the philosophical conundrum of our “integrity of identity that persists over time, undergoing changes and yet still continuing to be.” Psychologists, meanwhile, have demonstrated that we’re woefully flawed at predicting the priorities of our future selves. Even so, Joan Didion was right to counsel in her classic essay on keeping a notebook that “we are well advised to keep on nodding terms with the people we used to be, whether we find them attractive company or not.” But the most confounding thing about our relationship with the evolution of our own selves is that we tend to romanticize our youth even if we don’t find the versions of ourselves that inhabited it “attractive company” at all.

This conundrum is one of the many human perplexities Meghan Daum, one of the finest essayists of our time, explores in The Unspeakable: And Other Subjects of Discussion (public library | IndieBound) — a magnificent collection of personal essays examining “the tension between primal reactions and public decorum” and aiming at “a larger discussion about the way human experiences too often come with preassigned emotional responses,” driven by a valiant effort to unbridle those messy, complex experiences from the simplistic templates with which we address them, both privately and publicly.

Meghan Daum (Photograph: Laura Kleinhenz)

In the introduction, Daum echoes Zadie Smith’s piercing critique of our platitudes-paved road to self-actualization and laments the hijacking of our darker, more disquieting emotions by the happiness industrial complex:

For all the lip service we pay to “getting real,” we remain a culture whose discourse is largely rooted in platitudes. We are told — and in turn tell others — that illness and suffering isn’t a ruthless injustice, but a journey of hope. Finding disappointment in places where we’re supposed to find joy isn’t a sign of having different priorities as much as having an insufficiently healthy outlook. We love redemption stories and silver linings. We believe in overcoming adversity, in putting the past behind us, in everyday miracles. We like the idea that everything happens for a reason. When confronted with the suggestion that life is random or that suffering is not always transcendent we’re apt to not only accuse the suggester of rudeness but also pity him for his negative worldview. To reject sentimentality, or even question it, isn’t just uncivilized, it’s practically un-American.

In one of the collection’s most pause-giving essays, titled “Not What It Used to Be,” Daum reflects on the conflicted, paradoxical nostalgia we tend to place on our youth — nostalgia woven of an openness of longing, as the infinite possibilities of life stretch ahead, but also of many misplaced longings for the wrong things, the dangerous things, the dangerously safe things. Daum writes:

Most of us have unconscious disbeliefs about our lives, facts that we accept at face value but that still cause us to gasp just a little when they pass through our minds at certain angles. Mine are these: that my mother is dead, that the Vatican actually had it in itself to select a pope like Pope Francis, and that I am now older than the characters on thirtysomething. That last one is especially upending. How is it that the people who were, for me, the very embodiment of adulthood, who, with their dinner parties and marital spats and career angst represented the place in life I’d like to get to but surely never will, are on average six to eight years my junior? How did I get to be middle-aged without actually growing up?

Illustration by Lisbeth Zwerger from a rare edition of 'Alice in Wonderland.' Click image for more.

In a sentiment that calls to mind Maya Angelou’s unforgettable words on growing up, Daum adds:

Luckily, even some of the most confounding questions have soothingly prosaic answers. On the subject of growing up, or feeling that you have succeeded in doing so, I’m pretty sure the consensus is that it’s an illusion. Probably no one ever really feels grown-up, except for certain high school math teachers or members of Congress. I suspect that most members of AARP go around feeling in many ways just as confused and fraudulent as most middle school students. You might even be able to make a case that not feeling grown-up is a sign that you actually are, much as worrying that you’re crazy supposedly means you’re not.

Daum’s astonishment is especially resonant for those of us who compounded our dissatisfying college experience with the culturally inflicted guilt of feeling like not finding satisfaction there was a profound personal failure:

I managed to have such a mediocre time at a place that is pretty much custom designed for delivering the best years of your life. I’d like to say that I wasn’t the same person back then that I later became and now am. But the truth is that I was the exact same person. I was more myself then than at any other time in my life. I was an extreme version of myself. Everything I’ve always felt I felt more intensely. Everything I’ve always wanted, I wanted more. Everything I currently dislike, I downright hated back then. People who think I’m judgmental, impatient, and obsessed with real estate now should have seen me in college. I was bored by many of my classmates and irked by the contrived mischief and floundering sexual intrigues of dormitory life. I couldn’t wait to get out and rent my own apartment, preferably one in a grand Edwardian building on the Upper West Side of Manhattan. In that sense, I guess my college experience was just as intense as my husband’s. I just view that intensity negatively rather than nostalgically, which perhaps is its own form of nostalgia.

To illuminate that curious misplacing of nostalgia, Daum invokes an imaginary encounter between her present self and her older self — the concept behind an emboldening old favorite of letters by luminaries to their younger selves — in which Older Self ambushes Younger Self “like a goon sent in to settle a debt”:

At first, Younger Self is frightened and irritated (Older Self speaks harshly to her) but a feeling of calm quickly sets in over the encounter. Younger Self sits there rapt, as though receiving the wisdom of Yoda or of some musician she idolizes, such as Joni Mitchell. But Older Self is no Yoda. Older Self is stern and sharp. Older Self has adopted the emphatic, no-nonsense speaking style of formidable women with whom she worked in countless New York City offices before deciding she never again wanted to work anywhere but her own home (a place where, over the years, she has lost a certain amount of people skills and has been known to begin conversations as though slamming a cleaver into a side of raw beef). Older Self begins her sentences with “Listen” and “Look.” She says, “Listen, what you’re into right now isn’t working for you.” She says, “Look, do yourself a favor and get out of this situation right now. All of it. The whole situation. Leave this college. Forget about this boy you’re sleeping with but not actually dating. Stop pretending you did the reading for your Chaucer seminar when you didn’t and never will.”

To which Younger Self will ask, “Okay, then what should I do?” And of course Older Self has no answer, because Older Self did not leave the college, did not drop the boy, did not stop pretending to have read Chaucer. And the cumulative effect of all those failures (or missed opportunities, blown chances, fuckups, whatever) is sitting right here, administering a tongue-lashing to her younger self (which is to say herself) about actions or inactions that were never going to be anything other than what they were. And at that point the younger and older selves merge into some kind of floating blob of unfortunate yet inevitable life choices, at which point I stop the little game and nudge my mind back into real time and try to think about other things, such as what I might have for dinner that night or what might happen when I die. Such is the pendulum of my post-forty thoughts.

And yet the most paradoxical, most endearingly human thing is that most of us invariably fail to see our Younger Self as part of that amalgamated blob and instead romanticize it as the counterpoint to those “unfortunate yet inevitable life choices,” as our highest potentiality at a point before crumbling into the reality of necessary concessions and mediocrities. For all its cluelessness, for all its complicity in the making of our present dissatisfactions, we continue to worship youth — especially our own.

Reflecting on the disorienting fact — because that fact is always disorienting to those of whom it becomes factual — that nothing she ever does will ever be preceded by the word young again, Daum writes:

Any traces of precocity I ever had are long forgotten. I am not and will never again be a young writer, a young homeowner, a young teacher. I was never a young wife. The only thing I could do now for which my youth would be a truly notable feature would be to die. If I died now, I’d die young. Everything else, I’m doing middle-aged.

I am nostalgic for my twenties (most of them, anyway; twenty and twenty-one were squandered at college; twenty-four was kind of a wash, too) but I can tell you for sure that they weren’t as great as I now crack them up to be. I was always broke, I was often lonely, and I had some really terrible clothes. But my life was shiny and unblemished. Everything was ahead of me. I walked around with an abiding feeling that, at any given time, anything could go in any direction. And it was often true.

In a passage that makes one wonder whether contemporary adults are thrust into an illusory sense of youth by the constant stimulation and endless temptations of the internet, Daum describes the ceaseless fear of missing out — FOMO, as the Information Age has shorthanded it — that characterized her youth:

I didn’t want to miss anything. I wanted to stretch out over the city like a giant octopus. I wanted enough appendages to be able to ring every door buzzer simultaneously. There was some switch turned on in my brain that managed to make 90 percent of conversations feel interesting or useful or, if nothing else, worth referencing later if only by way of describing how boring this person was who I got stuck talking to.

And then, echoing Joan Didion’s memorable lament that “memory fades, memory adjusts, memory conforms to what we think we remember,” she adds: “Or at least it’s easy to remember it that way.”

Illustration from 'Lost in Translation' by Ella Frances Sanders, an illustrated compendium of untranslatable words from around the world. Click image for more.

But the sense she describes is a palpable, familiar one, perhaps best captured by the untranslatable Portuguese word saudade. Daum writes:

This was a time in my life when I was so filled with longing for so many things that were so far out of reach that at least once a day I thought my heart would implode from the sheer force of unrequited desire.

By desire I am not referring to apartments I wanted to occupy or furniture I wanted to buy or even people I was attracted to (well, I’m referring to those things a little) but, rather, a sensation I can only describe as the ache of not being there yet.

She revisits the imaginary encounter between her two selves and considers how gobsmacked Younger Self would be by the notion that a few decades later, she’d be reminiscing fondly about the cumulative timescape of the very things presently exasperating her:

I can imagine her looking at Older Self in horrified astonishment. “I’m going to be reminiscing about this?” she’d ask while the ATM spat out her card and flashed “insufficient funds” across the screen. “You’re telling me that when I’m forty-five I’ll be pining for the temp jobs and cheap shoes that now comprise my life? You’re telling me this is as good as it gets? You’re telling me, contrary to everything I tell myself, that it’s actually all downhill from here?”

To which I’d hope that Older Self would have the good sense to assure Younger Self that that is not what she is saying, that indeed things will only go up from here. Maybe not right away and certainly not without some deep valleys to offset the peaks (as well as a few sharp left turns, as long as we’re speaking in euphemisms) but with enough steadiness to suggest that whatever she is doing now more or less constitutes being on the right track.

What makes this imaginary exchange especially alluring as a thought experiment is precisely the fact that it’s fictional — fictional not because such a fold in the space-time continuum of personal identity is impossible in real life, but because it unfolds in a microscopic level every second of every minute of every day of our real lives. What makes the encounter fictional is the very idea of a static, all-knowing Older Self at any point in life — we are, indeed, “works in progress that mistakenly think they’re finished,” and the dividing line between our past selves and our present ones is a constantly shifting one, not so much a line as a scatterplot of impressions clustering here and there to form some aspect of our present identity, only to disperse again into the ether of our fluid personhood and reassemble in a different formation on which we hang our daily fragment of identity.

But this, perhaps, is what Daum is ultimately getting at. She follows that alluring fiction to its inevitable, necessary end:

“Listen,” Older Self might say. “The things that right now seem permanently out of reach, you’ll reach them eventually. You’ll have a career, a house, a partner in life. You will have much better shoes. You will reach a point where your funds will generally be sufficient — maybe not always plentiful, but sufficient.”

But here’s what Older Self will not have the heart to say: some of the music you are now listening to — the CDs you play while you stare out the window and think about the five million different ways your life might go — will be unbearable to listen to in twenty years. They will be unbearable not because they will sound dated and trite but because they will sound like the lining of your soul. They will take you straight back to the place you were in when you felt that anything could happen at any time, that your life was a huge room with a thousand doors, that your future was not only infinite but also elastic. They will be unbearable because they will remind you that at least half of the things you once planned for your future are now in the past and others got reabsorbed into your imagination before you could even think about acting on them. It will be as though you’d never thought of them in the first place, as if they were never meant to be anything more than passing thoughts you had while playing your stereo at night.

Illustration by Lisbeth Zwerger from a rare edition of 'Alice in Wonderland.' Click image for more.

Daum ends by reflecting on how we manage to romanticize such anguishing times by excising the anguish and framing into our memory only the sense of that octopoid reach into possibility:

Now that I am almost never the youngest person in any room I realize that what I miss most about those times is the very thing that drove me so mad back when I was living in them. What I miss is the feeling that nothing has started yet, that the future towers over the past, that the present is merely a planning phase for the gleaming architecture that will make up the skyline of the rest of my life. But what I forget is the loneliness of all that. If everything is ahead then nothing is behind. You have no ballast. You have no tailwinds either. You hardly ever know what to do, because you’ve hardly done anything. I guess this is why wisdom is supposed to be the consolation prize of aging. It’s supposed to give us better things to do than stand around and watch in disbelief as the past casts long shadows over the future.

With an eye toward the profound rift between who we think we’ll become and who we end up becoming, Daum concludes:

The problem, I now know, is that no one ever really feels wise, least of all those who actually have it in themselves to be so. The Older Self of our imagination never quite folds itself into the older self we actually become. Instead, it hovers in the perpetual distance like a highway mirage. It’s the destination that never gets any closer even as our life histories pile up behind us in the rearview mirror. It is the reason that I got to forty-something without ever feeling thirty-something. It is why I hope that if I make it to eighty-something I have the good sense not to pull out those old CDs. My heart, by then, surely would not be able to keep from imploding. My heart, back then, stayed in one piece only because, as bursting with anticipation as it was, it had not yet been strained by nostalgia. It had not yet figured out that life is mostly an exercise in being something other than what we used to be while remaining fundamentally — and sometimes maddeningly — who we are.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.