Brain Pickings

Posts Tagged ‘culture’

09 JUNE, 2014

How We Grieve: Meghan O’Rourke on the Messiness of Mourning and Learning to Live with Loss

By:

“The people we most love do become a physical part of us, ingrained in our synapses, in the pathways where memories are created.”

John Updike wrote in his memoir, “Each day, we wake slightly altered, and the person we were yesterday is dead. So why, one could say, be afraid of death, when death comes all the time?” And yet even if we were to somehow make peace with our own mortality, a primal and soul-shattering fear rips through whenever we think about losing those we love most dearly — a fear that metastasizes into all-consuming grief when loss does come. In The Long Goodbye (public library), her magnificent memoir of grieving her mother’s death, Meghan O’Rourke crafts a masterwork of remembrance and reflection woven of extraordinary emotional intelligence. A poet, essayist, literary critic, and one of the youngest editors the New Yorker has ever had, she tells a story that is deeply personal in its details yet richly resonant in its larger humanity, making tangible the messy and often ineffable complexities that anyone who has ever lost a loved one knows all too intimately, all too anguishingly. What makes her writing — her mind, really — particularly enchanting is that she brings to this paralyzingly difficult subject a poet’s emotional precision, an essayist’s intellectual expansiveness, and a voracious reader’s gift for apt, exquisitely placed allusions to such luminaries of language and life as Whitman, Longfellow, Tennyson, Swift, and Dickinson (“the supreme poet of grief”).

O’Rourke writes:

When we are learning the world, we know things we cannot say how we know. When we are relearning the world in the aftermath of a loss, we feel things we had almost forgotten, old things, beneath the seat of reason.

[…]

Nothing prepared me for the loss of my mother. Even knowing that she would die did not prepare me. A mother, after all, is your entry into the world. She is the shell in which you divide and become a life. Waking up in a world without her is like waking up in a world without sky: unimaginable.

[…]

When we talk about love, we go back to the start, to pinpoint the moment of free fall. But this story is the story of an ending, of death, and it has no beginning. A mother is beyond any notion of a beginning. That’s what makes her a mother: you cannot start the story.

In the days following her mother’s death, as O’Rourke faces the loneliness she anticipated and the sense of being lost that engulfed her unawares, she contemplates the paradoxes of loss: Ours is a culture that treats grief — a process of profound emotional upheaval — with a grotesquely mismatched rational prescription. On the one hand, society seems to operate by a set of unspoken shoulds for how we ought to feel and behave in the face of sorrow; on the other, she observes, “we have so few rituals for observing and externalizing loss.” Without a coping strategy, she finds herself shutting down emotionally and going “dead inside” — a feeling psychologists call “numbing out” — and describes the disconnect between her intellectual awareness of sadness and its inaccessible emotional manifestation:

It was like when you stay in cold water too long. You know something is off but don’t start shivering for ten minutes.

But at least as harrowing as the aftermath of loss is the anticipatory bereavement in the months and weeks and days leading up to the inevitable — a particularly cruel reality of terminal cancer. O’Rourke writes:

So much of dealing with a disease is waiting. Waiting for appointments, for tests, for “procedures.” And waiting, more broadly, for it—for the thing itself, for the other shoe to drop.

The hallmark of this anticipatory loss seems to be a tapestry of inner contradictions. O’Rourke notes with exquisite self-awareness her resentment for the mundanity of it all — there is her mother, sipping soda in front of the TV on one of those final days — coupled with weighty, crushing compassion for the sacred humanity of death:

Time doesn’t obey our commands. You cannot make it holy just because it is disappearing.

Then there was the question of the body — the object of so much social and personal anxiety in real life, suddenly stripped of control in the surreal experience of impending death. Reflecting on the initially disorienting experience of helping her mother on and off the toilet and how quickly it became normalized, O’Rourke writes:

It was what she had done for us, back before we became private and civilized about our bodies. In some ways I liked it. A level of anxiety about the body had been stripped away, and we were left with the simple reality: Here it was.

I heard a lot about the idea of dying “with dignity” while my mother was sick. It was only near her very end that I gave much thought to what this idea meant. I didn’t actually feel it was undignified for my mother’s body to fail — that was the human condition. Having to help my mother on and off the toilet was difficult, but it was natural. The real indignity, it seemed, was dying where no one cared for you the way your family did, dying where it was hard for your whole family to be with you and where excessive measures might be taken to keep you alive past a moment that called for letting go. I didn’t want that for my mother. I wanted her to be able to go home. I didn’t want to pretend she wasn’t going to die.

Among the most painful realities of witnessing death — one particularly exasperating for type-A personalities — is how swiftly it severs the direct correlation between effort and outcome around which we build our lives. Though the notion might seem rational on the surface — especially in a culture that fetishizes work ethic and “grit” as the key to success — an underbelly of magical thinking lurks beneath, which comes to light as we behold the helplessness and injustice of premature death. Noting that “the mourner’s mind is superstitious, looking for signs and wonders,” O’Rourke captures this paradox:

One of the ideas I’ve clung to most of my life is that if I just try hard enough it will work out. If I work hard, I will be spared, and I will get what I desire, finding the cave opening over and over again, thieving life from the abyss. This sturdy belief system has a sidecar in which superstition rides. Until recently, I half believed that if a certain song came on the radio just as I thought of it, it meant that all would be well. What did I mean? I preferred not to answer that question. To look too closely was to prick the balloon of possibility.

But our very capacity for the irrational — for the magic of magical thinking — also turns out to be essential for our spiritual survival. Without the capacity to discern from life’s senseless sound a meaningful melody, we would be consumed by the noise. In fact, one of O’Rourke’s most poetic passages recounts her struggle to find a transcendent meaning on an average day, amid the average hospital noises:

I could hear the coughing man whose family talked about sports and sitcoms every time they visited, sitting politely around his bed as if you couldn’t see the death knobs that were his knees poking through the blanket, but as they left they would hug him and say, We love you, and We’ll be back soon, and in their voices and in mine and in the nurse who was so gentle with my mother, tucking cool white sheets over her with a twist of her wrist, I could hear love, love that sounded like a rope, and I began to see a flickering electric current everywhere I looked as I went up and down the halls, flagging nurses, little flecks of light dotting the air in sinewy lines, and I leaned on these lines like guy ropes when I was so tired I couldn’t walk anymore and a voice in my head said: Do you see this love? And do you still not believe?

I couldn’t deny the voice.

Now I think: That was exhaustion.

But at the time the love, the love, it was like ropes around me, cables that could carry us up into the higher floors away from our predicament and out onto the roof and across the empty spaces above the hospital to the sky where we could gaze down upon all the people driving, eating, having sex, watching TV, angry people, tired people, happy people, all doing, all being –

In the weeks following her mother’s death, melancholy — “the black sorrow, bilious, angry, a slick in my chest” — comes coupled with another intense emotion, a parallel longing for a different branch of that-which-no-longer-is:

I experienced an acute nostalgia. This longing for a lost time was so intense I thought it might split me in two, like a tree hit by lightning. I was — as the expression goes — flooded by memories. It was a submersion in the past that threatened to overwhelm any “rational” experience of the present, water coming up around my branches, rising higher. I did not care much about work I had to do. I was consumed by memories of seemingly trivial things.

But the embodied presence of the loss is far from trivial. O’Rourke, citing a psychiatrist whose words had stayed with her, captures it with harrowing precision:

The people we most love do become a physical part of us, ingrained in our synapses, in the pathways where memories are created.

In another breathtaking passage, O’Rourke conveys the largeness of grief as it emanates out of our pores and into the world that surrounds us:

In February, there was a two-day snowstorm in New York. For hours I lay on my couch, reading, watching the snow drift down through the large elm outside … the sky going gray, then eerie violet, the night breaking around us, snow like flakes of ash. A white mantle covered trees, cars, lintels, and windows. It was like one of grief’s moods: melancholic; estranged from the normal; in touch with the longing that reminds us that we are being-toward-death, as Heidegger puts it. Loss is our atmosphere; we, like the snow, are always falling toward the ground, and most of the time we forget it.

Because grief seeps into the external world as the inner experience bleeds into the outer, it’s understandable — it’s hopelessly human — that we’d also project the very object of our grief onto the external world. One of the most common experiences, O’Rourke notes, is for the grieving to try to bring back the dead — not literally, but by seeing, seeking, signs of them in the landscape of life, symbolism in the everyday. The mind, after all, is a pattern-recognition machine and when the mind’s eye is as heavily clouded with a particular object as it is when we grieve a loved one, we begin to manufacture patterns. Recounting a day when she found inside a library book handwriting that seemed to be her mother’s, O’Rourke writes:

The idea that the dead might not be utterly gone has an irresistible magnetism. I’d read something that described what I had been experiencing. Many people go through what psychologists call a period of “animism,” in which you see the dead person in objects and animals around you, and you construct your false reality, the reality where she is just hiding, or absent. This was the mourner’s secret position, it seemed to me: I have to say this person is dead, but I don’t have to believe it.

[…]

Acceptance isn’t necessarily something you can choose off a menu, like eggs instead of French toast. Instead, researchers now think that some people are inherently primed to accept their own death with “integrity” (their word, not mine), while others are primed for “despair.” Most of us, though, are somewhere in the middle, and one question researchers are now focusing on is: How might more of those in the middle learn to accept their deaths? The answer has real consequences for both the dying and the bereaved.

O’Rourke considers the psychology and physiology of grief:

When you lose someone you were close to, you have to reassess your picture of the world and your place in it. The more your identity is wrapped up with the deceased, the more difficult the mental work.

The first systematic survey of grief, I read, was conducted by Erich Lindemann. Having studied 101 people, many of them related to the victims of the Cocoanut Grove fire of 1942, he defined grief as “sensations of somatic distress occurring in waves lasting from twenty minutes to an hour at a time, a feeling of tightness in the throat, choking with shortness of breath, need for sighing, and an empty feeling in the abdomen, lack of muscular power, and an intensive subjective distress described as tension or mental pain.”

Tracing the history of studying grief, including Elisabeth Kübler-Ross’s famous and often criticized 1969 “stage theory” outlining a simple sequence of Denial, Anger, Bargaining, Depression, and Acceptance, O’Rourke notes that most people experience grief not as sequential stages but as ebbing and flowing states that recur at various points throughout the process. She writes:

Researchers now believe there are two kinds of grief: “normal grief” and “complicated grief” (also called “prolonged grief”). “Normal grief” is a term for what most bereaved people experience. It peaks within the first six months and then begins to dissipate. “Complicated grief” does not, and often requires medication or therapy. But even “normal grief” … is hardly gentle. Its symptoms include insomnia or other sleep disorders, difficulty breathing, auditory or visual hallucinations, appetite problems, and dryness of mouth.

One of the most persistent psychiatric ideas about grief, O’Rourke notes, is the notion that one ought to “let go” in order to “move on” — a proposition plentiful even in the casual advice of her friends in the weeks following her mother’s death. And yet it isn’t necessarily the right coping strategy for everyone, let alone the only one, as our culture seems to suggest. Unwilling to “let go,” O’Rourke finds solace in anthropological alternatives:

Studies have shown that some mourners hold on to a relationship with the deceased with no notable ill effects. In China, for instance, mourners regularly speak to dead ancestors, and one study demonstrated that the bereaved there “recovered more quickly from loss” than bereaved Americans do.

I wasn’t living in China, though, and in those weeks after my mother’s death, I felt that the world expected me to absorb the loss and move forward, like some kind of emotional warrior. One night I heard a character on 24—the president of the United States—announce that grief was a “luxury” she couldn’t “afford right now.” This model represents an old American ethic of muscling through pain by throwing yourself into work; embedded in it is a desire to avoid looking at death. We’ve adopted a sort of “Ask, don’t tell” policy. The question “How are you?” is an expression of concern, but as my dad had said, the mourner quickly figures out that it shouldn’t always be taken for an actual inquiry… A mourner’s experience of time isn’t like everyone else’s. Grief that lasts longer than a few weeks may look like self-indulgence to those around you. But if you’re in mourning, three months seems like nothing — [according to some] research, three months might well find you approaching the height of sorrow.

Another Western hegemony in the culture of grief, O’Rourke notes, is its privatization — the unspoken rule that mourning is something we do in the privacy of our inner lives, alone, away from the public eye. Though for centuries private grief was externalized as public mourning, modernity has left us bereft of rituals to help us deal with our grief:

The disappearance of mourning rituals affects everyone, not just the mourner. One of the reasons many people are unsure about how to act around a loss is that they lack rules or meaningful conventions, and they fear making a mistake. Rituals used to help the community by giving everyone a sense of what to do or say. Now, we’re at sea.

[…]

Such rituals … aren’t just about the individual; they are about the community.

Craving “a formalization of grief, one that might externalize it,” O’Rourke plunges into the existing literature:

The British anthropologist Geoffrey Gorer, the author of Death, Grief, and Mourning, argues that, at least in Britain, the First World War played a huge role in changing the way people mourned. Communities were so overwhelmed by the sheer number of dead that the practice of ritualized mourning for the individual eroded. Other changes were less obvious but no less important. More people, including women, began working outside the home; in the absence of caretakers, death increasingly took place in the quarantining swaddle of the hospital. The rise of psychoanalysis shifted attention from the communal to the individual experience. In 1917, only two years after Émile Durkheim wrote about mourning as an essential social process, Freud’s “Mourning and Melancholia” defined it as something essentially private and individual, internalizing the work of mourning. Within a few generations, I read, the experience of grief had fundamentally changed. Death and mourning had been largely removed from the public realm. By the 1960s, Gorer could write that many people believed that “sensible, rational men and women can keep their mourning under complete control by strength of will and character, so that it need be given no public expression, and indulged, if at all, in private, as furtively as . . . masturbation.” Today, our only public mourning takes the form of watching the funerals of celebrities and statesmen. It’s common to mock such grief as false or voyeuristic (“crocodile tears,” one commentator called mourners’ distress at Princess Diana’s funeral), and yet it serves an important social function. It’s a more mediated version, Leader suggests, of a practice that goes all the way back to soldiers in The Iliad mourning with Achilles for the fallen Patroclus.

I found myself nodding in recognition at Gorer’s conclusions. “If mourning is denied outlet, the result will be suffering,” Gorer wrote. “At the moment our society is signally failing to give this support and assistance. . . . The cost of this failure in misery, loneliness, despair and maladaptive behavior is very high.” Maybe it’s not a coincidence that in Western countries with fewer mourning rituals, the bereaved report more physical ailments in the year following a death.

Illustration from 'The Iliad and the Odyssey: A Giant Golden Book' by Alice and Martin Provensen. Click image for details.

Finding solace in Marilynne Robinson’s beautiful meditation on our humanity, O’Rourke returns to her own journey:

The otherworldliness of loss was so intense that at times I had to believe it was a singular passage, a privilege of some kind, even if all it left me with was a clearer grasp of our human predicament. It was why I kept finding myself drawn to the remote desert: I wanted to be reminded of how the numinous impinges on ordinary life.

Reflecting on her struggle to accept her mother’s loss — her absence, “an absence that becomes a presence” — O’Rourke writes:

If children learn through exposure to new experiences, mourners unlearn through exposure to absence in new contexts. Grief requires acquainting yourself with the world again and again; each “first” causes a break that must be reset… And so you always feel suspense, a queer dread—you never know what occasion will break the loss freshly open.

She later adds:

After a loss, you have to learn to believe the dead one is dead. It doesn’t come naturally.

Among the most chilling effects of grief is how it reorients us toward ourselves as it surfaces our mortality paradox and the dawning awareness of our own impermanence. O’Rourke’s words ring with the profound discomfort of our shared existential bind:

The dread of death is so primal, it overtakes me on a molecular level. In the lowest moments, it produces nihilism. If I am going to die, why not get it over with? Why live in this agony of anticipation?

[…]

I was unable to push these questions aside: What are we to do with the knowledge that we die? What bargain do you make in your mind so as not to go crazy with fear of the predicament, a predicament none of us knowingly chose to enter? You can believe in God and heaven, if you have the capacity for faith. Or, if you don’t, you can do what a stoic like Seneca did, and push away the awfulness by noting that if death is indeed extinction, it won’t hurt, for we won’t experience it. “It would be dreadful could it remain with you; but of necessity either it does not arrive or else it departs,” he wrote.

If this logic fails to comfort, you can decide, as Plato and Jonathan Swift did, that since death is natural, and the gods must exist, it cannot be a bad thing. As Swift said, “It is impossible that anything so natural, so necessary, and so universal as death, should ever have been designed by Providence as an evil to mankind.” And Socrates: “I am quite ready to admit … that I ought to be grieved at death, if I were not persuaded in the first place that I am going to other gods who are wise and good.” But this is poor comfort to those of us who have no gods to turn to. If you love this world, how can you look forward to departing it? Rousseau wrote, “He who pretends to look on death without fear lies. All men are afraid of dying, this is the great law of sentient beings, without which the entire human species would soon be destroyed.”

And yet, O’Rourke arrives at the same conclusion that Alan Lightman did in his sublime meditation on our longing for permanence as she writes:

Without death our lives would lose their shape: “Death is the mother of beauty,” Wallace Stevens wrote. Or as a character in Don DeLillo’s White Noise says, “I think it’s a mistake to lose one’s sense of death, even one’s fear of death. Isn’t death the boundary we need?” It’s not clear that DeLillo means us to agree, but I think I do. I love the world more because it is transient.

[…]

One would think that living so proximately to the provisional would ruin life, and at times it did make it hard. But at other times I experienced the world with less fear and more clarity. It didn’t matter if I was in line for an extra two minutes. I could take in the sensations of color, sound, life. How strange that we should live on this planet and make cereal boxes, and shopping carts, and gum! That we should renovate stately old banks and replace them with Trader Joe’s! We were ants in a sugar bowl, and one day the bowl would empty.

A Perseid meteor over Joshua Tree National Park (Image: Joe Westerberg / NASA)

This awareness of our transience, our minuteness, and the paradoxical enlargement of our aliveness that it produces seems to be the sole solace from grief’s grip, though we all arrive at it differently. O’Rourke’s father approached it from another angle. Recounting a conversation with him one autumn night — one can’t help but notice the beautiful, if inadvertent, echo of Carl Sagan’s memorable words — O’Rourke writes:

“The Perseid meteor showers are here,” he told me. “And I’ve been eating dinner outside and then lying in the lounge chairs watching the stars like your mother and I used to” — at some point he stopped calling her Mom — “and that helps. It might sound strange, but I was sitting there, looking up at the sky, and I thought, ‘You are but a mote of dust. And your troubles and travails are just a mote of a mote of dust.’ And it helped me. I have allowed myself to think about things I had been scared to think about and feel. And it allowed me to be there — to be present. Whatever my life is, whatever my loss is, it’s small in the face of all that existence… The meteor shower changed something. I was looking the other way through a telescope before: I was just looking at what was not there. Now I look at what is there.”

O’Rourke goes on to reflect on this ground-shifting quality of loss:

It’s not a question of getting over it or healing. No; it’s a question of learning to live with this transformation. For the loss is transformative, in good ways and bad, a tangle of change that cannot be threaded into the usual narrative spools. It is too central for that. It’s not an emergence from the cocoon, but a tree growing around an obstruction.

In one of the most beautiful passages in the book, O’Rourke captures the spiritual sensemaking of death in an anecdote that calls to mind Alan Lightman’s account of a “transcendent experience” and Alan Watt’s consolation in the oneness of the universe. She writes:

Before we scattered the ashes, I had an eerie experience. I went for a short run. I hate running in the cold, but after so much time indoors in the dead of winter I was filled with exuberance. I ran lightly through the stripped, bare woods, past my favorite house, poised on a high hill, and turned back, flying up the road, turning left. In the last stretch I picked up the pace, the air crisp, and I felt myself float up off the ground. The world became greenish. The brightness of the snow and the trees intensified. I was almost giddy. Behind the bright flat horizon of the treescape, I understood, were worlds beyond our everyday perceptions. My mother was out there, inaccessible to me, but indelible. The blood moved along my veins and the snow and trees shimmered in greenish light. Suffused with joy, I stopped stock-still in the road, feeling like a player in a drama I didn’t understand and didn’t need to. Then I sprinted up the driveway and opened the door and as the heat rushed out the clarity dropped away.

I’d had an intuition like this once before, as a child in Vermont. I was walking from the house to open the gate to the driveway. It was fall. As I put my hand on the gate, the world went ablaze, as bright as the autumn leaves, and I lifted out of myself and understood that I was part of a magnificent book. What I knew as “life” was a thin version of something larger, the pages of which had all been written. What I would do, how I would live — it was already known. I stood there with a kind of peace humming in my blood.

A non-believer who had prayed for the first time in her life when her mother died, O’Rourke quotes Virginia Woolf’s luminous meditation on the spirit and writes:

This is the closest description I have ever come across to what I feel to be my experience. I suspect a pattern behind the wool, even the wool of grief; the pattern may not lead to heaven or the survival of my consciousness — frankly I don’t think it does — but that it is there somehow in our neurons and synapses is evident to me. We are not transparent to ourselves. Our longings are like thick curtains stirring in the wind. We give them names. What I do not know is this: Does that otherness — that sense of an impossibly real universe larger than our ability to understand it — mean that there is meaning around us?

[…]

I have learned a lot about how humans think about death. But it hasn’t necessarily taught me more about my dead, where she is, what she is. When I held her body in my hands and it was just black ash, I felt no connection to it, but I tell myself perhaps it is enough to still be matter, to go into the ground and be “remixed” into some new part of the living culture, a new organic matter. Perhaps there is some solace in this continued existence.

[…]

I think about my mother every day, but not as concertedly as I used to. She crosses my mind like a spring cardinal that flies past the edge of your eye: startling, luminous, lovely, gone.

The Long Goodbye is a remarkable read in its entirety — the kind that speaks with gentle crispness to the parts of us we protect most fiercely yet long to awaken most desperately. Complement it with Alan Lightman in finding solace in our impermanence and Tolstoy on finding meaning in a meaningless world.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

09 JUNE, 2014

The Poetics of the Psyche: Adam Phillips on Why Psychoanalysis Is Like Literature and How Art Soothes the Soul

By:

“Everybody is dealing with how much of their own aliveness they can bear and how much they need to anesthetize themselves.”

“A writer is someone who pays attention to the world — a writer is a professional observer,” Susan Sontag once said. The object of the writer’s observation isn’t just the outer world but also — and perhaps even more so — the inner. In that regard, the writer bears a striking similarity to another professional observer — the psychotherapist. That’s precisely what Adam Phillips — Britain’s most celebrated psychoanalytical writer and the author of such immeasurably stimulating reads as Promises, Promises: Essays on Psychoanalysis and Literature, On Kissing, Tickling, and Being Bored: Psychoanalytic Essays on the Unexamined Life, and the particularly wonderful On Kindness — explores in his wide-ranging conversation with Paul Holdengräber, several years in the making, part of The Paris Review’s legendary interview series.

Phillips, who read Carl Jung’s Memories, Dreams, Reflections at the age of seventeen and was profoundly influenced by it, reflects on his early educational experience:

This was conveyed very powerfully — that the way to learn how to live and to live properly was to read English literature — and it worked for me. I was taught close, attentive reading, and to ironize the ambitions of grand theory.

Like Kafka, who memorably considered what books do for the human soul — a question Carl Sagan also addressed beautifully, and one I too once contemplated in answering a 9-year-old girl’s inquiry — Phillips reflects on the essential reward of reading:

It’s not as though when I read I’m gathering information, or indeed can remember much of what I read. I know the books that grip me, as everybody does, but their effect is indiscernible. I don’t quite know what it is… There are powerful unconscious evocative effects in reading books that one loves. There’s something about these books that we want to go on thinking about, that matters to us. They’re not just fetishes that we use to fill gaps. They are like recurring dreams we can’t help thinking about.

Holdengräber cites an essay by the legendary British pediatrician Donald Winnicott, whose definitive biography Phillips penned in 1988:

HOLDENGRÄBER: It seems natural that an interest in literature and in Winnicott should go hand in hand. In Winnicott’s essay “On the Capacity to Be Alone,” he writes that the goal for the child is to be alone in the presence of the mother. For a long time this has seemed to me the single best definition of reading.

PHILLIPS: That idea was one of Winnicott’s most radical, because what he was saying was that solitude was prior to the wish to transgress. That there’s something deeply important about the early experience of being in the presence of somebody without being impinged upon by their demands, and without them needing you to make a demand on them. And that this creates a space internally into which one can be absorbed. In order to be absorbed one has to feel sufficiently safe, as though there is some shield, or somebody guarding you against dangers such that you can “forget yourself ” and absorb yourself, in a book, say. Or, for the child, in a game. It must be one of the precursors of reading, I suppose. I think for Winnicott it would be the definition of a good relationship if, in the relationship, you would be free to be absorbed in something else.

Phillips, who wrote in the preface to Promises, Promises: Essays on Psychoanalysis and Literature that “psychoanalysis, at its best, should be a profession of popularizers of interesting ideas about the difficulties and exhilarations of living,” uses the springboard of the parallels between children’s psychology and reading to consider the broader allure of psychoanalysis:

Psychoanalysis starts from the position that there is no cure, but that we need different ways of living with ourselves and different descriptions of these so-called selves.

The great thing about the psychoanalytic treatment is that it doesn’t work in the usual sense of work. I don’t mean by this to avoid the fact that it addresses human suffering. I only mean that it takes for granted that an awful lot of human suffering is simply intractable, that there’s a sense in which character is intractable. People change, but there really are limits. One thing you discover in psychoanalytic treatment is the limits of what you can change about yourself or your life. We are children for a very long time.

[…]

The point is that it’s an experiment in what your life might be like if you speak freely to another person—speak and allow that person to show you the ways in which you stop yourself thinking and speaking freely. I don’t mean by that that it doesn’t change symptoms. I know by my own experience that it does. But I think the most interesting thing about it is its unpredictability. If you buy a fridge, there are certain things you will be guaranteed. If you buy a psychoanalysis, you won’t be. It’s a real risk, and that also is the point of it. Patients come because they are suffering from something. They want that suffering to be alleviated. Ideally, in the process of doing the analysis, they might find their suffering is alleviated or modified, but also they might discover there are more important things than to alleviate one’s suffering.

When Holdengräber points out the word appetite frequents Phillips’s vocabulary in discussing psychoanalysis, Phillips offers a somewhat counterintuitive framework for the two goals of his profession:

Analysis should do two things that are linked together. It should be about the recovery of appetite, and the need not to know yourself… Symptoms are forms of self-knowledge. When you think, I’m agoraphobic, I’m a shy person, whatever it may be, these are forms of self-knowledge. What psychoanalysis, at its best, does is cure you of your self-knowledge. And of your wish to know yourself in that coherent, narrative way. You can only recover your appetite, and appetites, if you can allow yourself to be unknown to yourself. Because the point of knowing oneself is to contain one’s anxieties about appetite. It’s only worth knowing about the things that make one’s life worth living, and whether there are in fact things that make it worth living.

Illustration from 'Freud,' a graphic biography. Click image for details.

Echoing philosopher Martha Nussbaum’s meditation on living with our human fragility, Phillips adds:

Everybody is dealing with how much of their own aliveness they can bear and how much they need to anesthetize themselves.

We all have self-cures for strong feeling. Then the self-cure becomes a problem, in the obvious sense that the problem of the alcoholic is not alcohol but sobriety. Drinking becomes a problem, but actually the problem is what’s being cured by the alcohol. By the time we’re adults, we’ve all become alcoholics. That’s to say, we’ve all evolved ways of deadening certain feelings and thoughts.

Citing Kafka’s famous letter, Phillips points to art — something Alain de Botton explored more deeply in Art as Therapy. Phillips tells Holdengräber:

One of the reasons we admire or like art, if we do, is that it reopens us in some sense — as Kafka wrote in a letter, art breaks the sea that’s frozen inside us. It reminds us of sensitivities that we might have lost at some cost.

And yet those sensitivities to our inner lives become increasingly muffled by the constant influx of external stimulation brought on by the century of the self. Echoing Malcolm Gladwell’s assertion that “the modern version of introspection is the sum total of all those highly individualized choices that we make about the material content of our lives,” Phillips considers the solace of human conversation:

It can be extremely difficult to know what you want, especially if you live in a consumer, capitalist culture which is phobic of frustration — where the moment you feel a glimmer of frustration, there’s something available to meet it. Now, shopping and eating and sex may not be what you’re wanting, but in order to find that out you have to have a conversation with somebody. You can’t sit in a room by yourself like Rodin’s Thinker…

In conversation things can be metabolized and digested through somebody else — I say something to you and you can give it back to me in different forms — whereas you’ll notice that your own mind is very often extremely repetitive. It is very difficult to surprise oneself in one’s own mind. The vocabulary of one’s self-criticism is so impoverished and clichéd. We are at our most stupid in our self-hatred.

Returning to the parallels between psychoanalysis and literature, Phillips gives greater granularity to the analogy:

Psychoanalytic sessions are not like novels, they’re not like epic poems, they’re not like lyric poems, they’re not like plays — though they’re rather like bits of dialogue from plays. But they do seem to me to be like essays, nineteenth-century essays. There is the same opportunity to digress, to change the subject, to be incoherent, to come to conclusions that are then overcome and surpassed, and so on.

An essay is a mixture of the conversational and the coherent and has, to me, the advantages of both. There doesn’t have to be a beginning, a middle, and an end, as there tends to be in a short story. Essays can wander, they can meander.

Reflecting on the legacy of 19th-century essayists like Emerson and Lamb, Phillips defines the inherent psychology of the genre in terms that counter E.B. White’s notion of the essay as a mecca of narcissism and adds:

The essay is very rarely a fanatical form, it seems to me, partly because you’d just run out of steam. It would just be propaganda of the most boring sort. In order to write a compelling essay, you have to be able to change tone. I think you also have to be reflexively self-revising. It’s not that these things are impossible in other genres, but they’re very possible in essays. As the word essay suggests, it’s about trying something out, it’s about an experiment. From the time I began writing — although this wasn’t conscious — I think that was the tradition I was writing in.

Like Edgar Allan Poe, who considered music the most sublime embodiment of the Poetic Principle and Edna St. Vincent Millay, who extolled music above all arts including her own, Phillips explores the symmetry between psychoanalysis and poetry through the lens of music and its capacity — even on a neurological level — to sidestep our conscious bulwarks and whisper directly to the soul:

I can remember the first time I heard Dylan’s voice, Neil Young, J.J. Cale, Joni Mitchell — that music made me imagine myself. It was so evocative. It taught you nothing, but you felt you’d learned everything you needed to know.

[…]

The emotional impact of music is so incommensurate with what people can say about it, and that seems to be very illustrative of something fundamental—that very powerful emotional effects often can’t be articulated. You know something’s happened to you but you don’t know what it is. You’ll find yourself going back to certain poems again and again. After all, they are only words on a page, but you go back because something that really matters to you is evoked in you by the words. And if somebody said to you, Well, what is it? or What do your favorite poems mean?, you may well be able to answer it, if you’ve been educated in a certain way, but I think you’ll feel the gap between what you are able to say and why you go on reading.

In the same way, a psychoanalysis bent on understanding people is going to be very limited. It’s not about redescribing somebody such that they become like a character in a novel. It’s really showing you how much your wish to know yourself is a consequence of an anxiety state — and how it might be to live as yourself not knowing much about what’s going on.

Inverting Maya Angelou’s lament about labeling others and echoing Joss Whedon’s excellent Wesleyan commencement address on embracing all our selves, Phillips issues the same admonition about our tendency to label — and thus narrow and proscribe, to use Angelou’s words — ourselves:

When people say, “I’m the kind of person who,” my heart always sinks. These are formulas, we’ve all got about ten formulas about who we are, what we like, the kind of people we like, all that stuff. The disparity between these phrases and how one experiences oneself minute by minute is ludicrous. It’s like the caption under a painting. You think, Well, yeah, I can see it’s called that. But you need to look at the picture.

But Phillips later observes that while we’re telling ourselves who we are, we’re also telling ourselves — and grieving — who we could’ve been, a kind of toxic speculative grief for the unrealized what-ifs of our lives, something he explores in greater detail in his most recent book, Missing Out: In Praise of the Unlived Life. He tells Holdengräber:

Missing all our supposed other lives is something modern people are keen to do. We are just addicted to alternatives, fascinated by what we can never do. As if we all had the wrong parents, or the wrong bodies, or the wrong luck…

The comfort would be something like, You don’t have to worry too much about trying to have the lives you think you’re missing. Don’t be tyrannized by the part of yourself that’s only interested in elsewhere.

Reflecting on his prolific career as a writer, Phillips considers the question of why one writes — a question memorably addressed by George Orwell, David Foster Wallace, Michael Lewis, Lynne Tillman, Italo Calvino, Susan Orlean, and Joy Williams — as well as the psychology of criticism:

You have to be really good at masochism to welcome criticism. But you know, you can’t write differently, even if you want to. You just have to be able to notice when you are boring yourself.

Echoing Joan Didion (“Had I been blessed with even limited access to my own mind there would have been no reason to write.”), Phillips adds:

Anybody who writes knows you don’t simply write what you believe. You write to find out what you believe, or what you can afford to believe… When I write, things occur to me. It’s a way of thinking. But you can perform your thinking instead of just thinking it.

Unlike famous writers who ritualize their routines, Phillips sides with Bukowski and tells Holdengräber:

There is no creative process. I mean, I sit down and write. That is really what happens. I sit down in the morning on Wednesday and I write. And sometimes it doesn’t work and almost always it does work, and that’s it.

He points to an even more toxic cultural mythology that couples similar magical thinking with a profound confusion of causal relationship — the “tortured genius” ideal of the artist, which implies that one must suffer in order to create meaningful work. Instead, he suggests an alternative approach — the kind Ray Bradbury embodied and advocated — anchoring artistic endeavor not to cruelty but to kindness:

If you live in a culture which is fascinated by the myth of the artist, and the idea that the vocational artistic life is one of the best lives available, then there’s always going to be a temptation for people who are suffering to believe that to become an artist would be the solution when, in fact, it may be more of the problem. There are a number of people whom you might think of as casualties of the myth of the artist. They really should have done something else. Of course some people get lucky and find that art works for them, but for so many people it doesn’t. I think that needs to be included in the picture. Often one hears or reads accounts in which people will say, Well, he may have treated his children, wives, friends terribly, but look at the novels, the poems, the paintings. I think it’s a terrible equation. Obviously one can’t choose to be, as it were, a good parent or a good artist, but if the art legitimates cruelty, I think the art is not worth having. People should be doing everything they can to be as kind as possible and to enjoy each other’s company. Any art, any anything, that helps us do that is worth having. But if it doesn’t, it isn’t.

The full interview is available here. For more of Phillips’s singular mind, dive into his books, including the especially excellent On Kindness and Promises, Promises.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

09 JUNE, 2014

The Birth of the Information Age: How Paul Otlet’s Vision for Cataloging and Connecting Humanity Shaped Our World

By:

“Everyone from his armchair will be able to contemplate creation, in whole or in certain parts.”

Decades before Alan Turing pioneered computer science and Vannevar Bush imagined the web, a visionary Belgian idealist named Paul Otlet (August 23, 1868–December 10, 1944) set out to organize the world’s information. For nearly half a century, he worked unrelentingly to index and catalog every significant piece of human thought ever published or recorded, building a massive Universal Bibliography of 15 million books, magazines, newspapers, photographs, posters, museum pieces, and other assorted media. His monumental collection was predicated not on ownership but on access and sharing — while amassing it, he kept devising increasingly ambitious schemes for enabling universal access, fostering peaceful relations between nations, and democratizing human knowledge through a global information network he called the “Mundaneum” — a concept partway between Voltaire’s Republic of Letters, Marshall McLuhan’s “global village,” and the übermind of the future. Otlet’s work would go on to inspire generations of information science pioneers, including the founding fathers of the modern internet and the world wide web. (Even the visual bookshelf I use to manage the Brain Pickings book archive is named after him.)

In Cataloging the World: Paul Otlet and the Birth of the Information Age (public library), writer, educator, and design historian Alex Wright traces Otlet’s legacy not only in technology and information science, but also in politics, social reform, and peace activism, illustrating why not only Otlet’s ideas, but also his idealism matter as we contemplate the future of humanity.

The Mundaneum, with its enormous filing system designed by Otlet himself, allowed people to request information by mail-order. By 1912, Otlet and his team were fielding 1,500 such requests per year.

(Image: Mundaneum Archive, Belgium)

Wright writes:

Paul Otlet … seems to connect a series of major turning points in the history of the early twentieth-century information age, synthesizing and incorporating their ideas along with his own, and ultimately coming tantalizingly close to building a fully integrated global information network.

[…]

Otlet embraced the new internationalism and emerged as one of its most prominent apostles in Europe in the early twentieth century. In his work we can see many of these trends intersecting — the rise of industrial technologies, the problem of managing humanity’s growing intellectual output, and the birth of a new internationalism. To sustain it Otlet tried to assemble a great catalog of the world’s published information, create an encyclopedic atlas of human knowledge, build a network of federated museums and other cultural institutions, and establish a World City that would serve as the headquarters for a new world government. For Otlet these were not disconnected activities but part of a larger vision of worldwide harmony. In his later years he started to describe the Mundaneum in transcendental terms, envisioning his global knowledge network as something akin to a universal consciousness and as a gateway to collective enlightenment.

In 1903, Otlet developed a revolutionary index card system for organizing information.

(Image: Mundaneum Archive, Belgium)

Otlet's primarily female staff answered information requests by hand. Without the digital luxury of keyword searches, a single query could take painstaking hours, even days, of sifting through the elaborate index card catalog.

(Image: Mundaneum Archive, Belgium)

The Mundaneum, which officially opened its doors in 1920, a decade after Otlet first dreamt it up, wasn’t merely a prescient vision for the utilitarian information-retrieval function of the modern internet, but the ideological framework for a far nobler and more ambitious goal to unite the world around a new culture of networked peace and understanding, which would shepherd humanity toward reaching its spiritual potential — an idea that makes the Mundaneum’s fate in actuality all the more bitterly ironic.

At the peak of Otlet’s efforts to organize the world’s knowledge around a generosity of spirit, humanity’s greatest tragedy of ignorance and cruelty descended upon Europe. As the Nazis seized power, they launched a calculated campaign to thwart critical thought by banning and burning all books that didn’t agree with their ideology — the very atrocity that prompted Helen Keller’s scorching letter on book-burning — and even paved the muddy streets of Eastern Europe with such books so the tanks would pass more efficiently. When the Nazi inspectors responsible for the censorship effort eventually got to Otlet’s collection, they weren’t quite sure what to make of it. One report summed up their contemptuous bafflement:

The institute and its goals cannot be clearly defined. It is some sort of … ‘museum for the whole world,’ displayed through the most embarrassing and cheap and primitive methods… The library is cobbled together and contains, besides a lot of waste, some things we can use. The card catalog might prove rather useful.

But behind the “waste” and the “embarrassing” methods of organizing it lay far greater ideas that evaded, as is reliably the case, small minds. Wright outlines the remarkable prescience of Otlet’s vision:

What the Nazis saw as a “pile of rubbish,” Otlet saw as the foundation for a global network that, one day, would make knowledge freely available to people all over the world. In 1934, he described his vision for a system of networked computers — “electric telescopes,” he called them — that would allow people to search through millions of interlinked documents, images, and audio and video files. He imagined that individuals would have desktop workstations—each equipped with a viewing screen and multiple movable surfaces — connected to a central repository that would provide access to a wide range of resources on whatever topics might interest them. As the network spread, it would unite individuals and institutions of all stripes — from local bookstores and classrooms to universities and governments. The system would also feature so-called selection machines capable of pinpointing a particular passage or individual fact in a document stored on microfilm, retrieved via a mechanical indexing and retrieval tool. He dubbed the whole thing a réseau mondial: a “worldwide network” or, as the scholar Charles van den Heuvel puts it, an “analog World Wide Web.”

Twenty-five years before the first microchip, forty years before the first personal computer, and fifty years before the first Web browser, Paul Otlet had envisioned something very much like today’s Internet.

Otlet articulated this vision in his own writing, describing an infrastructure remarkably similar to the underlying paradigm of the modern web:

Everything in the universe, and everything of man, would be registered at a distance as it was produced. In this way a moving image of the world will be established, a true mirror of [its] memory. From a distance, everyone will be able to read text, enlarged and limited to the desired subject, projected on an individual screen. In this way, everyone from his armchair will be able to contemplate creation, in whole or in certain parts.

Otlet’s prescience, Wright notes, didn’t end there — he also envisioned speech recognition tools, wireless networks that would enable people to upload files to remote servers, social networks and virtual communities around individual pieces of media that would allow people to “participate, applaud, give ovations, sing in the chorus,” and even concepts we are yet to crack with our present technology, such as transmitting sensory experiences like smell and taste.

Otlet's sketch for the 'worldwide network' he envisioned

(Image: Mundaneum Archive, Belgium)

But Otlet’s most significant vision wasn’t about the technology of it — it was about politics and peace, the very things that most bedevil the modern web, from cyber terrorism to the ongoing struggle for net neutrality. Wright writes:

An ardent “internationalist,” Otlet believed in the inevitable progress of humanity toward a peaceful new future, in which the free flow of information over a distributed network would render traditional institutions — like state governments — anachronistic. Instead, he envisioned a dawning age of social progress, scientific achievement, and collective spiritual enlightenment. At the center of it all would stand the Mundaneum, a bulwark and beacon of truth for the whole world.

But when the Nazis swept Europe and crept closer to Belgium, it became clear to Otlet that not only the physical presence of the Mundaneum but also its political ideals stood at grave risk. He grew increasingly concerned. In swelling desperation to save his life’s work, he sent President Roosevelt a telegram offering the entire collection to the United States “as nucleus of a great World Institution for World Peace and Progress with a seat in America.” Otlet’s urgent plea made it all the way to the Belgian press, who printed the telegram, but he never heard back from Roosevelt. He send a second telegram, even more urgent, once Belgium was invaded, but again received no response. Finally, in a final act of despair, he decided to make “an appeal on behalf of humanity” and try persuading the Nazi inspectors that the Mundaneum was worth saving. Predictably, they were unmoved. A few days later, Nazi soldiers destroyed 63 tons’ worth of books Otlet’s meticulously preserved and indexed materials that constituted the heart of his collection.

Otlet was devastated, but continued to labor quietly over his dream of a global information network throughout the occupation. Four months after the liberation of Paris, he died. And yet the ghost of his work went on to greatly influence the modern information world. Wright contextualizes Otlet’s legacy:

While Otlet did not by any stretch of the imagination “invent” the Internet — working as he did in an age before digital computers, magnetic storage, or packet-switching networks — nonetheless his vision looks nothing short of prophetic. In Otlet’s day, microfilm may have qualified as the most advanced information storage technology, and the closest thing anyone had ever seen to a database was a drawer full of index cards. Yet despite these analog limitations, he envisioned a global network of interconnected institutions that would alter the flow of information around the world, and in the process lead to profound social, cultural, and political transformations.

By today’s standards, Otlet’s proto-Web was a clumsy affair, relying on a patchwork system of index cards, file cabinets, telegraph machines, and a small army of clerical workers. But in his writing he looked far ahead to a future in which networks circled the globe and data could travel freely. Moreover, he imagined a wide range of expression taking shape across the network: distributed encyclopedias, virtual classrooms, three-dimensional information spaces, social networks, and other forms of knowledge that anticipated the hyperlinked structure of today’s Web. He saw these developments as fundamentally connected to a larger utopian project that would bring the world closer to a state of permanent and lasting peace and toward a state of collective spiritual enlightenment.

And yet there’s a poignant duality in how the modern web came to both embody and defy Otlet’s ideals:

During its brief heyday, Otlet’s Mundaneum was also a window onto the world ahead: a vision of a networked information system spanning the globe. Today’s Internet represents both a manifestation of Otlet’s dream and also, arguably, the realization of his worst fears. For the system he imagined differed in crucial ways from the global computer network that would ultimately take shape during the Cold War. He must have sensed that his dream was over when he confronted Krüss and the Nazi delegation on that day in 1940. But before we can fully grasp the importance of Otlet’s vision, we need to look further back, to where it all began.

Comparing the Mundaneum with Sir Tim Berners Lee’s original 1989 proposal for the world wide web, both premised on an essential property of universality, Wright notes both the parallels between the two and the superiority, in certain key aspects, of Otlet’s ideals compared to how the modern web turned out:

[Otlet] never framed his thinking in purely technological terms; he saw the need for a whole-system approach that encompassed not just a technical solution for sharing documents and a classification system to bind them together, but also the attendant political, organizational, and financial structures that would make such an effort sustainable in the long term. And while his highly centralized, controlled approach may have smacked of nineteenth-century cultural imperialism (or, to put it more generously, at least the trappings of positivism), it had the considerable advantages of any controlled system, or what today we might call a “walled garden”: namely, the ability to control what goes in and out, to curate the experience, and to exert a level of quality control on the contents that are exchanged within the system.

Paul Otlet in 1932, months before the Nazis destroyed his Mundaneum

(Image: Mundaneum Archive, Belgium)

But Otlet’s greatest ambition, as well as the one most enduring due to its as-yet unfulfilled fruition, was that of the Mundaneum’s humanistic effect in strengthening the invisible bonds that link us together — an ethos rather antithetical to the individualistic, almost narcissistic paradigm of today’s social web. Wright explains:

The contemporary construct of “the user” that underlies so much software design figures nowhere in Otlet’s work. He saw the mission of the Mundaneum as benefiting humanity as a whole, rather than serving the whims of individuals. While he imagined personalized workstations (those Mondotheques), he never envisioned the network along the lines of a client-server “architecture” (a term that would not come into being for another two decades). Instead, each machine would act as a kind of “dumb” terminal, fetching and displaying material stored in a central location.

The counterculture programmers who paved the way for the Web believed they were participating in a process of personal liberation. Otlet saw it as a collective undertaking, one dedicated to a higher purpose than mere personal gratification. And while he might well have been flummoxed by the anything-goes ethos of present-day social networking sites like Facebook or Twitter, he also imagined a system that allowed groups of individuals to take part in collaborative experiences like lectures, opera performances, or scholarly meetings, where they might “applaud” or “give ovations.” It seems a short conceptual hop from here to Facebook’s ubiquitous “Like” button.

A reproduction of Otlet's original Mondotheque desk

(Image: Mundaneum Archive, Belgium)

In this regard, Otlet’s idea of collective intelligence working toward a common good presaged modern concepts like crowdsourcing and “cognitive surplus” as well as initiatives like Singularity University. Wright considers the essence of his legacy:

Otlet’s work invites us to consider a simple question: whether the path to liberation requires maximum personal freedom of the kind that characterizes today’s anything-goes Internet, or whether humanity would find itself better served by pursuing liberation through the exertion of discipline.

Considering the darker side of the modern internet in information monopolies like Google and Facebook, Wright reflects on how antithetical this dominance of private enterprise is to Otlet’s vision of a democratic, publicly funded international network. “He likely would have seen the pandemonium of today’s Web as an enormous waste of humanity’s intellectual and spiritual potential,” Wright writes and as he contemplates the messy machinery of money and motives propelling the modern web:

Would the Internet have turned out any differently had Paul Otlet’s vision come to fruition? Counterfactual history is a fool’s game, but it is perhaps worth considering a few possible lessons from the Mundaneum. First and foremost, Otlet acted not out of a desire to make money — something he never succeeded at doing — but out of sheer idealism. His was a quest for universal knowledge, world peace, and progress for humanity as a whole. The Mundaneum was to remain, as he said, “pure.” While many entrepreneurs vow to “change the world” in one way or another, the high-tech industry’s particular brand of utopianism almost always carries with it an underlying strain of free-market ideology: a preference for private enterprise over central planning and a distrust of large organizational structures. This faith in the power of “bottom-up” initiatives has long been a hallmark of Silicon Valley culture, and one that all but precludes the possibility of a large-scale knowledge network emanating from anywhere but the private sector.

But rather than a hapless historical lament, Wright argues, Otlet’s work can serve as an ideal — moral, social, political — to aspire to as we continue to shape this fairly young medium. It could lead us to devise more intelligent intellectual property regulations, build more sophisticated hyperlinks, and hone our ability to curate and contextualize information in more meaningful ways. He writes:

That is why Paul Otlet still matters. His vision was not just cloud castles and Utopian scheming and positivist cant but in some ways more relevant and realizable now than at any point in history. To be sure, some of his most cherished ideas seem anachronistic by today’s standards: his quest for “universal” truth, his faith in international organizations, and his conviction in the inexorable progress of humanity. But as more and more of us rely on the Internet to conduct our everyday lives, we are also beginning to discover the dark side of such extreme decentralization. The hopeful rhetoric of the early years of the Internet revolution has given way to the realization that we may be entering a state of permanent cultural amnesia, in which the sheer fluidity of the Web makes it difficult to keep our bearings. Along the way, many of us have also entrusted our most valued personal data — letters, photographs, films, and all kinds of other intellectual artifacts — to a handful of corporations who are ultimately beholden not to serving humanity but to meeting Wall Street quarterly earnings estimates. For all the utopian Silicon Valley rhetoric about changing the world, the technology industry seems to have little appetite for long-term thinking beyond its immediate parochial interests.

[…]

Otlet’s Mundaneum will never be. But it nonetheless offers us a kind of Platonic object, evoking the possibility of a technological future driven not by greed and vanity, but by a yearning for truth, a commitment to social change, and a belief in the possibility of spiritual liberation. Otlet’s vision for an international knowledge network—always far more expansive than a mere information retrieval tool—points toward a more purposeful vision of what the global network could yet become. And while history may judge Otlet a relic from another time, he also offers us an example of a man driven by a sense of noble purpose, who remained sure in his convictions and unbowed by failure, and whose deep insights about the structure of human knowledge allowed him to peer far into the future…

His work points to a deeply optimistic vision of the future: one in which the world’s knowledge coalesces into a unified whole, narrow national interests give way to the pursuit of humanity’s greater good, and we all work together toward building an enlightened society.

Cataloging the World: Paul Otlet and the Birth of the Information Age is a remarkable read in its entirety, not only in illuminating history but in extracting from it a beacon for the future. Complement it with Vannevar Bush’s 1945 “memex” concept and George Dyson’s history of bits. And lest we forget, it all started with a woman — Ada Lovelace, Lord Byron’s illegitimate daughter and the world’s first computer programmer.

Thanks, Liz

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.