“To be nobody-but-yourself — in a world which is doing its best, night and day, to make you everybody else — means to fight the hardest battle which any human being can fight.”
By Maria Popova
“No one can build you the bridge on which you, and only you, must cross the river of life,”wrote the thirty-year-old Nietzsche. “The true and durable path into and through experience,” Nobel-winning poet Seamus Heaney counseled the young more than a century later in his magnificent commencement address, “involves being true … to your own solitude, true to your own secret knowledge.”
Every generation believes that it must battle unprecedented pressures of conformity; that it must fight harder than any previous generation to protect that secret knowledge from which our integrity of selfhood springs. Some of this belief stems from the habitual conceit of a culture blinded by its own presentism bias, ignorant of the past’s contextual analogues. But much of it in the century and a half since Nietzsche, and especially in the years since Heaney, is an accurate reflection of the conditions we have created and continually reinforce in our present informational ecosystem — a Pavlovian system of constant feedback, in which the easiest and commonest opinions are most readily rewarded, and dissenting voices are most readily punished by the unthinking mob.
Few people in the two centuries since Emerson issued his exhortation to “trust thyself” have countered this culturally condoned blunting of individuality more courageously and consistently than E.E. Cummings (October 14, 1894–September 3, 1962) — an artist who never cowered from being his unconventional self because, in the words of his most incisive and competent biographer, he “despised fear, and his life was lived in defiance of all who ruled by it.”
A fortnight after the poet’s fifty-ninth birthday, a small Michigan newspaper published a short, enormous piece by Cummings under the title “A Poet’s Advice to Students,” radiating expansive wisdom on art, life, and the courage of being yourself. It went on to inspire Buckminster Fuller and was later included in E.E. Cummings: A Miscellany Revised (public library) — that wonderful out-of-print collection which the poet himself described as “a cluster of epigrams, forty-nine essays on various subjects, a poem dispraising dogmata, and several selections from unfinished plays,” and which gave us Cummings on what it really means to be an artist.
A poet is somebody who feels, and who expresses his feelings through words.
This may sound easy. It isn’t.
A lot of people think or believe or know they feel — but that’s thinking or believing or knowing; not feeling. And poetry is feeling — not knowing or believing or thinking.
Almost anybody can learn to think or believe or know, but not a single human being can be taught to feel. Why? Because whenever you think or you believe or you know, you’re a lot of other people: but the moment you feel, you’re nobody-but-yourself.
To be nobody-but-yourself — in a world which is doing its best, night and day, to make you everybody else — means to fight the hardest battle which any human being can fight; and never stop fighting.
Cummings should know — just four years earlier, he had fought that hardest battle himself: When he was awarded the prestigious Academy of American Poets annual fellowship — the MacArthur of poetry — Cummings had to withstand harsh criticism from traditionalists who besieged him with hate for the bravery of breaking with tradition and being nobody-but-himself in his art. With an eye to that unassailable creative integrity buoyed by relentless work ethic, he adds:
As for expressing nobody-but-yourself in words, that means working just a little harder than anybody who isn’t a poet can possibly imagine. Why? Because nothing is quite as easy as using words like somebody else. We all of us do exactly this nearly all of the time — and whenever we do it, we’re not poets.
If, at the end of your first ten or fifteen years of fighting and working and feeling, you find you’ve written one line of one poem, you’ll be very lucky indeed.
And so my advice to all young people who wish to become poets is: do something easy, like learning how to blow up the world — unless you’re not only willing, but glad, to feel and work and fight till you die.
“If the connection between reality and human response were direct and immediate, rather than indirect and inferred, indecision and failure would be unknown.”
By Maria Popova
“Truth always rests with the minority … because the minority is generally formed by those who really have an opinion, while the strength of a majority is illusory, formed by the gangs who have no opinion,” Kierkegaard wrote in his journal in the middle of the nineteenth century as he tussled with the eternal question of why we conform. Around the same time, across the Atlantic, Emerson fumed in his own diary as he contemplated the supreme existential challenge of individual integrity in a mass society: “Masses are rude, lame, unmade, pernicious in their demands and influence… I wish not to concede anything to them, but to tame, drill, divide, and break them up, and draw individuals out of them.” A century later, Eleanor Roosevelt would sharpen this sentiment in her abiding meditation on happiness and conformity: “When you adopt the standards and the values of someone else, you surrender your own integrity [and] become, to the extent of your surrender, less of a human being.”
Why even the soundest-minded of us are so susceptible to such unconscious surrender and what it takes to uphold a clear view of reality is what the great writer, media theorist, and political critic Walter Lippmann (September 23, 1889–December 14, 1974) — whose moral courage of shedding light on the pitfalls of society and the human psyche Eleanor Roosevelt greatly admired, and whom Theodore Roosevelt considered the “most brilliant young man of his age in all the United States” — explores in his timelessly insightful 1922 book Public Opinion (free ebook | public library).
Lippmann — who coined the word stereotype in its contemporary sense — begins by considering just how porous to ambient information we are in the constitution of our inner landscape opinion, and how absurd it is to regard ourselves as having firm and final grasp of reality when the entire history of our species is the history of misapprehension and pseudoreality tightly held as truth. He writes:
Looking back we can see how indirectly we know the environment in which nevertheless we live. We can see that the news of it comes to us now fast, now slowly; but that whatever we believe to be a true picture, we treat as if it were the environment itself. It is harder to remember that about the beliefs upon which we are now acting, but in respect to other peoples and other ages we flatter ourselves that it is easy to see when they were in deadly earnest about ludicrous pictures of the world. We insist, because of our superior hindsight, that the world as they needed to know it, and the world as they did know it, were often two quite contradictory things. We can see, too, that while they governed and fought, traded and reformed in the world as they imagined it to be, they produced results, or failed to produce any, in the world as it was. They started for the Indies and found America. They diagnosed evil and hanged old women. They thought they could grow rich by always selling and never buying. A caliph, obeying what he conceived to be the Will of Allah, burned the library at Alexandria.
Nearly a century before the Nobel-winning psychologist Daniel Kahneman came to study how our minds mislead us and observed that “the confidence people have in their beliefs is not a measure of the quality of evidence but of the coherence of the story that the mind has managed to construct,” Lippmann illustrates this elemental human tendency with the example of the sixth-century Alexandrian monk Cosmas, who set out to disprove pre-Christian geographers’ assertion that our planet is spherical.
Although the belief that Earth is flat had been steadily falling out of favor over the preceding three centuries, Cosmas held tightly to his religious mythology, positing that Earth was structurally modeled on the house of worship God describes to Moses during the Jewish Exodus from Egypt. Determined to reconcile reality and religion, Cosmas devised a theoretical model of the universe he called “Christian Topography.” Drawn centuries before the development of perspective, his map depicts the world as a flat parallelogram, twice as wide from east to west as it is high from north to south, containing the Earth at the center, surrounded by an ocean, in turn contained by another Earth, where humans lived before the flood of the Genesis myth. Atop this other Earth — Noah’s point of embarkation — is a conical mountain, behind which the Sun and Moon revolve like a celestial chandelier spinning to turn day into night. Specific compartments of this universe-within-a-world are allotted to the mortals, the blessed, and the angels.
Cosmas enfolded the map into his treatise Christian Opinion Concerning the World, and yet he seemed unwitting of the fact that the map was indeed an opinion rather than a representation of reality. He did what we have always done as human beings — mistake our labels and models of things for the things themselves. In the ancient monk, Lippmann finds a living allegory for the human pathology to see what we wish to believe:
For Cosmas there was nothing in the least absurd about his map. Only by remembering his absolute conviction that this was the map of the universe can we begin to understand how he would have dreaded Magellan or Peary or the aviator who risked a collision with the angels and the vault of heaven by flying seven miles up in the air. In the same way we can best understand the furies of war and politics by remembering that almost the whole of each party believes absolutely in its picture of the opposition, that it takes as fact, not what is, but what it supposes to be the fact.
Our opinions of the world and of other people, Lippmann argues, form much the way Cosmas constructed his map — governed less by a clear view of the relevant facts and the inner motives of others than by our theoretical models of what happened and why it happened, informed largely by our own beliefs and feelings. Decades before neuroscientists located the central mystery of consciousness in qualia — the subjective interiority of any human experience, so opaque to outside observers — Lippmann writes:
The only feeling that anyone can have about an event he does not experience is the feeling aroused by his mental image of that event. That is why until we know what others think they know, we cannot truly understand their acts.
The most interesting question, then, as well as the most pressing in matters both political and personal, is what makes us vulnerable to such self-inflicted blindnesses and delusions, and what can be done about it. Lippmann writes:
It is clear enough that under certain conditions men respond as powerfully to fictions as they do to realities, and that in many cases they help to create the very fictions to which they respond.
In [such] instances we must note particularly one common factor. It is the insertion between man and his environment of a pseudo-environment. To that pseudo-environment his behavior is a response. But because it is behavior, the consequences, if they are acts, operate not in the pseudo-environment where the behavior is stimulated, but in the real environment where action eventuates. If the behavior is not a practical act, but what we call roughly thought and emotion, it may be a long time before there is any noticeable break in the texture of the fictitious world. But when the stimulus of the pseudo-fact results in action on things or other people, contradiction soon develops. Then comes the sensation of butting one’s head against a stone wall, of learning by experience, and witnessing Herbert Spencer’s tragedy of the murder of a Beautiful Theory by a Gang of Brutal Facts, the discomfort in short of a maladjustment. For certainly, at the level of social life, what is called the adjustment of man to his environment takes place through the medium of fictions.
Fictions, Lippmann is careful to point out, need not be blatant lies — they can be, and most often are, the subtle deformations of reality in which a sapling of truth is grafted onto a robust trunk of Cosmian interpretation to produce a bramble of pseudo-reality. Three centuries after Galileo admonished against the folly of believing our preconceptions, as he defied the geocentric model of the universe nearly at the cost of his life, Lippmann writes:
By fictions I do not mean lies. I mean a representation of the environment which is in lesser or greater degree made by man himself… A work of fiction may have almost any degree of fidelity, and so long as the degree of fidelity can be taken into account, fiction is not misleading. In fact, human culture is very largely the selection, the rearrangement, the tracing of patterns upon, and the stylizing of, what William James called “the random irradiations and resettlements of our ideas.” The alternative to the use of fictions is direct exposure to the ebb and flow of sensation. That is not a real alternative, for however refreshing it is to see at times with a perfectly innocent eye, innocence itself is not wisdom, though a source and corrective of wisdom. For the real environment is altogether too big, too complex, and too fleeting for direct acquaintance. We are not equipped to deal with so much subtlety, so much variety, so many permutations and combinations. And although we have to act in that environment, we have to reconstruct it on a simpler model before we can manage with it. To traverse the world men must have maps of the world. Their persistent difficulty is to secure maps on which their own need, or someone else’s need, has not sketched in the coast of Bohemia.
And yet the maps two people make of the same reality may be so staggeringly divergent as to lead us to believe that the mappers inhabit different worlds. Once again anticipating the notion of qualia, Lippmann refines the sentiment by pointing out that “they live in the same world, but they think and feel in different ones.” It is through this draughtsmanship of thought and feeling that we draw the highly subjective maps by which we navigate the real world:
What each man does is based not on direct and certain knowledge, but on pictures made by himself or given to him. If his atlas tells him that the world is flat he will not sail near what he believes to be the edge of our planet for fear of falling off. If his maps include a fountain of eternal youth, a Ponce de Leon will go in quest of it. If someone digs up yellow dirt that looks like gold, he will for a time act exactly as if he had found gold. The way in which the world is imagined determines at any particular moment what men will do. It does not determine what they will achieve. It determines their effort, their feelings, their hopes, not their accomplishments and results.
The very fact that men theorize at all is proof that their pseudo-environments, their interior representations of the world, are a determining element in thought, feeling, and action. For if the connection between reality and human response were direct and immediate, rather than indirect and inferred, indecision and failure would be unknown.
Man is no Aristotelian god contemplating all existence at one glance. He is the creature of an evolution who can just about span a sufficient portion of reality to manage his survival, and snatch what on the scale of time are but a few moments of insight and happiness. Yet this same creature has invented ways of seeing what no naked eye could see, of hearing what no ear could hear, of weighing immense masses and infinitesimal ones, of counting and separating more items than he can individually remember. He is learning to see with his mind vast portions of the world that he could never see, touch, smell, hear, or remember. Gradually he makes for himself a trustworthy picture inside his head of the world beyond his reach.
But the pictures our mind’s eye constructs by inference and common sense — that most pernicious traitor of reality — are inherently untrustworthy. They are always partial and gravely warped by the illusion of completeness — especially when it comes to what we call public opinion. Lippmann anchors his argument to a definition:
Those features of the world outside which have to do with the behavior of other human beings, in so far as that behavior crosses ours, is dependent upon us, or is interesting to us, we call roughly public affairs. The pictures inside the heads of these human beings, the pictures of themselves, of others, of their needs, purposes, and relationship, are their public opinions. Those pictures which are acted upon by groups of people, or by individuals acting in the name of groups, are Public Opinion with capital letters.
Considering why our mental pictures so habitually misrepresent reality, Lippmann identifies our limited access to facts as the key factor — we simply don’t have the time and opportunity to take into account every relevant data point and contextual quotient in forming our opinions about a person or situation. Half a century after the English mathematician and philosopher William Kingdon Clifford extolled the discipline of doubt and plainly observed that “it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence,” and nearly a century before our so-called social media, Lippmann laments “the distortion arising because events have to be compressed into very short messages,” compounded by the flattening of dimension and the erasure of nuance induced by compressing a complex world into a limited vocabulary. More than half a century before James Baldwin admonished that “people who shut their eyes to reality simply invite their own destruction,” Lippmann observes that these distortions and deceptions converge to foment a “fear of facing those facts which would seem to threaten the established routine of men’s lives.” In other words, we instinctively partake in willful blindness, attending only to those facts which corroborate our existing model of reality — the model by which our lives operate with the lowest degree of friction.
Lippmann returns to the central causality of our misapprehension — our limited access to facts, barred from us by various layers of circumstantial opacity and deliberate privacy — and offers a vital calibration of the confidence we have in our world-picture:
Whether the reasons for privacy are good or bad, the barriers exist. Privacy is insisted upon at all kinds of places in the area of what is called public affairs. It is often very illuminating, therefore, to ask yourself how you got at the facts on which you base your opinion. Who actually saw, heard, felt, counted, named the thing, about which you have an opinion? Was it the man who told you, or the man who told him, or someone still further removed? And how much was he permitted to see?
You can ask yourself these questions, but you can rarely answer them. They will remind you, however, of the distance which often separates your public opinion from the event with which it deals. And the reminder is itself a protection.
In a single succinct prescription for effective critical thinking, Lippmann distills the antidote to our susceptibility to outside manipulation and our propensity for self-deception:
In truly effective thinking the prime necessity is to liquidate judgments, regain an innocent eye, disentangle feelings, be curious and open-hearted.
“Good nature is, of all moral qualities, the one that the world needs most, and good nature is the result of ease and security, not of a life of arduous struggle.”
By Maria Popova
“Everybody should be quiet near a little stream and listen,” the great children’s book author Ruth Krauss — a philosopher, really — wrote in her last and loveliest collaboration with the young Maurice Sendak in 1960. At the time of her first collaboration with Sendak twelve years earlier, just after the word “workaholic” was coined, the German philosopher Josef Pieper was composing Leisure, the Basis of Culture — his timeless and increasingly timely manifesto for reclaiming our human dignity in a culture of busyness. “Leisure,” Pieper wrote, “is not the same as the absence of activity… or even as an inner quiet. It is rather like the stillness in the conversation of lovers, which is fed by their oneness.”
A generation earlier, with a seer’s capacity to peer past the horizon of the present condition and anticipate a sweeping cultural current before it has flooded in, and with a sage’s ability to provide the psychic buoy for surviving the current’s perilous rapids, Bertrand Russell (May 18, 1872–February 2, 1970) addressed the looming cult of workaholism in a prescient 1932 essay titled In Praise of Idleness (public library).
A great deal of harm is being done in the modern world by belief in the virtuousness of work, and that the road to happiness and prosperity lies in an organized diminution of work.
With his characteristic wisdom punctuated by wry wit, he examines what work actually means:
Work is of two kinds: first, altering the position of matter at or near the earth’s surface relatively to other such matter; second, telling other people to do so. The first kind is unpleasant and ill paid; the second is pleasant and highly paid. The second kind is capable of indefinite extension: there are not only those who give orders, but those who give advice as to what orders should be given. Usually two opposite kinds of advice are given simultaneously by two organized bodies of men; this is called politics. The skill required for this kind of work is not knowledge of the subjects as to which advice is given, but knowledge of the art of persuasive speaking and writing, i.e., of advertising.
Russell points to landowners as a historical example of a class whose idleness was only made possible by the toil of others. For the vast majority of our species’ history, up until the Industrial Revolution, the average person spent nearly every waking hour working hard to earn the basic necessities of survival. Any marginal surplus, he notes, was swiftly appropriated by those in power — the warriors, the monarchs, the priests. Since the Industrial Revolution, other power systems — from big business to dictatorships — have simply supplanted the warriors, monarchs, and priests. Russell considers how the exploitive legacy of pre-industrial society has corrupted the modern social fabric and warped our value system:
A system which lasted so long and ended so recently has naturally left a profound impress upon men’s thoughts and opinions. Much that we take for granted about the desirability of work is derived from this system, and, being pre-industrial, is not adapted to the modern world. Modern technique has made it possible for leisure, within limits, to be not the prerogative of small privileged classes, but a right evenly distributed throughout the community. The morality of work is the morality of slaves, and the modern world has no need of slavery.
Writing nearly a century after Kierkegaard extolled the existential boon of idleness, Russell considers how this manipulated mentality has hypnotized us into worshiping work as virtue and scorning leisure as laziness, as weakness, as folly, rather than recognizing it as the raw material of social justice and the locus of our power:
The conception of duty, speaking historically, has been a means used by the holders of power to induce others to live for the interests of their masters rather than for their own. Of course the holders of power conceal this fact from themselves by managing to believe that their interests are identical with the larger interests of humanity. Sometimes this is true; Athenian slave owners, for instance, employed part of their leisure in making a permanent contribution to civilization which would have been impossible under a just economic system. Leisure is essential to civilization, and in former times leisure for the few was only rendered possible by the labors of the many. But their labors were valuable, not because work is good, but because leisure is good. And with modern technique it would be possible to distribute leisure justly without injury to civilization.
Russell notes that WWI — which was dubbed “the war to end all wars” by a world willfully blind to the fact that violence begets more violence, unwitting that this world war would pave the way for the next — furthered our civilizational conflation of duty with work and work with virtue, lulling us into the modern trance of busyness. More than half a century before Annie Dillard observed that “how we spend our days is, of course, how we spend our lives,” Russell traces the ledger of our existential spending back to war’s false promise of freedom:
The war showed conclusively that, by the scientific organization of production, it is possible to keep modern populations in fair comfort on a small part of the working capacity of the modern world. If, at the end of the war, the scientific organization, which had been created in order to liberate men for fighting and munition work, had been preserved, and the hours of work had been cut down to four, all would have been well. Instead of that the old chaos was restored, those whose work was demanded were made to work long hours, and the rest were left to starve as unemployed. Why? Because work is a duty, and a man should not receive wages in proportion to what he has produced, but in proportion to his virtue as exemplified by his industry.
Pointing out that this equivalence originates in the same morality — or, rather, immorality — that produced the slave state, he exposes the core cultural falsehood it has effected, which stands as a monumental obstruction to equality and social justice in contemporary society:
The idea that the poor should have leisure has always been shocking to the rich.
Born in an era when urban workingmen had just acquired the right to vote in Great Britain, Russell draws on his own childhood for a stark illustration of this belief and its far-reaching tentacles of socioeconomic oppression:
I remember hearing an old Duchess say: “What do the poor want with holidays? They ought to work.” People nowadays are less frank, but the sentiment persists, and is the source of much of our economic confusion.
That sentiment, Russell reminds us again and again, is ahistorical. Advances in science, technology, and the very mechanics of society have made it no longer necessary for the average person to endure fifteen-hour workdays in order to obtain basic sustenance, as adults — and often children — had to in the early nineteenth century. But while the allocation of our time in relation to need has changed immensely, our attitudes about how that time is spent hardly have. He writes:
Every human being, of necessity, consumes, in the course of his life, a certain amount of the produce of human labor.
The wise use of leisure, it must be conceded, is a product of civilization and education. A man who has worked long hours all his life will be bored if he becomes suddenly idle. But without a considerable amount of leisure a man is cut off from many of the best things. There is no longer any reason why the bulk of the population should suffer this deprivation; only a foolish asceticism, usually vicarious, makes us continue to insist on work in excessive quantities now that the need no longer exists.
But while reinstating the dignity of leisure — or what Russell calls idleness — is a necessary condition for recalibrating our life-satisfaction to more adequately reflect the contemporary realities of work and need, it is not a sufficient one. Exacerbating our already warped relationship with work is the muddling of needs and wants at the heart of capitalist materialism — something Russell would address nearly two decades later in his Nobel Prize acceptance speech, listing acquisitiveness as the first of the four desires driving human behavior. He considers the radical shift that would take place if we were to stop regarding the virtue of work as an end in itself and begin seeing it as a means to a state of being in which work is no longer needed, reinstating leisure and comfort — that is, a contented sense of enoughness — as the proper existential end:
What will happen when the point has been reached where everybody could be comfortable without working long hours?
In the West, we have various ways of dealing with this problem. We have no attempt at economic justice, so that a large proportion of the total produce goes to a small minority of the population, many of whom do no work at all. Owing to the absence of any central control over production, we produce hosts of things that are not wanted. We keep a large percentage of the working population idle, because we can dispense with their labor by making the others overwork. When all these methods prove inadequate, we have a war; we cause a number of people to manufacture high explosives, and a number of others to explode them, as if we were children who had just discovered fireworks. By a combination of all these devices we manage, though with difficulty, to keep alive the notion that a great deal of severe manual work must be the lot of the average man.
Our society, Russell argues, is driven by “continually fresh schemes, by which present leisure is to be sacrificed to future productivity.” He challenges the inanity of this proposition:
The fact is that moving matter about, while a certain amount of it is necessary to our existence, is emphatically not one of the ends of human life. If it were, we should have to consider every navvy superior to Shakespeare. We have been misled in this matter by two causes. One is the necessity of keeping the poor contented, which has led the rich, for thousands of years, to preach the dignity of labor, while taking care themselves to remain undignified in this respect. The other is the new pleasure in mechanism, which makes us delight in the astonishingly clever changes that we can produce on the earth’s surface. Neither of these motives makes any great appeal to the actual worker. If you ask him what he thinks the best part of his life, he is not likely to say: “I enjoy manual work because it makes me feel that I am fulfilling man’s noblest task, and because I like to think how much man can transform his planet. It is true that my body demands periods of rest, which I have to fill in as best I may, but I am never so happy as when the morning comes and I can return to the toil from which my contentment springs.” I have never heard workingmen say this sort of thing. They consider work, as it should be considered, a necessary means to a livelihood, and it is from their leisure hours that they derive whatever happiness they may enjoy.
Decades before Diane Ackerman made her exquisite case for the evolutionary and existential value of play, Russell considers how the cult of productivity has demolished one of life’s pillars of satisfaction. Noting that modern people — true of the moderns of 1932, even truer of today’s — enjoy a little leisure but wouldn’t know what to do with themselves if they had to work only four hours a day, he observes:
In so far as this is true in the modern world, it is a condemnation of our civilization; it would not have been true at any earlier period. There was formerly a capacity for lightheartedness and play which has been to some extent inhibited by the cult of efficiency. The modern man thinks that everything ought to be done for the sake of something else, and never for its own sake.
The seedbed of this soul-shriveling belief is the notion — a driving force of consumerism — that the only worthwhile activities are those that bring material profit. A formidable logician, Russell exposes the self-unraveling nature of this argument:
Broadly speaking, it is held that getting money is good and spending money is bad. Seeing that they are two sides of one transaction, this is absurd; one might as well maintain that keys are good, but keyholes are bad. Whatever merit there may be in the production of goods must be entirely derivative from the advantage to be obtained by consuming them. The individual, in our society, works for profit; but the social purpose of his work lies in the consumption of what he produces. It is this divorce between the individual and the social purpose of production that makes it so difficult for men to think clearly in a world in which profit-making is the incentive to industry. We think too much of production, and too little of consumption. One result is that we attach too little importance to enjoyment and simple happiness, and that we do not judge production by the pleasure that it gives to the consumer.
Another result, Russell argues, is a kind of split between positive idleness, which ought to be the nourishing end of work, and negative idleness, which ends up being the effect of work under the spell of consumerism and its consequent socioeconomic inequality. He writes:
The pleasures of urban populations have become mainly passive: seeing cinemas, watching football matches, listening to the radio, and so on. This results from the fact that their active energies are fully taken up with work; if they had more leisure, they would again enjoy pleasures in which they took an active part.
With an eye to our civilization’s triumphs and failures of self-actualization, Russell points out that, historically, there has been a small leisure class enjoying a great many privileges without a basis in social justice, profiting on the backs of a large working class toiling for survival. While this rendered the oppressive leisure class morally condemnable, it resulted in the vast majority of art and science — “the whole of what we call civilization.” He writes:
Without the leisure class, mankind would never have emerged from barbarism.
The method of a hereditary leisure class without duties was, however, extraordinarily wasteful. None of the members of the class had been taught to be industrious, and the class as a whole was not exceptionally intelligent. The class might produce one Darwin, but against him had to be set tens of thousands of country gentlemen who never thought of anything more intelligent than fox-hunting and punishing poachers.
Russell’s most compelling point is the most counterintuitive — the idea that reclaiming leisure is not a reinforcement of elitism but the antidote to elitism itself and a form of resistance to oppression, for it would require dismantling the power structures of modern society and undoing the spell they have cast on us to keep the poor poor and the rich rich. To correctly calibrate modern life around a sense of enough — that is, around meeting the need for comfort rather than satisfying the endless want for consumerist acquisitiveness — would be to lay the groundwork for social justice. In such a society, Russell argues, no one would have to work more than four hours out of twenty-four — a proposition even more countercultural today than it was in his era. He paints the landscape of possibility:
In a world where no one is compelled to work more than four hours a day, every person possessed of scientific curiosity will be able to indulge it, and every painter will be able to paint without starving, however excellent his pictures may be. Young writers will not be obliged to draw attention to themselves by sensational potboilers, with a view to acquiring the economic independence needed for monumental works, for which, when the time at last comes, they will have lost the taste and the capacity.
Above all, there will be happiness and joy of life, instead of frayed nerves, weariness, and dyspepsia. The work exacted will be enough to make leisure delightful, but not enough to produce exhaustion. Since men will not be tired in their spare time, they will not demand only such amusements as are passive and vapid. At least 1 per cent will probably devote the time not spent in professional work to pursuits of some public importance, and, since they will not depend upon these pursuits for their livelihood, their originality will be unhampered, and there will be no need to conform to the standards set by elderly pundits. But it is not only in these exceptional cases that the advantages of leisure will appear. Ordinary men and women, having the opportunity of a happy life, will become more kindly and less persecuting and less inclined to view others with suspicion. The taste for war will die out, partly for this reason, and partly because it will involve long and severe work for all. Good nature is, of all moral qualities, the one that the world needs most, and good nature is the result of ease and security, not of a life of arduous struggle. Modern methods of production have given us the possibility of ease and security for all; we have chosen, instead, to have overwork for some and starvation for the others. Hitherto we have continued to be as energetic as we were before there were machines; in this we have been foolish, but there is no reason to go on being foolish for ever.
The anatomy of feeling, the science of psychedelics, Ursula K. Le Guin’s final poetry collection, arresting essays by Zadie Smith, Rebecca Solnit, Anne Lamott, and Audre Lorde, a physicist’s lyrical meditation on science and spirituality, and more.
By Maria Popova
I treat my annual best-of reading lists as Old Year’s resolutions in reverse — unlike traditional resolutions, which frame aspirational priorities for the new year, they present a record of the reading that merited priority over the year past. In consequence, they are invariably subjective and incomplete — a shelf’s worth of books that I, one person, read and enjoyed in the time given, with the sensibility I have. Since this year I finished writing one book and putting together another, my reading time for new releases has been especially limited, which means these annual selections are especially subjective — no doubt I missed a great many worthy and wonderful books. But of those I did read, here — in excerpts from the pieces I originally wrote about them earlier in the year — are the ones I loved with all my heart and mind:
SO FAR SO GOOD
In November of 2014, the wise and wonderful Ursula K. Le Guin (October 21, 1929–January 22, 2018) — one of the great losses of 2018 — accepted the National Book Award with a stunning speech that quickly became our era’s supreme manifesto for protecting the art of the written word from the assault of the market. In consonance with her conviction, Le Guin sent the manuscript of her final poetry collection to an independent nonprofit poetry publisher, Copper Canyon Press, who turned directly to her readers to bring it to life. And oh how alive So Far So Good (public library) is — a sort of existential atlas, traversing bordering territories of mediations, incantations, and divinations on subjects like time, impermanence, and the splendors of uncertainty. Undergirding the verses is Le Guin’s largehearted generosity of spirit — toward the reader, toward nature and reality, toward the intertwined natures of life and art.
In the vast abyss before time, self
is not, and soul commingles
with mist, and rock, and light. In time,
soul brings the misty self to be.
Then slow time hardens self to stone
while ever lightening the soul,
till soul can loose its hold of self
and both are free and can return
to vastness and dissolve in light,
the long light after time.
In one of the most arresting essays, titled “On Optimism and Despair,” Smith takes on an eternal question that has bared its sharpest edges in our cultural moment — the question John Steinbeck tussled with when he wrote to his best friend at the peak of WWII: “All the goodness and the heroisms will rise up again, then be cut down again and rise up. It isn’t that the evil thing wins — it never will — but that it doesn’t die.”
Caught in the maelstrom of the moment, we forget this cyclical nature of history — history being, as I wrote in Figuring, not what happened, but what survives the shipwrecks of judgment and chance. We forget that the present always looks different from the inside than it does from the outside — something James Baldwin knew when, in considering why Shakespeare endures, he observed: “It is said that his time was easier than ours, but I doubt it — no time can be easy if one is living through it.” We forget that our particular moment, with all its tribulations and triumphs, is not neatly islanded in the river of time but swept afloat by massive cultural currents that have raged long before it and will rage long after.
Two days after the 2016 American presidential election, Smith — a black Englishwoman living in the freshly sundered United States — was invited to give a speech upon receiving a literary award in Germany. Traveling from a country on the brink of one catastrophic political regime to a country that has survived another, Smith took the opportunity to unmoor the despair of the present from the shallow waters of the cultural moment and cast it into the oceanic context of humanity’s pasts, aswirl with examples and counterexamples of progress, with ideals attained and shattered, with abiding assurance that we shape tomorrow by how we navigate our parallel potentialities for moral ruin and moral redemption today.
Nearly half a century after the German humanistic philosopher Erich Fromm asserted that “optimism is an alienated form of faith, pessimism an alienated form of despair” and a turn of the cycle after Rebecca Solnit contemplated our grounds for hope in dark times, Smith addresses a question frequently posed before her — why her earlier novels are aglow with optimism, while her later writing “tinged with despair” — a question implying that the arc of her body of work inclines toward an admission of the failure of its central animating forces: diversity, multiculturalism, the polyphony of perspectives. With an eye to “what the ancient Greeks did to each other, and the Romans, and the seventeenth-century British, and the nineteenth-century Americans,” Smith offers a corrective that stretches the ahistorical arc of that assumption:
My best friend during my youth — now my husband — is himself from Northern Ireland, an area where people who look absolutely identical to each other, eat the same food, pray to the same God, read the same holy book, wear the same clothes and celebrate the same holidays have yet spent four hundred years at war over a relatively minor doctrinal difference they later allowed to morph into an all-encompassing argument over land, government and national identity. Racial homogeneity is no guarantor of peace, any more than racial heterogeneity is fated to fail.
Speaking from the German stage, Smith recounts visiting the country during her first European book tour in her early twenties, traveling with her father, who had been there in 1945 as a young soldier in the reconstruction:
We made a funny pair on that tour, I’m sure: a young black girl and her elderly white father, clutching our guidebooks and seeking those spots in Berlin that my father had visited almost fifty years earlier. It is from him that I have inherited both my optimism and my despair, for he had been among the liberators at Belsen and therefore seen the worst this world has to offer, but had, from there, gone forward, with a sufficiently open heart and mind, striding into one failed marriage and then another, marrying both times across various lines of class, color and temperament, and yet still found in life reasons to be cheerful, reasons even for joy.
He was a member of the white working class, a man often afflicted by despair who still managed to retain a core optimism. Perhaps in a different time under different cultural influences living in a different society he would have become one of the rabid old angry white men of whom the present left is so afeared. As it was, born in 1925 and dying in 2006, he saw his children benefit from the civilized postwar protections of free education and free health care, and felt he had many reasons to be grateful.
This is the world I knew. Things have changed, but history is not erased by change, and the examples of the past still hold out new possibilities for all of us, opportunities to remake, for a new generation, the conditions from which we ourselves have benefited… Progress is never permanent, will always be threatened, must be redoubled, restated and reimagined if it is to survive.
It is, of course, an abiding question, as old as consciousness — we are material creatures that live in a material universe, yet we are capable of experiences that transcend what we can atomize into physical facts: love, joy, the full-being gladness of a Beethoven symphony on a midsummer’s night.
The Nobel-winning physicist Niels Bohr articulated the basic paradox of living with and within such a duality: “The fact that religions through the ages have spoken in images, parables, and paradoxes means simply that there are no other ways of grasping the reality to which they refer. But that does not mean that it is not a genuine reality. And splitting this reality into an objective and a subjective side won’t get us very far.”
Nearly a century after Bohr, the physicist and writer Alan Lightman takes us further, beyond these limiting dichotomies, in Searching for Stars on an Island in Maine (public library) — a lyrical and illuminating inquiry into our dual impulse for belief in the unprovable and for trust in truth affirmed by physical evidence. Through the lens of his personal experience as a working scientist and a human being with uncommon receptivity to the poetic dimensions of life, Lightman traces our longing for absolutes in a relative world from Galileo to Van Gogh, from Descartes to Dickinson, emerging with that rare miracle of insight at the meeting point of the lucid and the luminous.
Lightman, who has previously written beautifully about his transcendent experience facing a young osprey, relays a parallel experience he had one summer night on an island off the coast of Maine, where he and his wife have been going for a quarter century. On this small, remote speck of land, severed from the mainland without ferries or bridges, each of the six families has had to learn to cross the ocean by small boat — a task particularly challenging at night. Lightman recounts the unbidden revelation of one such nocturnal crossing:
No one was out on the water but me. It was a moonless night, and quiet. The only sound I could hear was the soft churning of the engine of my boat. Far from the distracting lights of the mainland, the sky vibrated with stars. Taking a chance, I turned off my running lights, and it got even darker. Then I turned off my engine. I lay down in the boat and looked up. A very dark night sky seen from the ocean is a mystical experience. After a few minutes, my world had dissolved into that star-littered sky. The boat disappeared. My body disappeared. And I found myself falling into infinity. A feeling came over me I’d not experienced before… I felt an overwhelming connection to the stars, as if I were part of them. And the vast expanse of time — extending from the far distant past long before I was born and then into the far distant future long after I will die — seemed compressed to a dot. I felt connected not only to the stars but to all of nature, and to the entire cosmos. I felt a merging with something far larger than myself, a grand and eternal unity, a hint of something absolute. After a time, I sat up and started the engine again. I had no idea how long I’d been lying there looking up.
Lightman — the first professor at MIT to receive a dual faculty appointment in science and the humanities — syncopates this numinous experience with the reality of his lifelong devotion to science:
I have worked as a physicist for many years, and I have always held a purely scientific view of the world. By that, I mean that the universe is made of material and nothing more, that the universe is governed exclusively by a small number of fundamental forces and laws, and that all composite things in the world, including humans and stars, eventually disintegrate and return to their component parts. Even at the age of twelve or thirteen, I was impressed by the logic and materiality of the world. I built my own laboratory and stocked it with test tubes and petri dishes, Bunsen burners, resistors and capacitors, coils of electrical wire. Among other projects, I began making pendulums by tying a fishing weight to the end of a string. I’d read in Popular Science or some similar magazine that the time for a pendulum to make a complete swing was proportional to the square root of the length of the string. With the help of a stopwatch and ruler, I verified this wonderful law. Logic and pattern. Cause and effect. As far as I could tell, everything was subject to numerical analysis and quantitative test. I saw no reason to believe in God, or in any other unprovable hypotheses.
Yet after my experience in that boat many years later… I understood the powerful allure of the Absolutes — ethereal things that are all-encompassing, unchangeable, eternal, sacred. At the same time, and perhaps paradoxically, I remained a scientist. I remained committed to the material world.
Against our human finitude, temporality, and imperfection, these “Absolutes” offer infinity, eternity, perfection. Lightman defines them as concepts and beliefs that “refer to an enduring and fixed reference point that can anchor and guide us through our temporary lives” — notions like constancy, immortality, permanence, the soul, “God”; notions unprovable by the scientific method. Conversely, however, notions that belong to this realm of Absolutes fall apart when they make claims in the realm of science — claims disproven by the facts of the material world. With an eye to how the discoveries of modern science — from heliocentricity to evolution to the chemical composition of the universe — have challenged many of these Absolutes, Lightman writes:
Nothing in the physical world seems to be constant or permanent. Stars burn out. Atoms disintegrate. Species evolve. Motion is relative. Even other universes might exist, many without life. Unity has given way to multiplicity. I say that the Absolutes have been challenged rather than disproved, because the notions of the Absolutes cannot be disproved any more than they can be proved. The Absolutes are ideals, entities, beliefs in things that lie beyond the physical world. Some may be true and some false, but the truth or falsity cannot be proven.
From all the physical and sociological evidence, the world appears to run not on absolutes but on relatives, context, change, impermanence, and multiplicity. Nothing is fixed. All is in flux.
On the one hand, such an onslaught of discovery presents a cause for celebration… Is it not a testament to our minds that we little human beings with our limited sensory apparatus and brief lifespans, stuck on our one planet in space, have been able to uncover so much of the workings of nature? On the other hand, we have found no physical evidence for the Absolutes. And just the opposite. All of the new findings suggest that we live in a world of multiplicities, relativities, change, and impermanence. In the physical realm, nothing persists. Nothing lasts. Nothing is indivisible. Even the subatomic particles found in the twentieth century are now thought to be made of even smaller “strings” of energy, in a continuing regression of subatomic Russian dolls. Nothing is a whole. Nothing is indestructible. Nothing is still. If the physical world were a novel, with the business of examining evil and good, it would not have the clear lines of Dickens but the shadowy ambiguities of Dostoevsky.
“To be a good human being,” philosopher Martha Nussbaum observed, “is to have a kind of openness to the world, an ability to trust uncertain things beyond your own control” — to have, that is, a willingness to regard with an openhearted curiosity what is other than ourselves and therefore strange, discomfiting, difficult to fathom and relate to, difficult at first to love, for we cannot love what we do not understand. Out of such regard arises the awareness at the heart of Lucille Clifton’s lovely poem “cutting greens” — a recognition of “the bond of live things everywhere,” among which we are only a small part of a vast and miraculous world, and from which we can learn a great deal about being better versions of ourselves.
That is what naturalist and author Sy Montgomery, one of the most poetic science writers of our time, explores in How to Be a Good Creature: A Memoir in Thirteen Animals (public library), illustrated by artist Rebecca Green — an autobiographical adventure into the wilderness of our common humanity, where the world of science and the legacy of Aesop converge into an existential expedition to uncover the elemental truth that “knowing someone who belongs to another species can enlarge your soul in surprising ways.”
Looking back on her unusual and passionate life of swimming with electric eels, digging for mistletoe seeds in emu droppings, and communing with giant octopuses, Montgomery reflects on what she learned about leadership from an emu, about ferocity and forgiveness from an ermine, about living with a sense of wholeness despite imperfection from a one-eyed dog named Thurber (after the great New Yorker cartoonist and essayist James Thurber, who was blinded in one eye by an arrow as a child), and about what it takes for the heart to be “stretched wide with awe.”
At the New England Aquarium, Montgomery gets to know one of Earth’s most alien creatures — the subject of her exquisite book The Soul of an Octopus. She writes:
Reading an octopus’s intentions is not like reading, for instance, a dog’s. I could read [my dog] Sally’s feelings in a glance, even if the only part of her I could see was her tail, or one ear. But Sally was family, and in more than one sense. Dogs, like all placental mammals, share 90 percent of our genetic material. Dogs evolved with humans. Octavia and I were separated by half a billion years of evolution. We were as different as land from sea. Was it even possible for a human to understand the emotions of a creature as different from us as an octopus?
As Octavia slowly allows this improbable and almost miraculous cross-species creaturely connection, Montgomery reflects on the insight attributed to the ancient Greek philosopher Thales of Miletus — “The universe is alive, and has fire in it, and is full of gods.” — and writes:
Being friends with an octopus — whatever that friendship meant to her — has shown me that our world, and the worlds around and within it, is aflame with shades of brilliance we cannot fathom — and is far more vibrant, far more holy, than we could ever imagine.
“There is no time for despair, no place for self-pity, no need for silence, no room for fear,” Toni Morrison exhorted in considering the artist’s task in troubled times. In our interior experience as individuals, as in the public forum of our shared experience as a culture, our courage lives in the same room as our fear — it is in troubled times, in despairing times, that we find out who we are and what we are capable of.
That is what the great poet, essayist, feminist, and civil rights champion Audre Lorde (February 18, 1934–November 17, 1992) explores with exquisite self-possession and might of character in a series of diary entries included in A Burst of Light: and Other Essays (public library).
Seventeen days before she turned fifty, and six years after she underwent a mastectomy for breast cancer, Lorde was told she had liver cancer. She declined surgery and even a biopsy, choosing instead to go on living her life and her purpose, exploring alternative treatments as she proceeded with her planned teaching trip to Europe. In a diary entry penned on her fiftieth birthday, Lorde reckons with the sudden call to confront the ultimate fear:
I want to write down everything I know about being afraid, but I’d probably never have enough time to write anything else. Afraid is a country where they issue us passports at birth and hope we never seek citizenship in any other country. The face of afraid keeps changing constantly, and I can count on that change. I need to travel light and fast, and there’s a lot of baggage I’m going to have to leave behind me. Jettison cargo.
“Not every man knows what he shall sing at the end,” the poet Mark Strand, born within weeks of Lorde, wrote in his stunning ode to mortality. Exactly a month after her diagnosis, with the medical establishment providing more confusion than clarity as she confronts her mortality, Lorde resolves in her journal:
Dear goddess! Face-up again against the renewal of vows. Do not let me die a coward, mother. Nor forget how to sing. Nor forget song is a part of mourning as light is a part of sun.
By the spring, she had lost nearly fifty pounds. But she was brimming with a crystalline determination to do the work of visibility and kinship across difference. She taught in Germany, immersed herself in the international communities of the African Diaspora, and traveled to the world’s first Feminist Book Fair in London. “I may be too thin, but I can still dance!” she exults in her diary on the first day of June. She dances with her fear in an entry penned six days later:
I am listening to what fear teaches. I will never be gone. I am a scar, a report from the frontlines, a talisman, a resurrection. A rough place on the chin of complacency.
“Finding the words is another step in learning to see,” bryologist Robin Wall Kimmerer wrote in reflecting on what her Native American tradition and her training as a scientist taught her about how naming confers dignity upon life. If to name is to see and reveal — to remove the veil of blindness, willful or manipulated, and expose things as they really are — then it is in turn another step in remaking the world, another form of resistance to the damaging dominant narratives that go unquestioned. Walt Whitman knew this when he contemplated our greatest civic might: “I can conceive of no better service… than boldly exposing the weakness, liabilities and infinite corruptions of democracy.”
A century and a half after Whitman, Rebecca Solnit — one of our own era’s boldest public defenders of democracy, and one of the most poetic — explores this crucial causal link between the stories we tell and the world we build in Call Them by Their True Names (public library) — a collection of her essays at the nexus of politics, philosophy, and the selective record of personal and political choices we call history. Composed in response to more than a decade’s worth of cultural crises and triumphs, the pieces in the book furnish an extraordinarily lucid yet hopeful lens on the present and a boldly uncynical telescopic perspective on the future.
Solnit writes in the preface:
One of the folktale archetypes, according to the Aarne-Thompson classification of these stories, tells of how “a mysterious or threatening helper is defeated when the hero or heroine discovers his name.” In the deep past, people knew names had power. Some still do. Calling things by their true names cuts through the lies that excuse, buffer, muddle, disguise, avoid, or encourage inaction, indifference, obliviousness. It’s not all there is to changing the world, but it’s a key step.
When the subject is grim, I think of the act of naming as diagnosis. Though not all diagnosed diseases are curable, once you know what you’re facing, you’re far better equipped to know what you can do about it. Research, support, and effective treatment, as well as possibly redefining the disease and what it means, can proceed from this first step. Once you name a disorder, you may be able to connect to the community afflicted with it, or build one. And sometimes what’s diagnosed can be cured.
That, indeed, is what the philosopher and Trappist monk Thomas Merton celebrated in his beautiful fan letter to Rachel Carson after she catalyzed the modern environmental movement by speaking inconvenient truth to power in exposing the truth about pesticides, marketed at the time as harmless helpers to humanity — an act Merton considered “contributing a most valuable and essential piece of evidence for the diagnosis of the ills of our civilization.” Such naming of wrongs, betrayals, and corruptions unweaves the very fabric of the status quo. It is, Solnit argues, “the first step in the process of liberation” and often leads to shifts in the power system itself. In the age of “alternative facts,” when language is used as a weapon of oppression and manipulation, her words reverberate with the irrepressible, unsilenceable urgency of truth:
To name something truly is to lay bare what may be brutal or corrupt — or important or possible — and key to the work of changing the world is changing the story.
Chance and choice converge to make us who we are, and although we may mistake chance for choice, our choices are the cobblestones, hard and uneven, that pave our destiny. They are ultimately all we can answer for and point to in the architecture of our character. Joan Didion captured this with searing lucidity in defining character as “the willingness to accept responsibility for one’s own life” and locating in that willingness the root of self-respect.
A century before Didion, Friedrich Nietzsche (October 15, 1844–August 25, 1900) composed the score for harmonizing our choices and our contentment with the life they garner us. Nietzsche, who greatly admired Emerson’s ethos of nonconformity and self-reliant individualism, wrote fervently, almost frenetically, about how to find yourself and what it means to be a free spirit. He saw the process of becoming oneself as governed by the willingness to own one’s choices and their consequences — a difficult willingness, yet one that promises the antidote to existential hopelessness, complacency, and anguish.
The legacy of that deceptively simple yet profound proposition is what philosopher John J. Kaag explores in Hiking with Nietzsche: On Becoming Who You Are (public library) — part masterwork of poetic scholarship, part contemplative memoir concerned with the most fundamental question of human life: What gives our existence meaning?
The answer, Kaag suggests in drawing on Nietzsche’s most timeless ideas, challenges our ordinary understanding of selfhood and its cascading implications for happiness, fulfillment, and the building blocks of existential contentment. He writes:
The self is not a hermetically sealed, unitary actor (Nietzsche knew this well), but its flourishing depends on two things: first, that it can choose its own way to the greatest extent possible, and then, when it fails, that it can embrace the fate that befalls it.
At the center of Nietzsche’s philosophy is the idea of eternal return — the ultimate embrace of responsibility that comes from accepting the consequences, good or bad, of one’s willful action. Embedded in it is an urgent exhortation to calibrate our actions in such a way as to make their consequences bearable, livable with, in a hypothetical perpetuity. Nietzsche illustrates the concept with a simple, stirring thought experiment in his final book, Ecce Homo: How One Becomes What One Is:
What if some day or night a demon were to steal into your loneliest loneliness and say to you: “This life as you now live and have lived it you will have to live once again and innumerable times again; and there will be nothing new in it, but every pain and every joy and every thought and sigh and everything unspeakably small or great in your life must return to you, all in the same succession and sequence — even this spider and this moonlight between the trees, and even this moment and I myself…”
“Attention is an intentional, unapologetic discriminator. It asks what is relevant right now, and gears us up to notice only that,” cognitive scientist Alexandra Horowitz wrote in her inquiry into how our conditioned way of looking narrows the lens of our perception. Attention, after all, is the handmaiden of consciousness, and consciousness the central fact and the central mystery of our creaturely experience. From the days of Plato’s cave to the birth of neuroscience, we have endeavored to fathom its nature. But it is a mystery that only seems to deepen with each increment of approach. “Our normal waking consciousness,” William James wrote in his landmark 1902 treatise on spirituality, “is but one special type of consciousness, whilst all about it, parted from it by the filmiest of screens, there lie potential forms of consciousness entirely different… No account of the universe in its totality can be final which leaves these other forms of consciousness quite disregarded.”
Half a century after James, two new molecules punctured the filmy screen to unlatch a portal to a wholly novel universe of consciousness, shaking up our most elemental assumptions about the nature of the mind, our orientation toward mortality, and the foundations of our social, political, and cultural constructs. One of these molecules — lysergic acid diethylamide, or LSD — was a triumph of twentieth-century science, somewhat accidentally synthesized by the Swiss chemist Albert Hofmann in the year physicist Lise Meitner discovered nuclear fission. The other — the compound psilocin, known among the Aztecs as “flesh of the gods” — was the rediscovery of a substance produced by a humble brown mushroom, which indigenous cultures across eras and civilizations had been incorporating into their spiritual rituals since ancient times, and which the Roman Catholic Church had violently suppressed and buried during the Spanish conquest of the Americas.
Together, these two molecules commenced the psychedelic revolution of the 1950s and 1960s, frothing the stream of consciousness — a term James coined — into a turbulent existential rapids. Their proselytes included artists, scientists, political leaders, and ordinary people of all stripes. Their most ardent champions were the psychiatrists and physicians who lauded them as miracle drugs for salving psychic maladies as wide-ranging as anxiety, addiction, and clinical depression. Their cultural consequence was likened to that of the era’s other cataclysmic disruptor: the atomic bomb.
And then — much thanks to Timothy Leary’s reckless handling of his Harvard psilocybin studies that landed him in prison, where Carl Sagan sent him cosmic poetry — a landslide of moral panic and political backlash outlawed psychedelics, shut down clinical studies of their medical and psychiatric uses, and drove them into the underground. For decades, academic research into their potential for human flourishing languished and nearly perished. But a small subset of scientists, psychiatrists, and amateur explorers refused to relinquish their curiosity about that potential.
The 1990s brought a quiet groundswell of second-wave interest in psychedelics — a resurgence that culminated with a 2006 paper reporting on studies at Johns Hopkins, which had found that psilocybin had occasioned “mystical-type experiences having substantial and sustained personal meaning and significance” for terminally ill cancer patients — experiences from which they “return with a new perspective and profound acceptance.” In other words, the humble mushroom compound had helped people face the ultimate frontier of existence — their own mortality — with unparalleled equanimity. The basis of the experience, researchers found, was a sense of the dissolution of the personal ego, followed by a sense of becoming one with the universe — a notion strikingly similar to Bertrand Russell’s insistence that a fulfilling life and a rewarding old age are a matter of “[making] your interests gradually wider and more impersonal, until bit by bit the walls of the ego recede, and your life becomes increasingly merged in the universal life.”
More clinical experiments followed at UCLA, NYU, and other leading universities, demonstrating that this psilocybin-induced dissolution of the ego, extremely difficult if not impossible to achieve in our ordinary consciousness, has profound benefits in rewiring the faulty mental mechanisms responsible for disorders like alcoholism, anxiety, and depression.
One good way to understand a complex system is to disturb it and then see what happens. By smashing atoms, a particle accelerator forces them to yield their secrets. By administering psychedelics in carefully calibrated doses, neuroscientists can profoundly disturb the normal waking consciousness of volunteers, dissolving the structures of the self and occasioning what can be described as a mystical experience. While this is happening, imaging tools can observe the changes in the brain’s activity and patterns of connection. Already this work is yielding surprising insights into the “neural correlates” of the sense of self and spiritual experience.
Pollan examines the psilocybin studies of cancer patients, which reignited scientific interest in psychedelics, and the profound results of subsequent studies exploring the use of psychedelics in treating mental illness, including addiction, depression, obsessive-compulsive disorder. He approaches his subject as a science writer and a skeptic endowed with equal parts rigorous critical thinking and openminded curiosity. In a sentiment evocative of physicist Alan Lightman’s elegant braiding of the numinous and the scientific, he echoes Carl Sagan’s views on the mystery of reality and examines his own lens:
My default perspective is that of the philosophical materialist, who believes that matter is the fundamental substance of the world and the physical laws it obeys should be able to explain everything that happens. I start from the assumption that nature is all that there is and gravitate toward scientific explanations of phenomena. That said, I’m also sensitive to the limitations of the scientific-materialist perspective and believe that nature (including the human mind) still holds deep mysteries toward which science can sometimes seem arrogant and unjustifiably dismissive.
Was it possible that a single psychedelic experience — something that turned on nothing more than the ingestion of a pill or square of blotter paper — could put a big dent in such a worldview? Shift how one thought about mortality? Actually change one’s mind in enduring ways?
The idea took hold of me. It was a little like being shown a door in a familiar room — the room of your own mind — that you had somehow never noticed before and being told by people you trusted (scientists!) that a whole other way of thinking — of being! — lay waiting on the other side. All you had to do was turn the knob and enter. Who wouldn’t be curious? I might not have been looking to change my life, but the idea of learning something new about it, and of shining a fresh light on this old world, began to occupy my thoughts. Maybe there was something missing from my life, something I just hadn’t named.
We go through life seeing reality not as it really is, in its unfathomable depths of complexity and contradiction, but as we hope or fear or expect it to be. Too often, we confuse certainty for truth and the strength of our beliefs for the strength of the evidence. When we collide with the unexpected, with the antipode to our hopes, we are plunged into bewildered despair. We rise from the pit only by love. Perhaps Keats had it slightly wrong — perhaps truth is love and love is truth.
In general, it doesn’t feel like the light is making a lot of progress. It feels like death by annoyance. At the same time, the truth is that we are beloved, even in our current condition, by someone; we have loved and been loved. We have also known the abyss of love lost to death or rejection, and that it somehow leads to new life. We have been redeemed and saved by love, even as a few times we have been nearly destroyed, and worse, seen our children nearly destroyed. We are who we love, we are one, and we are autonomous.
She turns to the greatest paradox of the human heart — our parallel capacities for the perpendiculars of immense love and immense despair:
Love has bridged the high-rises of despair we were about to fall between. Love has been a penlight in the blackest, bleakest nights. Love has been a wild animal, a poultice, a dinghy, a coat. Love is why we have hope.
So why have some of us felt like jumping off tall buildings ever since we can remember, even those of us who do not struggle with clinical depression? Why have we repeatedly imagined turning the wheels of our cars into oncoming trucks?
We just do.
To me, this is very natural. It is hard here.
And yet, in the wreckage of this hardship, we find our most redemptive potentialities:
There is the absolute hopelessness we face that everyone we love will die, even our newborn granddaughter, even as we trust and know that love will give rise to growth, miracles, and resurrection. Love and goodness and the world’s beauty and humanity are the reasons we have hope. Yet no matter how much we recycle, believe in our Priuses, and abide by our local laws, we see that our beauty is being destroyed, crushed by greed and cruel stupidity. And we also see love and tender hearts carry the day. Fear, against all odds, leads to community, to bravery and right action, and these give us hope.
In a sentiment that calls to mind what psychologists call “the vampire problem” — the limiting loop by which we fail to imagine transformation because the very faculty doing the imagining can only be informed by the already transformed self — Lamott adds:
We can change. People say we can’t, but we do when the stakes or the pain is high enough. And when we do, life can change. It offers more of itself when we agree to give up our busyness.
“Life and Reality are not things you can have for yourself unless you accord them to all others,” Alan Watts wrote in the early 1950s, nearly a quarter century before Thomas Nagel’s landmark essay “What Is It Like to Be a Bat?” unlatched the study of other consciousnesses and seeded the disorienting awareness that other beings — “beings who walk other spheres,” to borrow Whitman’s wonderful term — experience this world we share in ways thoroughly alien to our own.
Today, we know that we need not step across the boundary of species to encounter such alien-seeming ways of inhabiting the world. There are innumerable ways of being human — we each experience life and reality in radically different ways merely by our way of seeing, but these differences are accentuated to an extreme when mental illness alters the elemental interiority of a consciousness. In these extreme cases, it can become impossible for even the most empathic imagination to grasp — not only cerebrally but with an embodied understanding — the slippery reality of an anguished consciousness so different from one’s own. Conversely, it can become impossible for those who share that anguish to articulate it, effecting an overwhelming sense of alienation and the false conviction that one is alone in one’s suffering. To convey that reality to those unbedeviled by such mental anguish, and to wrap language around its ineffable interiority for others who suffer silently from the same, is therefore a creative feat and existential service of the highest caliber.
That is what author, Happy Ending Music & Reading Series host, and my dear friend Amanda Stern accomplishes in Little Panic: Dispatches from an Anxious Life (public library) — part-memoir and part-portrait of a cruelly egalitarian affliction that cuts across all borders of age, gender, race, and class, clutching one’s entire reality and sense of self in a stranglehold that squeezes life out. What emerges is a sort of literary laboratory of consciousness, anatomizing an all-consuming yet elusive feeling-pattern to explore what it takes to break the tyranny of worry and what it means to feel at home in oneself.
Part of the splendor of the book is the way Stern unspools the thread of being to the very beginning, all the way to the small child predating conscious memory. In consonance with Maurice Sendak, who so passionately believed that a centerpiece of healthy adulthood is “having your child self intact and alive and something to be proud of,” the child-Amanda emerges from the pages alive and real to articulate in that simple, profound way only children have what the yet-undiagnosed acute anxiety disorder actually feels like from the inside:
Whenever I am afraid, worry sounds itself as sixty, seventy, radio channels playing at the same time inside my head. Refrains loop around and around my brain like fast jabber and I cannot get any of it to stop. I know there is something wrong with me, but no one knows how to fix me. Not anyone outside my body, and definitely not me. Eddie [Stern’s older brother] says a body is blood and bones and skin, and when everything falls off you’re a skeleton, but I am air pressure and tingly dots; energy and everything. I am air and nothing.
My breath flips on its side, horizontal and too wide to go through my lungs.
The grave paradox of mental illness and mental health is that, despite what we now know about how profoundly our emotions affect our physical wellbeing, these terms sever the head from the body — the physical body and the emotional body. A century after William James proclaimed that “a purely disembodied human emotion is a nonentity,” Stern offers a powerful corrective for our ongoing cultural Cartesianism. Her vivid prose, pulsating with a life in language, invites the reader into the interiority of a deeply embodied mind that experiences and comprehends the world somatically. “I was born with a basketball net slung over my top ribs, where the world dunks its balls of dread,” she writes as she channels her young self’s budding awareness that something is terribly, fundamentally wrong with her:
I am a growing constellation of errors. I don’t know what’s wrong with me, only that something is, and it must be too shameful to divulge, or so rare that even the doctors are stumped.
At the end of the book, Stern considers the centrality of anxiety in her own blink of existence and telescopes to a larger truth about this widespread yet largely invisible affliction that seems a fundamental feature of being human:
When did it start? It started before I was born. It started before my mother was born. It started when friction created the world. When does anything start? It doesn’t, it just grows, sometimes to unmanageable heights, and then, when you’re at the very edge, it becomes clear: something must be done.
Left untreated, anxiety disorders, like fingernails, grow with a person. The longer they go untended, the more mangled and painful they become. Often, they spiral, straight out of control, splitting and splintering into other disorders, like depression, social anxiety, agoraphobia. A merry-go-round of features we rise and fall upon. Separation anxiety handicaps its captors, preventing them from leaving bad relationships, moving far from home, going on trips, to parties, applying for jobs, having children, getting married, seeing friends, or falling asleep. Some people are so crippled by their anxiety they have panic attacks in anticipation of having a panic attack.
I’ve had panic attacks in nearly every part of New York City, even on Staten Island. I’ve had them in taxis, on subways, public bathrooms, banks, street corners, in Washington Square Park, on multiple piers, the Manhattan Bridge, Chinatown, the East Village, the Upper East Side, Central Park, Lincoln Center, the dressing room at Urban Outfitters, Mamoun’s Falafel, the Bobst library, the Mid-Manhattan Library, the main library branch, the Brooklyn Library, the Fort Greene Farmer’s Market, laundromats, book kiosks, in the entrance of FAO Schwartz, at the post office, the steps of the Met, on stoops, at the Brooklyn Flea, in bars, at friends’ houses, on stage, in the shower, in queen-sized beds, double beds, twin beds, in my crib.
I’ve grown so expert at hiding them, most people would never even know that I’m suffering. How, after all, do you explain that a restaurant’s decision to dim their lights swelled your throat shut, and that’s why you must leave immediately, not just the restaurant, but the neighborhood? If you cannot point to something, then it is invisible. Like a cult leader, anxiety traps you and convinces you that you’re the only one it sees.
For better or worse, we can only teach others what we understand… Each person begins, after all, as a story other people tell. And when we fall outside the confines of our common standards, we will assume our deficits define us.
My fear and my conviction were the same: that I was the flaw in the universe; the wrongly circled letter in our multiple-choice world. This terrible truth binds us all: fear there’s a single, unattainable, correct way to be human.
“A purely disembodied human emotion is a nonentity,” William James wrote in his pioneering 1884 theory of how our bodies affect our feelings. In the century-some since, breakthroughs in neurology, psychobiology, and neuroscience have contributed leaps of layered (though still incomplete) understanding of the relationship between the physical body and our emotional experience. That tessellated relationship is what neuroscientist Antonio Damasio examines in The Strange Order of Things: Life, Feeling, and the Making of Cultures (public library) — a title inspired by the disorienting fact that several billion years ago, single-cell organisms began exhibiting behaviors strikingly analogous to certain human social behaviors and 100 million years ago insects developed interactions, instruments, and cooperative strategies that we might call cultural. That such sociocultural behaviors long predate the development of the human brain casts new light on the ancient mind-body problem and offers a radical revision of how we understand mind, feeling, consciousness, and the construction of cultures.
Two decades after his landmark exploration of how the relationship between the body and the mind shapes our conscious experience, Damasio draws a visionary link between biology and social science in a fascinating investigation of homeostasis — the delicate balance that underpins our physical existence, ensures our survival, and defines our flourishing. At the heart of his inquiry is his lifelong interest in the nature of human affect — why we feel what we feel, how we use emotions to construct selfhood, what makes our intentions and our feelings so frequently contradictory, how the body and the mind conspire in the inception of emotional reality. What emerges is not an arsenal of certitudes and answers but a celebration of curiosity and a reminder that intelligent, informed speculation is how we expand the territory of knowledge by moving the boundary of the knowable further into the unknown.
Feelings, Damasio argues, are the unheralded germinators of human culture:
Human beings have distinguished themselves from all other beings by creating a spectacular collection of objects, practices, and ideas, collectively known as cultures. The collection includes the arts, philosophical inquiry, moral systems and religious beliefs, justice, governance, economic institutions, and technology and science.
Language, sociality, knowledge, and reason are the inventors and executors of these complicated processes. But feelings get to motivate them and stay on to check the results… Cultural activity began and remains deeply embedded in feeling. The favorable and unfavorable interplay of feeling and reason must be acknowledged if we are to understand the conflicts and contradictions of the human condition.
The modern-day universe of codes and ciphers began in a cottage on the prairie, with a pair of young lovers smiling at each other across a table and a rich man urging them to be spectacular.
The two young lovers were Elizebeth Smith and William Friedman, and the rich man, the eccentric textile tycoon George Fabyan.
The youngest of nine children raised in a modest Quaker home, Elizebeth was born in an era when fewer than four percent of American women graduated from college. Four years after earning her degree in Greek and English literature, she still felt like “a quivering, keenly alive, restless, mental question mark.” The following year, 1916, she began her improbable career at Riverbank Laboratories — Fabyan’s Wonderland-like estate, where the billionaire had hired Elizebeth to work on the cipher at the heart of a literary conspiracy theory claiming that Francis Bacon was the true author of Shakespeare’s works. At Riverbank, she met William, a young geneticist living in a windmill — one of the many fanciful fixtures of Riverbank — and studying seeds in order to infuse crops with optimal properties as a kind of proto genetic engineering. Over long walks, animated by parallel intellectual voraciousness and shared skepticism of the Bacon cipher conspiracy, the two fell in love.
William and Elizebeth were married at Riverbank, where they had begun collaborating on cryptographic work. The papers on the subject they wrote together — though always published under William’s name alone — soon spread their reputation beyond Riverbank. Cryptography was new then, new and thrilling and full of unmined possibility for government intelligence, and so the U.S. Navy eventually recruited the Friedmans. Fagone writes:
The savaging of Nazis, the birth of a science: It begins on the day when a twenty-three-year-old American woman decides to trust her doubt and dig with her own mind.
The room is dark but her pencil is sharp. An envelope of puzzles arrives from Washington, sent by men who have the largest of responsibilities and the tiniest of clues. With William she examines the puzzles. He is game, he looks at her with eyes like little bonfires, he is in love with her. She is not in love yet but she would not be ashamed to fall in love with such a bright and kind person. She stares at the odd blocks of text and starts to flip and stack and rearrange them on a scratch pad, a kindling of letters, a friction of alphabets hot to the touch, and then a flame catches and then catches again, until she understands that she can ignite whenever she wants, that a power is there for the taking, for her and for anyone, and nothing will ever be the same. The ribs of a pattern shine through. Something rises at the nib of her pencil and her heart whomps away. The skeletons of words leap out and make her jump.
At twenty-two, physicist Freeman Dyson (b. December 15, 1923) ascended to a position Newton had held a quarter millennium earlier at Trinity College, where Dyson lived in a room just below Ludwig Wittgenstein’s. Nearly a century later, Dyson remains one of the preeminent scientific minds of our time and a rare witness of a great many cultural milestones, triumphs, and tragedies that have shaped modern life as we know it — landmark discoveries like cosmic microwave background radiation and the double helix structure of DNA, which have profoundly changed our understanding of the universe; the invention of the atomic bomb and the scarring brutality of a World War; the rise of the Internet. He has seen the stars of countless political regimes, scientific theories, and ideologies rise and fall. In Maker of Patterns: An Autobiography Through Letters (public library), Dyson unleashes his warm wisdom and unboastful wit on subjects as varied as politics, the enchantment of science, the vacuity of celebrity, the value of the immigrant perspective, his vibrant friendship with Richard Feynman, and the complexities of being human. He recounts “a flash of illumination” on the Greyhound bus that revealed to him the nature of creativity and composes a singularly delightful account of meeting the great, troubled logician Kurt Gödel at a farewell party for T.S. Eliot at the Princeton home of Robert Oppenheimer. What emerges is not only the fascinating memoir of an uncommon genius, composed of Dyson’s letters to his loved ones, but an invaluable time-capsule of collective memory.
The rewards and redemptions of that elemental yet endangered response is what British naturalist and environmental writer Michael McCarthy, a modern-day Carson, explores in The Moth Snowstorm: Nature and Joy (public library) — part memoir and part manifesto, a work of philosophy rooted in environmental science and buoyed by a soaring poetic imagination.
The natural world can offer us more than the means to survive, on the one hand, or mortal risks to be avoided, on the other: it can offer us joy.
There can be occasions when we suddenly and involuntarily find ourselves loving the natural world with a startling intensity, in a burst of emotion which we may not fully understand, and the only word that seems to me to be appropriate for this feeling is joy.
Referring to it as joy may not facilitate its immediate comprehension either, not least because joy is not a concept, nor indeed a word, that we are entirely comfortable with, in the present age. The idea seems out of step with a time whose characteristic notes are mordant and mocking, and whose preferred emotion is irony. Joy hints at an unrestrained enthusiasm which may be thought uncool… It reeks of the Romantic movement. Yet it is there. Being unfashionable has no effect on its existence… What it denotes is a happiness with an overtone of something more, which we might term an elevated or, indeed, a spiritual quality.
A century and a half after Thoreau extolled nature as a form of prayer and an antidote to the smallening of spirit amid the ego-maelstrom we call society — “In the street and in society I am almost invariably cheap and dissipated, my life is unspeakably mean,” he lamented in his journal — McCarthy considers the role of the transcendent feelings nature can stir in us in a secular world:
They are surely very old, these feelings. They are lodged deep in our tissues and emerge to surprise us. For we forget our origins; in our towns and cities, staring into our screens, we need constantly reminding that we have been operators of computers for a single generation and workers in neon-lit offices for three or four, but we were farmers for five hundred generations, and before that hunter-gatherers for perhaps fifty thousand or more, living with the natural world as part of it as we evolved, and the legacy cannot be done away with.
Having devoted eight years of my life to it, and having a heart swelling with gratitude to the legion of writers and artists who contributed original letters and illustrations for this monumental labor of love, I must proudly include A Velocity of Being: Letters to a Young Reader (public library) — a collection of original letters to the children of today and tomorrow about why we read and what books do for the human spirit, composed by 121 of the most interesting and inspiring humans in our world: Jane Goodall, Yo-Yo Ma, Jacqueline Woodson, Ursula K. Le Guin, Mary Oliver, Neil Gaiman, Amanda Palmer, Rebecca Solnit, Elizabeth Gilbert, Shonda Rhimes, Alain de Botton, James Gleick, Anne Lamott, Diane Ackerman, Judy Blume, Eve Ensler, David Byrne, Sylvia Earle, Richard Branson, Daniel Handler, Marina Abramović, Regina Spektor, Elizabeth Alexander, Adam Gopnik, Debbie Millman, Dani Shapiro, Tim Ferriss, Ann Patchett, a 98-year-old Holocaust survivor, Italy’s first woman in space, and many more immensely accomplished and largehearted artists, writers, scientists, philosophers, entrepreneurs, musicians, and adventurers whose character has been shaped by a life of reading.
Accompanying each letter is an original illustration by a prominent artist in response to the text — including beloved children’s book illustrators like Sophie Blackall, Oliver Jeffers, Isabelle Arsenault, Jon Klassen, Shaun Tan, Olivier Tallec, Christian Robinson, Marianne Dubuc, Lisa Brown, Carson Ellis, Mo Willems, Peter Brown, and Maira Kalman.
Because this project was born of a deep concern for the future of books and a love of literature as a pillar of democratic society, we are donating 100% of proceeds from the book to the New York public library system in gratitude for their noble work in stewarding literature and democratizing access to the written record of human experience. The gesture is inspired in large part by James Baldwin’s moving recollection of how he used the library to read his way from Harlem to the literary pantheon and Ursula K. Le Guin’s insistence that “a great library is freedom.” (Le Guin is one of four contributors we lost between the outset of the project and its completion, for all of whom their letter is their last published work.)