Brain Pickings Icon
Brain Pickings

Search results for “baloney detection kit”

Beware the Rise of the Pseudo-Intellectual: Tom Wolfe’s Boston University Commencement Address

“We live in an age in which ideas, important ideas, are worn like articles of fashion.”

Few things bypass our culture’s codified shell of cynicism more elegantly and powerfully than the commencement address — that singular mode of intravenous wisdom-delivery wherein an elder steps onto a stage and plugs straight into what Oscar Wilde called the “temperament of receptivity,” so elusive in all hearts and doubly so in the young. History’s greatest commencement addresses — masterworks like Joseph Brodsky’s “Speech at the Stadium” and David Foster Wallace’s “This Is Water” — deliver not vacant platitudes but hard-earned, life-tested insight into the beliefs, behaviors, and habits of mind that embolden us to live good, rewarding, noble lives.

That is what celebrated writer Tom Wolfe (b. March 2, 1931) delivered when he took the podium at Boston University in 2000 with a magnificent address included in Way More than Luck: Commencement Speeches on Living with Bravery, Empathy, and Other Existential Skills (public library).

Tom Wolfe by Henry Leutwyler

Wolfe begins by putting in perspective the value — the gift — of an education:

As someone who grew up in the Great Depression of the 1930s, I know that a commencement is a family triumph. Forget money. Aside from love, the cardinal virtues, and time, there is no greater gift parents can give a child than an education.

And yet much of the true value of education, Wolfe argues, is being eclipsed by what he calls “pernicious enlightenment” — our idea-fetishism, continually fueled by the challenge of finding wisdom in the age of information, which leads us to mistake surface impressions for substantive understanding. Wolfe writes:

We live in an age in which ideas, important ideas, are worn like articles of fashion — and for precisely the same reason articles of fashion are worn, which is to make the wearer look better and to feel à la mode.

He examines the role of the middle class in the dissemination and uptake of ideas:

The truth is that there is a common bond among all cultures, among all peoples in this world … at least among those who have reached the level of the wheel, the shoe, and the toothbrush. And that common bond is that much-maligned class known as the bourgeoisie — the middle class… They are all over the world, in every continent, every nation, every society, every culture, everywhere you find the wheel, the shoe, and the toothbrush, and wherever they are, all of them believe in the same things. And what are those things? Peace, order, education, hard work, initiative, enterprise, creativity, cooperation, looking out for one another, looking out for the future of children, patriotism, fair play, and honesty. How much more do you want from the human beast? How much more can you possibly expect?

I say that the middle class around the world … is the highest form of evolution. The bourgeoisie! — the human beast doesn’t get any better! The worldwide bourgeoisie makes what passes today for aristocrats — people consumed by juvenility who hang loose upon society — look like shiftless children.

Perhaps with an eye to Virginia Woolf’s legendary rant against the malady of middlebrow, Wolfe notes:

We writers spent the entire twentieth century tearing down the bourgeoisie! … We in the arts have been complicit in the denigration of the best people on earth. Why? Because so many of the most influential ideas of our time are the product of a new creature of the twentieth century, a creature that did not exist until 1898 — and that creature is known as “the intellectual.”

The true enemy of the assimilation of substantive ideas, Wolfe argues, isn’t the middlebrow person but the pseudo-intellectual or, even, the “intellectual” — for anyone who describes himself as an “intellectual” (to say nothing of a “public intellectual”) already implies the “pseudo” by the very act of such self-description. (You know the type — perhaps he has an exaggerated “European accent” of unidentifiable Germanic origin, perhaps he quotes Voltaire excessively, perhaps he slips one too many French words into ordinary speech where a perfectly good English option exists.) Wolfe makes an important distinction:

We must be careful to make a distinction between the intellectual and the person of intellectual achievement. The two are very, very different animals. There are people of intellectual achievement who increase the sum of human knowledge, the powers of human insight, and analysis. And then there are the intellectuals. An intellectual is a person knowledgeable in one field who speaks out only in others. Starting in the early twentieth century, for the first time an ordinary storyteller, a novelist, a short story writer, a poet, a playwright, in certain cases a composer, an artist, or even an opera singer could achieve a tremendous eminence by becoming morally indignant about some public issue. It required no intellectual effort whatsoever. Suddenly he was elevated to a plane from which he could look down upon ordinary people. Conversely — this fascinates me — conversely, if you are merely a brilliant scholar, merely someone who has added immeasurably to the sum of human knowledge and the powers of human insight, that does not qualify you for the eminence of being an intellectual.

Art by Maira Kalman from ‘And the Pursuit of Happiness.’ Click image for more.

Having often thought about the role of cynicism in our culture — how we use its self-righteous hubris to mask our insecurity and vulnerability — I find myself nodding vigorously with Wolfe’s observation about the use of “moral indignation” in public discourse:

One of the things that I find really makes it worth watching all the Academy Awards, all the Emmys, all those awards ceremonies, is to see how today’s actors and television performers have discovered the formula. If you become indignant, this elevates you to the plane of “intellectual.” No mental activity is required. It is a rule, to which there has never been an exception, that when an actor or a television performer rises up to the microphone at one of these awards ceremonies and expresses moral indignation over something, he illustrates Marshall McLuhan’s dictum that “moral indignation is a standard strategy for endowing the idiot with dignity.”

Wolfe leaves graduates with a clarion call for cultivating the critical discernment necessary for making up one’s own mind in the face of such wearable intellectualism:

You’re not going to find many traditional judges who can lead you any longer, since they now wander helplessly, bemused by the willful ignorance of that bizarre twentieth-century organism, the intellectual. You’re going to have to make the crucial judgments yourselves. But you are among the very handful of those who can do it.

Way More than Luck, which also includes advice from Bradley Whitford, Debbie Millman, Nora Ephron, David Foster Wallace, and Jonathan Safran Foer, is an elevating read in its entirety. Complement it with this evolving archive of the greatest commencement addresses of all time, then revisit Carl Sagan’s Baloney Detection Kit for critical thinking.

BP

This Idea Must Die: Some of the World’s Greatest Thinkers Each Select a Major Misconception Holding Us Back

From the self to left brain vs. right brain to romantic love, a catalog of broken theories that hold us back from the conquest of Truth.

“To kill an error is as good a service as, and sometimes even better than, the establishing of a new truth or fact,” asserted Charles Darwin in one of the eleven rules for critical thinking known as Prospero’s Precepts. If science and human knowledge progress in leaps and bounds of ignorance, then the recognition of error and the transcendence of falsehood are the springboard for the leaps of progress. That’s the premise behind This Idea Must Die: Scientific Theories That Are Blocking Progress (public library) — a compendium of answers Edge founder John Brockman collected by posing his annual question“What scientific idea is ready for retirement?” — to 175 of the world’s greatest scientists, philosophers, and writers. Among them are Nobel laureates, MacArthur geniuses, and celebrated minds like theoretical physicist and mathematician Freeman Dyson, biological anthropologist Helen Fisher, cognitive scientist and linguist Steven Pinker, media theorist Douglas Rushkoff, philosopher Rebecca Newberger Goldstein, psychologist Howard Gardner, social scientist and technology scholar Sherry Turkle, actor and author Alan Alda, futurist and Wired founding editor Kevin Kelly, and novelist, essayist, and screenwriter Ian McEwan.

Brockman paints the backdrop for the inquiry:

Science advances by discovering new things and developing new ideas. Few truly new ideas are developed without abandoning old ones first. As theoretical physicist Max Planck (1858–1947) noted, “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” In other words, science advances by a series of funerals.

Many of the answers are redundant — but this is a glorious feature rather than a bug of Brockman’s series, for its chief reward is precisely this cumulative effect of discerning the zeitgeist of ideas with which some of our era’s greatest minds are tussling in synchronicity. They point to such retirement-ready ideas as IQ, the self, race, the left brain vs. right brain divide, human nature and essentialism, free will, and even science itself. What emerges is the very thing Carl Sagan deemed vital to truth in his Baloney Detection Kit — a “substantive debate on the evidence by knowledgeable proponents of all points of view.”

Illustration by Lizi Boyd from ‘Flashlight.’ Click image for more.

One of the most profound undercurrents across the answers has to do with our relationship with knowledge, certainty, and science itself. And one of the most profound contributions in that regard comes from MacArthur fellow Rebecca Newberger Goldstein, a philosopher who thinks deeply and dimensionally about some of the most complex questions of existence. Assailing the idea that science makes philosophy obsolete — that science is the transformation of “philosophy’s vagaries into empirically testable theories” and philosophy merely the “cold-storage room in which questions are shelved until the sciences get around to handling them” — Goldstein writes:

The obsolescence of philosophy is often taken to be a consequence of science. After all, science has a history of repeatedly inheriting — and definitively answering — questions over which philosophers have futilely hemmed and hawed for unconscionable amounts of time.

The gravest problem with this theory, Goldstein notes, is its internal incoherence:

You can’t argue for science making philosophy obsolete without indulging in philosophical arguments… When pressed for an answer to the so-called demarcation problem, scientists almost automatically reach for the notion of “falsifiability” first proposed by Karl Popper. His profession? Philosophy. But whatever criterion you offer, its defense is going to implicate you in philosophy.

This is something that Dorion Sagan, Carl Sagan’s son, has previously addressed, but Goldstein brings to it unparalleled elegance of thought and eloquence of expression:

A triumphalist scientism needs philosophy to support itself. And the lesson here should be generalized. Philosophy is joined to science in reason’s project. Its mandate is to render our views and our attitudes maximally coherent.

In doing so, she argues, philosophy provides “the reasoning that science requires in order to claim its image as descriptive.” As a proponent of the vital difference between information and wisdom — the former being the material of science, the latter the product of philosophy, and knowledge the change agent that transmutes one into the other — I find the provocative genius of Goldstein’s conclusion enormously invigorating:

What idea should science retire? The idea of “science” itself. Let’s retire it in favor of the more inclusive “knowledge.”

Neuroscientist Sam Harris, author of the indispensable Spirituality Without Religion, echoes this by choosing our narrow definition of “science” as the idea to be put to rest:

Search your mind, or pay attention to the conversations you have with other people, and you’ll discover that there are no real boundaries between science and philosophy — or between those disciplines and any other that attempts to make valid claims about the world on the basis of evidence and logic. When such claims and their methods of verification admit of experiment and/or mathematical description, we tend to say our concerns are “scientific”; when they relate to matters more abstract, or to the consistency of our thinking itself, we often say we’re being “philosophical”; when we merely want to know how people behaved in the past, we dub our interests “historical” or “journalistic”; and when a person’s commitment to evidence and logic grows dangerously thin or simply snaps under the burden of fear, wishful thinking, tribalism, or ecstasy, we recognize that he’s being “religious.”

The boundaries between true intellectual disciplines are currently enforced by little more than university budgets and architecture… The real distinction we should care about — the observation of which is the sine qua non of the scientific attitude — is between demanding good reasons for what one believes and being satisfied with bad ones.

In a sentiment that calls to mind both Richard Feynman’s spectacular ode to a flower and Carl Sagan’s enduring wisdom on our search for meaning, Harris applies this model of knowledge to one of the great mysteries of science and philosophy — consciousness:

Even if one thinks the human mind is entirely the product of physics, the reality of consciousness becomes no less wondrous, and the difference between happiness and suffering no less important. Nor does such a view suggest that we’ll ever find the emergence of mind from matter fully intelligible; consciousness may always seem like a miracle. In philosophical circles, this is known as “the hard problem of consciousness” — some of us agree that this problem exists, some of us don’t. Should consciousness prove conceptually irreducible, remaining the mysterious ground for all we can conceivably experience or value, the rest of the scientific worldview would remain perfectly intact.

The remedy for all this confusion is simple: We must abandon the idea that science is distinct from the rest of human rationality. When you are adhering to the highest standards of logic and evidence, you are thinking scientifically. And when you’re not, you’re not.

Illustration from ‘Once Upon an Alphabet’ by Oliver Jeffers. Click image for more.

Psychologist Susan Blackmore, who studies this very problem — famously termed “the hard problem of consciousness” by philosopher David Chalmers in 1996 — benches the idea that there are neural correlates to consciousness. Tempting as it may be to interpret neural activity as the wellspring of that special something we call “consciousness” or “subjective experience,” as opposed to the “unconscious” rest of the brain, Blackmore admonishes that such dualism is past its cultural expiration date:

Dualist thinking comes naturally to us. We feel as though our conscious experiences were of a different order from the physical world. But this is the same intuition that leads to the hard problem seeming hard. It’s the same intuition that produces the philosopher’s zombie — a creature identical to me in every way except that it has no consciousness. It’s the same intuition that leads people to write, apparently unproblematically, about brain processes being either conscious or unconscious… Intuitively plausible as it is, this is a magic difference. Consciousness is not some weird and wonderful product of some brain processes but not others. Rather, it’s an illusion constructed by a clever brain and body in a complex social world. We can speak, think, refer to ourselves as agents, and so build up the false idea of a persisting self that has consciousness and free will.

Much of the allure of identifying such neural correlates of consciousness, Blackmore argues, lies in cultural mythologies rooted in fantasy rather than fact:

While people are awake they must always be conscious of something or other. And that leads along the slippery path to the idea that if we knew what to look for, we could peer inside someone’s brain and find out which processes were the conscious ones and which the unconscious ones. But this is all nonsense. All we’ll ever find are the neural correlates of thoughts, perceptions, memories, and the verbal and attentional processes that lead us to think we’re conscious.

When we finally have a better theory of consciousness to replace these popular delusions, we’ll see that there’s no hard problem, no magic difference, and no NCCs.

Illustration by Rob Hunter from ‘A Graphic Cosmogony.’ Click image for more.

In a related grievance, social psychologist Bruce Hood — author of the uncomfortable yet strangely comforting The Self Illusion — does away with the notion of the self. Half a century after Alan Watts enlisted Eastern philosophy in this mission, Hood presents a necessary integration of science and philosophy:

It seems almost redundant to call for the retirement of the free willing self, as the idea is neither scientific nor is this the first time the concept has been dismissed for lack of empirical support. The self did not have to be discovered; it’s the default assumption most of us experience, so it wasn’t really revealed by methods of scientific inquiry.

[…]

Yet the self, like a conceptual zombie, refuses to die. It crops up again and again in recent theories of decision making, as an entity with free will which can be depleted. It reappears as an interpreter in cognitive neuroscience, as able to integrate parallel streams of information arising from separable neural substrates. Even if these appearances of the self are understood to be convenient ways of discussing the emergent output of multiple parallel processes, students of the mind continue to implicitly endorse the idea that there’s a decision maker, an experiencer, a point of origin.

We know the self is constructed because it can be so easily deconstructed — through damage, disease, and drugs. It must be an emergent property of a parallel system processing input, output, and internal representations. It’s an illusion because it feels so real, but that experience is not what it seems. The same is true for free will. Although we can experience the mental anguish of making a decision… the choices and decisions we make are based on situations that impose on us. We don’t have the free will to choose the experiences that have shaped our decisions.

[…]

By abandoning the free willing self, we’re forced to reexamine the factors that are truly behind our thoughts and behavior and the way they interact, balance, override, and cancel out. Only then will we begin to make progress in understanding how we really operate.

Illustration by Ben Newman from ‘A Graphic Cosmogony.’ Click image for more.

Among the most provocative answers, in fact, is one examining the factors that underlie one of the most complex and seemingly human of our experiences: love. Biological anthropologist Helen Fisher, who studies the brain on love, points to romantic love and addiction as two concepts in need of serious reformulation and reframing — one best accomplished by understanding the intersection of the two. Fisher argues that we ought to broaden the definition of addiction and do away with science’s staunch notion that all addiction is harmful. Love, she argues with a wealth of neurobiological evidence in hand, is in fact a state that closely resembles that of addiction in terms of what happens in the brain during it — and yet love, anguishing as it may be at times, is universally recognized as the height of positive experience. In that respect, it presents a case of “positive addiction.” Fisher writes:

Love-besotted men and women show all the basic symptoms of addiction. Foremost, the lover is stiletto-focused on his/her drug of choice, the love object. The lover thinks obsessively about him or her (intrusive thinking), and often compulsively calls, writes, or stays in touch. Paramount in this experience is intense motivation to win one’s sweetheart, not unlike the substance abuser fixated on the drug. Impassioned lovers distort reality, change their priorities and daily habits to accommodate the beloved, experience personality changes (affect disturbance), and sometimes do inappropriate or risky things to impress this special other. Many are willing to sacrifice, even die for, “him” or “her.” The lover craves emotional and physical union with the beloved (dependence). And like addicts who suffer when they can’t get their drug, the lover suffers when apart from the beloved (separation anxiety). Adversity and social barriers even heighten this longing (frustration attraction).

In fact, besotted lovers express all four of the basic traits of addiction: craving, tolerance, withdrawal, and relapse. They feel a “rush” of exhilaration when they’re with their beloved (intoxication). As their tolerance builds, they seek to interact with the beloved more and more (intensification). If the love object breaks off the relationship, the lover experiences signs of drug withdrawal, including protest, crying spells, lethargy, anxiety, insomnia or hypersomnia, loss of appetite or binge eating, irritability, and loneliness. Lovers, like addicts, also often go to extremes, sometimes doing degrading or physically dangerous things to win back the beloved. And lovers relapse the way drug addicts do. Long after the relationship is over, events, people, places, songs, or other external cues associated with their abandoning sweetheart can trigger memories and renewed craving.

Fisher points to fMRI studies that have shown intense romantic love to trigger the brain’s reward system and the dopamine pathways responsible for “energy, focus, motivation, ecstasy, despair, and craving,” as well as the brain regions most closely associated with addiction and substance abuse. In shedding light on the neurochemical machinery of romantic love, Fisher argues, science reveals it to be a “profoundly powerful, natural, often positive addiction.”

Illustration by Christine Rösch from ‘The Mathematics of Love.’ Click image for more.

Astrophysicist Marcelo Gleiser, who has written beautifully about the necessary duality of knowledge and mystery, wants to do away with “the venerable notion of Unification.” He points out that smaller acts of unification and simplification are core to the scientific process — from the laws of thermodynamics to Newton’s law of universal gravity — but simplification as sweeping as reducing the world to a single Theory of Everything is misplaced:

The trouble starts when we take this idea too far and search for the Über-unification, the Theory of Everything, the arch-reductionist notion that all forces of nature are merely manifestations of a single force. This is the idea that needs to go.

Noting that at some point along the way, “math became equated with beauty and beauty with truth,” Gleiser writes:

The impulse to unify it all runs deep in the souls of mathematicians and theoretical physicists, from the Langlands program to superstring theory. But here’s the rub: Pure mathematics isn’t physics. The power of mathematics comes precisely from its detachment from physical reality. A mathematician can create any universe she wants and play all sorts of games with it. A physicist can’t; his job is to describe nature as we perceive it. Nevertheless, the unification game has been an integral part of physics since Galileo and has produced what it should: approximate unifications.

And yet this unification game, as integral as it may be to science, is also antithetical to it in the long run:

The scientific impulse to unify is crypto-religious… There’s something deeply appealing in equating all of nature to a single creative principle: To decipher the “mind of God” is to be special, is to answer to a higher calling. Pure mathematicians who believe in the reality of mathematical truths are monks of a secret order, open only to the initiated. In the case of high energy physics, all unification theories rely on sophisticated mathematics related to pure geometric structures: The belief is that nature’s ultimate code exists in the ethereal world of mathematical truths and that we can decipher it.

Echoing Richard Feynman’s spectacular commencement address admonishing against “cargo cult science,” Gleiser adds:

Recent experimental data has been devastating to such belief — no trace of supersymmetric particles, of extra dimensions, or of dark matter of any sort, all long-awaited signatures of unification physics. Maybe something will come up; to find, we must search. The trouble with unification in high energy physics is that you can always push it beyond the experimental range. “The Large Hadron Collider got to 7 TeV and found nothing? No problem! Who said nature should opt for the simplest versions of unification? Maybe it’s all happening at much higher energies, well beyond our reach.”

There’s nothing wrong with this kind of position. You can believe it until you die, and die happy. Or you can conclude that what we do best is construct approximate models of how nature works and that the symmetries we find are only descriptions of what really goes on. Perfection is too hard a burden to impose on nature.

People often see this kind of argument as defeatist, as coming from someone who got frustrated and gave up. (As in “He lost his faith.”) Big mistake. To search for simplicity is essential to what scientists do. It’s what I do. There are essential organizing principles in nature, and the laws we find are excellent ways to describe them. But the laws are many, not one. We’re successful pattern-seeking rational mammals. That alone is cause for celebration. However, let’s not confuse our descriptions and models with reality. We may hold perfection in our mind’s eye as a sort of ethereal muse. Meanwhile nature is out there doing its thing. That we manage to catch a glimpse of its inner workings is nothing short of wonderful. And that should be good enough.

Ceramic tile by Debbie Millman courtesy of the artist

Science writer Amanda Gefter takes issue with one particular manifestation of our propensity for oversimplification — the notion of the universe. She writes:

Physics has a time-honored tradition of laughing in the face of our most basic intuitions. Einstein’s relativity forced us to retire our notions of absolute space and time, while quantum mechanics forced us to retire our notions of pretty much everything else. Still, one stubborn idea has stood steadfast through it all: the universe.

[…]

In recent years, however, the concept of a single shared spacetime has sent physics spiraling into paradox. The first sign that something was amiss came from Stephen Hawking’s landmark work in the 1970s showing that black holes radiate and evaporate, disappearing from the universe and purportedly taking some quantum information with them. Quantum mechanics, however, is predicated upon the principle that information can never be lost.

Gefter points to recent breakthroughs in physics that produced one particularly puzzling such paradox, known as the “firewall paradox,” solved by the idea that spacetime is divided not by horizons but by the reference frames of the observers, “as if each observer had his or her own universe.”

But the solution isn’t a multiverse theory:

Yes, there are multiple observers, and yes, any observer’s universe is as good as any other’s. But if you want to stay on the right side of the laws of physics, you can talk only about one at a time. Which means, really, that only one exists at a time. It’s cosmic solipsism.

Here, psychology, philosophy, and cosmology converge, for what such theories suggest is what we already know about the human psyche — as I’ve put it elsewhere, the stories that we tell ourselves, whether they be false or true, are always real. Gefter concludes:

Adjusting our intuitions and adapting to the strange truths uncovered by physics is never easy. But we may just have to come around to the notion that there’s my universe and there’s your universe — but there’s no such thing as the universe.

Biological anthropologist Nina Jablonski points to the notion of race as urgently retirement-ready. Pointing out that it has always been a “vague and slippery concept,” she traces its origins to Hume and Kant — the first to divide humanity into geographic groupings called “races” — and the pseudoscientific seeds of racism this division planted:

Skin color, as the most noticeable racial characteristic, was associated with a nebulous assemblage of opinions and hearsay about the inherent natures of the various races. Skin color stood for morality, character, and the capacity for civilization; it became a meme.

Even though the atrocious “race science” that emerged in the 19th and early 20th century didn’t hold up — whenever scientists looked for actual sharp boundaries between groups, none came up — and race came to be something people identify themselves with as a shared category of experiences and social bonds, Jablonski argues that the toxic aftershocks of pseudoscience still poison culture:

Even after it has been shown that many diseases (adult-onset diabetes, alcoholism, high blood pressure, to name a few) show apparent racial patterns because people share similar environmental conditions, groupings by race are maintained. The use of racial self-categorization in epidemiological studies is defended and even encouraged. Medical studies of health disparities between “races” become meaningless when sufficient variables — such as differences in class, ethnic social practices, and attitudes — are taken into account.

Half a century after the ever-prescient Margaret Mead made the same point, Jablonski urges:

Race has a hold on history but no longer has a place in science. The sheer instability and potential for misinterpretation render race useless as a scientific concept. Inventing new vocabularies to deal with human diversity and inequity won’t be easy, but it must be done.

Psychologist Jonathan Gottschall, who has previously explored why storytelling is so central to the human experience, argues against the notion that there can be no science of art. With an eye to our civilization’s long struggle to define art, he writes:

We don’t even have a good definition, in truth, for what art is. In short, there’s nothing so central to human life that’s so incompletely understood.

Granted, Gottschall is only partly right, for there are some excellent definitions of art — take, for instance, Jeanette Winterson’s or Leo Tolstoy’s — but the fact that they don’t come from scientists only speaks to his larger point. He argues that rather than being unfit to shed light on the role of art in human life, science simply hasn’t applied itself to the problem adequately:

Scientific work in the humanities has mainly been scattered, preliminary, and desultory. It doesn’t constitute a research program.

If we want better answers to fundamental questions about art, science must jump into the game with both feet. Going it alone, humanities scholars can tell intriguing stories about the origins and significance of art, but they don’t have the tools to patiently winnow the field of competing ideas. That’s what the scientific method is for — separating the more accurate stories from the less accurate stories. But a strong science of art will require both the thick, granular expertise of humanities scholars and the clever hypothesis-testing of scientists. I’m not calling for a scientific takeover of the arts, I’m calling for a partnership.

[…]

The Delphic admonition “Know thyself” still rings out as the great prime directive of intellectual inquiry, and there will always be a gaping hole in human self-knowledge until we develop a science of art.

In a further testament to the zeitgeist-illuminating nature of the project, actor, author, and science-lover Alan Alda makes a passionate case for the same concept:

The trouble with truth is that not only is the notion of eternal, universal truth highly questionable, but simple, local truths are subject to refinement as well. Up is up and down is down, of course. Except under special circumstances. Is the North Pole up and the South Pole down? Is someone standing at one of the poles right-side up or upside-down? Kind of depends on your perspective.

When I studied how to think in school, I was taught that the first rule of logic was that a thing cannot both be and not be at the same time and in the same respect. That last note, “in the same respect,” says a lot. As soon as you change the frame of reference, you’ve changed the truthiness of a once immutable fact.

[…]

This is not to say that nothing is true or that everything is possible — just that it might not be so helpful for things to be known as true for all time, without a disclaimer… I wonder — and this is just a modest proposal — whether scientific truth should be identified in a way acknowledging that it’s something we know and understand for now, and in a certain way.

[…]

Facts, it seems to me, are workable units, useful in a given frame or context. They should be as exact and irrefutable as possible, tested by experiment to the fullest extent. When the frame changes, they don’t need to be discarded as untrue but respected as still useful within their domain. Most people who work with facts accept this, but I don’t think the public fully gets it.

That’s why I hope for more wariness about implying we know something to be true or false for all time and for everywhere in the cosmos.

Illustration from ‘Once Upon an Alphabet’ by Oliver Jeffers. Click image for more.

And indeed this elasticity of truth across time is at the heart of what I find to be the most beautiful and culturally essential contribution to the collection. As someone who believes that the stewardship of enduring ideas is at least as important as the genesis of new ones — not only because past ideas are the combinatorial building blocks of future ones but also because in order to move forward we always need a backdrop against which to paint the contrast of progress and improvement — I was most bewitched by writer Ian McEwan’s admonition against the arrogance of retiring any idea as an impediment to progress:

Beware of arrogance! Retire nothing! A great and rich scientific tradition should hang onto everything it has. Truth is not the only measure. There are ways of being wrong that help others to be right. Some are wrong, but brilliantly so. Some are wrong but contribute to method. Some are wrong but help found a discipline. Aristotle ranged over the whole of human knowledge and was wrong about much. But his invention of zoology alone was priceless. Would you cast him aside? You never know when you might need an old idea. It could rise again one day to enhance a perspective the present cannot imagine. It would not be available to us if it were fully retired.

To appreciate McEwan’s point, one need only look at something like Bertrand Russell’s timely thoughts on boredom, penned in 1930 and yet astoundingly resonant with our present anxieties about the societal side effects of current technology. McEwan captures this beautifully:

Every last serious and systematic speculation about the world deserves to be preserved. We need to remember how we got to where we are, and we’d like the future not to retire us. Science should look to literature and maintain a vibrant living history as a monument to ingenuity and persistence. We won’t retire Shakespeare. Nor should we Bacon.

Complement This Idea Must Die, the entirety of which weaves a mind-stretching mesh of complementary and contradictory perspectives on our relationship with knowledge, with some stimulating answers to previous editions of Brockman’s annual question, exploring the only thing worth worrying about (2013), the single most elegant theory of how the world works (2012), and the best way to make ourselves smarter (2011).

BP

Self-Refinement Through the Wisdom of the Ages: New Year’s Resolutions from Some of Humanity’s Greatest Minds

Enduring ideas for personal refinement from Seneca, Thoreau, Virginia Woolf, Carl Sagan, Alan Watts, Emerson, Bruce Lee, Maya Angelou, and more.

At the outset of each new year, humanity sets out to better itself as we resolve to eradicate our unhealthy habits and cultivate healthy ones. But while the most typical New Year’s resolutions tend to be about bodily health, the most meaningful ones aim at a deeper kind of health through the refinement of our mental, spiritual, and emotional habits — which often dictate our physical ones. In a testament to young Susan Sontag’s belief that rereading is an act of rebirth, I have revisited the timelessly rewarding ideas of great thinkers from the past two millennia to cull fifteen such higher-order resolutions for personal refinement.

1. THOREAU: WALK AND BE MORE PRESENT

No one has made a more compelling case for the bodily and spiritual value of walking — that basic, infinitely rewarding, yet presently endangered human activity — than Henry David Thoreau. In his 1861 treatise Walking (free ebook | public library), penned seven years after Walden, Thoreau reminds us of how that primal act of mobility connects us with our essential wildness, that spring of spiritual vitality methodically dried up by our sedentary civilization. He makes a special point of differentiating the art of sauntering from the mere act of walking:

I have met with but one or two persons in the course of my life who understood the art of Walking, that is, of taking walks — who had a genius, so to speak, for sauntering, which word is beautifully derived “from idle people who roved about the country, in the Middle Ages, and asked charity, under pretense of going a la Sainte Terre, to the Holy Land, till the children exclaimed, “There goes a Sainte-Terrer,” a Saunterer, a Holy-Lander. They who never go to the Holy Land in their walks, as they pretend, are indeed mere idlers and vagabonds; but they who do go there are saunterers in the good sense, such as I mean. Some, however, would derive the word from sans terre, without land or a home, which, therefore, in the good sense, will mean, having no particular home, but equally at home everywhere. For this is the secret of successful sauntering. He who sits still in a house all the time may be the greatest vagrant of all; but the saunterer, in the good sense, is no more vagrant than the meandering river, which is all the while sedulously seeking the shortest course to the sea.

Proclaiming that “every walk is a sort of crusade,” Thoreau laments — note, a century and a half before our present sedentary society — our growing civilizational tameness, which has possessed us to cease undertaking “persevering, never-ending enterprises” so that even “our expeditions are but tours.” With a dramatic flair, he lays out the spiritual conditions required of the true walker:

If you are ready to leave father and mother, and brother and sister, and wife and child and friends, and never see them again — if you have paid your debts, and made your will, and settled all your affairs, and are a free man — then you are ready for a walk.

[…]

No wealth can buy the requisite leisure, freedom, and independence which are the capital in this profession… It requires a direct dispensation from Heaven to become a walker.

But the passage that I keep coming back to as I face the modern strain for presence in the age of productivity, 150 years later, is this:

I am alarmed when it happens that I have walked a mile into the woods bodily, without getting there in spirit. In my afternoon walk I would fain forget all my morning occupations and my obligations to Society. But it sometimes happens that I cannot easily shake off the village. The thought of some work will run in my head and I am not where my body is — I am out of my senses. In my walks I would fain return to my senses. What business have I in the woods, if I am thinking of something out of the woods?

Read more here.

2. VIRGINIA WOOLF: KEEP A DIARY

Many celebrated writers have extolled the creative benefits of keeping a diary, but none more convincingly than Virginia Woolf, who was not only a masterful letter-writer and little-known children’s book author, but also a dedicated diarist. Although she kept some sporadic early journals, Woolf didn’t begin serious journaling until 1915, when she was 33. Once she did, she continued doggedly until her last entry in 1941, four days before her death, leaving behind 26 volumes written in her own hand. More than a mere tool of self-exploration, however, Woolf approached the diary as a kind of R&D lab for her craft. As her husband observes in the introduction to her collected journals, A Writer’s Diary (public library), Woolf’s journaling was “a method of practicing or trying out the art of writing.”

In an entry from April 20th, 1919, Woolf makes a case for the vast benefits of keeping a diary as a tool of refining one’s writing style — something Joan Didion echoed nearly a century and a half later in her timeless essay on keeping a notebook — and considers the optimal approach to journaling:

The habit of writing thus for my own eye only is good practice. It loosens the ligaments… What sort of diary should I like mine to be? Something loose knit and yet not slovenly, so elastic that it will embrace anything, solemn, slight or beautiful that comes into my mind. I should like it to resemble some deep old desk, or capacious hold-all, in which one flings a mass of odds and ends without looking them through. I should like to come back, after a year or two, and find that the collection had sorted itself and refined itself and coalesced, as such deposits so mysteriously do, into a mould, transparent enough to reflect the light of our life, and yet steady, tranquil compounds with the aloofness of a work of art. The main requisite, I think on re-reading my old volumes, is not to play the part of censor, but to write as the mood comes or of anything whatever; since I was curious to find how I went for things put in haphazard, and found the significance to lie where I never saw it at the time.

Woolf considers the diary an equally potent autobiographical tool as well — one essential in the face of how woefully our present selves shortchange our future happiness. In an entry from January 20th, 1919, a 37-year-old Woolf considers the utility of the diaries to her future self, noting with equal parts sharp self-awareness and near-comic self-consciousness her own young-person’s perception of 50 as an “elderly” age:

I have just re-read my year’s diary and am much struck by the rapid haphazard gallop at which it swings along, sometimes indeed jerking almost intolerably over the cobbles. Still if it were not written rather faster than the fastest type-writing, if I stopped and took thought, it would never be written at all; and the advantage of the method is that it sweeps up accidentally several stray matters which I should exclude if I hesitated, but which are the diamonds of the dustheap. If Virginia Woolf at the age of 50, when she sits down to build her memoirs out of these books, is unable to make a phrase as it should be made, I can only condole with her and remind her of the existence of the fireplace, where she has my leave to burn these pages to so many black films with red eyes in them. But how I envy her the task I am preparing for her! There is none I should like better. Already my 37th birthday next Saturday is robbed of some of its terrors by the thought. Partly for the benefit of this elderly lady (no subterfuges will then be possible: 50 is elderly, though I anticipate her protest and agree that it is not old) partly to give the year a solid foundation I intend to spend the evenings of this week of captivity in making out an account of my friendships and their present condition, with some account of my friends’ characters; and to add an estimate of their work and a forecast of their future works. The lady of 50 will be able to say how near to the truth I come.

Read more here, then see other writers make the same case.

3. SENECA: MAKE YOUR LIFE WIDE RATHER THAN LONG

Around the time Thoreau was bemoaning his mind’s tendency to roam out of the woods while his body saunters in the woods, in another part of the world Kierkegaard was making a similar lament about our greatest source of unhappiness — the refusal to recognize that “busy is a decision” and that presence is infinitely more rewarding than productivity. I frequently worry that being productive is the surest way to lull ourselves into a trance of passivity and busyness the greatest distraction from living, as we coast through our lives day after day, showing up for our obligations but being absent from our selves, mistaking the doing for the being.

Despite a steadily swelling human life expectancy, these concerns seem more urgent than ever — and yet they are hardly unique to our age. In fact, they go as far back as the record of human experience and endeavor. It is unsurprising, then, that the best treatment of the subject is also among the oldest: Roman philosopher Seneca’s spectacular 2,000-year-old treatise On the Shortness of Life (public library) — a poignant reminder of what we so deeply intuit yet so easily forget and so chronically fail to put into practice.

Seneca writes:

It is not that we have a short time to live, but that we waste a lot of it. Life is long enough, and a sufficiently generous amount has been given to us for the highest achievements if it were all well invested. But when it is wasted in heedless luxury and spent on no good activity, we are forced at last by death’s final constraint to realize that it has passed away before we knew it was passing. So it is: we are not given a short life but we make it short, and we are not ill-supplied but wasteful of it… Life is long if you know how to use it.

To those who so squander their time, he offers an unambiguous admonition:

You are living as if destined to live for ever; your own frailty never occurs to you; you don’t notice how much time has already passed, but squander it as though you had a full and overflowing supply — though all the while that very day which you are devoting to somebody or something may be your last. You act like mortals in all that you fear, and like immortals in all that you desire… How late it is to begin really to live just when life must end! How stupid to forget our mortality, and put off sensible plans to our fiftieth and sixtieth years, aiming to begin life from a point at which few have arrived!

The cure he prescribes is rather simple, yet far from easy to enact:

Putting things off is the biggest waste of life: it snatches away each day as it comes, and denies us the present by promising the future. The greatest obstacle to living is expectancy, which hangs upon tomorrow and loses today. You are arranging what lies in Fortune’s control, and abandoning what lies in yours. What are you looking at? To what goal are you straining? The whole future lies in uncertainty: live immediately.

Read more about how to fill the length of your life with vibrant width here.

4. ANNA DEAVERE SMITH: DEFINE YOURSELF

A great many creators have spoken to the power of discipline, or what psychologists now call “grit,” in setting apart those who succeed from those who fail at their endeavor of choice — including Tchaikovsky (“A self-respecting artist must not fold his hands on the pretext that he is not in the mood.”), Chuck Close (“Inspiration is for amateurs — the rest of us just show up and get to work.”), Anthony Trollope (“My belief of book writing is much the same as my belief as to shoemaking. The man who will work the hardest at it, and will work with the most honest purpose, will work the best.”), and E.B. White (“A writer who waits for ideal conditions under which to work will die without putting a word on paper.”). How to master the elusive art of discipline is what beloved artist, actor, playwright, and educator Anna Deavere Smith outlines in one of the missives in her immeasurably insightful and useful compendium Letters to a Young Artist: Straight-up Advice on Making a Life in the Arts for Actors, Performers, Writers, and Artists of Every Kind (public library).

Smith writes:

Discipline — both mental and physical — is crucial.

She recounts an encounter with the son of Melvin van Peebles, a black filmmaker who made a smash-hit independent film in the seventies that earned him a lot of money and cultural status. The son, Mario van Peebles, had made a film about his father’s film, a screening of which Smith hosted. She writes:

He must be in his mid-sixties, and he is in perfect physical shape. He was standing by the bar, and I asked him not about the film but about his physique.

“You look like you work out,” I said.

“Every day,” he said.

People who actually work out every single day have no problem talking about it. He and I agreed that we have to get up and go immediately to the gym, the pool, wherever our workout is, without doing anything before.

“If I get up and think, ‘Let me have a cup of coffee first,’ it ain’t happ’nin’,” he said.

Not even a cup of coffee. I’m the same way. If I go to the computer or take a newspaper before heading to the gym, there’s a chance I won’t get there.

As someone who has been working out every single morning for the past fifteen years, I wholeheartedly, wholebodily agree. I do a great deal of my reading at the gym, too, including this particular book itself — there’s something powerful about the alignment of two disciplines, of body and mind, in the same routine. The two rhythms reinforce one another.

More than that, however, Smith argues that discipline is also the single most important anchor of identity for creative people — the essential material out of which they craft the building blocks of how they define themselves:

The life of an artist is not a state of “being.” It even sounds pretentious, sometimes, to call oneself blanketly “an artist.” It’s not up to you or me to give ourselves that title. A doctor becomes a doctor because he or she is formally given an MD. A scholar in the university is formally given a PhD, a counselor an LLD, a hairstylist a license, and so forth.

We are on the fringe, and we don’t get such licenses. There are prizes and rewards, popularity and good or bad press. But you have to be your own judge. That, in and of itself, takes discipline, and clarity, and objectivity. Given the fact that we are not “credentialed” by any institution that even pretends to be objective, it is harder to make our guild. True, some schools and universities give a degree for a course of study. But that’s a business transaction and ultimately not enough to make you an “artist.”

Because an artist is never hit with the magic wand of legitimacy from the outside and “you have to hit your own head with your own handmade wand,” creative people are singularly vulnerable every time they put their art — whatever its nature — into the world. Without the shield of, say, a Ph.D. to point to and say, “But look, I’m real,” it’s all too easy to hang our merit and worth and realness on the opinions of others — opinions often mired in their own insecurities and vulnerabilities, which at the most malignant extreme manifest as people’s tendency to make themselves feel big by making others feel small, make themselves feel real by making others feel unreal. Smith captures the paradox of this condition elegantly:

We who work in the arts are at the risk of being in a popularity contest rather than a profession. If that fact causes you despair, you should probably pick another profession. Your desire to communicate must be bigger than your relationship to these chaotic and unfair realities. Ideally, we must be even more “professional” than lawyers, doctors, accountants, hairdressers. We have to create our own standards of discipline.

All of the successful artists I know are very disciplined and very organized. Even if they don’t look organized, they have their own order.

[…]

What we become — what we are — ultimately consists of what we have been doing.

Read more on how to cultivate that discipline here.

5. ALAN WATTS: BREAK FREE FROM YOUR EGO

During the 1950s and 1960s, British philosopher and writer Alan Watts began popularizing Eastern philosophy in the West, offering a wholly different perspective on inner wholeness in the age of anxiety and what it really means to live a life of purpose. We owe much of today’s mainstream adoption of practices like yoga and meditation to Watts’s influence. His 1966 masterwork The Book: On the Taboo Against Knowing Who You Are (public library) builds upon his indispensable earlier work as Watts argues with equal parts conviction and compassion that “the prevalent sensation of oneself as a separate ego enclosed in a bag of skin is a hallucination which accords neither with Western science nor with the experimental philosophy-religions of the East.” He explores the cause and cure of that illusion in a way that flows from profound unease as we confront our cultural conditioning into a deep sense of lightness as we surrender to the comforting mystery and interconnectedness of the universe.

Envisioned as a packet of essential advice a parent might hand down to his child on the brink of adulthood as initiation into the central mystery of life, this existential manual is rooted in what Watts calls “a cross-fertilization of Western science with an Eastern intuition.”

Watts considers the singular anxiety of the age, perhaps even more resonant today, half a century and a manic increase of pace later:

There is a growing apprehension that existence is a rat-race in a trap: living organisms, including people, are merely tubes which put things in at one end and let them out at the other, which both keeps them doing it and in the long run wears them out.

At the heart of the human condition, Watts argues, is a core illusion that fuels our deep-seated sense of loneliness the more we subscribe to the myth of the sole ego, one reflected in the most basic language we use to make sense of the world:

We suffer from a hallucination, from a false and distorted sensation of our own existence as living organisms. Most of us have the sensation that “I myself” is a separate center of feeling and action, living inside and bounded by the physical body — a center which “confronts” an “external” world of people and things, making contact through the senses with a universe both alien and strange. Everyday figures of speech reflect this illusion. “I came into this world.” “You must face reality.” “The conquest of nature.”

This feeling of being lonely and very temporary visitors in the universe is in flat contradiction to everything known about man (and all other living organisms) in the sciences. We do not “come into” this world; we come out of it, as leaves from a tree. As the ocean “waves,” the universe “peoples.” Every individual is an expression of the whole realm of nature, a unique action of the total universe. This fact is rarely, if ever, experienced by most individuals. Even those who know it to be true in theory do not sense or feel it, but continue to be aware of themselves as isolated “egos” inside bags of skin.

Read more here, then revisit Watts’s related antidote to the age of anxiety.

6. CAROL DWECK: CULTIVATE A GROWTH MINDSET

“If you imagine less, less will be what you undoubtedly deserve,” Debbie Millman counseled in one of the best commencement speeches ever given, urging: “Do what you love, and don’t stop until you get what you love. Work as hard as you can, imagine immensities…” Far from Pollyannaish platitude, this advice actually reflects modern psychology’s insight into how belief systems about our own abilities and potential fuel our behavior and predict our success. Much of that understanding stems from the work of Stanford psychologist Carol Dweck, synthesized in her remarkably insightful Mindset: The New Psychology of Success (public library) — an inquiry into the power of our beliefs, both conscious and unconscious, and how changing even the simplest of them can have profound impact on nearly every aspect of our lives.

One of the most basic beliefs we carry about ourselves, Dweck found in her research, has to do with how we view and inhabit what we consider to be our personality. A “fixed mindset” assumes that our character, intelligence, and creative ability are static givens which we can’t change in any meaningful way, and success is the affirmation of that inherent intelligence, an assessment of how those givens measure up against an equally fixed standard; striving for success and avoiding failure at all costs become a way of maintaining the sense of being smart or skilled. A “growth mindset,” on the other hand, thrives on challenge and sees failure not as evidence of unintelligence but as a heartening springboard for growth and for stretching our existing abilities. Out of these two mindsets, which we manifest from a very early age, springs a great deal of our behavior, our relationship with success and failure in both professional and personal contexts, and ultimately our capacity for happiness.

Dweck writes:

Believing that your qualities are carved in stone — the fixed mindset — creates an urgency to prove yourself over and over. If you have only a certain amount of intelligence, a certain personality, and a certain moral character — well, then you’d better prove that you have a healthy dose of them. It simply wouldn’t do to look or feel deficient in these most basic characteristics.

[…]

There’s another mindset in which these traits are not simply a hand you’re dealt and have to live with, always trying to convince yourself and others that you have a royal flush when you’re secretly worried it’s a pair of tens. In this mindset, the hand you’re dealt is just the starting point for development. This growth mindset is based on the belief that your basic qualities are things you can cultivate through your efforts. Although people may differ in every which way — in their initial talents and aptitudes, interests, or temperaments — everyone can change and grow through application and experience.

Do people with this mindset believe that anyone can be anything, that anyone with proper motivation or education can become Einstein or Beethoven? No, but they believe that a person’s true potential is unknown (and unknowable); that it’s impossible to foresee what can be accomplished with years of passion, toil, and training.

At the heart of what makes the “growth mindset” so winsome, Dweck found, is that it creates a passion for learning rather than a hunger for approval. Its hallmark is the conviction that human qualities like intelligence and creativity, and even relational capacities like love and friendship, can be cultivated through effort and deliberate practice. Not only are people with this mindset not discouraged by failure, but they don’t actually see themselves as failing in those situations — they see themselves as learning.

Read more on how to cultivate this fruitful mindset here.

7. BENJAMIN FRANKLIN: TURN HATERS INTO FANS

In one chapter of the altogether illuminating You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself (public library) — a “book about self-delusion, but also a celebration of it,” a fascinating and pleasantly uncomfortable-making look at why “self-delusion is as much a part of the human condition as fingers and toes” — David McRaney examines one particularly fascinating manifestation of our self-delusion: The Benjamin Franklin Effect.

This particular form of self-delusion has to do with our tendency to do nice things to people we like and bad things to those we dislike. But what the psychology behind the effect reveals is an inverse relationship — a reverse-engineering of attitudes that takes place as we grow to like people for whom we do nice things and dislike those to whom we are unkind. This curious effect is named after a specific incident early in the Founding Father’s political career.

Franklin, born one of seventeen children to poor parents, entered this world — despite his parents’ and society’s priorities in his favor relative to his siblings — with very low odds of becoming an educated scientist, gentleman, scholar, entrepreneur, and, perhaps most of all, a man of significant political power. To compensate for his unfavorable givens, he quickly learned formidable people skills and became “a master of the game of personal politics.” When he ran for his second term as a clerk, a peer whose name Franklin never mentions in his autobiography delivered a long election speech censuring the future Founding Father and tarnishing his reputation.

Although Franklin won, he was furious with his opponent and, observing that this was “a gentleman of fortune and education” who might one day come to hold great power in government, rather concerned about future frictions with him. The troll had to be tamed, and tamed shrewdly. McRaney writes:

Franklin set out to turn his hater into a fan, but he wanted to do it without “paying any servile respect to him.” Franklin’s reputation as a book collector and library founder gave him a standing as a man of discerning literary tastes, so Franklin sent a letter to the hater asking if he could borrow a specific selection from his library, one that was a “very scarce and curious book.” The rival, flattered, sent it right away. Franklin sent it back a week later with a thank-you note. Mission accomplished. The next time the legislature met, the man approached Franklin and spoke to him in person for the first time. Franklin said the man “ever after manifested a readiness to serve me on all occasions, so that we became great friends, and our friendship continued to his death.”

Read more about what modern psychology has revealed about this curious phenomenon here, then complement it with Kierkegaard on why haters hate.

8. HANNAH ARENDT: THINK RATHER THAN KNOW

In 1973, Hannah Arendt became the first woman to speak at the prestigious Gifford Lectures — an annual series established in 1888 aiming “to promote and diffuse the study of natural theology in the widest sense of the term,” bridging science, philosophy, and spirituality, an ancient quest of enduring urgency to this day. Over the years, the Gifford Lectures have drawn such celebrated minds as William James, Werner Heisenberg, Niels Bohr, Iris Murdoch, and Carl Sagan, whose 1985 lecture was later published as the spectacular posthumous volume Varieties of Scientific Experience. Arendt’s own lecture was later expanded and published as The Life of the Mind (public library), an immeasurably stimulating exploration of thinking — a process we take for so obvious and granted as to be of no interest, yet one bridled with complexities and paradoxes that often keep us from seeing the true nature of reality. With extraordinary intellectual elegance, Arendt draws “a distinguishing line between truth and meaning, between knowing and thinking,” and makes a powerful case for the importance of that line in the human experience.

Arendt asks:

What are we “doing” when we do nothing but think? Where are we when we, normally always surrounded by our fellow-men, are together with no one but ourselves?

Arendt considers the crucial necessity of never ceasing to pursue questions, those often unanswerable questions, of meaning over so-called truth — something doubly needed in our era of ready-made “opinions” based on neatly packaged “facts”:

By posing the unanswerable questions of meaning, men establish themselves as question-asking beings. Behind all the cognitive questions for which men find answers, there lurk the unanswerable ones that seem entirely idle and have always been denounced as such. It is more than likely that men, if they were ever to lose the appetite for meaning we call thinking and cease to ask unanswerable questions, would lose not only the ability to produce those thought-things that we call works of art but also the capacity to ask all the answerable questions upon which every civilization is founded… While our thirst for knowledge may be unquenchable because of the immensity of the unknown, the activity itself leaves behind a growing treasure of knowledge that is retained and kept in store by every civilization as part and parcel of its world. The loss of this accumulation and of the technical expertise required to conserve and increase it inevitably spells the end of this particular world.

Read more here.

9. ANNE LAMOTT: LET GO OF PERFECTIONISM

In her indispensable Bird by Bird: Some Instructions on Writing and Life (public library) — one of the finest books on writing ever written, a treasure trove of insight both practical and profound — Anne Lamott explores how perfectionism paralyzes us creatively.

She recounts this wonderful anecdote, after which the book is titled:

Thirty years ago my older brother, who was ten years old at the time, was trying to get a report on birds written that he’d had three months to write, which was due the next day. We were out at our family cabin in Bolinas, and he was at the kitchen table close to tears, surrounded by binder paper and pencils and unopened books on birds, immobilized by the hugeness of the task ahead. Then my father sat down beside him, put his arm around my brother’s shoulder, and said, “Bird by bird, buddy. Just take it bird by bird.”

In this bird-by-bird approach to writing, there is no room for perfectionism. (Neil Gaiman famously advised, “Perfection is like chasing the horizon. Keep moving.” and David Foster Wallace admonished, “If your fidelity to perfectionism is too high, you never do anything.”) Lamott cautions:

Perfectionism is the voice of the oppressor, the enemy of the people. It will keep you cramped and insane your whole life, and it is the main obstacle between you and a shitty first draft.

[…]

Perfectionism is a mean, frozen form of idealism, while messes are the artist’s true friend. What people somehow (inadvertently, I’m sure) forgot to mention when we were children was that we need to make messes in order to find out who we are and why we are here — and, by extension, what we’re supposed to be writing.

Read more here and couple with Lamott on how to stop keeping yourself small by people-pleasing.

10. CARL SAGAN: MASTER CRITICAL THINKING

Carl Sagan endures as our era’s greatest patron saint of reason and common sense — a true master of the vital balance between skepticism and openness. In The Demon-Haunted World: Science as a Candle in the Dark (public library) — the same indispensable volume that gave us Sagan’s timeless meditation on science and spirituality, published mere months before his death in 1996 — Sagan shares his secret to upholding the rites of reason, even in the face of society’s most shameless untruths and outrageous propaganda.

In a chapter titled “The Fine Art of Baloney Detection,” he reflects on the many types of deception to which we’re susceptible — from psychics to religious zealotry to paid product endorsements by scientists, which he held in especially low regard, noting that they “betray contempt for the intelligence of their customers” and “introduce an insidious corruption of popular attitudes about scientific objectivity.” But rather than preaching from the ivory tower of self-righteousness, Sagan approaches the subject from the most vulnerable of places — having just lost both of his parents, he reflects on the all too human allure of promises of supernatural reunions in the afterlife, reminding us that falling for such fictions doesn’t make us stupid or bad people, but simply means that we need to equip ourselves with the right tools against them.

Through their training, scientists are equipped with what Sagan calls a “baloney detection kit” — a set of cognitive tools and techniques that fortify the mind against penetration by falsehoods:

The kit is brought out as a matter of course whenever new ideas are offered for consideration. If the new idea survives examination by the tools in our kit, we grant it warm, although tentative, acceptance. If you’re so inclined, if you don’t want to buy baloney even when it’s reassuring to do so, there are precautions that can be taken; there’s a tried-and-true, consumer-tested method.

But the kit, Sagan argues, isn’t merely a tool of science — rather, it contains invaluable tools of healthy skepticism that apply just as aptly, and just as necessarily, to everyday life. By adopting the kit, we can all shield ourselves against clueless guile and deliberate manipulation. Sagan shares nine of these tools:

  1. Wherever possible there must be independent confirmation of the “facts.”
  2. Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
  3. Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
  4. Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
  5. Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
  6. Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
  7. If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.
  8. Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.
  9. Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle — an electron, say — in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.

Read more, including Sagan’s list of the twenty most common pitfalls of common sense and how to counter them, here.

11. REBECCA SOLNIT: GET LOST TO FIND YOURSELF

“On how one orients himself to the moment,” Henry Miller wrote in reflecting on the art of living, “depends the failure or fruitfulness of it.” Indeed, this act of orienting ourselves — to the moment, to the world, to our own selves — is perhaps the most elusive art of all, and our attempts to master it often leave us fumbling, frustrated, discombobulated. And yet therein lies our greatest capacity for growth and self-transcendence.

Rebecca Solnit, whose mind and writing are among the most consistently enchanting of our time, explores this tender tango with the unknown in her altogether sublime collection A Field Guide to Getting Lost (public library).

Solnit writes in the opening essay:

Leave the door open for the unknown, the door into the dark. That’s where the most important things come from, where you yourself came from, and where you will go. Three years ago I was giving a workshop in the Rockies. A student came in bearing a quote from what she said was the pre-Socratic philosopher Meno. It read, “How will you go about finding that thing the nature of which is totally unknown to you?” I copied it down, and it has stayed with me since. The student made big transparent photographs of swimmers underwater and hung them from the ceiling with the light shining through them, so that to walk among them was to have the shadows of swimmers travel across your body in a space that itself came to seem aquatic and mysterious. The question she carried struck me as the basic tactical question in life. The things we want are transformative, and we don’t know or only think we know what is on the other side of that transformation. Love, wisdom, grace, inspiration — how do you go about finding these things that are in some ways about extending the boundaries of the self into unknown territory, about becoming someone else?

Solnit turns to Edgar Allan Poe, who argued that “in matters of philosophical discovery … it is the unforeseen upon which we must calculate most largely,” and considers the deliberate juxtaposition of the rational, methodical act of calculation with the ineffable, intangible nature of the unforeseen:

How do you calculate upon the unforeseen? It seems to be an art of recognizing the role of the unforeseen, of keeping your balance amid surprises, of collaborating with chance, of recognizing that there are some essential mysteries in the world and thereby a limit to calculation, to plan, to control. To calculate on the unforeseen is perhaps exactly the paradoxical operation that life most requires of us.

The poet John Keats captured this paradoxical operation memorably in his notion of “negative capability,” which Solnit draws on before turning to another literary luminary, Walter Benjamin, who considered the difference between not finding your way and losing yourself — something he called “the art of straying.” Solnit writes:

To lose yourself: a voluptuous surrender, lost in your arms, lost to the world, utterly immersed in what is present so that its surroundings fade away. In Benjamin’s terms, to be lost is to be fully present, and to be fully present is to be capable of being in uncertainty and mystery. And one does not get lost but loses oneself, with the implication that it is a conscious choice, a chosen surrender, a psychic state achievable through geography. That thing the nature of which is totally unknown to you is usually what you need to find, and finding it is a matter of getting lost.

Read more here.

12. BRUCE LEE: BE LIKE WATER

With his singular blend of physical prowess and metaphysical wisdom, coupled with his tragic untimely death, legendary Chinese-American martial artist, philosopher, and filmmaker Bruce Lee (1940-1973) is one of those rare cultural icons whose ethos and appeal remain timeless, attracting generation after generation of devotees. Inspired by the core principles of Wing Chun, the ancient Chinese conceptual martial art, which he learned from his only formal martial arts teacher, Yip Man, between the ages of thirteen and eighteen. When he left Hong Kong in 1959, Lee adapted Wing Chun into his own version, Jun Fan Gung Fu — literal translation: Bruce Lee’s Kung Fu — and popularized it in America.

In 1971, at the peak of his career, Lee starred in four episodes of the short-lived TV series Longstreet. In one of them, he delivered his most oft-cited metaphor for the philosophy of Gung Fu: “Be like water,” at the heart of which is the Chinese concept of wu wei — “trying not to try.” In Bruce Lee: Artist of Life (public library) — a compendium of his never-before-published private letters, notes, and poems — Lee traces the thinking that originated his famous metaphor, which came after a period of frustration with his inability to master “the art of detachment” that Yip Man was trying to impart on him. Lee recounts:

After spending many hours meditating and practicing, I gave up and went sailing alone in a junk. On the sea I thought of all my past training and got mad at myself and punched the water! Right then — at that moment — a thought suddenly struck me; was not this water the very essence of gung fu? Hadn’t this water just now illustrated to me the principle of gung fu? I struck it but it did not suffer hurt. Again I struck it with all of my might — yet it was not wounded! I then tried to grasp a handful of it but this proved impossible. This water, the softest substance in the world, which could be contained in the smallest jar, only seemed weak. In reality, it could penetrate the hardest substance in the world. That was it! I wanted to be like the nature of water.

Suddenly a bird flew by and cast its reflection on the water. Right then I was absorbing myself with the lesson of the water, another mystic sense of hidden meaning revealed itself to me; should not the thoughts and emotions I had when in front of an opponent pass like the reflection of the birds flying over the water? This was exactly what Professor Yip meant by being detached — not being without emotion or feeling, but being one in whom feeling was not sticky or blocked. Therefore in order to control myself I must first accept myself by going with and not against my nature.

[…]

Water is so fine that it is impossible to grasp a handful of it; strike it, yet it does not suffer hurt; stab it, and it is not wounded; sever it, yet it is not divided. It has no shape of its own but molds itself to the receptacle that contains it. When heated to the state of steam it is invisible but has enough power to split the earth itself. When frozen it crystallizes into a mighty rock. First it is turbulent like Niagara Falls, and then calm like a still pond, fearful like a torrent, and refreshing like a spring on a hot summer’s day.

Lee illustrates the power of this water-like disposition with a passage from the Tao Te Ching, Lao Tzu’s famous teachings:

The rivers and seas are lords of a hundred valleys. This is because their strength is in lowliness; they are kings of them all. So it is that the perfect master wishing to lead them, he follows. Thus, though he is above them, he follows. Thus, though he is above them, men do not feel him to be an injury. And since he will not strive, none strive with him.

Read more here.

13: MAYA ANGELOU: CHOOSE COURAGE OVER CYNICISM

In 1982, nearly a decade after their spectacular conversation about freedom, beloved poet, memoirist, dramatist, actor, producer, filmmaker, and civil rights activist Maya Angelou and celebrated interviewer Bill Moyers traveled together to the beautiful Texas countryside to discuss the ugliest aspects of human nature at a conference titled Facing Evil. It was a subject with which Angelou, the survivor of childhood rape and courageous withstander of lifelong racism, was intimately acquainted. In a recent remembrance of his friend, Moyers shares excerpts from the 1988 documentary about the event, affirming once more that Angelou was nothing if not a champion of the human spirit and its highest potentiality for good.

In one of the most poignant passages, Angelou reflects on how refusing to speak for five years after being raped as a child (“I won’t say severely raped; all rape is severe,” Angelou notes in one of her characteristically piercing asides) shaped her journey:

To show you … how out of evil there can come good, in those five years I read every book in the black school library. I read all the books I could get from the white school library. I memorized James Weldon Johnson, Paul Laurence Dunbar, Countee Cullen and Langston Hughes. I memorized Shakespeare, whole plays, fifty sonnets. I memorized Edgar Allen Poe, all the poetry — never having heard it, I memorized it. I had Longfellow, I had Guy de Maupassant, I had Balzac, Rudyard Kipling — I mean, it was catholic kind of reading, and catholic kind of storing.

[…]

Out of this evil, which was a dire kind of evil, because rape on the body of a young person more often than not introduces cynicism, and there is nothing quite so tragic as a young cynic, because it means the person has gone from knowing nothing to believing nothing. In my case I was saved in that muteness… And I was able to draw from human thought, human disappointments and triumphs, enough to triumph myself.

Angelou’s most soul-expanding point is that courage — something she not only embodied but also championed beautifully in her children’s book illustrated by Basquiat — is our indelible individual capacity and our shared existential responsibility:

We need the courage to create ourselves daily, to be bodacious enough to create ourselves daily — as Christians, as Jews, as Muslims, as thinking, caring, laughing, loving human beings. I think that the courage to confront evil and turn it by dint of will into something applicable to the development of our evolution, individually and collectively, is exciting, honorable.

Read more here.

14. EMERSON: CULTIVATE TRUE FRIENDSHIP

It’s been argued that friendship is a greater gift than romantic love (though it’s not uncommon for one to turn abruptly into the other), but whatever the case, friendship is certainly one of the most rewarding fruits of life — from the sweetness of childhood friendships to the trickiness of workplace ones. This delicate dance has been examined by thinkers from Aristotle to Francis Bacon to Thoreau, but none more thoughtfully than by Ralph Waldo Emerson. In an essay on the subject, found in his altogether soul-expanding Essays and Lectures (public library; free download), Emerson considers the intricate dynamics of friendship, beginning with our often underutilized innate capacities:

We have a great deal more kindness than is ever spoken. Barring all the selfishness that chills like east winds the world, the whole human family is bathed with an element of love like a fine ether… The emotions of benevolence … from the highest degree of passionate love, to the lowest degree of good will, they make the sweetness of life.

[…]

What is so delicious as a just and firm encounter of two, in a thought, in a feeling? How beautiful, on their approach to this beating heart, the steps and forms of the gifted and the true! The moment we indulge our affections, the earth is metamorphosed; there is no winter, and no night; all tragedies, all ennuis vanish; all duties even; nothing fills the proceeding eternity but the forms all radiant of beloved persons. Let the soul be assured that somewhere in the universe it should rejoin its friend, and it would be content and cheerful alone for a thousand years.

Emerson goes on to consider the two essential conditions of friendship:

There are two elements that go to the composition of friendship, each so sovereign, that I can detect no superiority in either, no reason why either should be first named. One is Truth. A friend is a person with whom I may be sincere. Before him, I may think aloud. I am arrived at last in the presence of a man so real and equal that I may drop even those undermost garments of dissimulation, courtesy, and second thought, which men never put off, and may deal with him with the simplicity and wholeness, with which one chemical atom meets another. Sincerity is the luxury allowed, but diadems and authority, only to the highest rank, that being permitted to speak truth as having none above it to court or conform unto. Every man alone is sincere. At the entrance of a second person, hypocrisy begins… We cover up our thought from him under a hundred folds.

[…]

The other element of friendship is tenderness. We are holden to men by every sort of tie, by blood, by pride, by fear, by hope, by lucre, by lust, by hate, by admiration, by every circumstance and badge and trifle, but we can scarce believe that so much character can subsist in another as to draw us by love. Can another be so blessed, and we so pure, that we can offer him tenderness? When a man becomes dear to me, I have touched the goal of fortune.

Read more here.

15. ELEANOR ROOSEVELT: LIVE BY YOUR OWN STANDARDS

Eleanor Roosevelt endures as one of the most beloved and influential luminaries in modern history — a relentless champion of working women and underprivileged youth, the longest-serving American First Lady, and the author of some beautiful, if controversial, love letters.

When she was 76, Roosevelt penned You Learn by Living: Eleven Keys for a More Fulfilling Life (public library) — a relentlessly insightful compendium of her philosophy on the meaningful life.

Roosevelt considers the seedbed of happiness:

Happiness is not a goal, it is a by-product. Paradoxically, the one sure way not to be happy is deliberately to map out a way of life in which one would please oneself completely and exclusively. After a short time, a very short time, there would be little that one really enjoyed. For what keeps our interest in life and makes us look forward to tomorrow is giving pleasure to other people.

[…]

Someone once asked me what I regarded as the three most important requirements for happiness. My answer was: ‘A feeling that you have been honest with yourself and those around you; a feeling that you have done the best you could both in your personal life and in your work; and the ability to love others.’

Indeed, personal integrity — without which it is impossible to be honest with oneself — is a centerpiece of our capacity for happiness. In a chapter titled “The Right to Be an Individual,” Roosevelt considers the moral responsibility of living what you believe — of fully inhabiting your inner life — as the foundation of integrity and, more than that, of what it means to be human:

It’s your life — but only if you make it so. The standards by which you live must be your own standards, your own values, your own convictions in regard to what is right and wrong, what is true and false, what is important and what is trivial. When you adopt the standards and the values of someone else or a community or a pressure group, you surrender your own integrity. You become, to the extent of your surrender, less of a human being.

Read more here.

Complement with some actual New Year’s resolutions by Friedrich Nietzsche, Italo Calvino, Jonathan Swift, Susan Sontag, Marilyn Monroe, Woody Guthrie, and Ursula Nordstrom.

BP

Margaret Mead on Myth vs. Deception and What to Tell Kids about Santa Claus

How to instill an appreciation of the difference between “fact” and “poetic truth,” in kids and grownups alike.

Few things rattle the fine line between the benign magic of mythology and the deliberate delusion of a lie more than the question of how, what, and whether to tell kids about Santa Claus. Half a century ago, Margaret Mead (December 16, 1901–November 15, 1978) — the world’s most influential cultural anthropologist and one of history’s greatest academic celebrities — addressed this delicate subject with great elegance, extending beyond the jolly Christmas character and into larger questions of distinguishing between myth and deception.

From the wonderful out-of-print volume Margaret Mead: Some Personal Views (public library) — the same compendium of Mead’s answers to audience questions from her long career as a public speaker and lecturer, which also gave us her remarkably timely thoughts on racism and law enforcement and equality in parenting — comes an answer to a question she was asked in December of 1964: “Were your children brought up to believe in Santa Claus? If so, what did you tell them when they discovered he didn’t exist?”

Mead’s response, which calls to mind Carl Sagan’s Baloney Detection Kit, is a masterwork of celebrating rational, critical thinking without sacrificing magic to reason:

Belief in Santa Claus becomes a problem mainly when parents simultaneously feel they are telling their children a lie and insist on the literal belief in a jolly little man in a red suit who keeps tabs on them all year, reads their letters and comes down the chimney after landing his sleigh on the roof. Parents who enjoy Santa Claus — who feel that it is more fun talk about what Santa Claus will bring than what Daddy will buy you for Christmas and who speak of Santa Claus in a voice that tells no lie but instead conveys to children something about Christmas itself — can give children a sense of continuity as they discover the sense in which Santa is and is not “real.”

With her great gift for nuance, Mead adds:

Disillusionment about the existence of a mythical and wholly implausible Santa Claus has come to be a synonym for many kinds of disillusionment with what parents have told children about birth and death and sex and the glory of their ancestors. Instead, learning about Santa Claus can help give children a sense of the difference between a “fact” — something you can take a picture of or make a tape recording of, something all those present can agree exists — and poetic truth, in which man’s feelings about the universe or his fellow men is expressed in a symbol.

Recalling her own experience both as a child and as a parent, Mead offers an inclusive alternative to the narrow Santa Claus myth, inviting parents to use the commercial Western holiday as an opportunity to introduce kids to different folkloric traditions and value systems:

One thing my parents did — and I did for my own child — was to tell stories about the different kinds of Santa Claus figures known in different countries. The story I especially loved was the Russian legend of the little grandmother, the babushka, at whose home the Wise Men stopped on their journey. They invited her to come with them, but she had no gift fit for the Christ child and she stayed behind to prepare it. Later she set out after the Wise Men but she never caught up with them, and so even today she wanders around the world, and each Christmas she stops to leave gifts for sleeping children.

But Mead’s most important, most poetic point affirms the idea that children stories shouldn’t protect kids from the dark:

Children who have been told the truth about birth and death will know, when they hear about Kris Kringle and Santa Claus and Saint Nicholas and the little babushka, that this is a truth of a different kind.

Margaret Mead: Some Personal Views is, sadly, long out of print — but it’s an immeasurable trove of Mead’s wisdom well worth the used-book hunt. Complement it with Mead’s beautiful love letters to her soulmate and the story of how she discovered the meaning of life in a dream.

BP

View Full Site

Brain Pickings participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from a link on here, I receive a small percentage of its price. Privacy policy.