Brain Pickings

Posts Tagged ‘culture’

26 FEBRUARY, 2015

E.B. White on How to Write for Children and the Writer’s Responsibility to All Readers

By:

“Anyone who writes down to children is simply wasting his time. You have to write up, not down.”

The loving and attentive reader of children’s books knows that the best of them are not one-dimensional oversimplifications of life but stories that tackle with elegant simplicity such complexities as uncertainty, loneliness, loss, and the cycle of life. And anyone who sits with this awareness for a moment becomes suddenly skeptical of the very notion of a “children’s” book. Maurice Sendak certainly knew that when he scoffed in his final interview: “I don’t write for children. I write — and somebody says, ‘That’s for children!’” Seven decades earlier, J.R.R. Tolkien had articulated the same sentiment, with more politeness and academic rigor, in his terrific essay on why there is no such thing as writing “for children.” But one of the finest, most charming and most convincing renunciations of the myths about writing for children comes from E.B. White, nearly two decades after he sneezed Charlotte’s Web.

In a 1969 interview, included in the altogether unputdownable The Paris Review Interviews, vol. IV (public library) — which also features wonderfully wide-ranging conversations with Haruki Murakami, Maya Angelou, Ezra Pound, Marilynne Robinson, William Styron and more — White turns his formidable amalgam of wit and wisdom to our culture’s limiting misconceptions about storytelling “for children.”

When the interviewer asks whether there is “any shifting of gears” in writing children’s books, as opposed to the grownup nonfiction for which he is best known, White responds with the rare combination of conviction and nuance:

Anybody who shifts gears when he writes for children is likely to wind up stripping his gears. But I don’t want to evade your question. There is a difference between writing for children and for adults. I am lucky, though, as I seldom seem to have my audience in mind when I am at work. It is as though they didn’t exist.

Echoing Ursula Nordstrom — the visionary editor and patron saint of childhood who brought to life not only Charlotte’s Web but also such classics as Goodnight Moon, Where the Wild Things Are, and The Giving Tree, and who famously insisted that children never want a blunt creative edge — White adds:

Anyone who writes down to children is simply wasting his time. You have to write up, not down. Children are demanding. They are the most attentive, curious, eager, observant, sensitive, quick, and generally congenial readers on earth. They accept, almost without question, anything you present them with, as long as it is presented honestly, fearlessly, and clearly. I handed them, against the advice of experts, a mouse-boy, and they accepted it without a quiver. In Charlotte’s Web, I gave them a literate spider, and they took that.

E. B. White's drawings of the vectors of the web-spinning process. Click image for more.

Long before psychologists knew that language is central to how human imagination evolves, White is especially adamant about the blunting of language:

Some writers for children deliberately avoid using words they think a child doesn’t know. This emasculates the prose and, I suspect, bores the reader. Children are game for anything. I throw them hard words, and they backhand them over the net. They love words that give them a hard time, provided they are in a context that absorbs their attention. I’m lucky again: my own vocabulary is small, compared to most writers, and I tend to use the short words. So it’s no problem for me to write for children. We have a lot in common.

Like C.S. Lewis, who contemplated what writing for children reveals about the key to authenticity in all writing, White extrapolates the writer’s responsibility to all audiences:

A writer should concern himself with whatever absorbs his fancy, stirs his heart, and unlimbers his typewriter… A writer has the duty to be good, not lousy; true, not false; lively, not dull; accurate, not full of error. He should tend to lift people up, not lower them down. Writers do not merely reflect and interpret life, they inform and shape life.

[…]

A writer must reflect and interpret his society, his world; he must also provide inspiration and guidance and challenge. Much writing today strikes me as deprecating, destructive, and angry. There are good reasons for anger, and I have nothing against anger. But I think some writers have lost their sense of proportion, their sense of humor, and their sense of appreciation. I am often mad, but I would hate to be nothing but mad: and I think I would lose what little value I may have as a writer if I were to refuse, as a matter of principle, to accept the warming rays of the sun, and to report them, whenever, and if ever, they happen to strike me.

The Paris Review Interviews, vol. IV is a treasure trove in its totality. Complement this particular gem with E.B. White on the future of reading, what makes a great city, his satirical take on the difference between love and passion, and his beautiful letter to a man who had lost faith in life.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

25 FEBRUARY, 2015

What Comes After Religion: The Search for Meaning in Secular Life

By:

“We need reminders to be good, places to reawaken awe, something to reawaken our kinder, less selfish impulses…”

In their series of animated essays, The School of Life have contemplated what great books do for the soul, how to merge money and meaning, and what philosophy is for. Now comes a wonderful animation that builds on School of Life founder Alain de Botton’s book Religion for Atheists: A Non-believer’s Guide to the Uses of Religion (public library) to explore what comes after religion and how we can begin to address the deeper existential yearnings which led us to create religion in the first place — a meditation that calls to mind Sam Harris’s recent guide to spirituality without religion and the broader question of how we fill our lives with meaning.

Transcribed highlights below.

Fewer and fewer people believe nowadays. It’s possible that in a generation, there simply won’t be religion across Europe and large sections of North America, Australia, and Asia. That’s not necessarily a problem — but it’s worth thinking about why people made up religion in the first place.

[…]

We may no longer believe, but the needs and longings that made us make up these stories go on: We’re lonely, and violent; we long for beauty, wisdom, and purpose; we want to live for something more than just ourselves.

Society tells us to direct our hopes in two areas: romantic love and professional success. And it distracts us with news, movies, and consumption. It’s not enough, as we know — especially at three in the morning. We need reminders to be good, places to reawaken awe, something to reawaken our kinder, less selfish impulses — universal things, which need tending, like delicate flowers, and rituals that bring us together.

The choice isn’t between religion and the secular world, as it is now — the challenge is to learn from religions so we can fill the secular world with replacements for the things we long ago made up religion to provide. The challenge begins here.

For more on this slippery but vital question, see Alan Lightman on finding transcendent moments in the secular world and Mary Oliver on a life well lived.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

24 FEBRUARY, 2015

Joan Didion on Hollywood’s Diversity Problem: A Masterpiece from 1968 That Could Have Been Written Today

By:

“The public life of liberal Hollywood comprises a kind of dictatorship of good intentions, a social contract in which actual and irreconcilable disagreement is as taboo as failure or bad teeth.”

Over and over, Joan Didion has emerged as an enchantress of nuance — a writer of deep and dimensional wisdom on such undying human issues as self-respect, grief, and the passage of time. Didion has a particular penchant for unraveling issues of social friction and discomfort to reveal that they are merely symptoms, rather than causes, of deeper societal pathologies.

Take Hollywood’s diversity problem, of which the world becomes palpably aware every year as the Academy Awards roll around. (Awards are, after all, a vehicle for rewarding the echelons of a system’s values, and the paucity of people of color among nominees and winners speaks volumes about how much that system values diversity — something that renders hardly surprising a recent Los Angeles Times survey, which found that the Oscar-dispensing academy is primarily male and 94% white.) Nearly half a century before today’s crescendoing public outcries against Hollywood’s masculine whiteness, Didion addressed the issue with unparalleled intellectual elegance.

In an essay titled “Good Citizens,” written between 1968 and 1970 and found in the altogether indispensable The White Album (public library) — which also gave us Didion on driving as a transcendent experience — she turns her perceptive and prescient gaze to Hollywood’s diversity problem and the vacant pretensions that both beget and obscure it.

Joan Didion by Julian Wasser

Shortly after the Civil Rights Act of 1968 and the assassination of Martin Luther King, Didion writes:

Politics are not widely considered a legitimate source of amusement in Hollywood, where the borrowed rhetoric by which political ideas are reduced to choices between the good (equality is good) and the bad (genocide is bad) tends to make even the most casual political small talk resemble a rally.

And because this is Didion — a writer of extraordinary subtlety and piercing precision, who often tells half the story in the telling itself — she paints a backdrop of chronic superficiality as she builds up to the overt point:

“Those who cannot remember the past are condemned to repeat it,” someone said to me at dinner not long ago, and before we had finished our fraises des bois he had advised me as well that “no man is an island.” As a matter of fact I hear that no man is an island once or twice a week, quite often from people who think they are quoting Ernest Hemingway. “What a sacrifice on the altar of nationalism,” I heard an actor say about the death in a plane crash of the president of the Philippines. It is a way of talking that tends to preclude further discussion, which may well be its intention: the public life of liberal Hollywood comprises a kind of dictatorship of good intentions, a social contract in which actual and irreconcilable disagreement is as taboo as failure or bad teeth, a climate devoid of irony.

She recounts one event particularly painful to watch — a staged debate between the writer William Styron and the actor Ossie Davis. Davis had asserted that Styron’s Pulitzer-winning novel The Confessions of Nat Turner, in which the black protagonist falls in love with a white woman, had encouraged racism. With her characteristic clarity bordering on wryness, Didion notes: “It was Mr. Styron’s contention that it had not.” She sums up the evening with one famous spectator’s response:

James Baldwin sat between them, his eyes closed and his head thrown back in understandable but rather theatrical agony.

Didion reflects on what that particular event revealed — and still reveals — about Hollywood’s general vacancy of any real commitment to social justice:

[The evening’s] curious vanity and irrelevance stay with me, if only because those qualities characterize so many of Hollywood’s best intentions. Social problems present themselves to many of these people in terms of a scenario, in which, once certain key scenes are licked (the confrontation on the courthouse steps, the revelation that the opposition leader has an anti-Semitic past, the presentation of the bill of participants to the President, a Henry Fonda cameo), the plot will proceed inexorably to an upbeat fade. Marlon Brando does not, in a well-plotted motion picture, picket San Quentin in vain: what we are talking about here is faith in a dramatic convention. Things “happen” in motion pictures. There is always a resolution, always a strong cause-effect dramatic line, and to perceive the world in those terms is to assume an ending for every social scenario… If the poor people march on Washington and camp out, there to receive bundles of clothes gathered on the Fox lot by Barbra Streisand, then some good must come of it (the script here has a great many dramatic staples, not the least of them in a sentimental notion of Washington as an open forum, cf. Mr. Deeds Goes to Washington), and doubts have no place in the story.

The unwelcome presence of doubt is, of course, the fatal diagnosis of all systems that warp reality and flatten its complexities by leaving no room for nuance — systems like the Hollywood of that era, which has hardly changed in ours, and the mainstream media of today, who tend to depict the world by the same “dramatic convention” of clickbait and sensationalism adding nothing to the actual conquest of meaning.

Reflecting on Hollywood’s “vanity of perceiving social life as a problem to be solved by the good will of individuals,” Didion recounts the particular pretensions she observed at a national convention of the United States Junior Chamber of Commerce, commonly known as Jaycees — a leadership and civic engagement training organization for young people — held at LA’s Miramar Hotel:

In any imaginative sense Santa Monica seemed an eccentric place for the United States Junior Chamber of Commerce to be holding a national congress, but there they were, a thousand delegates and wives, gathered in the Miramar Hotel for a relentless succession of keynote banquets and award luncheons and prayer breakfasts and outstanding-young-men forums… I was watching the pretty young wife of one delegate pick sullenly at her lunch. “Let someone else eat this slop,” she said suddenly, her voice cutting through not only the high generalities of the occasion but The New Generation’s George M. Cohan medley as well.

It bears noting that this took place a decade and a half before the Jaycees began accepting women as members, so any female presence at the convention was invariably allotted to such pretty young wives — including this one, whom Didion proceeded to see “sobbing into a pink napkin.”

Illustration from 'How to Be a Nonconformist,' 1968. Click image for more.

But what makes Didion’s essay doubly poignant is the way in which she gets at — obliquely, unambiguously — how Hollywood’s chronic biases permeate the rest of society. Having gone to the Jaycees national convention “in search of the abstraction lately called ‘Middle America,'” she found instead a mirroring of Hollywood’s superficial and performative approaches to social justice issues. Didion writes:

At first I thought I had walked out of the rain and into a time warp: the Sixties seemed not to have happened.

Nearly half a century later, one can’t help but feel pulled into the very same time warp — Didion could have written this today:

They knew that this was a brave new world and [the Jaycees] said so. It was time to “put brotherhood into action,” to “open our neighborhoods to those of all colors.” It was time to “turn attention to the cities,” to think about youth centers and clinics and the example set by a black policeman-preacher in Philadelphia who was organizing a decency rally patterned after Miami’s. It was time to “decry apathy.”

The word “apathy” cropped up again and again, an odd word to use in relation to the past few years, and it was a while before I realized what it meant. It was not simply a word remembered from the Fifties, when most of these men had frozen their vocabularies: it was a word meant to indicate that not enough of “our kind” were speaking out. It was a cry in the wilderness, and this resolute determination to meet 1950 head-on was a kind of refuge. Here were some people who had been led to believe that the future was always a rational extension of the past, that there would ever be world enough and time enough for “turning attention,” for “problems” and “solutions.” Of course they would not admit their inchoate fears that the world was not that way any more. Of course they would not join the “fashionable doubters.” Of course they would ignore the “pessimistic pundits.” Late one afternoon I sat in the Miramar lobby, watching the rain fall and the steam rise off the heated pool outside and listening to a couple of Jaycees discussing student unrest and whether the “solution” might not lie in on-campus Jaycee groups. I thought about this astonishing notion for a long time. It occurred to me finally that I was listening to a true underground, to the voice of all those who have felt themselves not merely shocked but personally betrayed by recent history. It was supposed to have been their time. It was not.

Reading Didion’s astute observations many decades later, at a time when so many of us feel just as “personally betrayed by recent history,” reaffirms the The White Album as absolutely essential reading for modern life — not merely because its title is all the more uncomfortably perfect in light of this particular essay, but because every single essay in it directs the same unflinching perceptiveness at some timeless and timely aspect of our world.

Complement it with Didion’s reading list of all-time favorite books and her answers to the Proust Questionnaire.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

23 FEBRUARY, 2015

This Idea Must Die: Some of the World’s Greatest Thinkers Each Select a Major Misconception Holding Us Back

By:

From the self to left brain vs. right brain to romantic love, a catalog of broken theories that hold us back from the conquest of Truth.

“To kill an error is as good a service as, and sometimes even better than, the establishing of a new truth or fact,” asserted Charles Darwin in one of the eleven rules for critical thinking known as Prospero’s Precepts. If science and human knowledge progress in leaps and bounds of ignorance, then the recognition of error and the transcendence of falsehood are the springboard for the leaps of progress. That’s the premise behind This Idea Must Die: Scientific Theories That Are Blocking Progress (public library) — a compendium of answers Edge founder John Brockman collected by posing his annual question“What scientific idea is ready for retirement?” — to 175 of the world’s greatest scientists, philosophers, and writers. Among them are Nobel laureates, MacArthur geniuses, and celebrated minds like theoretical physicist and mathematician Freeman Dyson, biological anthropologist Helen Fisher, cognitive scientist and linguist Steven Pinker, media theorist Douglas Rushkoff, philosopher Rebecca Newberger Goldstein, psychologist Howard Gardner, social scientist and technology scholar Sherry Turkle, actor and author Alan Alda, futurist and Wired founding editor Kevin Kelly, and novelist, essayist, and screenwriter Ian McEwan.

Brockman paints the backdrop for the inquiry:

Science advances by discovering new things and developing new ideas. Few truly new ideas are developed without abandoning old ones first. As theoretical physicist Max Planck (1858–1947) noted, “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” In other words, science advances by a series of funerals.

Many of the answers are redundant — but this is a glorious feature rather than a bug of Brockman’s series, for its chief reward is precisely this cumulative effect of discerning the zeitgeist of ideas with which some of our era’s greatest minds are tussling in synchronicity. They point to such retirement-ready ideas as IQ, the self, race, the left brain vs. right brain divide, human nature and essentialism, free will, and even science itself. What emerges is the very thing Carl Sagan deemed vital to truth in his Baloney Detection Kit — a “substantive debate on the evidence by knowledgeable proponents of all points of view.”

Illustration by Lizi Boyd from 'Flashlight.' Click image for more.

One of the most profound undercurrents across the answers has to do with our relationship with knowledge, certainty, and science itself. And one of the most profound contributions in that regard comes from MacArthur fellow Rebecca Newberger Goldstein, a philosopher who thinks deeply and dimensionally about some of the most complex questions of existence. Assailing the idea that science makes philosophy obsolete — that science is the transformation of “philosophy’s vagaries into empirically testable theories” and philosophy merely the “cold-storage room in which questions are shelved until the sciences get around to handling them” — Goldstein writes:

The obsolescence of philosophy is often taken to be a consequence of science. After all, science has a history of repeatedly inheriting — and definitively answering — questions over which philosophers have futilely hemmed and hawed for unconscionable amounts of time.

The gravest problem with this theory, Goldstein notes, is its internal incoherence:

You can’t argue for science making philosophy obsolete without indulging in philosophical arguments… When pressed for an answer to the so-called demarcation problem, scientists almost automatically reach for the notion of “falsifiability” first proposed by Karl Popper. His profession? Philosophy. But whatever criterion you offer, its defense is going to implicate you in philosophy.

This is something that Dorion Sagan, Carl Sagan’s son, has previously addressed, but Goldstein brings to it unparalleled elegance of thought and eloquence of expression:

A triumphalist scientism needs philosophy to support itself. And the lesson here should be generalized. Philosophy is joined to science in reason’s project. Its mandate is to render our views and our attitudes maximally coherent.

In doing so, she argues, philosophy provides “the reasoning that science requires in order to claim its image as descriptive.” As a proponent of the vital difference between information and wisdom — the former being the material of science, the latter the product of philosophy, and knowledge the change agent that transmutes one into the other — I find the provocative genius of Goldstein’s conclusion enormously invigorating:

What idea should science retire? The idea of “science” itself. Let’s retire it in favor of the more inclusive “knowledge.”

Neuroscientist Sam Harris, author of the indispensable Spirituality Without Religion, echoes this by choosing our narrow definition of “science” as the idea to be put to rest:

Search your mind, or pay attention to the conversations you have with other people, and you’ll discover that there are no real boundaries between science and philosophy — or between those disciplines and any other that attempts to make valid claims about the world on the basis of evidence and logic. When such claims and their methods of verification admit of experiment and/or mathematical description, we tend to say our concerns are “scientific”; when they relate to matters more abstract, or to the consistency of our thinking itself, we often say we’re being “philosophical”; when we merely want to know how people behaved in the past, we dub our interests “historical” or “journalistic”; and when a person’s commitment to evidence and logic grows dangerously thin or simply snaps under the burden of fear, wishful thinking, tribalism, or ecstasy, we recognize that he’s being “religious.”

The boundaries between true intellectual disciplines are currently enforced by little more than university budgets and architecture… The real distinction we should care about — the observation of which is the sine qua non of the scientific attitude — is between demanding good reasons for what one believes and being satisfied with bad ones.

In a sentiment that calls to mind both Richard Feynman’s spectacular ode to a flower and Carl Sagan’s enduring wisdom on our search for meaning, Harris applies this model of knowledge to one of the great mysteries of science and philosophy — consciousness:

Even if one thinks the human mind is entirely the product of physics, the reality of consciousness becomes no less wondrous, and the difference between happiness and suffering no less important. Nor does such a view suggest that we’ll ever find the emergence of mind from matter fully intelligible; consciousness may always seem like a miracle. In philosophical circles, this is known as “the hard problem of consciousness” — some of us agree that this problem exists, some of us don’t. Should consciousness prove conceptually irreducible, remaining the mysterious ground for all we can conceivably experience or value, the rest of the scientific worldview would remain perfectly intact.

The remedy for all this confusion is simple: We must abandon the idea that science is distinct from the rest of human rationality. When you are adhering to the highest standards of logic and evidence, you are thinking scientifically. And when you’re not, you’re not.

Illustration from 'Once Upon an Alphabet' by Oliver Jeffers. Click image for more.

Psychologist Susan Blackmore, who studies this very problem — famously termed “the hard problem of consciousness” by philosopher David Chalmers in 1996 — benches the idea that there are neural correlates to consciousness. Tempting as it may be to interpret neural activity as the wellspring of that special something we call “consciousness” or “subjective experience,” as opposed to the “unconscious” rest of the brain, Blackmore admonishes that such dualism is past its cultural expiration date:

Dualist thinking comes naturally to us. We feel as though our conscious experiences were of a different order from the physical world. But this is the same intuition that leads to the hard problem seeming hard. It’s the same intuition that produces the philosopher’s zombie — a creature identical to me in every way except that it has no consciousness. It’s the same intuition that leads people to write, apparently unproblematically, about brain processes being either conscious or unconscious… Intuitively plausible as it is, this is a magic difference. Consciousness is not some weird and wonderful product of some brain processes but not others. Rather, it’s an illusion constructed by a clever brain and body in a complex social world. We can speak, think, refer to ourselves as agents, and so build up the false idea of a persisting self that has consciousness and free will.

Much of the allure of identifying such neural correlates of consciousness, Blackmore argues, lies in cultural mythologies rooted in fantasy rather than fact:

While people are awake they must always be conscious of something or other. And that leads along the slippery path to the idea that if we knew what to look for, we could peer inside someone’s brain and find out which processes were the conscious ones and which the unconscious ones. But this is all nonsense. All we’ll ever find are the neural correlates of thoughts, perceptions, memories, and the verbal and attentional processes that lead us to think we’re conscious.

When we finally have a better theory of consciousness to replace these popular delusions, we’ll see that there’s no hard problem, no magic difference, and no NCCs.

Illustration by Rob Hunter from 'A Graphic Cosmogony.' Click image for more.

In a related grievance, social psychologist Bruce Hood — author of the uncomfortable yet strangely comforting The Self Illusion — does away with the notion of the self. Half a century after Alan Watts enlisted Eastern philosophy in this mission, Hood presents a necessary integration of science and philosophy:

It seems almost redundant to call for the retirement of the free willing self, as the idea is neither scientific nor is this the first time the concept has been dismissed for lack of empirical support. The self did not have to be discovered; it’s the default assumption most of us experience, so it wasn’t really revealed by methods of scientific inquiry.

[…]

Yet the self, like a conceptual zombie, refuses to die. It crops up again and again in recent theories of decision making, as an entity with free will which can be depleted. It reappears as an interpreter in cognitive neuroscience, as able to integrate parallel streams of information arising from separable neural substrates. Even if these appearances of the self are understood to be convenient ways of discussing the emergent output of multiple parallel processes, students of the mind continue to implicitly endorse the idea that there’s a decision maker, an experiencer, a point of origin.

We know the self is constructed because it can be so easily deconstructed — through damage, disease, and drugs. It must be an emergent property of a parallel system processing input, output, and internal representations. It’s an illusion because it feels so real, but that experience is not what it seems. The same is true for free will. Although we can experience the mental anguish of making a decision… the choices and decisions we make are based on situations that impose on us. We don’t have the free will to choose the experiences that have shaped our decisions.

[…]

By abandoning the free willing self, we’re forced to reexamine the factors that are truly behind our thoughts and behavior and the way they interact, balance, override, and cancel out. Only then will we begin to make progress in understanding how we really operate.

Illustration by Ben Newman from 'A Graphic Cosmogony.' Click image for more.

Among the most provocative answers, in fact, is one examining the factors that underlie one of the most complex and seemingly human of our experiences: love. Biological anthropologist Helen Fisher, who studies the brain on love, points to romantic love and addiction as two concepts in need of serious reformulation and reframing — one best accomplished by understanding the intersection of the two. Fisher argues that we ought to broaden the definition of addiction and do away with science’s staunch notion that all addiction is harmful. Love, she argues with a wealth of neurobiological evidence in hand, is in fact a state that closely resembles that of addiction in terms of what happens in the brain during it — and yet love, anguishing as it may be at times, is universally recognized as the height of positive experience. In that respect, it presents a case of “positive addiction.” Fisher writes:

Love-besotted men and women show all the basic symptoms of addiction. Foremost, the lover is stiletto-focused on his/her drug of choice, the love object. The lover thinks obsessively about him or her (intrusive thinking), and often compulsively calls, writes, or stays in touch. Paramount in this experience is intense motivation to win one’s sweetheart, not unlike the substance abuser fixated on the drug. Impassioned lovers distort reality, change their priorities and daily habits to accommodate the beloved, experience personality changes (affect disturbance), and sometimes do inappropriate or risky things to impress this special other. Many are willing to sacrifice, even die for, “him” or “her.” The lover craves emotional and physical union with the beloved (dependence). And like addicts who suffer when they can’t get their drug, the lover suffers when apart from the beloved (separation anxiety). Adversity and social barriers even heighten this longing (frustration attraction).

In fact, besotted lovers express all four of the basic traits of addiction: craving, tolerance, withdrawal, and relapse. They feel a “rush” of exhilaration when they’re with their beloved (intoxication). As their tolerance builds, they seek to interact with the beloved more and more (intensification). If the love object breaks off the relationship, the lover experiences signs of drug withdrawal, including protest, crying spells, lethargy, anxiety, insomnia or hypersomnia, loss of appetite or binge eating, irritability, and loneliness. Lovers, like addicts, also often go to extremes, sometimes doing degrading or physically dangerous things to win back the beloved. And lovers relapse the way drug addicts do. Long after the relationship is over, events, people, places, songs, or other external cues associated with their abandoning sweetheart can trigger memories and renewed craving.

Fisher points to fMRI studies that have shown intense romantic love to trigger the brain’s reward system and the dopamine pathways responsible for “energy, focus, motivation, ecstasy, despair, and craving,” as well as the brain regions most closely associated with addiction and substance abuse. In shedding light on the neurochemical machinery of romantic love, Fisher argues, science reveals it to be a “profoundly powerful, natural, often positive addiction.”

Illustration by Christine Rösch from 'The Mathematics of Love.' Click image for more.

Astrophysicist Marcelo Gleiser, who has written beautifully about the necessary duality of knowledge and mystery, wants to do away with “the venerable notion of Unification.” He points out that smaller acts of unification and simplification are core to the scientific process — from the laws of thermodynamics to Newton’s law of universal gravity — but simplification as sweeping as reducing the world to a single Theory of Everything is misplaced:

The trouble starts when we take this idea too far and search for the Über-unification, the Theory of Everything, the arch-reductionist notion that all forces of nature are merely manifestations of a single force. This is the idea that needs to go.

Noting that at some point along the way, “math became equated with beauty and beauty with truth,” Gleiser writes:

The impulse to unify it all runs deep in the souls of mathematicians and theoretical physicists, from the Langlands program to superstring theory. But here’s the rub: Pure mathematics isn’t physics. The power of mathematics comes precisely from its detachment from physical reality. A mathematician can create any universe she wants and play all sorts of games with it. A physicist can’t; his job is to describe nature as we perceive it. Nevertheless, the unification game has been an integral part of physics since Galileo and has produced what it should: approximate unifications.

And yet this unification game, as integral as it may be to science, is also antithetical to it in the long run:

The scientific impulse to unify is crypto-religious… There’s something deeply appealing in equating all of nature to a single creative principle: To decipher the “mind of God” is to be special, is to answer to a higher calling. Pure mathematicians who believe in the reality of mathematical truths are monks of a secret order, open only to the initiated. In the case of high energy physics, all unification theories rely on sophisticated mathematics related to pure geometric structures: The belief is that nature’s ultimate code exists in the ethereal world of mathematical truths and that we can decipher it.

Echoing Richard Feynman’s spectacular commencement address admonishing against “cargo cult science,” Gleiser adds:

Recent experimental data has been devastating to such belief — no trace of supersymmetric particles, of extra dimensions, or of dark matter of any sort, all long-awaited signatures of unification physics. Maybe something will come up; to find, we must search. The trouble with unification in high energy physics is that you can always push it beyond the experimental range. “The Large Hadron Collider got to 7 TeV and found nothing? No problem! Who said nature should opt for the simplest versions of unification? Maybe it’s all happening at much higher energies, well beyond our reach.”

There’s nothing wrong with this kind of position. You can believe it until you die, and die happy. Or you can conclude that what we do best is construct approximate models of how nature works and that the symmetries we find are only descriptions of what really goes on. Perfection is too hard a burden to impose on nature.

People often see this kind of argument as defeatist, as coming from someone who got frustrated and gave up. (As in “He lost his faith.”) Big mistake. To search for simplicity is essential to what scientists do. It’s what I do. There are essential organizing principles in nature, and the laws we find are excellent ways to describe them. But the laws are many, not one. We’re successful pattern-seeking rational mammals. That alone is cause for celebration. However, let’s not confuse our descriptions and models with reality. We may hold perfection in our mind’s eye as a sort of ethereal muse. Meanwhile nature is out there doing its thing. That we manage to catch a glimpse of its inner workings is nothing short of wonderful. And that should be good enough.

Ceramic tile by Debbie Millman courtesy of the artist

Science writer Amanda Gefter takes issue with one particular manifestation of our propensity for oversimplification — the notion of the universe. She writes:

Physics has a time-honored tradition of laughing in the face of our most basic intuitions. Einstein’s relativity forced us to retire our notions of absolute space and time, while quantum mechanics forced us to retire our notions of pretty much everything else. Still, one stubborn idea has stood steadfast through it all: the universe.

[…]

In recent years, however, the concept of a single shared spacetime has sent physics spiraling into paradox. The first sign that something was amiss came from Stephen Hawking’s landmark work in the 1970s showing that black holes radiate and evaporate, disappearing from the universe and purportedly taking some quantum information with them. Quantum mechanics, however, is predicated upon the principle that information can never be lost.

Gefter points to recent breakthroughs in physics that produced one particularly puzzling such paradox, known as the “firewall paradox,” solved by the idea that spacetime is divided not by horizons but by the reference frames of the observers, “as if each observer had his or her own universe.”

But the solution isn’t a multiverse theory:

Yes, there are multiple observers, and yes, any observer’s universe is as good as any other’s. But if you want to stay on the right side of the laws of physics, you can talk only about one at a time. Which means, really, that only one exists at a time. It’s cosmic solipsism.

Here, psychology, philosophy, and cosmology converge, for what such theories suggest is what we already know about the human psyche — as I’ve put it elsewhere, the stories that we tell ourselves, whether they be false or true, are always real. Gefter concludes:

Adjusting our intuitions and adapting to the strange truths uncovered by physics is never easy. But we may just have to come around to the notion that there’s my universe and there’s your universe — but there’s no such thing as the universe.

Biological anthropologist Nina Jablonski points to the notion of race as urgently retirement-ready. Pointing out that it has always been a “vague and slippery concept,” she traces its origins to Hume and Kant — the first to divide humanity into geographic groupings called “races” — and the pseudoscientific seeds of racism this division planted:

Skin color, as the most noticeable racial characteristic, was associated with a nebulous assemblage of opinions and hearsay about the inherent natures of the various races. Skin color stood for morality, character, and the capacity for civilization; it became a meme.

Even though the atrocious “race science” that emerged in the 19th and early 20th century didn’t hold up — whenever scientists looked for actual sharp boundaries between groups, none came up — and race came to be something people identify themselves with as a shared category of experiences and social bonds, Jablonski argues that the toxic aftershocks of pseudoscience still poison culture:

Even after it has been shown that many diseases (adult-onset diabetes, alcoholism, high blood pressure, to name a few) show apparent racial patterns because people share similar environmental conditions, groupings by race are maintained. The use of racial self-categorization in epidemiological studies is defended and even encouraged. Medical studies of health disparities between “races” become meaningless when sufficient variables — such as differences in class, ethnic social practices, and attitudes — are taken into account.

Half a century after the ever-prescient Margaret Mead made the same point, Jablonski urges:

Race has a hold on history but no longer has a place in science. The sheer instability and potential for misinterpretation render race useless as a scientific concept. Inventing new vocabularies to deal with human diversity and inequity won’t be easy, but it must be done.

Psychologist Jonathan Gottschall, who has previously explored why storytelling is so central to the human experience, argues against the notion that there can be no science of art. With an eye to our civilization’s long struggle to define art, he writes:

We don’t even have a good definition, in truth, for what art is. In short, there’s nothing so central to human life that’s so incompletely understood.

Granted, Gottschall is only partly right, for there are some excellent definitions of art — take, for instance, Jeanette Winterson’s or Leo Tolstoy’s — but the fact that they don’t come from scientists only speaks to his larger point. He argues that rather than being unfit to shed light on the role of art in human life, science simply hasn’t applied itself to the problem adequately:

Scientific work in the humanities has mainly been scattered, preliminary, and desultory. It doesn’t constitute a research program.

If we want better answers to fundamental questions about art, science must jump into the game with both feet. Going it alone, humanities scholars can tell intriguing stories about the origins and significance of art, but they don’t have the tools to patiently winnow the field of competing ideas. That’s what the scientific method is for — separating the more accurate stories from the less accurate stories. But a strong science of art will require both the thick, granular expertise of humanities scholars and the clever hypothesis-testing of scientists. I’m not calling for a scientific takeover of the arts, I’m calling for a partnership.

[…]

The Delphic admonition “Know thyself” still rings out as the great prime directive of intellectual inquiry, and there will always be a gaping hole in human self-knowledge until we develop a science of art.

In a further testament to the zeitgeist-illuminating nature of the project, actor, author, and science-lover Alan Alda makes a passionate case for the same concept:

The trouble with truth is that not only is the notion of eternal, universal truth highly questionable, but simple, local truths are subject to refinement as well. Up is up and down is down, of course. Except under special circumstances. Is the North Pole up and the South Pole down? Is someone standing at one of the poles right-side up or upside-down? Kind of depends on your perspective.

When I studied how to think in school, I was taught that the first rule of logic was that a thing cannot both be and not be at the same time and in the same respect. That last note, “in the same respect,” says a lot. As soon as you change the frame of reference, you’ve changed the truthiness of a once immutable fact.

[…]

This is not to say that nothing is true or that everything is possible — just that it might not be so helpful for things to be known as true for all time, without a disclaimer… I wonder — and this is just a modest proposal — whether scientific truth should be identified in a way acknowledging that it’s something we know and understand for now, and in a certain way.

[…]

Facts, it seems to me, are workable units, useful in a given frame or context. They should be as exact and irrefutable as possible, tested by experiment to the fullest extent. When the frame changes, they don’t need to be discarded as untrue but respected as still useful within their domain. Most people who work with facts accept this, but I don’t think the public fully gets it.

That’s why I hope for more wariness about implying we know something to be true or false for all time and for everywhere in the cosmos.

Illustration from 'Once Upon an Alphabet' by Oliver Jeffers. Click image for more.

And indeed this elasticity of truth across time is at the heart of what I find to be the most beautiful and culturally essential contribution to the collection. As someone who believes that the stewardship of enduring ideas is at least as important as the genesis of new ones — not only because past ideas are the combinatorial building blocks of future ones but also because in order to move forward we always need a backdrop against which to paint the contrast of progress and improvement — I was most bewitched by writer Ian McEwan’s admonition against the arrogance of retiring any idea as an impediment to progress:

Beware of arrogance! Retire nothing! A great and rich scientific tradition should hang onto everything it has. Truth is not the only measure. There are ways of being wrong that help others to be right. Some are wrong, but brilliantly so. Some are wrong but contribute to method. Some are wrong but help found a discipline. Aristotle ranged over the whole of human knowledge and was wrong about much. But his invention of zoology alone was priceless. Would you cast him aside? You never know when you might need an old idea. It could rise again one day to enhance a perspective the present cannot imagine. It would not be available to us if it were fully retired.

To appreciate McEwan’s point, one need only look at something like Bertrand Russell’s timely thoughts on boredom, penned in 1930 and yet astoundingly resonant with our present anxieties about the societal side effects of current technology. McEwan captures this beautifully:

Every last serious and systematic speculation about the world deserves to be preserved. We need to remember how we got to where we are, and we’d like the future not to retire us. Science should look to literature and maintain a vibrant living history as a monument to ingenuity and persistence. We won’t retire Shakespeare. Nor should we Bacon.

Complement This Idea Must Die, the entirety of which weaves a mind-stretching mesh of complementary and contradictory perspectives on our relationship with knowledge, with some stimulating answers to previous editions of Brockman’s annual question, exploring the only thing worth worrying about (2013), the single most elegant theory of how the world works (2012), and the best way to make ourselves smarter (2011).

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.