The Marginalian
The Marginalian

Big Thinkers on the Only Things Worth Worrying About

In his famous and wonderfully heartening letter of fatherly advice, F. Scott Fitzgerald gave his young daughter Scottie a list of things to worry and not worry about in life. Among the unworriables, he named popular opinion, the past, the future, triumph, and failure “unless it comes through your own fault.” Among the worry-worthy, courage, cleanliness, and efficiency. What Fitzgerald touched on, of course, is the quintessential anxiety of the human condition, which drives us to worry about things big and small, mundane and monumental, often confusing the two classes. It was this “worryability” that young Italo Calvino resolved to shake from his life. A wonderful 1934 book classified all of our worries in five general categories that endure with astounding prescience and precision, but we still struggle to identify the things truly worth worrying about — and, implicitly, working to resolve — versus those that only strain our psychoemotional capacity with the deathly grip of anxiety.

‘My Wheel of Worry’ by Andrew Kuo, depicting his inner worries, arguments, counterarguments, and obsessions in the form of charts and graphs.
Click image for details.

In What Should We Be Worried About? (public library), intellectual jockey and Edge founder John Brockman tackles this issue with his annual question — which has previously answered such conundrums as the single most elegant theory of how the world works (2012) and the best way to make ourselves smarter (2011) — and asks some of our era’s greatest thinkers in science, psychology, technology, philosophy, and more to each contribute one valid “worry” about our shared future. Rather than alarmist anxiety-slinging, however, the ethos of the project is quite the opposite — to put in perspective the things we worry about but shouldn’t, whether by our own volition or thanks to ample media manipulation, and contrast them with issues of actual concern, at which we ought to aim our collective attention and efforts in order to ensure humanity’s progress and survival.

Behavioral neuroscientist Kate Jeffery offers one of the most interesting answers, reminiscent of Alan Watts’s assertion that “without birth and death … the world would be static, rhythm-less, undancing, mummified,” exploring our mortality paradox and pointing to the loss of death as a thing to worry about:

Every generation our species distills the best of itself, packages it up and passes it on, shedding the dross and creating a fresher, newer, shinier generation. We have been doing this now for four billion years, and in doing so have transmogrified from unicellular microorganisms that do little more than cling to rocks and photosynthesize, to creatures of boundless energy and imagination who write poetry, make music, love each other and work hard to decipher the secrets of themselves and their universe.

And then they die.

Death is what makes this cyclical renewal and steady advance in organisms possible. Discovered by living things millions of years ago, aging and death permit a species to grow and flourish. Because natural selection ensures that the child-who-survives-to-reproduce is better than the parent (albeit infinitesimally so, for that is how evolution works), it is better for many species that the parent step out of the way and allow its (superior) child to succeed in its place. Put more simply, death stops a parent from competing with its children and grandchildren for the same limited resources. So important is death that we have, wired into our genes, a self-destruct senescence program that shuts down operations once we have successfully reproduced, so that we eventually die, leaving our children—the fresher, newer, shinier versions of ourselves—to carry on with the best of what we have given them: the best genes, the best art, and the best ideas. Four billion years of death has served us well.

Now, all this may be coming to an end, for one of the things we humans, with our evolved intelligence, are working hard at is trying to eradicate death. This is an understandable enterprise, for nobody wants to die—genes for wanting to die rarely last long in a species. For millennia, human thinkers have dreamed of conquering old age and death: the fight against it permeates our art and culture, and much of our science. We personify death as a specter and loathe it, fear it and associate it with all that is bad in the world. If we could conquer it, how much better life would become.

Celebrated filmmaker Terry Gilliam leans toward the philosophical with an answer somewhere between John Cage and Yoda:

I’ve given up asking questions. I merely float on a tsunami of acceptance of anything life throws at me… and marvel stupidly.

Music pioneer Brian Eno, a man of strong opinions on art and unconventional approaches to creativity, is concerned that we see politics, a force that impacts our daily lives on nearly every level, as something other people do:

Most of the smart people I know want nothing to do with politics. We avoid it like the plague — like Edge avoids it, in fact. Is this because we feel that politics isn’t where anything significant happens? Or because we’re too taken up with what we’re doing, be it Quantum Physics or Statistical Genomics or Generative Music? Or because we’re too polite to get into arguments with people? Or because we just think that things will work out fine if we let them be — that The Invisible Hand or The Technosphere will mysteriously sort them out?

Whatever the reasons for our quiescence, politics is still being done — just not by us. It’s politics that gave us Iraq and Afghanistan and a few hundred thousand casualties. It’s politics that’s bleeding the poorer nations for the debts of their former dictators. It’s politics that allows special interests to run the country. It’s politics that helped the banks wreck the economy. It’s politics that prohibits gay marriage and stem cell research but nurtures Gaza and Guantanamo.

But we don’t do politics. We expect other people to do it for us, and grumble when they get it wrong. We feel that our responsibility stops at the ballot box, if we even get that far. After that we’re as laissez-faire as we can get away with.

What worries me is that while we’re laissez-ing, someone else is faire-ing.

Barbara Strauch, science editor of The New York Times, echoes Richard Feynman’s lament about the general public’s scientific ignorance — not the good kind, but the kind that leads to the resurgence of preventable diseases — when it comes to science, as well as the dismal state of science education. She sees oases of hope in that desert of ignorance but finds the disconnect worrisome:

Something quite serious has been lost. . . . This decline in general-interest science coverage comes at a time of divergent directions in the general public. At one level, there seems to be increasing ignorance. After all, it’s not just science news coverage that has suffered, but also the teaching of science in schools. And we just went through a political season that saw how all this can play out, with major political figures spouting off one silly statement after another, particularly about women’s health. . . .

But something else is going on, as well. Even as we have in some pockets what seems like increasing ignorance of science, we have at the same time, a growing interest of many. It’s easy to see, from where I sit, how high that interest is. Articles about anything scientific, from the current findings in human evolution to the latest rover landing on Mars, not to mention new genetic approaches to cancer — and yes, even the Higgs boson — zoom to the top of our newspaper’s most emailed list.

We know our readers love science and cannot get enough of it. And it’s not just our readers. As the rover Curiosity approached Mars, people of all ages in all parts of the country had “Curiosity parties” to watch news of the landing. Mars parties! Social media, too, has shown us how much interest there is across the board, with YouTube videos and tweets on science often becoming instant megahits.

So what we have is a high interest and a lot of misinformation floating around. And we have fewer and fewer places that provide real information to a general audience that is understandable, at least by those of us who do not yet have our doctorates in astrophysics. The disconnect is what we should all be worried about.

Nicholas Carr, author of the techno-dystopian The Shallows: What the Internet Is Doing to Our Brains, considers the effects that digital communication might be having on our intricate internal clocks and the strange ways in which our brains warp time:

I’m concerned about time — the way we’re warping it and it’s warping us. Human beings, like other animals, seem to have remarkably accurate internal clocks. Take away our wristwatches and our cell phones, and we can still make pretty good estimates about time intervals. But that faculty can also be easily distorted. Our perception of time is subjective; it changes with our circumstances and our experiences. When things are happening quickly all around us, delays that would otherwise seem brief begin to feel interminable. Seconds stretch out. Minutes go on forever. . . .

Given what we know about the variability of our time sense, it seems clear that information and communication technologies would have a particularly strong effect on personal time perception. After all, they often determine the pace of the events we experience, the speed with which we’re presented with new information and stimuli, and even the rhythm of our social interactions. That’s been true for a long time, but the influence must be particularly strong now that we carry powerful and extraordinarily fast computers around with us all day long. Our gadgets train us to expect near-instantaneous responses to our actions, and we quickly get frustrated and annoyed at even brief delays.

I know that my own perception of time has been changed by technology. . . .

As we experience faster flows of information online, we become, in other words, less patient people. But it’s not just a network effect. The phenomenon is amplified by the constant buzz of Facebook, Twitter, texting, and social networking in general. Society’s “activity rhythm” has never been so harried. Impatience is a contagion spread from gadget to gadget.

One of the gravest yet most lucid and important admonitions comes from classicist-turned-technologist Tim O’Reilly, who echoes Susan Sontag’s concerns about anti-intellectualism and cautions that the plague of ignorance might spread far enough to drive our civilization into extinction:

For so many in the techno-elite, even those who don’t entirely subscribe to the unlimited optimism of the Singularity, the notion of perpetual progress and economic growth is somehow taken for granted. As a former classicist turned technologist, I’ve always lived with the shadow of the fall of Rome, the failure of its intellectual culture, and the stasis that gripped the Western world for the better part of a thousand years. What I fear most is that we will lack the will and the foresight to face the world’s problems squarely, but will instead retreat from them into superstition and ignorance.

[…]

History teaches us that conservative, backward-looking movements often arise under conditions of economic stress. As the world faces problems ranging from climate change to the demographic cliff of aging populations, it’s wise to imagine widely divergent futures.

Yes, we may find technological solutions that propel us into a new golden age of robots, collective intelligence, and an economy built around “the creative class.” But it’s at least as probable that as we fail to find those solutions quickly enough, the world falls into apathy, disbelief in science and progress, and after a melancholy decline, a new dark age.

Civilizations do fail. We have never yet seen one that hasn’t. The difference is that the torch of progress has in the past always passed to another region of the world. But we’ve now, for the first time, got a single global civilization. If it fails, we all fail together.

Biological anthropologist Helen Fisher, who studies the brain on love and whose Why We Love remains indispensable, worries that we misunderstand men. She cites her research for some findings that counter common misconceptions and illustrate how gender stereotypes limit us:

Men fall in love faster too — perhaps because they are more visual. Men experience love at first sight more regularly; and men fall in love just as often. Indeed, men are just as physiologically passionate. When my colleagues and I have scanned men’s brains (using fMRI), we have found that they show just as much activity as women in neural regions linked with feelings of intense romantic love. Interestingly, in the 2011 sample, I also found that when men fall in love, they are faster to introduce their new partner to friends and parents, more eager to kiss in public, and want to “live together” sooner. Then, when they are settled in, men have more intimate conversations with their wives than women do with their husbands—because women have many of their intimate conversations with their girlfriends. Last, men are just as likely to believe you can stay married to the same person forever (76% of both sexes). And other data show that after a break up, men are 2.5 times more likely to kill themselves.

[…]

In the Iliad, Homer called love “magic to make the sanest man go mad.” This brain system lives in both sexes. And I believe we’ll make better partnerships if we embrace the facts: men love — just as powerfully as women.

David Rowan, editor of Wired UK and scholar of the secrets of entrepreneurship, worries about the growing disconnect between the data-rich and the data-poor:

Each day, according to IBM, we collectively generate 2.5 quintillion bytes — a tsunami of structured and unstructured data that’s growing, in IDC’s reckoning, at 60 per cent a year. Walmart drags a million hourly retail transactions into a database that long ago passed 2.5 petabytes; Facebook processes 2.5 billion pieces of content and 500 terabytes of data each day; and Google, whose YouTube division alone gains 72 hours of new video every minute, accumulates 24 petabytes of data in a single day. . . . Certainly there are vast public benefits in the smart processing of these zetta- and yottabytes of previously unconstrained zeroes and ones. . . .

Yet as our lives are swept unstoppably into the data-driven world, such benefits are being denied to a fast-emerging data underclass. Any citizen lacking a basic understanding of, and at least minimal access to, the new algorithmic tools will increasingly be disadvantaged in vast areas of economic, political and social participation. The data disenfranchised will find it harder to establish personal creditworthiness or political influence; they will be discriminated against by stock markets and by social networks. We need to start seeing data literacy as a requisite, fundamental skill in a 21st-century democracy, and to campaign — and perhaps even to legislate — to protect the interests of those being left behind.

Some, like social and cognitive scientist Dan Sperber, go meta, admonishing that our worries about worrying are ushering in a new age of anxiety, the consequences of which are debilitating:

Worrying is an investment of cognitive resources laced with emotions from the anxiety spectrum and aimed at solving some specific problem. It has its costs and benefits, and so does not worrying. Worrying for a few minutes about what to serve for dinner in order please one’s guests may be a sound investment of resources. Worrying about what will happen to your soul after death is a total waste. Human ancestors and other animals with foresight may have only worried about genuine and pressing problems such as not finding food or being eaten. Ever since they have become much more imaginative and have fed their imagination with rich cultural inputs, that is, since at least 40,000 years (possibly much more), humans have also worried about improving their lot individually and collectively — sensible worries — and about the evil eye, the displeasure of dead ancestors, the purity of their blood — misplaced worries.

A new kind of misplaced worries is likely to become more and more common. The ever-accelerating current scientific and technological revolution results in a flow of problems and opportunities that presents unprecedented cognitive and decisional challenges. Our capacity to anticipate these problems and opportunities is swamped by their number, novelty, speed of arrival, and complexity.

[…]

What I am particularly worried about is that humans will be less and less able to appreciate what they should really be worrying about and that their worries will do more harm than good. Maybe, just as on a boat in rapids, one should try not to slowdown anything but just to optimize a trajectory one does not really control, not because safety is guaranteed and optimism is justified — the worst could happen — but because there is no better option than hope.

Mathematician and economist Eric R. Weinstein considers our conventional wisdom on what it takes to cultivate genius, including the myth of the 10,000 hours rule, and argues instead that the pursuit of excellence is a social malady that gets us nowhere meaningful:

We cannot excel our way out of modern problems. Within the same century, we have unlocked the twin nuclei of both cell and atom and created the conditions for synthetic biological and even digital life with computer programs that can spawn with both descent and variation on which selection can now act. We are in genuinely novel territory which we have little reason to think we can control; only the excellent would compare these recent achievements to harmless variations on the invention of the compass or steam engine. So surviving our newfound god-like powers will require modes that lie well outside expertise, excellence, and mastery.

Going back to Sewall Wright’s theory of adaptive landscapes of fitness, we see four modes of human achievement paired with what might be considered their more familiar accompanying archetypes:

A) Climbing—Expertise: Moving up the path of steepest ascent towards excellence for admission into a community that holds and defends a local maximum of fitness.

B) Crossing—Genius: Crossing the ‘Adaptive Valley’ to an unknown and unoccupied even higher maximum level of fitness.

C) Moving—Heroism: Moving ‘mountains of fitness’ for one’s group.

D) Shaking—Rebellion: Leveling peaks and filling valleys for the purpose of changing the landscape to be more even.

The essence of genius as a modality is that it seems to reverse the logic of excellence.

He adds the famous anecdote of Feynman’s Challenger testimony:

In the wake of the Challenger disaster, Richard Feynman was mistakenly asked to become part of the Rogers commission investigating the accident. In a moment of candor Chairman Rogers turned to Neil Armstrong in a men’s room and said “Feynman is becoming a real pain.” Such is ever the verdict pronounced by steady hands over great spirits. But the scariest part of this anecdote is not the story itself but the fact that we are, in the modern era, now so dependent on old Feynman stories having no living heroes with which to replace him: the ultimate tragic triumph of runaway excellence.

This view, however, is remarkably narrow and defeatist. As Voltaire memorably remarked, “Appreciation is a wonderful thing: It makes what is excellent in others belong to us as well.” Without appreciation for the Feynmans of the past we duly don our presentism blinders and refuse to acknowledge the fact that genius is a timeless quality that belongs to all ages, not a cultural commodity of the present. Many of our present concerns have been addressed with enormous prescience in the past, often providing more thoughtful and richer answers than we are able to today, whether it comes to the value of space exploration or the economics of media or the essence of creativity or even the grand question of how to live. Having “living heroes” is an admirable aspiration, but they should never replace — only enhance and complement — the legacy and learnings of those who came before.

Indeed, this presentism bias is precisely what Noga Arikha, historian of ideas and author of Passions and Tempers: A History of the Humours, points to as her greatest worry in one of the most compelling answers. It’s something I’ve voiced as well in a recent interview with the Guardian. Arikha writes:

I worry about the prospect of collective amnesia.

While access to information has never been so universal as it is now — thanks to the Internet — the total sum of knowledge of anything beyond the present seems to be dwindling among those people who came of age with the Internet. Anything beyond 1945, if then, is a messy, remote landscape; the centuries melt into each other in an insignificant magma. Famous names are flickers on a screen, their dates irrelevant, their epochs dusty. Everything is equalized.

She points to a necessary antidote to this shallowing of our cultural hindsight:

There is a way out: by integrating the teaching of history within the curricula of all subjects—using whatever digital or other means we have to redirect attention to slow reading and old sources. Otherwise we will be condemned to living without perspective, robbed of the wisdom and experience with which to build for the future, confined by the arrogance of our presentism to repeating history without noticing it.

Berkeley developmental psychologist Alison Gopnik, author of The Philosophical Baby: What Children’s Minds Tell Us About Truth, Love, and the Meaning of Life, worries that much of modern parenting is concerned with the wrong things — particularly the push for overachievement — when evidence strongly indicates that the art of presence is the most important gift a parent can bestow upon a child:

Thinking about children, as I do for a living, and worrying go hand in hand. There is nothing in human life so important and urgent as raising the next generation, and yet it also feels as if we have very little control over the outcome. . . .

[But] “parenting” worries focus on relatively small variations in what parents and children do — co-sleeping or crying it out, playing with one kind of toy rather than another, more homework or less. There is very little evidence that any of this make much difference to the way that children turn out in the long run. There is even less evidence that there is any magic formula for making one well-loved and financially supported child any smarter or happier or more successful as an adult than another.

Instead, she argues, it is neglect that parents should be most worried about — a moral intuition as old as the world, yet one lamentably diluted by modern parents’ misguided concerns:

More recently research into epigenetics has helped demonstrate just how the mechanisms of care and neglect work. Research in sociology and economics has shown empirically just how significant the consequences of early experience actually can be. The small variations in middle-class “parenting” make very little difference. But providing high-quality early childhood care to children who would otherwise not receive it makes an enormous and continuing difference up through adulthood. In fact, the evidence suggests that this isn’t just a matter of teaching children particular skills or kinds of knowledge—a sort of broader institutional version of “parenting.” Instead, children who have a stable, nurturing, varied early environment thrive in a wide range of ways, from better health to less crime to more successful marriages. That’s just what we’d expect from the evolutionary story. I worry more and more about what will happen to the generations of children who don’t have the uniquely human gift of a long, protected, stable childhood.

Journalist Rolf Dobelli, author of The Art of Thinking Clearly, offers an almost Alan Wattsian concern about the paradox of material progress:

As mammals, we are status seekers. Non-status seeking animals don’t attract suitable mating partners and eventually exit the gene pool. Thus goods that convey high status remain extremely important, yet out of reach for most of us. Nothing technology brings about will change that. Yes, one day we might re-engineer our cognition to reduce or eliminate status competition. But until that point, most people will have to live with the frustrations of technology’s broken promise. That is, goods and services will be available to everybody at virtually no cost. But at the same time, status-conveying goods will inch even further out of reach. That’s a paradox of material progress.

Columbia biologist Stuart Firestein, author of the fantastic Ignorance: How It Drives Science and champion of “thoroughly conscious ignorance,” worries about our unreasonable expectations of science:

Much of science is failure, but it is a productive failure. This is a crucial distinction in how we think about failure. More importantly is that not all wrong science is bad science. As with the exaggerated expectations of scientific progress, expectations about the validity of scientific results have simply become overblown. Scientific “facts” are all provisional, all needing revision or sometimes even outright upending. But this is not bad; indeed it is critical to continued progress. Granted it’s difficult, because you can’t just believe everything you read. But let’s grow up and recognize that undeniable fact of life. . . .

So what’s the worry? That we will become irrationally impatient with science, with it’s wrong turns and occasional blind alleys, with its temporary results that need constant revision. And we will lose our trust and belief in science as the single best way to understand the physical universe. . . . From a historical perspective the path to discovery may seem clear, but the reality is that there are twists and turns and reversals and failures and cul de sacs all along the path to any discovery. Facts are not immutable and discoveries are provisional. This is the messy process of science. We should worry that our unrealistic expectations will destroy this amazing mess.

Neuroscientist Sam Harris, who has previously explored the psychology of lying, is concerned about bad incentives that bring out the worst in us, as individuals and as a society:

We need systems that are wiser than we are. We need institutions and cultural norms that make us better than we tend to be. It seems to me that the greatest challenge we now face is to build them.

Writer Douglas Rushkoff, author of Present Shock: When Everything Happens Now, offers a poignant and beautifully phrased, if exceedingly anthropocentric, concern:

We should worry less about our species losing its biosphere than losing its soul.

Our collective perceptions and cognition is our greatest evolutionary achievement. This is the activity that gives biology its meaning. Our human neural network is in the process of deteriorating and our perceptions are becoming skewed — both involuntarily and by our own hand — and all that most of us in the greater scientific community can do is hope that somehow technology picks up the slack, providing more accurate sensors, faster networks, and a new virtual home for complexity.

We should worry such networks won’t be able to function without us; we should also worry that they will.

Harvard’s Lisa Randall, one of the world’s leading theoretical physicists and the author of, most recently, Knocking on Heaven’s Door: How Physics and Scientific Thinking Illuminate the Universe and the Modern World, worries about the decline in major long-term investments in research, the kind that made the Large Hadron Collider possible, which would in turn diminish our capacity for exploring the most intensely fascinating aspects of the unknown:

I’m worried I won’t know the answer to questions I care deeply about. Theoretical research (what I do) can of course be done more cheaply. A pencil and paper and even a computer are pretty cheap. But without experiments, or the hope of experiments, theoretical science can’t truly advance either.

One of the most poignant answers comes from psychologist Susan Blackmore, author of Consciousness: An Introduction, who admonishes that we’re disconnecting our heads from our hands by outsourcing so much of our manual humanity to machines, in the process amputating the present for the sake of some potential future. She writes:

What should worry us is that we seem to be worrying more about the possible disasters that might befall us than who we are becoming right now.

From ‘Things I have learned in my life so far’ by Stefan Sagmeister.
Click image for details.

What Should We Be Worried About? is an awakening read in its entirety. For more of Brockman’s editorial-curatorial mastery, revisit the Edge Question compendiums from 2013 and 2012, and see Nobel-winning behavioral economist Daniel Kahneman on the marvels and flaws of our intuition.


Published February 11, 2014

https://www.themarginalian.org/2014/02/11/brockman-what-should-we-be-worried-about/

BP

www.themarginalian.org

BP

PRINT ARTICLE

Filed Under

View Full Site

The Marginalian participates in the Bookshop.org and Amazon.com affiliate programs, designed to provide a means for sites to earn commissions by linking to books. In more human terms, this means that whenever you buy a book from a link here, I receive a small percentage of its price, which goes straight back into my own colossal biblioexpenses. Privacy policy. (TLDR: You're safe — there are no nefarious "third parties" lurking on my watch or shedding crumbs of the "cookies" the rest of the internet uses.)