In schools and colleges, in these audio-visual days, doubt has been raised as to the future of reading — whether the printed word is on its last legs. One college president has remarked that in fifty years “only five per cent of the people will be reading.” For this, of course, one must be prepared. But how prepare? To us it would seem that even if only one person out of a hundred and fifty million should continue as a reader, he would be the one worth saving, the nucleus around which to found a university. We think this not impossible person, this Last Reader, might very well stand in the same relation to the community as the queen bee to the colony of bees, and that the others would quite properly dedicate themselves wholly to his welfare, serving special food and building special accommodations. From his nuptial, or intellectual, flight would come the new race of men, linked perfectly with the long past by the unbroken chain of the intellect, to carry on the community. But it is more likely that our modern hive of bees, substituting a coaxial cable for spinal fluid, will try to perpetuate the rave through audio-visual devices, which ask no discipline of the mind and which are already giving the room the languor of an opium parlor.
How reminiscent this is of Virginia Woolf’s 1926 meditation on the moving image, in which she pondered the audiovisual mesmerism of cinema: “The eye licks it all up instantaneously, and the brain, agreeably titillated, settles down to watch things happening without bestirring itself to think.” White sees reading as the antidote to this mental resignation, likening the mutuality of its ecstasy to that of sex — something he knows a thing or two about:
Reading is the work of the alert mind, is demanding, and under ideal conditions produces finally a sort of ecstasy. As in the sexual experience, there are never more than two persons present in the act of reading — the writer, who is the impregnator, and the reader, who is the respondent. This gives the experience of reading a sublimity and power unequalled by any other form of communication.
He concludes by urging for the conservation of reading, both in education and in culture at large:
It would be just as well, we think, if educators clung to this great phenomenon and did not get sidetracked, for although books and reading may at times have played too large a part in the educational process, that is not what is happening today. Indeed, there is very little true reading, and not nearly as much writing as one would suppose from the towering piles of pulpwood in the dooryards of our paper mills. Readers and writers are scarce, as are publishers and reporters. The reports we get nowadays are those of men who have not gone to the scene of the accident, which is always farther inside one’s own head than it is convenient to penetrate without galoshes.
“If we wish to appreciate the role that genius has played in the modern world, we must [remember] that genius is ultimately the product of the hopes and longings of ordinary people.”
By Maria Popova
“Genius is nothing more nor less than doing well what anyone can do badly,” celebrated British novelist Amelia E. Barr wrote in her 9 rules for success in 1901. Indeed, the notion of what genius is and isn’t endures as one of our culture’s greatest fixations. We apply the label of “genius” to everyone from our greatest luminaries to exceptional children’s book editors to our dogs, and we even nickname prestigious cultural awards after it. But what, precisely, is genius? Why was the concept of it born in the first place, where did it begin, how did it evolve, and what does it mean today? That’s precisely what historian Darrin M. McMahon explores in Divine Fury: A History of Genius (public library) — a fascinating, first-of-its-kind chronicle of the evolution of genius as a cultural concept, its permutations across millennia of creative history, and its more recent role as a social equalizer and a double-edged sword of democratization.
Even today, more than 2,000 years after its first recorded use by the Roman author Plautus, [the word “genius”] continues to resonate with power and allure. The power to create. The power to divine the secrets of the universe. The power to destroy. With its hints of madness and eccentricity, sexual prowess and protean possibility, genius remains a mysterious force, bestowing on those who would assume it superhuman abilities and godlike powers. Genius, conferring privileged access to the hidden workings of the world. Genius, binding us still to the last vestiges of the divine.
Such lofty claims may seem excessive in an age when football coaches and rock stars are frequently described as “geniuses.” The luster of the word — once reserved for a pantheon of eminence, the truly highest of the high — has no doubt faded over time, the result of inflated claims and general overuse. The title of a BBC television documentary on the life of the Nobel Prize-winning physicist Richard Feynman sums up the situation: No Ordinary Genius. There was a time when such a title would have been redundant. That time is no more.
McMahon argues that, in an age where we’re urged to explore the “genius” in all of us, we’ve grown increasingly obsessed with the word and the idea of genius, robbing it of substance in the process. Particularly in the last century, we’ve applied the label of “genius” frivolously and indiscriminately to everyone from rock stars to startup founders to, even, Adolf Hitler, whom TIME magazine crowned “man of the year” in 1938 for his evil genius. And yet the impulse to know — to be — genius is among our greatest, most profound human yearnings for union with divinity, something the legendary literary critic Harold Bloom has explored in his own meditation on genius. For the perfect embodiment of this desire, McMahon points to Albert Einstein, whom he considers “the quintessential modern genius”:
“I want to know how God created the world,” Einstein once observed. “I want to know his thoughts.” It was, to be sure, a manner of speaking, like the physicist’s celebrated line about the universe and dice. Still, the aspiration is telling. For genius, from its earliest origins, was a religious notion, and as such was bound up not only with the superhuman and transcendent, but also with the capacity for violence, destruction, and evil that all religions must confront.
McMahon sets out to unravel this lineage of unexpected associations by tracing the history of genius, both as a concept and as a figure, from antiquity to today, exploring a vibrant spectrum of individuals who both embodied and shaped the label — poets, philosophers, artists, scientists, inventors, composers, military strategists, entrepreneurs, and even a horse. As much a history of ideas as a psychological history of our grasping after the divine, the journey he takes us on is above all one of introspection through the lens of history. Reminding us that, as Toni Morrison memorably wrote, “definitions belong to the definers, not the defined,” McMahon argues for the social construction of genius:
If we wish to appreciate the role that genius has played in the modern world, we must recall the evil with the good, bearing in mind as we do so the uncomfortable thought that genius is ultimately the product of the hopes and longings of ordinary people. We are the ones who marvel and wonder, longing for the salvation genius might bring. We are the ones who pay homage and obeisance. In a very real sense, the creator of genius is us.
Which is not to deny that geniuses almost always possess something special, something real, however elusive that something may be. But it is to recognize the commonsense fact that genius is in part a social creation — what historians like to call a “construction” — and, as such, of service to those who build. That fact reminds us further that for all their originality (and originality is itself a defining feature of genius in its modern form), extraordinary human beings not only define their images but embody them, stepping into molds prepared by the social imaginary and the exemplars who came before. Even outliers as remarkable, as deviant, as Einstein and Hitler are no exceptions to this rule: however inimitable — however unique — their genius was partly prepared for them, worked out over the course of generations.
This construction of genius as “a figure of extraordinary privilege and power,” McMahon argues, began in ancient Greece, where the era’s luminaries — poets, philosophers, politicians — first pondered the question of what makes a great man. (For, as McMahon explores in a later chapter, the original concept of genius was an exercise in cultural hegemony excluding women and various “others.”) The Romans picked up the inquiry where the Greeks had left off, seeking to understand what lent Julius Caesar his military might and why Homer could enchant as he did. This quest continued through Christianity, which attempted to answer it with the image of the God-man Christ, the ultimate genius. During the Renaissance, da Vinci and Michelangelo bent this fascination with Godlike genius through the lens of art, attempting both to capture it and to further illuminate its elusive nature.
And so we get to the modern genius. McMahon writes:
The modern genius was born in the eighteenth century—conceived, in keeping with long-standing prejudices, almost exclusively as a man. There were precedents for this birth, stretching all the way back to antiquity. But that the birth itself occurred in the bright place of deliverance we call “the Enlightenment” is clear. Scholars have long recognized the genius’s emergence in this period as the highest human type, a new paragon of human excellence who was the focus of extensive contemporary comment and observation.
What remains a mystery, however, is why the genius emerged in the first place, and why it did under those specific circumstances of time and place. Tracing scholars’ attempts to answer these questions, McMahon points to several factors, ranging from the rise of capitalism to the evolution of aesthetics to new models of authorship and selfhood. But his own explanation has to do with something else entirely: The religious change described as the “withdrawal from God” — a collective pulling back from spiritual companions, which opened up a space for humans to embrace self-reliance as we came to entrust ourselves with the fate of our civilization and our individual lives. That, in turn, catalyzed the birth of the modern genius — at once a stand-in for God and a testament to the human spirit at its highest potential. McMahon frames the shift:
Geniuses mediated between human beings and the divine. Chosen to reveal wonders, geniuses were conceived as wonders themselves, illustrating perfectly the proposition that the gradual disenchantment of the world was accompanied from the outset by its continual re-enchantment. Geniuses pulled back the curtain of existence to reveal a universe that was richer, deeper, more extraordinary and terrible than previously imagined. The baffling beauty of space-time was no different in this respect from the sublime majesty of Byron’s poetry, Beethoven’s symphonies, or Poincaré’s theorems, as radiant as an Edison light bulb or the explosion of the atomic bomb. Genius was a flash of light, but its brilliance served to illuminate the dark mystery that surrounded and set it apart.
Geniuses, then, were believed to possess rare and special powers: the power to create, redeem, and destroy; the power to penetrate the fabric of the universe; the power to see into the future, or to see into our souls.
By the early twentieth century, geniuses rose to greater cultural authority and a new, scientifically driven movement to understand the nature of genius was afoot. The IQ test was invented. Dominant political ideologies sought to justify the worship of their leaders — from Stalin to Hitler — by means of religious genius. A new generation of geniuses, from Einstein to Twain, entered the realm of pop-culture celebrity. Then, as is our tendency as a culture of extremes, we took it too far. McMahon worries:
Genius is seemingly everywhere today, hailed in our newspapers and glossy magazines, extolled in our television profiles and Internet chatter. Replete with publicists, hashtags, and “buzz,” genius is now consumed by a celebrity culture that draws few distinctions between a genius for fashion, a genius for business, and a genius for anything else. If the “problem of genius” of yesteryear was how to know and how to find it, “our genius problem” today is that it is impossible to avoid. Genius remains a relationship, but our relationship to it has changed. All might have their fifteen minutes of genius. All might be geniuses now. … [But] a world in which all might aspire to genius is a world in which the genius as a sacred exception can no longer exist. Einstein, the “genius of geniuses,” was the last of the titans. The age of the genius is gone. Should citizens of democracies mourn this passing or rejoice? Probably a bit of both. The genius is dead: long live the genius of humanity.
We’ve also grown increasingly obsessed with dissecting “genius” — from literally dissecting Einstein’s brain to being transfixed by the daily rituals of geniuses, as though emulating those would somehow sprinkle some of their pixie dust on our own ordinary lives. Rather than a cultural tragedy to lament, however, McMahon reminds us that this is merely a manifestation of our intense yearning for transcendence, for touching the extraordinary:
Einstein’s brain had become a “mythical object,” and Einstein’s genius a myth, which served to mediate the secrets of the universe and to comfort us in our darkness and insecurity. The genius of Einstein resembles even now what the ancients once called a “middle term” of the universe, shuttling between ordinary human beings and the heavens. The divinum quiddam of his brain provides a glimpse of another dimension; it is a portal to a mysterious realm.
The celebrated political theorist Hannah Arendt addressed this in a 1958 essay admonishing against the “vulgarization and commercialization of the notion of genius,” driven by “the great reverence the modern age so willingly paid to genius, so frequently bordering on idolatry.” But rather than a cheapening of the notion of genius, McMahon argues, this shift bespeaks a certain democratization that broadens traditional definitions to make genius a more inclusive concept, especially after the end of WWII:
This trickling down (or welling up) of genius along the vertical axis leading from high culture to low was accompanied, as well, by a horizontal expansion, a pushing outward of gender boundaries and geographical frontiers.
Touching on, though not naming, Howard Gardner’s seminal 1983 theory of multiple intelligences — the necessary antidote to the limitations of IQ — McMahon continues:
This gradual expansion of genius — in effect, its democratization and globalization — gathered momentum in the aftermath of 1945. The development marked, in some sense, a return to an older understanding of genius as a faculty possessed by all. That understanding, it is true, had never been entirely abandoned. Although men and women had spoken for centuries of genius as a general disposition or trait, Europeans, and especially Americans, continued long after the eighteenth century to acknowledge that different people might have a genius for different things.
And yet there is a downside to this democratization. Much like “curation,” which used to stand for something and now means nothing since we’ve applied it to everything, the ubiquity of “genius” renders its true manifestations all the more invisible:
If genius is everywhere, the genius is nowhere, or at least harder than ever to see. The same forces that have democratized and expanded genius’s kingdom have sent the genius into exile or to an early grave. That curious fact will become apparent if one tries to name a genius in the postwar world. Einstein comes immediately to mind, of course. But he is the exception who proves the rule. And though there are others — including artists, such as Pablo Picasso and Jackson Pollock, or scientists, such as J. Robert Oppenheimer and Richard Feynman — they tend either to be holdovers from an earlier age or fail to command common and overwhelming assent. The truth is that we live at a time when there is genius in all of us, but very few geniuses to be found.
Though McMahon allows for the possibility that genius can often only be recognized in retrospect, with the hindsight of generations — Shakespeare, after all, was only widely celebrated after the fact — he remains unconvinced that this dilution of “genius” is doing our culture justice:
Even if they now walk among us, we no longer regard geniuses as we once did; nor do we look to them for the same things that we did in the past. The religion of genius is a moribund faith: the genius is all but disenchanted.
Ultimately, however, McMahon turns to Emerson — the ultimate champion of self-reliance, who shaped the modern cultural ideal — for reassurance that everything is as it should be, even it if requires our constant mindfulness in recalibrating the genius of humanity:
As Ralph Waldo Emerson acknowledged of “the excess of influence” of great men, their “attractions warp us from our place.” But he also knew that it was natural to believe in them. “We feed on genius,” he said, we need it as sustenance to survive.
In an age as suspicious of “greatness” as our own, it is worth recalling that truth, and recalling that, although those who prostrate themselves before idols make themselves small, those who fail to take the measure of true stature are similarly diminished. Great men and great women still have their uses. As Emerson put it over a century and a half ago in a passage that serves as an epigraph to this book, the genius of humanity continues to be the right point of view of history. “Once you saw phoenixes: they are gone; the world is not therefore disenchanted.” May it never be.
Reflections on how to keep the center solid as you continue to evolve.
By Maria Popova
UPDATE: The fine folks of Holstee have turned these seven learnings into a gorgeous letterpress poster inspired by mid-century children’s book illustration.
On October 23, 2006, I sent a short email to a few friends at work — one of the four jobs I held while paying my way through college — with the subject line “brain pickings,” announcing my intention to start a weekly digest featuring five stimulating things to learn about each week, from a breakthrough in neuroscience to a timeless piece of poetry. “It should take no more than 4 minutes (hopefully much less) to read,” I promised. This was the inception of Brain Pickings. At the time, I neither planned nor anticipated that this tiny experiment would one day be included in the Library of Congress digital archive of “materials of historical importance” and the few friends would become millions of monthly readers all over the world, ranging from the Dutch high school student who wrote to me this morning to my 77-year-old grandmother in Bulgaria to the person in Wisconsin who mailed me strudel last week. (Thank you!) Above all, I had no idea that in the seven years to follow, this labor of love would become my greatest joy and most profound source of personal growth, my life and my living, my sense of purpose, my center. (For the curious, more on the origin story here.)
Looking back today on the thousands of hours I’ve spent researching and writing Brain Pickings and the countless collective hours of readership it has germinated — a smile-inducing failure on the four-minute promise — I choke up with gratitude for the privilege of this journey, for its endless rewards of heart, mind and spirit, and for all the choices along the way that made it possible. I’m often asked to offer advice to young people who are just beginning their own voyages of self-discovery, or those reorienting their calling at any stage of life, and though I feel utterly unqualified to give “advice” in that omniscient, universally wise sense the word implies, here are seven things I’ve learned in seven years of making those choices, of integrating “work” and life in such inextricable fusion, and in chronicling this journey of heart, mind and spirit — a journey that took, for whatever blessed and humbling reason, so many others along for the ride. I share these here not because they apply to every life and offer some sort of blueprint to existence, but in the hope that they might benefit your own journey in some small way, bring you closer to your own center, or even simply invite you to reflect on your own sense of purpose.
Allow yourself the uncomfortable luxury of changing your mind. Cultivate that capacity for “negative capability.” We live in a culture where one of the greatest social disgraces is not having an opinion, so we often form our “opinions” based on superficial impressions or the borrowed ideas of others, without investing the time and thought that cultivating true conviction necessitates. We then go around asserting these donned opinions and clinging to them as anchors to our own reality. It’s enormously disorienting to simply say, “I don’t know.” But it’s infinitely more rewarding to understand than to be right — even if that means changing your mind about a topic, an ideology, or, above all, yourself.
Do nothing for prestige or status or money or approval alone. As Paul Graham observed, “prestige is like a powerful magnet that warps even your beliefs about what you enjoy. It causes you to work not on what you like, but what you’d like to like.” Those extrinsic motivators are fine and can feel life-affirming in the moment, but they ultimately don’t make it thrilling to get up in the morning and gratifying to go to sleep at night — and, in fact, they can often distract and detract from the things that do offer those deeper rewards.
Be generous. Be generous with your time and your resources and with giving credit and, especially, with your words. It’s so much easier to be a critic than a celebrator. Always remember there is a human being on the other end of every exchange and behind every cultural artifact being critiqued. To understand and be understood, those are among life’s greatest gifts, and every interaction is an opportunity to exchange them.
Build pockets of stillness into your life. Meditate. Go for walks. Ride your bike going nowhere in particular. There is a creative purpose to daydreaming, even to boredom. The best ideas come to us when we stop actively trying to coax the muse into manifesting and let the fragments of experience float around our unconscious mind in order to click into new combinations. Without this essential stage of unconscious processing, the entire flow of the creative process is broken.
When people tell you who they are, Maya Angelou famously advised, believe them. Just as importantly, however, when people try to tell you who you are, don’t believe them. You are the only custodian of your own integrity, and the assumptions made by those that misunderstand who you are and what you stand for reveal a great deal about them and absolutely nothing about you.
Presence is far more intricate and rewarding an art than productivity. Ours is a culture that measures our worth as human beings by our efficiency, our earnings, our ability to perform this or that. The cult of productivity has its place, but worshipping at its altar daily robs us of the very capacity for joy and wonder that makes life worth living — for, as Annie Dillard memorably put it, “how we spend our days is, of course, how we spend our lives.”
“Expect anything worthwhile to take a long time.” This is borrowed from the wise and wonderful Debbie Millman, for it’s hard to better capture something so fundamental yet so impatiently overlooked in our culture of immediacy. The myth of the overnight success is just that — a myth — as well as a reminder that our present definition of success needs serious retuning. As I’ve reflected elsewhere, the flower doesn’t go from bud to blossom in one spritely burst and yet, as a culture, we’re disinterested in the tedium of the blossoming. But that’s where all the real magic unfolds in the making of one’s character and destiny.
Then, just for good measure, here are seven of my favorite pieces from the past seven years. (Yes, it is exactly like picking your favorite child — so take it with a grain of salt.)
Brain Pickings participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from a link on here, I get a small percentage of its price. That helps supportBrain Pickings by offsetting a fraction of what it takes to maintain the site, and is very much appreciated