Brain Pickings Icon
Brain Pickings

Page 809

The Nature of the Fun: David Foster Wallace on Why Writers Write

“Fiction becomes a weird way to countenance yourself and to tell the truth instead of being a way to escape yourself or present yourself in a way you figure you will be maximally likable.”

On the heels of the highly anticipated new David Foster Wallace biography comes Both Flesh and Not: Essays (public library) — a collection spanning twenty years of Wallace’s nonfiction writing on subjects as wide-ranging as math, Borges, democracy, the U.S. Open, and the entire spectrum of human experience in between. Among the anthology’s finest is an essay titled “The Nature of the Fun” — a meditation on why writers write, encrusted in Wallace’s signature blend of self-conscious despondency, even more self-conscious optimism, and overwhelming self-awareness. It was originally published in 1998 in Fiction Writer and also included in the wonderful 1998 anthology Why I Write: Thoughts on the Craft of Fiction.

After offering an extended and rather gory metaphor for the writer’s creative output and a Zen parable about unpredictability, he gets to the meat of things:

In the beginning, when you first start out trying to write fiction, the whole endeavor’s about fun. You don’t expect anybody else to read it. You’re writing almost wholly to get yourself off. To enable your own fantasies and deviant logics and to escape or transform parts of yourself you don’t like. And it works – and it’s terrific fun. Then, if you have good luck and people seem to like what you do, and you actually start to get paid for it, and get to see your stuff professionally typeset and bound and blurbed and reviewed and even (once) being read on the a.m. subway by a pretty girl you don’t even know it seems to make it even more fun. For a while. Then things start to get complicated and confusing, not to mention scary. Now you feel like you’re writing for other people, or at least you hope so. You’re no longer writing just to get yourself off, which — since any kind of masturbation is lonely and hollow — is probably good. But what replaces the onanistic motive? You’ve found you very much enjoy having your writing liked by people, and you find you’re extremely keen to have people like the new stuff you’re doing. The motive of pure personal starts to get supplanted by the motive of being liked, of having pretty people you don’t know like you and admire you and think you’re a good writer. Onanism gives way to attempted seduction, as a motive. Now, attempted seduction is hard work, and its fun is offset by a terrible fear of rejection. Whatever “ego” means, your ego has now gotten into the game. Or maybe “vanity” is a better word. Because you notice that a good deal of your writing has now become basically showing off, trying to get people to think you’re good. This is understandable. You have a great deal of yourself on the line, writing — your vanity is at stake. You discover a tricky thing about fiction writing; a certain amount of vanity is necessary to be able to do it all, but any vanity above that certain amount is lethal.

Here, Wallace echoes Vonnegut, who famously advised, “Write to please just one person. If you open a window and make love to the world, so to speak, your story will get pneumonia.” Indeed, this lusting after prestige and approval is a familiar detractor of creative purpose in any endeavor. Wallace goes on:

At some point you find that 90% of the stuff you’re writing is motivated and informed by an overwhelming need to be liked. This results in shitty fiction. And the shitty work must get fed to the wastebasket, less because of any sort of artistic integrity than simply because shitty work will cause you to be disliked. At this point in the evolution of writerly fun, the very thing that’s always motivated you to write is now also what’s motivating you to feed your writing to the wastebasket. This is a paradox and a kind of double-bind, and it can keep you stuck inside yourself for months or even years, during which period you wail and gnash and rue your bad luck and wonder bitterly where all the fun of the thing could have gone.

He adds to literary history’s most famous insights on the relationship between truth and fiction:

The smart thing to say, I think, is that the way out of this bind is to work your way somehow back to your original motivation — fun. And, if you can find your way back to fun, you will find that the hideously unfortunate double-bind of the late vain period turns out really to have been good luck for you. Because the fun you work back to has been transfigured by the extreme unpleasantness of vanity and fear, an unpleasantness you’re now so anxious to avoid that the fun you rediscover is a way fuller and more large-hearted kind of fun. It has something to do with Work as Play. Or with the discovery that disciplined fun is more than impulsive or hedonistic fun. Or with figuring out that not all paradoxes have to be paralyzing. Under fun’s new administration, writing fiction becomes a way to go deep inside yourself and illuminate precisely the stuff you don’t want to see or let anyone else see, and this stuff usually turns out (paradoxically) to be precisely the stuff all writers and readers everywhere share and respond to, feel. Fiction becomes a weird way to countenance yourself and to tell the truth instead of being a way to escape yourself or present yourself in a way you figure you will be maximally likable. This process is complicated and confusing and scary, and also hard work, but it turns out to be the best fun there is.

He concludes on a Bradbury-like note:

The fact that you can now sustain the fun of writing only by confronting the very same unfun parts of yourself you’d first used writing to avoid or disguise is another paradox, but this one isn’t any kind of bind at all. What it is is a gift, a kind of miracle, and compared to it the rewards of strangers’ affection is as dust, lint.

Both Flesh and Not is excellent in its entirety and just as quietly, unflinchingly soul-stirring as “The Nature of the Fun.”

BP

The Half-Life of Facts: Dissecting the Predictable Patterns of How Knowledge Grows

“No one learns something new and then holds it entirely independent of what they already know. We incorporate it into the little edifice of personal knowledge that we have been creating in our minds our entire lives.”

Concerns about the usefulness of knowledge and the challenges of information overload predate contemporary anxieties by decades, centuries, if not millennia. In The Half-life of Facts: Why Everything We Know Has an Expiration Date (public library) — which gave us this fantastic illustration of how the Gutenberg press embodied combinatorial creativitySamuel Arbesman explores why, in a world in constant flux with information proliferating at overwhelming rates, understanding the underlying patterns of how facts change equips us for better handling the uncertainty around us. (He defines fact as “a bit of knowledge that we know, either as individuals or as a society, as something about the state of the world.”)

Arbesman writes in the introduction:

Knowledge is like radioactivity. If you look at a single atom of uranium, whether it’s going to decay — breaking down and unleashing its energy — is highly unpredictable. It might decay in the next second, or you might have to sit and stare at it for thousands, or perhaps even millions, of years before it breaks apart.

But when you take a chunk of uranium, itself made up of trillions upon trillions of atoms, suddenly the unpredictable becomes predictable.. We know how uranium atoms work in the aggregate. As a group of atoms, uranium is highly regular. When we combine particles together, a rule of probability known as the law of large numbers takes over, and even the behavior of a tiny piece of uranium becomes understandable. If we are patient enough, half of a chunk of uranium will break down in 704 million years, like clock-work. This number — 704 million years — is a measurable amount of time, and it is known as the half-life of uranium.

It turns out that facts, when viewed as a large body of knowledge, are just as predictable. Facts, in the aggregate, have half-lives: We can measure the amount of time for half of a subject’s knowledge to be overturned. There is science that explores the rates at which new facts are created, new technologies developed, and even how facts spread. How knowledge changes can be understood scientifically.

This is a powerful idea. We don’t have to be at sea in a world of changing knowledge. Instead, we can understand how facts grow and change in the aggregate, just like radioactive materials. This book is a guide to the startling notion that our knowledge — even what each of us has in our head — changes in understandable and systematic ways.

Indeed, Arbesman’s conception depicts facts as the threads of which our networked knowledge and combinatorial creativity are woven:


Facts are how we organize and interpret our surroundings. No one learns something new and then holds it entirely independent of what they already know. We incorporate it into the little edifice of personal knowledge that we have been creating in our minds our entire lives. In fact, we even have a phrase for the state of affairs that occurs when we fail to do this: cognitive dissonance.

Facts, says Arbesman, live on a continuum from the very rapidly changing (like the stock market and the weather) to those whose pace of change is so slow it’s imperceptible to us (like the number of continents on Earth and the number of fingers on the human hand), in the mid-range of which live mesofacts — the facts that change at the meso, or middle, of the timescale. These include facts that change over a single lifetime. For instance, my grandmother, who celebrates her 76th birthday today, learned in grade school that there were a little over 2 billion people living on Earth and a hundred elements in the periodic table, but we’ve recently passed seven billion and there are now 118 known elements. But, rather than fretting about this impossibly rapid informational treadmill, Arbesman finds comfort in patterns:

Facts change in regular and mathematically understandable ways. And only by knowing the pattern of our knowledge’s evolution an we be better prepared for its change.

He offers a curious example of the exponential nature of knowledge through the history of scientific research:

If you look back in history you can get the impression that scientific discoveries used to be easy. Galileo rolled objects down slopes; Robert Hooke played with a spring to learn about elasticity; Isaac Newton poked around his own eye with a darning needle to understand color perception. It took creativity and knowledge (and perhaps lack of squeamishness or regard for one’s own well-being) to ask the right questions, but the experiments themselves could be very simple. Today, if you want to make a discovery in physics, it helps to be part of a ten-thousand-member team that runs a multibillion-dollar atom smasher. It takes even more money, more effort, and more people to find out new things.

Indeed, until very recently, no one was particularly interested in the increasing difficulty of discovery, but Arbesman and his team decided to examine the precise pace of change in just how much harder discovery is getting. He looked at the history of three specific fields of science — mammal species, asteroids, and chemical elements — and determined that size was a good proxy for ease of discovery: Smaller creatures and asteroids are harder to discover; in chemistry, he used inverse size since larger elements are harder to create and detect. He plotted the results and what emerged was a clear pattern of exponential decay in the ease of discovery:

What this means is that the ease of discovery doesn’t drop by the same amount every year — it declines by the same fraction each year, a sort of reverse compound interest. For example, the size of asteroids discovered annually gets 2.5 percent smaller each year. In the first few years, the ease of discovery drops off quickly; after early researchers pick the low-hanging fruit, it continues to ‘decay’ for a long time, becoming slightly harder without ever quite becoming impossible.

And yet:

However it happens, scientific discovery marches forward. We are in an exceptional time, when the number of scientists is growing rapidly and consists of the majority of scientists who have ever lived. We have massive collaborative projects, from the Manhattan Project to particle accelerators, that have and are unearthing secrets of our cosmos. Yet, while this era of big science has allowed for the shockingly fast accumulation of knowledge, this growth of science is not unexpected.

Arbesman highlights the practical application beyond the cerebral understanding of how knowledge becomes obsolete:

Scholars in the field of information science in the 1970s were concerned with understanding the half-life of knowledge for a specific reason: protecting libraries from being overwhelmed.

In our modern digital information age, this sounds strange. But in the 1970s librarians everywhere were coping with the very real implications of the exponential growth of knowledge: Their libraries were being inundated. They needed ways to figure out which volumes they could safely discard. If they knew the half-life of a book or article’s time to obsolescence, it would go a long way to providing a means of avoiding overloading a library’s capacity. Knowing the half-lives of a library’s volumes would give a librarian a handle on how long books should be kept before they are just taking up space on the shelves, without being useful.

So a burst of research was conducted into this area. Information scientists examined citation data, and even usage data in libraries, in order to answer such questions as, If a book isn’t taken out for decades, is it that important anymore? And should we keep it on our shelves?

These questions, of course, strike very close to home given much of what makes my own heart sing is the excavation of near-forgotten gems that are at once timeless and timely, but that rot away in the dusty corners of humanity’s intellectual library in a culture conditioned us to fetishize the newest. In fact, contrary to what Arbesman suggests, those fears of the 1970s are not at all “strange” in the “digital information age” — if anything, they are, or should be, all the more exacerbated given the self-perpetuating nature of our knowledge biases: the internet is wired to give more weight to information that a greater number of people have already seen, sending the near-forgotten into an increasingly rapid spiral to the bottom, however “timeless and timely” that information may inherently be.

Still, The Half-life of Facts offers a fascinating and necessary look at the pace of human knowledge and what its underlying patterns might reveal about the secrets of intellectual progress, both for us as individuals and collectively, as a culture and a civilization.

BP

The Beatles Perform Shakespeare in Color, 1964

The Fab Four take on “A Midsummer Night’s Dream.”

The Beatles: iconic craze-starters, tireless tourers, comic book heroes, vintage children’s book protagonists, animation pioneers, and one timelessly photogenic bunch. In 1964, the Fab Four added another art under their belt — live theater — when they performed Shakespeare’s “A Midsummer Night’s Dream,” in color, to the sound of shouting hecklers (scripted, part of the play) and someone yelling “Go back to Liverpool!” (unscripted, decidedly unshakespearean). Enjoy.

BP

View Full Site

Brain Pickings participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from a link on here, I get a small percentage of its price. That helps support Brain Pickings by offsetting a fraction of what it takes to maintain the site, and is very much appreciated