Brain Pickings

Posts Tagged ‘psychology’

16 SEPTEMBER, 2013

Aesthetic Consumerism and the Violence of Photography: What Susan Sontag Teaches Us about Visual Culture and the Social Web

By:

“Needing to have reality confirmed and experience enhanced by photographs is an aesthetic consumerism to which everyone is now addicted.”

Ever since its invention in 1839, the photographic image and its steady evolution have shaped our experience of reality — from chronicling our changing world and recording its diversity to helping us understand the science of emotion to anchored us to consumer culture. But despite the meteoric rise of photography from a niche curiosity to a mass medium over the past century and a half, there’s something ineffably yet indisputably different about visual culture in the digital age — something at once singular and deeply rooted at the essence of the photographic image itself.

Though On Photography (public library) — the seminal collection of essays by reconstructionist Susan Sontag — was originally published in 1977, Sontag’s astute insight resonates with extraordinary timeliness today, shedding light on the psychology and social dynamics of visual culture online.

In the opening essay, “In Plato’s Cave,” Sontag contextualizes the question of how and why photographs came to grip us so powerfully:

Humankind lingers unregenerately in Plato’s cave, still reveling, its age-old habit, in mere images of the truth. But being educated by photographs is not like being educated by older, more artisanal images. For one thing, there are a great many more images around, claiming our attention. The inventory started in 1839 and since then just about everything has been photographed, or so it seems. This very insatiability of the photographing eye changes the terms of confinement in the cave, our world. In teaching us a new visual code, photographs alter and enlarge our notions of what is worth looking at and what we have a right to observe. They are a grammar and, even more importantly, an ethics of seeing. Finally, the most grandiose result of the photographic enterprise is to give us the sense that we can hold the whole world in our heads — as an anthology of images.

The lens, one of 100 ideas that changed photography. Click for more.

More than anything, however, Sontag argues that the photographic image is a control mechanism we exert upon the world — upon our experience of it and upon others’ perception of our experience:

Photographs really are experience captured, and the camera is the ideal arm of consciousness in its acquisitive mood. To photograph is to appropriate the thing photographed. It means putting oneself into a certain relation to the world that feels like knowledge — and, therefore, like power.

What makes this insight particularly prescient is that Sontag arrived at it more than three decades before the age of the social media photostream — the ultimate attempt to control, frame, and package our lives — our idealized lives — for presentation to others, and even to ourselves. The aggression Sontag sees in this purposeful manipulation of reality through the idealized photographic image applies even more poignantly to the aggressive self-framing we practice as we portray ourselves pictorially on Facebook, Instagram, and the like:

Images which idealize (like most fashion and animal photography) are no less aggressive than work which makes a virtue of plainness (like class pictures, still lifes of the bleaker sort, and mug shots). There is an aggression implicit in every use of the camera.

Online, thirty-some years after Sontag’s observation, this aggression precipitates a kind of social media violence of self-assertion — a forcible framing of our identity for presentation, for idealization, for currency in an economy of envy.

Even in the 1970s, Sontag was able to see where visual culture was headed, noting that photography had already become “almost as widely practiced an amusement as sex and dancing” and had taken on the qualities of a mass art form, meaning most who practice it don’t practice it as an art. Rather, Sontag presages, the photograph became a utility in our cultural power-dynamics:

It is mainly a social rite, a defense against anxiety, and a tool of power.

She goes even further in asserting photography’s inherent violence:

Like a car, a camera is sold as a predatory weapon — one that’s as automated as possible, ready to spring. Popular taste expects an easy, an invisible technology. Manufacturers reassure their customers that taking pictures demands no skill or expert knowledge, that the machine is all-knowing, and responds to the slightest pressure of the will. It’s as simple as turning the ignition key or pulling the trigger. Like guns and cars, cameras are fantasy-machines whose use is addictive.

The camera obscura, one of 100 ideas that changed photography. Click for more.

But in addition to dividing us along a power hierarchy, photographs also connect us into communities and nuclear units. Sontag writes:

Through photographs, each family constructs a portrait-chronicle of itself — a portable kit of images that bears witness to its connectedness.

One has to wonder, however, whether — and how much — the family circle has been replaced by the social circle as we construct our online communities around photostreams and shared timelines. Similarly, Sontag notes the heightened use of photography in tourism. There, images validate experience, which raises the question of whether we engage in a kind of “social media tourism” today as we vicariously devour other people’s lives. Sontag writes:

Photographs … help people to take possession of space in which they are insecure. Thus, photography develops in tandem with one of the most characteristic of modern activities: tourism. For the first time in history, large numbers of people regularly travel out of their habitual environments for short periods of time. It seems positively unnatural to travel for pleasure without taking a camera along. Photographs will offer indisputable evidence that the trip was made, that the program was carried out, that fun was had.

[…]

A way of certifying experience, taking photographs is also a way of refusing it — by limiting experience to a search for the photogenic, by converting experience into an image, a souvenir.

Out of those souvenirs we build a fantasy — one we project about our own lives, and one we deduce about those of others:

Photographs, which cannot themselves explain anything, are inexhaustible invitations to deduction, speculation, and fantasy.

But Sontag’s most piercing — and perhaps most heartbreaking — insight about leisure and photography touches on our cultural cult of productivity, which we worship at the expense of our ability to be truly present. For most of us, especially those who find tremendous fulfillment and absorption in our work, Sontag’s observation about the photograph as a self-soothing tool against the anxiety of “inefficiency” rings terrifyingly true:

The very activity of taking pictures is soothing, and assuages general feelings of disorientation that are likely to be exacerbated by travel. Most tourists feel compelled to put the camera between themselves and whatever is remarkable that they encounter. Unsure of other responses, they take a picture. This gives shape to experience: stop, take a photograph, and move on. The method especially appeals to people handicapped by a ruthless work ethic — Germans, Japanese, and Americans. Using a camera appeases the anxiety which the work-driven feel about not working when they are on vacation and supposed to be having fun. They have something to do that is like a friendly imitation of work: they can take pictures.

Man on Rooftop with Eleven Men in Formation on His Shoulders (Unidentified American artist, ca. 1930)

From 'Faking It: Manipulated Photography Before Photoshop.' Click image for more.

At the same time, photography is both an attempted antidote to our mortality paradox and a deepening awareness of it:

All photographs are memento mori. To take a photograph is to participate in another person’s (or thing’s) mortality, vulnerability, mutability. Precisely by slicing out this moment and freezing it, all photographs testify to time’s relentless melt.

This seems especially true, if subtly tragic, as we fill our social media timelines with images, as if to prove that our biological timelines — our very lives — are filled with notable moments, which also remind us that they are all inevitably fleeting towards the end point of that timeline: mortality itself. And so the photographic image becomes an affirmation of our very existence, one whose power is invariably addictive:

Needing to have reality confirmed and experience enhanced by photographs is an aesthetic consumerism to which everyone is now addicted.

[…]

It would not be wrong to speak of people having a compulsion to photograph: to turn experience itself into a way of seeing. Ultimately, having an experience becomes identical with taking a photograph of it, and participating in a public event comes more and more to be equivalent to looking at it in photographed form. That most logical of nineteenth-century aesthetes, Mallarmé, said that everything in the world exists in order to end in a book. Today everything exists to end in a photograph.

On Photography remains a cultural classic of the most timeless kind, with every reading unfolding timelier and timelier insights as our visual vernacular continues to evolve. Complement it with 100 Ideas That Changed Photography, the curious legacy of image manipulation before Photoshop, and the history of photography, animated.

For more of Sontag’s brilliant brain, see her wisdom on writing, boredom, sex, censorship, and aphorisms, her radical vision for remixing education, her insight on why lists appeal to us, and her illustrated meditations on art and on love.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

13 SEPTEMBER, 2013

“Tip-of-the-Tongue Syndrome,” Transactive Memory, and How the Internet Is Making Us Smarter

By:

“A public library keeps no intentional secrets about its mechanisms; a search engine keeps many.”

“The dangerous time when mechanical voices, radios, telephones, take the place of human intimacies, and the concept of being in touch with millions brings a greater and greater poverty in intimacy and human vision,” Anaïs Nin wrote in her diary in 1946, decades before the internet as we know it even existed. Her fear has since been echoed again and again with every incremental advance in technology, often with simplistic arguments about the attrition of attention in the age of digital distraction. But in Smarter Than You Think: How Technology is Changing Our Minds for the Better (public library), Clive Thompson — one of the finest technology writers I know, with regular bylines for Wired and The New York Times — makes a powerful and rigorously thought out counterpoint. He argues that our technological tools — from search engines to status updates to sophisticated artificial intelligence that defeats the world’s best chess players — are now inextricably linked to our minds, working in tandem with them and profoundly changing the way we remember, learn, and “act upon that knowledge emotionally, intellectually, and politically,” and this is a promising rather than perilous thing.

He writes in the introduction:

These tools can make even the amateurs among us radically smarter than we’d be on our own, assuming (and this is a big assumption) we understand how they work. At their best, today’s digital tools help us see more, retain more, communicate more. At their worst, they leave us prey to the manipulation of the toolmakers. But on balance, I’d argue, what is happening is deeply positive. This book is about the transformation.

Page from 'Charley Harper: An Illustrated Life'

But Thompson is nothing if not a dimensional thinker with extraordinary sensitivity to the complexities of cultural phenomena. Rather than revisiting painfully familiar and trite-by-overuse notions like distraction and information overload, he examines the deeper dynamics of how these new tools are affecting the way we make sense of the world and of ourselves. Several decades after Vannevar Bush’s now-legendary meditation on how technology will impact our thinking, Thompson reaches even further into the fringes of our cultural sensibility — past the cheap techno-dystopia, past the pollyannaish techno-utopia, and into that intricate and ever-evolving intersection of technology and psychology.

One of his most fascinating and important points has to do with our outsourcing of memory — or, more specifically, our increasingly deft, search-engine-powered skills of replacing the retention of knowledge in our own brains with the on-demand access to knowledge in the collective brain of the internet. Think, for instance, of those moments when you’re trying to recall the name of a movie but only remember certain fragmentary features — the name of the lead actor, the gist of the plot, a song from the soundtrack. Thompson calls this “tip-of-the-tongue syndrome” and points out that, today, you’ll likely be able to reverse-engineer the name of the movie you don’t remember by plugging into Google what you do remember about it. Thompson contextualizes the phenomenon, which isn’t new, then asks the obvious, important question about our culturally unprecedented solutions to it:

Tip-of-the-tongue syndrome is an experience so common that cultures worldwide have a phrase for it. Cheyenne Indians call it navonotootse’a, which means “I have lost it on my tongue”; in Korean it’s hyeu kkedu-te mam-dol-da, which has an even more gorgeous translation: “sparkling at the end of my tongue.” The phenomenon generally lasts only a minute or so; your brain eventually makes the connection. But … when faced with a tip-of-the-tongue moment, many of us have begun to rely instead on the Internet to locate information on the fly. If lifelogging … stores “episodic,” or personal, memories, Internet search engines do the same for a different sort of memory: “semantic” memory, or factual knowledge about the world. When you visit Paris and have a wonderful time drinking champagne at a café, your personal experience is an episodic memory. Your ability to remember that Paris is a city and that champagne is an alcoholic beverage — that’s semantic memory.

[…]

What’s the line between our own, in-brain knowledge and the sea of information around us? Does it make us smarter when we can dip in so instantly? Or dumber with every search?

Vannevar Bush's 'memex' -- short for 'memory index' -- a primitive vision for a personal hard drive for information storage and management. Click image for the full story.

That concern, of course, is far from unique to our age — from the invention of writing to Alvin Toffler’s Future Shock, new technology has always been a source of paralyzing resistance and apprehension:

Writing — the original technology for externalizing information — emerged around five thousand years ago, when Mesopotamian merchants began tallying their wares using etchings on clay tablets. It emerged first as an economic tool. As with photography and the telephone and the computer, newfangled technologies for communication nearly always emerge in the world of commerce. The notion of using them for everyday, personal expression seems wasteful, risible, or debased. Then slowly it becomes merely lavish, what “wealthy people” do; then teenagers take over and the technology becomes common to the point of banality.

Thompson reminds us of the anecdote, by now itself familiar “to the point of banality,” about Socrates and his admonition that the “technology” of writing would devastate the Greek tradition of debate and dialectic, and would render people incapable of committing anything to memory because “knowledge stored was not really knowledge at all.” He cites Socrates’s parable of the Egyptian god Theuth and how he invented writing, offering it as a gift to the king of Egypt, Thamus, who met the present with defiant indignation:

This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

That resistance endured as technology changed shape, across the Middle Ages and past Gutenberg’s revolution, but it wasn’t without counter-resistance: Those who recorded their knowledge in writing and, eventually, collected it in the form of books argued that it expanded the scope of their curiosity and the ideas they were able to ponder, whereas the mere act of rote memorization made no guarantees of deeper understanding.

Ultimately, however, Thompson points out that Socrates was both right and wrong: It’s true that, with some deliberately cultivated exceptions and neurological outliers, few thinkers today rely on pure memorization and can recite extensive passages of text from memory. But what Socrates failed to see was the extraordinary dot-connecting enabled by access to knowledge beyond what our own heads can hold — because, as Amanda Palmer poignantly put it, “we can only connect the dots that we collect,” and the outsourcing of memory has exponentially enlarged our dot-collections.

With this in mind, Thompson offers a blueprint to this newly developed system of knowledge management in which access is critical:

If you are going to read widely but often read books only once; if you going to tackle the ever-expanding universe of ideas by skimming and glancing as well as reading deeply; then you are going to rely on the semantic-memory version of gisting. By which I mean, you’ll absorb the gist of what you read but rarely retain the specifics. Later, if you want to mull over a detail, you have to be able to refind a book, a passage, a quote, an article, a concept.

Giuseppe Arcimboldo, The Librarian, ca. 1566

This, he argues, is also how and why libraries were born — the death of the purely oral world and the proliferation of print after Gutenberg placed new demands on organizing and storing human knowledge. And yet storage and organization soon proved to be radically different things:

The Gutenberg book explosion certainly increased the number of books that libraries acquired, but librarians had no agreed-upon ways to organize them. It was left to the idiosyncrasies of each. A core job of the librarian was thus simply to find the book each patron requested, since nobody else knew where the heck the books were. This created a bottleneck in access to books, one that grew insufferable in the nineteenth century as citizens began swarming into public venues like the British Library. “Complaints about the delays in the delivery of books to readers increased,” as Matthew Battles writes in Library: An Unquiet History, “as did comments about the brusqueness of the staff.” Some patrons were so annoyed by the glacial pace of access that they simply stole books; one was even sentenced to twelve months in prison for the crime. You can understand their frustration. The slow speed was not just a physical nuisance, but a cognitive one.

The solution came in the late 19th century by way of Melville Dewey, whose decimal system imposed order by creating a taxonomy of book placement, eventually rendering librarians unnecessary — at least in their role as literal book-retrievers. They became, instead, curiosity sherpas who helped patrons decide what to read and carry out comprehensive research. In many ways, they came to resemble the editors and curators who help us navigate the internet today, framing for us what is worth attending to and why.

But Thompson argues that despite history’s predictable patterns of resistance followed by adoption and adaptation, there’s something immutably different about our own era:

The history of factual memory has been fairly predictable up until now. With each innovation, we’ve outsourced more information, then worked to make searching more efficient. Yet somehow, the Internet age feels different. Quickly pulling up [the answer to a specific esoteric question] on Google seems different from looking up a bit of trivia in an encyclopedia. It’s less like consulting a book than like asking someone a question, consulting a supersmart friend who lurks within our phones.

And therein lies the magic of the internet — that unprecedented access to humanity’s collective brain. Thompson cites the work of Harvard psychologist Daniel Wegner, who first began exploring this notion of collective rather than individual knowledge in the 1980s by observing how partners in long-term relationships often divide and conquer memory tasks in sharing the household’s administrative duties:

Wegner suspected this division of labor takes place because we have pretty good “metamemory.” We’re aware of our mental strengths and limits, and we’re good at intuiting the abilities of others. Hang around a workmate or a romantic partner long enough and you begin to realize that while you’re terrible at remembering your corporate meeting schedule, or current affairs in Europe, or how big a kilometer is relative to a mile, they’re great at it. So you begin to subconsciously delegate the task of remembering that stuff to them, treating them like a notepad or encyclopedia. In many respects, Wegner noted, people are superior to these devices, because what we lose in accuracy we make up in speed.

[…]

Wegner called this phenomenon “transactive” memory: two heads are better than one. We share the work of remembering, Wegner argued, because it makes us collectively smarter — expanding our ability to understand the world around us.

This ability to “google” one another’s memory stores, Thompson argues, is the defining feature of our evolving relationship with information — and it’s profoundly shaping our experience of knowledge:

Transactive memory helps explain how we’re evolving in a world of on-tap information.

He illustrates this by turning to the work of Betsy Sparrow, a graduate student of Wegner’s, who conducted a series of experiments demonstrating that when we know a digital tool will store information for us, we’re far less likely to commit it to memory. On the surface, this may appear like the evident and worrisome shrinkage of our mental capacity. But there’s a subtler yet enormously important layer that such techno-dystopian simplifications miss: This very outsourcing of memory requires that we learn what the machine knows — a kind of meta-knowledge that enables us to retrieve the information when we need it. And, reflecting on Sparrow’s findings, Thomspon points out that this is neither new nor negative:

We’ve been using transactive memory for millennia with other humans. In everyday life, we are only rarely isolated, and for good reason. For many thinking tasks, we’re dumber and less cognitively nimble if we’re not around other people. Not only has transactive memory not hurt us, it’s allowed us to perform at higher levels, accomplishing acts of reasoning that are impossible for us alone. It wasn’t until recently that computer memory became fast enough to be consulted on the fly, but once it did — with search engines boasting that they return results in tenths of a second — our transactive habits adapted.

Outsourcing our memory to machines rather than to other humans, in fact, offers certain advantages by pulling us into a seemingly infinite rabbit hole of indiscriminate discovery:

In some ways, machines make for better transactive memory buddies than humans. They know more, but they’re not awkward about pushing it in our faces. When you search the Web, you get your answer — but you also get much more. Consider this: If I’m trying to remember what part of Pakistan has experienced many U.S. drone strikes and I ask a colleague who follows foreign affairs, he’ll tell me “Waziristan.” But when I queried this once on the Internet, I got the Wikipedia page on “Drone attacks in Pakistan.” A chart caught my eye showing the astonishing increase of drone attacks (from 1 a year to 122 a year); then I glanced down to read a précis of studies on how Waziristan residents feel about being bombed. (One report suggested they weren’t as opposed as I’d expected, because many hated the Taliban, too.) Obviously, I was procrastinating. But I was also learning more, reinforcing my schematic understanding of Pakistan.

But algorithms, as the filter bubble has taught us, come with their own biases — most of which remain intentionally obscured from view — and this requires a whole new kind of literacy:

The real challenge of using machines for transactive memory lies in the inscrutability of their mechanics. Transactive memory works best when you have a sense of how your partners’ minds work — where they’re strong, where they’re weak, where their biases lie. I can judge that for people close to me. But it’s harder with digital tools, particularly search engines. You can certainly learn how they work and develop a mental model of Google’s biases. … But search companies are for-profit firms. They guard their algorithms like crown jewels. This makes them different from previous forms of outboard memory. A public library keeps no intentional secrets about its mechanisms; a search engine keeps many. On top of this inscrutability, it’s hard to know what to trust in a world of self-publishing. To rely on networked digital knowledge, you need to look with skeptical eyes. It’s a skill that should be taught with the same urgency we devote to teaching math and writing.

Thompson’s most important point, however, has to do with how outsourcing our knowledge to digital tools actually hampers the very process of creative thought, which relies on our ability to connect existing ideas from our mental pool of resources into new combinations, or what the French polymath Henri Poincaré has famously termed “sudden illuminations.” Without a mental catalog of materials which to mull and let incubate in our fringe consciousness, our capacity for such illuminations is greatly deflated. Thompson writes:

These eureka moments are familiar to all of us; they’re why we take a shower or go for a walk when we’re stuck on a problem. But this technique works only if we’ve actually got a lot of knowledge about the problem stored in our brains through long study and focus. … You can’t come to a moment of creative insight if you haven’t got any mental fuel. You can’t be googling the info; it’s got to be inside you.

But while this is a valid concern, Thompson doubts that we’re outsourcing too many bits of knowledge and thus curtailing our creativity. He argues, instead, that we’re mostly employing this newly evolved skill to help us sift the meaningful from the meaningless, but we remain just as capable of absorbing that which truly stimulates us:

Evidence suggests that when it comes to knowledge we’re interested in — anything that truly excites us and has meaning — we don’t turn off our memory. Certainly, we outsource when the details are dull, as we now do with phone numbers. These are inherently meaningless strings of information, which offer little purchase on the mind. … It makes sense that our transactive brains would hand this stuff off to machines. But when information engages us — when we really care about a subject — the evidence suggests we don’t turn off our memory at all.

He illustrates this deep-seated psychological tendency with a famous 1979 experiment:

Scientists gave a detailed description of a fictitious half inning of baseball to two groups: one composed of avid baseball fans, the other of people who didn’t know the game well. When asked later to recall what they’d read, the baseball fans had “significantly greater” recall than the nonfans. Because the former cared deeply about baseball, they fit the details into their schema of how the game works. The nonfans had no such mental model, so the details didn’t stick. A similar study found that map experts retained far more details from a contour map than nonexperts. The more you know about a field, the more easily you can absorb facts about it.

The question, then, becomes: How do we get people interested in things beyond their existing interests? (Curiously, this has been the Brain Pickings mission since the very beginning in 2005.) Thompson considers:

In an ideal world, we’d all fit the Renaissance model — we’d be curious about everything, filled with diverse knowledge and thus absorbing all current events and culture like sponges. But this battle is age-old, because it’s ultimately not just technological. It’s cultural and moral and spiritual; “getting young people to care about the hard stuff” is a struggle that goes back centuries and requires constant societal arguments and work. It’s not that our media and technological environment don’t matter, of course. But the vintage of this problem indicates that the solution isn’t merely in the media environment either.

In the epilogue, Thompson offers his ultimate take on that solution, at once romantic and beautifully grounded in critical thinking:

Understanding how to use new tools for thought requires not just a critical eye, but curiosity and experimentation. … A tool’s most transformative uses generally take us by surprise.

[…]

How should you respond when you get powerful new tools for finding answers?

Think of harder questions.

Smarter Than You Think is excellent and necessary in its entirety, covering everything from the promise of artificial intelligence to how technology is changing our ambient awareness.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

12 SEPTEMBER, 2013

David Foster Wallace on Writing, Death, and Redemption

By:

“You don’t have to think very hard to realize that our dread of both relationships and loneliness … has to do with angst about death, the recognition that I’m going to die, and die very much alone, and the rest of the world is going to go merrily on without me.”

On May 21, 2005 David Foster Wallace took the podium at Kenyon College and delivered the now-legendary This Is Water, one of history’s greatest commencement addresses — his timeless meditation on the meaning of life and the grueling work required in order to stay awake to the world rather than enslaved by one’s own self-consuming intellect. It included this admonition:

Think of the old cliché about “the mind being an excellent servant but a terrible master.” This, like many clichés, so lame and unexciting on the surface, actually expresses a great and terrible truth. It is not the least bit coincidental that adults who commit suicide with firearms almost always shoot themselves in: the head. They shoot the terrible master.

Three years later, on September 12, 2008, Wallace murdered his own terrible master — not by firearms, but by hanging himself. Several months prior, frustrated with the disorienting side effects of the antidepressant he had been taking to alleviate his 20-year struggle with depression, he had attempted to wean himself off the medication. His personal tragedy was soon inscribed into the modern-day literary canon, turning him into a kind of public patron-saint of the Tortured Genius archetype.

Conversations with David Foster Wallace (public library) — an essential, the essential, collection of 22 interviews and profiles of the beloved author — reveals with empathic granularity Wallace’s conflicted relationship with life and death, and its slow, subtly menacing evolution.

In an interview by Larry McCaffery, originally published in the Review of Contemporary Fiction in 1993, 31-year-old Wallace appears already remarkably aware of the mortality paradox, the exorcism of which he sees as the highest achievement of fiction:

You don’t have to think very hard to realize that our dread of both relationships and loneliness, both of which are like sub-dreads of our dread of being trapped inside a self (a psychic self, not just a physical self), has to do with angst about death, the recognition that I’m going to die, and die very much alone, and the rest of the world is going to go merrily on without me. I’m not sure I could give you a steeple-fingered theoretical justification, but I strongly suspect a big part of real art-fiction’s job is to aggravate this sense of entrapment and loneliness and death in people, to move people to countenance it, since any possible human redemption requires us first to face what’s dreadful, what we want to deny.

This dark whimsy is what lends literature its mesmerism, and in it Wallace sees both redemption and remedy for our existential dance with anxiety:

If you’re going to try not just to depict the way a culture’s bound and defined by mediated gratification and image, but somehow to redeem it, or at least fight a rearguard against it, then what you’re going to be doing is paradoxical. You’re at once allowing the reader to sort of escape self by achieving some sort of identification with another human psyche — the writer’s, or some character’s, etc. — and you’re also trying to antagonize the reader’s intuition that she is a self, that she is alone and going to die alone. You’re trying somehow both to deny and affirm that the writer is over here with his agenda while the reader’s over there with her agenda, distinct. This paradox is what makes good fiction sort of magical, I think. The paradox can’t be resolved, but it can somehow be mediated — “re-mediated,” since this is probably where poststructuralism rears its head for me — by the fact that language and linguistic intercourse is, in and of itself, redeeming, remedying.

Later in the conversation, Wallace considers, with his typical sharp self-consciousness and meta-awareness, the terrifying joy and vulnerability that great art necessitates and the courage creativity calls for:

The reader walks away from real art heavier than she came to it. Fuller. All the attention and engagement and work you need to get from the reader can’t be for your benefit; it’s got to be for hers. What’s poisonous about the cultural environment today is that it makes this so scary to try to carry out. Really good work probably comes out of a willingness to disclose yourself, open yourself up in spiritual and emotional ways that risk making you look banal or melodramatic or naive or unhip or sappy, and to ask the reader really to feel something. To be willing to sort of die in order to move the reader, somehow. Even now I’m scared about how sappy this’ll look in print, saying this. And the effort actually to do it, not just talk about it, requires a kind of courage I don’t seem to have yet. … Maybe it’s as simple as trying to make the writing more generous and less ego-driven.

How wistful to consider that, fifteen years later, Wallace took his young self’s advice all too seriously, too literally, too extremely — for isn’t suicide the ultimate, most tragic, and most permanent denial of the ego?

But perhaps most prescient of all — most heartbreaking, most humbling, most harrowing — is something Wallace said to one interviewer particularly preoccupied with the root of the author’s genius:

That was his whole thing. “Are you normal?” “Are you normal?” I think one of the true ways I’ve gotten smarter is that I’ve realized that there are ways other people are a lot smarter than me. My biggest asset as a writer is that I’m pretty much like everybody else. The parts of me that used to think I was different or smarter or whatever almost made me die.

Conversations with David Foster Wallace is revelational in its entirety. Complement it with his timeless wisdom on why writers write, the original audio of that mythic Kenyon commencement speech, and Wallace’s animated advice on ambition.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

11 SEPTEMBER, 2013

How Antidepressants Affect Selfhood, Teenage Sexuality, and Our Quest for Personal Identity

By:

“Though antidepressants are effective at managing negative emotions, they don’t in themselves provide the sense of meaning and direction that a person equally needs in order to find her way in life.”

“Great art was born of great terrors, great loneliness, great inhibitions, instabilities, and it always balances them,” Anaïs Nin famously wrote. But what if it doesn’t balance out? What if the emotional excess, believed to be essential to creativity, was of the negative and crippling kind? One need only look at such tragic heroes as Sylvia Plath, David Foster Wallace, Marilyn Monroe, and Kurt Cobain to grasp the gravity of the proposition. And yet we remain ever so culturally ambivalent about alleviating the anguish of mental illness with the same arsenal we use against physical pain: drugs.

In Coming of Age on Zoloft: How Antidepressants Cheered Us Up, Let Us Down, and Changed Who We Are (public library), Katherine Sharpe explores the heart of this ambivalence through an intersection of her own experience, conversation with medical and psychiatric experts, and in-depth interviews with forty young adults who grew up on psychopharmaceuticals. Having spent a fair portion of my own life on antidepressants, and having recently resumed treatment, I was instantly fascinated, both as an observer of culture and a living sample size of one.

Sharpe begins with an anecdote from her college days, in which she and her six roommates arrived at the accidental and highly self-conscious realization that each one of them was, or had been, on one form of psychoactive drug or another — an incident emblematic of the pervasive and profound cultural pattern at the heart of Sharpe’s book. She writes:

It is strange, as a young person, to realize that you have lived through something that can be considered a real historical change, but that’s exactly what we had done. When I was a child, in the early 1980s, taking psychiatric medication was decidedly a fringe phenomenon. Prozac came onto the market in 1987, the year I was eight. The first member of a family of drugs called SSRIs (for “selective serotonin reuptake inhibitors”), it quickly became the leading edge of a psychopharmaceutical revolution. Throughout the 1990s and 2000s, Americans grew ever more likely to reach for a pill to address a wide variety of mental and emotional problems. We also became more likely to think of those problems as a kind of disease, manifestations of an innate biochemical imbalance. Depression, social anxiety, obsessive-compulsive disorder, and the like went from being strange clinical terms or scrupulously hidden secrets to constituting acceptable topics of cocktail party conversation — talk that was often followed up by chatter about the new miracle drugs for despair.

Artwork by Bobby Baker from 'Drawing Mental Illness.' Click image for more.

But more than a mere statistically swelling phenomenon — less than two decades after the introduction of Prozac, SSRIs had outpaced blood pressure medication to become America’s favorite class of drugs, popped by about 10% of the nation — Sharpe points out a troubling corollary: In permeating everyday life so profoundly, antidepressants also embedded themselves in youth, with an ever-growing number of teenagers taking psychopharmaceuticals to abate depression, ADHD, and other mental health issues. And while relief from the debilitating and often deadly effects of adolescent depression is undoubtedly preferable over the alternative, it comes with a dark side: Antidepressants confuse our ability to tell our “true self” from the symptoms of the disease, and from the effects of the medication, at a time when the search for selfhood and the construction of personal identity are at their most critical and formative stages. And given the teenage brain responds so differently to life than the adult’s, the implications are even more uneasy:

Rightly or wrongly, antidepressants command powerful emotions; they can lead people to examine their deepest assumptions about themselves and the world.

[…]

The notion that depression distorts the true self and that antidepressants merely restore what was there all along has often been invoked against the fear that by taking antidepressants, we might somehow be betraying our true natures. But that belief in particular is one that people who start medication young cannot fall back on. Worries about how antidepressants might affect the self are greatly magnified for people who begin using them in adolescence, before they’ve developed a stable, adult sense of self. Lacking a reliable conception of what it is to feel “like themselves,” young people have no way to gauge the effects of the drugs on their developing personalities. Searching for identity — asking “Who am I?” and combing the inner and outer worlds for an answer that seems to fit — is the main developmental task of the teenage years. And for some young adults, the idea of taking a medication that could frustrate that search can become a discouraging, painful preoccupation.

She relays her own experience:

When I first began to use Zoloft, my inability to pick apart my “real” thoughts and emotions from those imparted by the drug made me feel bereft. The trouble seemed to have everything to do with being young. I was conscious of needing to figure out my own interests and point myself in a direction in the world, and the fact of being on medication seemed frighteningly to compound the possibilities for error. How could I ever find my way in life if I didn’t even know which feelings were mine?

This inner torment makes perfect, if tragic, sense in the context of developmental psychology, the commonly accepted credo of which is that establishing an identity is adolescents’ primary developmental task. When that process is disrupted by folding in the effects of medication, or the adopted inner storytelling that mental illness renders one somehow handicapped or fundamentally flawed, the consequences can be serious and far-reaching:

Though antidepressants are effective at managing negative emotions, they don’t in themselves provide the sense of meaning and direction that a person equally needs in order to find her way in life.

And even though modern psychology does away with the notion of the immutable self — something Nin herself so eloquently articulated more than half a century ago — Sharpe reminds us that despite what we may rationally believe about our scientific selves, we hang on to the romantic ideal of their metaphysical manifestation with emotional fervor:

For the last twenty years, the dominant academic theories of personhood have focused not on the idea of essence but on performance and changefulness, the sense that we don and doff identities at will as we move through our lives. Intellectually, we all know that the true self is more of a metaphor than a literal reality — we don’t really believe that there is some perfectly realized version of each of us hovering out there, just waiting to be discovered like a vein of gold.

But no matter how well we understand the academic critique of the essential self, or how much we feel disposed to dismiss “Who am I?” … most of us still want to feel, in some way, like ourselves. We may never achieve the highly concrete answer to the question of who we are that we first imagine possible as a young teenager — but a notional sense of self is something that we rely on from day to day. … A feeling of authenticity is, admittedly, an intangible thing to lose — but in a society that still prizes a notion of authentic selfhood, however problematic, it can be a significant one.

Artwork by James Thurber from 'Is Sex Necessary?' Click image for more.

Among the facets of selfhood most deeply affected by adolescents’ and young adults’ use of antidepressants, Sharpe notes, is that of sexuality. Every SSRI warning label cautions that the drug might — meaning, to decode the big-pharma-euphemism here, most likely will — produce “sexual side effects” ranging from loss of interest in sex to performance difficulty to inability to reach orgasm. For teenagers, most of whom are only just beginning to experiment with and understand their sexuality — whether parents approve or not — the repercussions can have an additional layer of gravity over the frustration these “sexual side effects” present for adults:

Just as teens don’t have a sense of their baseline adult personality with which to judge whether and how antidepressants may be affecting them, teens also lack a baseline impression of their own sexuality. Adults who are familiar with their own sexual norms will have an easy time knowing when those norms have been upset. But for adolescents who are just growing into their sexuality, the picture can be more mysterious. … Because SSRIs influence not just performance but also a person’s thoughts and desires, these side effects are relevant for teens who aren’t having sex as well as for those who are.

Artwork from 'An ABZ of Love.' Click image for more.

Coming of Age on Zoloft is fantastic and pause-giving in its entirety, embodying the rare bravery of asking important, complex questions in a society that fetishizes simplistic, sensationalistic answers. In a culture where just about the most embarrassing thing is not to have an opinion, Sharpe invites us to form one that is truly our own, however inconclusive and full of what Keats called “negative capability,” rather than a borrowed one that is easier to don but devoid of true understanding. Sharpe herself puts it beautifully:

This book won’t settle those debates, but it does speak to them. Twenty-five years after the introduction of Prozac, we are still collectively attempting to figure out what an appropriate use of medication would look like, in our culture and in our individual lives. We are trying to figure out what our sadness and pain mean — if they mean anything at all — and when they attain the status of illness. We’re trying to figure out when to turn to pills, when to go another route, and how we might be able to tell. … Good answers to the big questions about medication are likely to proceed from careful attention to the actual experiences of the people who have faced them.

For more on how psychoactive drugs affect the romantic and sexual lives of adults, see biological anthropologist Helen Fisher’s excellent analysis of the neurochemistry of desire and SSRIs.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.