Brain Pickings

Posts Tagged ‘psychology’

03 DECEMBER, 2013

Young vs. Old, Male vs. Female, Intuition vs. Intellect: Susan Sontag on How the Stereotypes and Polarities of Culture Imprison Us


“The young-old polarization and the male-female polarization are perhaps the two leading stereotypes that imprison people.”

“Identity is something that you are constantly earning,” Joss Whedon observed in his fantastic Wesleyan commencement address on our inner contradictions, adding: “It is a process that you must be active in.” But ours is a culture that prefers to make our identities static and confine them to categories, often diametrically opposed to one another, with specific stereotypes attached to each. And yet what is a human being if not a locus of exceptions, a complex cluster of contradictions, coexisting and in perpetual flux, which becomes a caricature of itself as soon as it is flattened into a label? From Susan Sontag: The Complete Rolling Stone Interview (public library) — the same fascinating, wide-ranging 1978 conversation with Rolling Stone contributing editor Jonathan Cott that gave us Sontag on the false divide between “high” and pop culture, which was also among the best biographies, memoirs, and history books of 2013 — comes Sontag’s timeless and timely meditation on the polarities and stereotypes that imprison us as individuals and flatten us as a culture.

In the introduction, Cott captures Sontag’s unflinching resistance to stereotypes:

In all of her endeavors, Sontag attempted to challenge and upend stereotypical categories such as male/female and young/old that induced people to live constrained and risk-averse lives; and she continually examined and tested out her notion that supposed polarities such as thinking and feeling, form and content, ethics and aesthetics, and consciousness and sensuousness could in fact simply be looked at as aspects of each other — much like the pile on the velvet that, upon reversing one’s touch, provides two textures and two ways of feeling, two shades and two ways of perceiving.

Indeed, Sontag explores these toxic stereotypes and polarities throughout the conversation. She tells Cott:

A lot of our ideas about what we can do at different ages and what age means are so arbitrary — as arbitrary as sexual stereotypes. I think that the young-old polarization and the male-female polarization are perhaps the two leading stereotypes that imprison people. The values associated with youth and with masculinity are considered to be the human norms, and anything else is taken to be at least less worthwhile or inferior. Old people have a terrific sense of inferiority. They’re embarrassed to be old. What you can do when you’re young and what you can do when you’re old is as arbitrary and without much basis as what you can do if you’re a woman or what you can do if you’re a man.

Susan Sontag on love, illustrated by Wendy MacNaughton. Click image for details.

One of her most poignant points comes from an anecdote she shares in discussing “the misogyny of women,” about her son, David Rieff (who edited her fantastic recently published journals):

Regarding those sexual stereotypes: the other night I was in a situation with David when we went out to Vincennes University, where I was invited to attend a seminar, and then after the seminar, four people plus David and myself went out to have coffee. And it so happened that the four people from the seminar were all women. We sat down at the table, and one of the women said, in French, to David, “Oh, you poor guy, having to sit with five women!” And everybody laughed. And then I said to these women, who were all teachers at Vincennes, “Do you realize what you’re saying and what a low opinion you have of yourselves?” I mean, can you imagine a situation in which a woman would sit down with five men and a man would say, “Oh, you poor thing, you have to sit with five men and we don’t have another woman for you.” She’d be honored. … And don’t forget that these were professional women who probably would have called themselves feminists, and yet what they were expressing was quite involuntary.

She then returns to an equally imprisoning polarity related to age, drawing a parallel:

You can find something very similar between young people and old people, since if a young person — man or woman — in his or her twenties would sit down with a bunch of people in their sixties or seventies, one of those persons might have said, What a pity you have to sit here with five old people, that must be boring for you! The point about women is or should be obvious, but people haven’t said how awful and embarrassed and diminished and apologetic they feel about being old.

Another dangerous polarity she points to is that of intuition vs. the intellect — one historically intertwined with our culture’s male-female stereotypes. Sontag, celebrated as one of the greatest intellectuals in modern history, turns out to be averse to such categorization and echoes French philosopher Henri Bergson as she tells Cott:

Most everything I do seems to have as much to do with intuition as with reason. . . . The kind of thinking that makes a distinction between thought and feeling is just one of those forms of demagogy that causes lots of trouble for people by making them suspicious of things that they shouldn’t be suspicious or complacent of.

For people to understand themselves in this way seems to be very destructive, and also very culpabilizing. These stereotypes of thought versus feeling, heart versus head, male versus female were invented at a time when people were convinced that the world was going in a certain direction — that is, toward technocracy, rationalization, science, and so on — but they were all invented as a defense against Romantic values.

Sontag then returns to her treatment of male-female stereotypes, which Cott terms “the male-female polarity as a kind of prison” — something I find especially prescient as I head to TED Women this week with equal parts excitement for the TED part and skepticism towards the women-ghettoization part. She riffs on the work of Hannah Arendt, whom she considers “the first woman political philosopher,” and tells Cott:

If I’m going to play chess, I don’t think I should play differently because I’m a woman.

Obviously, that is a more rule-determined kind of game, but even if I’m a poet or a prose writer or a painter, it seems to me my choices have to do with all kinds of different traditions that I’ve become attached to, or of experiences I’ve had, some of which may have something to do with the fact that I’m a woman but need not at all be necessarily determinant. I think it’s very oppressive to be asked to conform to the stereotype, exactly as a black writer might be asked to express black consciousness or only write about black material or reflect a black cultural sensibility. I don’t want to be “ghettoized” any more than some black writers I know want to be ghettoized.

In a sentiment Margaret Atwood would come to echo two decades later in discussing literature’s “women problem,” Sontag examines how women-only awards and accolades compromise rather than progress the gender equality movement:

Let’s say a film of mine is invited to a woman’s film festival. Well, I don’t refuse to send the film—on the contrary, I’m always glad when my films are shown, but it’s only the accident of my being a woman that accounts for my film being included. But as regards my work as a filmmaker, I don’t think that has anything to do with my being a woman—it has to do with me, and one of the things about me is that I’m a woman.

And, remember, this is 1978 — only shortly after the height of the Second Wave of Feminism and the passing of the Equal Pay Act, and at a time when Ms. magazine was changing gender politics. So when Cott pushes back and suggests that a feminist response might accuse Sontag of acting “as if the revolution had already been won,” she responds, more than a quarter century before the Lean In narrative, by nailing the issue with exquisite, prescient precision:

I don’t believe that the revolution has been won, but I think it’s just as useful for women to participate in traditional structures and enterprises, and to demonstrate that they’re competent and that they can be airplane pilots and bank executives and generals, and a lot of things that I don’t want to be and that I don’t think are so great. But it’s very good that women stake out their claims in these occupations. The attempt to set up a separate culture is a way of not seeking power, and I think women have to seek power. As I’ve said in the past, I don’t think the emancipation of women is just a question of having equal rights. It’s a question of having equal power, and how are they going to have that unless they participate in the structures that already exist?


I think that women should be proud of and identify with women who do things at a very high level of excellence, and not criticize them for not expressing a feminine sensibility or a feminine sense of sensuality. My idea is to just desegregate everything. The kind of feminist I am is to be an antisegregationist. And I don’t think it’s because I believe the battle has been won. I think it’s fine if there are women’s collectives doing things, but I don’t believe that the goal is a creation or a vindication of feminine values. I think the goal is half the pie. I wouldn’t establish or disestablish a principle of feminine culture or feminine sensibility or feminine sensuality. I think it would be nice if men would be more feminine and women more masculine. To me, that would be a more attractive world.

Susan Sontag: The Complete Rolling Stone Interview is an excellent read in its entirety and one of the best history books of the year for good reason, brimming with layers upon layers of insight not only into one of the greatest minds in modern history but also into the broader cultural context and dynamics upon which this mind, and our shared legacy, fed.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:

You can also become a one-time patron with a single donation in any amount:

Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

02 DECEMBER, 2013

The 13 Best Psychology and Philosophy Books of 2013


How to think like Sherlock Holmes, make better mistakes, master the pace of productivity, find fulfilling work, stay sane, and more.

After the best biographies, memoirs, and history books of 2013, the season’s subjective selection of best-of reading lists continue with the most stimulating psychology and philosophy books published this year. (Catch up on the 2012 roundup here and 2011’s here.)


“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library) — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Her approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.


Given my soft spot for famous diaries, it should come as no surprise that I keep one myself. Perhaps the greatest gift of the practice has been the daily habit of reading what I had written on that day a year earlier; not only is it a remarkable tool of introspection and self-awareness, but it also illustrates that our memory “is never a precise duplicate of the original [but] a continuing act of creation” and how flawed our perception of time is — almost everything that occurred a year ago appears as having taken place either significantly further in the past (“a different lifetime,” I’d often marvel at this time-illusion) or significantly more recently (“this feels like just last month!”). Rather than a personal deficiency of those of us befallen by this tendency, however, it turns out to be a defining feature of how the human mind works, the science of which is at first unsettling, then strangely comforting, and altogether intensely interesting.

That’s precisely what acclaimed BBC broadcaster and psychology writer Claudia Hammond explores in Time Warped: Unlocking the Mysteries of Time Perception (public library) — a fascinating foray into the idea that our experience of time is actively created by our own minds and how these sensations of what neuroscientists and psychologists call “mind time” are created. As disorienting as the concept might seem — after all, we’ve been nursed on the belief that time is one of those few utterly reliable and objective things in life — it is also strangely empowering to think that the very phenomenon depicted as the unforgiving dictator of life is something we might be able to shape and benefit from. Hammond writes:

We construct the experience of time in our minds, so it follows that we are able to change the elements we find troubling — whether it’s trying to stop the years racing past, or speeding up time when we’re stuck in a queue, trying to live more in the present, or working out how long ago we last saw our old friends. Time can be a friend, but it can also be an enemy. The trick is to harness it, whether at home, at work, or even in social policy, and to work in line with our conception of time. Time perception matters because it is the experience of time that roots us in our mental reality. Time is not only at the heart of the way we organize life, but the way we experience it.

Discus chronologicus, a depiction of time by German engraver Christoph Weigel, published in the early 1720s; from Cartographies of Time. (Click for details)

Among the most intriguing illustrations of “mind time” is the incredible elasticity of how we experience time. (“Where is it, this present?,” William James famously wondered. “It has melted in our grasp, fled ere we could touch it, gone in the instant of becoming.”) For instance, Hammond points out, we slow time down when gripped by mortal fear — the cliche about the slow-motion car crash is, in fact, a cognitive reality. This plays out even in situations that aren’t life-or-death per se but are still associated with strong feelings of fear. Hammond points to a study in which people with arachnophobia were asked to look at spiders — the very object of their intense fear — for 45 seconds and they overestimated the elapsed time. The same pattern was observed in novice skydivers, who estimated the duration of their peers’ falls as short, whereas their own, from the same altitude, were deemed longer.

Inversely, time seems to speed up as we get older — a phenomenon of which competing theories have attempted to make light. One, known as the “proportionality theory,” uses pure mathematics, holding that a year feels faster when you’re 40 than when you’re 8 because it only constitutes one fortieth of your life rather than a whole eighth. Among its famous proponents are Vladimir Nabokov and William James. But Hammond remains unconvinced:

The problem with the proportionality theory is that it fails to account for the way we experience time at any one moment. We don’t judge one day in the context of our whole lives. If we did, then for a 40-year-old every single day should flash by because it is less than one fourteen-thousandth of the life they’ve had so far. It should be fleeting and inconsequential, yet if you have nothing to do or an enforced wait at an airport for example, a day at 40 can still feel long and boring and surely longer than a fun day at the seaside packed with adventure for a child. … It ignores attention and emotion, which … can have a considerable impact on time perception.

Another theory suggests that perhaps it is the tempo of life in general that has accelerated, making things from the past appear as slower, including the passage of time itself.

But one definite change does take place with age: As we grow older, we tend to feel like the previous decade elapsed more rapidly, while the earlier decades of our lives seem to have lasted longer. Similarly, we tend to think of events that took place in the past 10 years as having happened more recently than they actually did. (Quick: What year did the devastating Japanese tsunami hit? When did we lose Maurice Sendak?) Conversely, we perceive events that took place more than a decade ago as having happened even longer ago. (When did Princess Diana die? What year was the Chernobyl disaster?) This, Hammond points out, is known as “forward telescoping”:

It is as though time has been compressed and — as if looking through a telescope — things seem closer than they really are. The opposite is called backward or reverse telescoping, also known as time expansion. This is when you guess that events happened longer ago than they really did. This is rare for distant events, but not uncommon for recent weeks.


The most straightforward explanation for it is called the clarity of memory hypothesis, proposed by the psychologist Norman Bradburn in 1987. This is the simple idea that because we know that memories fade over time, we use the clarity of a memory as a guide to its recency. So if a memory seems unclear we assume it happened longer ago.

Originally featured in July, with a deeper dive into the psychology of why time slows down when we’re afraid, speeds up as we age, and gets warped when we’re on vacation.


“If one wanted to crush and destroy a man entirely, to mete out to him the most terrible punishment,” wrote Dostoevsky, “all one would have to do would be to make him do work that was completely and utterly devoid of usefulness and meaning.” Indeed, the quest to avoid work and make a living of doing what you love is a constant conundrum of modern life. In How to Find Fulfilling Work (public library) — the latest installment in The School of Life’s wonderful series reclaiming the traditional self-help genre as intelligent, non-self-helpy, yet immensely helpful guides to modern living, which previously gave us Philippa Perry’s How to Stay Sane and Alain de Botton’s How to Think More About Sex — philosopher Roman Krznaric (remember him?) explores the roots of this contemporary quandary and guides us to its fruitful resolution:

The desire for fulfilling work — a job that provides a deep sense of purpose, and reflects our values, passions and personality — is a modern invention. … For centuries, most inhabitants of the Western world were too busy struggling to meet their subsistence needs to worry about whether they had an exciting career that used their talents and nurtured their wellbeing. But today, the spread of material prosperity has freed our minds to expect much more from the adventure of life.

We have entered a new age of fulfillment, in which the great dream is to trade up from money to meaning.

Krznaric goes on to outline two key afflictions of the modern workplace — “a plague of job dissatisfaction” and “uncertainty about how to choose the right career” — and frames the problem:

Never have so many people felt so unfulfilled in their career roles, and been so unsure what to do about it. Most surveys in the West reveal that at least half the workforce are unhappy in their jobs. One cross-European study showed that 60 per cent of workers would choose a different career if they could start again. In the United States, job satisfaction is at its lowest level — 45 per cent — since record-keeping began over two decades ago.

Of course, Krznaric points out, there’s plenty of cynicism and skepticism to go around, with people questioning whether it’s even possible to find a job in which we thrive and feel complete. He offers an antidote to the default thinking:

There are two broad ways of thinking about these questions. The first is the ‘grin and bear it’ approach. This is the view that we should get our expectations under control and recognize that work, for the vast majority of humanity — including ourselves — is mostly drudgery and always will be. Forget the heady dream of fulfillment and remember Mark Twain’s maxim. “Work is a necessary evil to be avoided.” … The history is captured in the word itself. The Latin labor means drudgery or toil, while the French travail derives from the tripalium, an ancient Roman instrument of torture made of three sticks. … The message of the ‘grin and bear it’ school of thought is that we need to accept the inevitable and put up with whatever job we can get, as long as it meets our financial needs and leaves us enough time to pursue our ‘real life’ outside office hours. The best way to protect ourselves from all the optimistic pundits pedaling fulfillment is to develop a hardy philosophy of acceptance, even resignation, and not set our hearts on finding a meaningful career.

I am more hopeful than this, and subscribe to a different approach, which is that it is possible to find work that is life-enhancing, that broadens our horizons and makes us feel more human.


This is a book for those who are looking for a job that is big enough for their spirit, something more than a ‘day job’ whose main function is to pay the bills.

'Never have so many people felt so unfulfilled in their career roles, and been so unsure what to do about it.'

Krznaric considers the five keys to making a career meaningful — earning money, achieving status, making a difference, following our passions, and using our talents — but goes on to demonstrate that they aren’t all created equal. In particular, he echoes 1970s Zen pioneer Alan Watts and modern science in arguing that money alone is a poor motivator:

Schopenhauer may have been right that the desire for money is widespread, but he was wrong on the issue of equating money with happiness. Overwhelming evidence has emerged in the last two decades that the pursuit of wealth is an unlikely path to achieving personal wellbeing — the ancient Greek ideal of eudaimonia or ‘the good life.’ The lack of any clear positive relationship between rising income and rising happiness has become one of the most powerful findings in the modern social sciences. Once our income reaches an amount that covers our basic needs, further increases add little, if anything, to our levels of life satisfaction.

The second false prophet of fulfillment, as Y-Combinator Paul Graham has poignantly cautioned and Debbie Millman has poetically articulated, is prestige. Krznaric admonishes:

We can easily find ourselves pursuing a career that society considers prestigious, but which we are not intrinsically devoted to ourselves — one that does not fulfill us on a day-to-day basis.

Krznaric pits respect, which he defines as “being appreciated for what we personally bring to a job, and being valued for our individual contribution,” as the positive counterpart to prestige and status, arguing that “in our quest for fulfilling work, we should seek a job that offers not just good status prospects, but good respect prospects.”

Rather than hoping to create a harmonious union between the pursuit of money and values, we might have better luck trying to combine values with talents. This idea comes courtesy of Aristotle, who is attributed with saying, ‘Where the needs of the world and your talents cross, there lies your vocation.’

Originally featured in April — read the full article here.


“If you are not making mistakes, you’re not taking enough risks,” Debbie Millman counseled. “Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before,” Neil Gaiman advised young creators. In Intuition Pumps And Other Tools for Thinking (public library), the inimitable Daniel Dennett, one of our greatest living philosophers, offers a set of thinking tools — “handy prosthetic imagination-extenders and focus holders” — that allow us to “think reliably and even gracefully about really hard questions” — to enhance your cognitive toolkit. He calls these tools “intuition pumps” — thought experiments designed to stir “a heartfelt, table-thumping intuition” (which we know is a pillar of even the most “rational” of science) about the question at hand, a kind of persuasion tool the reverse-engineering of which enables us to think better about thinking itself. Intuition, of course, is a domain-specific ability that relies on honed critical thinking rather than a mystical quality bestowed by the gods — but that’s precisely Dennett’s point, and his task is to help us hone it.

Though most of his 77 “intuition pumps” address concrete questions, a dozen are “general-purpose” tools that apply deeply and widely, across just about any domain of thinking. The first of them is also arguably the most useful yet most uncomfortable: making mistakes.

Echoing Dorion Sagan’s case for why science and philosophy need each other, Dennett begins with an astute contribution to the best definitions of philosophy, wrapped in a necessary admonition about the value of history:

The history of philosophy is in large measure the history of very smart people making very tempting mistakes, and if you don’t know the history, you are doomed to making the same darn mistakes all over again. … There is no such thing as philosophy-free science, just science that has been conducted without any consideration of its underlying philosophical assumptions.

He speaks for the generative potential of mistakes and their usefulness as an empirical tool:

Sometimes you don’t just want to risk making mistakes; you actually want to make them — if only to give you something clear and detailed to fix.

Therein lies the power of mistakes as a vehicle for, as Rilke famously put it, “living the questions” and thus advancing knowledge in a way that certainty cannot — for, as Richard Feynman memorably noted, the scientist’s job is to remain unsure, and so seems the philosopher’s. Dennett writes:

We philosophers are mistake specialists. … While other disciplines specialize in getting the right answers to their defining questions, we philosophers specialize in all the ways there are of getting things so mixed up, so deeply wrong, that nobody is even sure what the right questions are, let alone the answers. Asking the wrong questions risks setting any inquiry off on the wrong foot. Whenever that happens, this is a job for philosophers! Philosophy — in every field of inquiry — is what you have to do until you figure out what questions you should have been asking in the first place.


Mistakes are not just opportunities for learning; they are, in an important sense, the only opportunity for learning or making something truly new. Before there can be learning, there must be learners. There are only two non-miraculous ways for learners to come into existence: they must either evolve or be designed and built by learners that evolved. Biological evolution proceeds by a grand, inexorable process of trial and error — and without the errors the trials wouldn’t accomplish anything.

Dennett offers a caveat that at once highlights the importance of acquiring knowledge and reminds us of the power of “chance-opportunism”:

Trials can be either blind or foresighted. You, who know a lot, but not the answer to the question at hand, can take leaps — foresighted leaps. You can look before you leap, and hence be somewhat guided from the outset by what you already know. You need not be guessing at random, but don’t look down your nose at random guesses; among its wonderful products is … you!

And since evolution is the highest epitome of how the process of trial and error drives progress, Dennett makes a case for understanding evolution as a key to understanding everything else we humans value:

Evolution … is the central, enabling process not only of life but also of knowledge and learning and understanding. If you attempt to make sense of the world of ideas and meanings, free will and morality, art and science and even philosophy itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back. … For evolution, which knows nothing, the steps into novelty are blindly taken by mutations, which are random copying “errors” in DNA.

Dennett echoes Dostoyevsky (“Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others.”) and offers the key to making productive mistakes:

The chief trick to making good mistakes is not to hide them — especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own mistakes, turning them over in your mind as if they were works of art, which in a way they are. … The trick is to take advantage of the particular details of the mess you’ve made, so that your next attempt will be informed by it and not just another blind stab in the dark.

We have all heard the forlorn refrain “Well, it seemed like a good idea at the time!” This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say, “Well, it seemed like a good idea at the time!” is standing on the threshold of brilliance.

Originally featured in May — read the full article here.


“The habit of mind which leads to a search for relationships between facts,” wrote James Webb Young in his famous 1939 5-step technique for creative problem-solving, “becomes of the highest importance in the production of ideas.” But just how does one acquire those vital cognitive customs? That’s precisely what science writer Maria Konnikova explores in Mastermind: How to Think Like Sherlock Holmes (UK; public library) — an effort to reverse-engineer Holmes’s methodology into actionable insights that help develop “habits of thought that will allow you to engage mindfully with yourself and your world as a matter of course.”

Bridging ample anecdotes from the adventures of Conan Doyle’s beloved detective with psychology studies both classic and cutting-edge, Konnikova builds a compelling case at the intersection of science and secular spiritualism, stressing the power of rigorous observation alongside a Buddhist-like, Cageian emphasis on mindfulness. She writes:

The idea of mindfulness itself is by no means a new one. As early as the end of the nineteenth century, William James, the father of modern psychology, wrote that, ‘The faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will. … An education which should improve this faculty would be the education par excellence.’ That faculty, at its core, is the very essence of mindfulness. And the education that James proposes, an education in a mindful approach to life and to thought.


In recent years, studies have shown that meditation-like thought (an exercise in the very attentional control that forms the center of mindfulness), for as little as fifteen minutes a day, can shift frontal brain activity toward a pattern that has been associated with more positive and more approach-oriented emotional states, and that looking at scenes of nature, for even a short while, can help us become more insightful, more creative, and more productive. We also know, more definitively than we ever have, that our brains are not built for multitasking — something that precludes mindfulness altogether. When we are forced to do multiple things at once, not only do we perform worse on all of them but our memory decreases and our general wellbeing suffers a palpable hit.

But for Sherlock Holmes, mindful presence is just a first step. It’s a means to a far larger, far more practical and practically gratifying goal. Holmes provides precisely what William James had prescribed: an education in improving our faculty of mindful thought and in using it in order to accomplish more, think better, and decide more optimally. In its broadest application, it is a means for improving overall decision making and judgment ability, starting from the most basic building block of your own mind.

But mindfulness, and the related mental powers it bestows upon its master, is a skill acquired with grit and practice, rather than an in-born talent or an easy feat attained with a few half-hearted tries:

It is most difficult to apply Holmes’s logic in those moments that matter the most. And so, all we can do is practice, until our habits are such that even the most severe stressors will bring out the very thought patterns that we’ve worked so hard to master.

Echoing Carl Sagan, Konnikova examines the role of intuition — a grab-bag concept embraced by some of history’s greatest scientific minds, cultural icons, and philosophers — as both a helpful directional signpost of intellectual inquiry and a dangerous blind spot:

Our intuition is shaped by context, and that context is deeply informed by the world we live in. It can thus serve as a blinder — or blind spot — of sorts. … With mindfulness, however, we can strive to find a balance between fact-checking our intuitions and remaining open-minded. We can then make our best judgments, with the information we have and no more, but with, as well, the understanding that time may change the shape and color of that information.

“I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose,” Holmes famously remarked. Indeed, much like the inventor’s mind, the problem-solver’s mind is the product of that very choice: The details and observations we select to include in our “brain attic” shape and filter our perception of reality. Konnikova writes:

Observation with a capital O — the way Holmes uses the word when he gives his new companion a brief history of his life with a single glance — does entail more than, well, observation (the lowercase kind). It’s not just about the passive process of letting objects enter into your visual field. It is about knowing what and how to observe and directing your attention accordingly: what details do you focus on? What details do you omit? And how do you take in and capture those details that you do choose to zoom in on? In other words, how do you maximize your brain attic’s potential? You don’t just throw any old detail up there, if you remember Holmes’s early admonitions; you want to keep it as clean as possible. Everything we choose to notice has the potential to become a future furnishing of our attics — and what’s more, its addition will mean a change in the attic’s landscape that will affect, in turn, each future addition. So we have to choose wisely.

Choosing wisely means being selective. It means not only looking but looking properly, looking with real thought. It means looking with the full knowledge that what you note — and how you note it — will form the basis of any future deductions you might make. It’s about seeing the full picture, noting the details that matter, and understanding how to contextualize those details within a broader framework of thought.

Originally featured in January — read the full article for more, including Konnikova’s four rules for Sherlockian thinking.


Commencement season is upon us and, after Greil Marcus’s soul-stirring speech on the essence of art at the 2013 School of Visual Arts graduation ceremony, here comes an exceptional adaptation of one of the best commencement addresses ever delivered: In May of 2012, beloved author Neil Gaiman stood up in front of the graduating class at Philadelphia’s University of the Arts and dispensed some timeless advice on the creative life; now, his talk comes to life as a slim but potent book titled Make Good Art (public library).

Best of all, it’s designed by none other than the inimitable Chip Kidd, who has spent the past fifteen years shaping the voice of contemporary cover design with his prolific and consistently stellar output, ranging from bestsellers like cartoonist Chris Ware’s sublime Building Stories and neurologist Oliver Sacks’s The Mind’s Eye to lesser-known gems like The Paris Review‘s Women Writers at Work and The Letter Q, that wonderful anthology of queer writers’ letters to their younger selves. (Fittingly, Kidd also designed the book adaptation of Ann Patchett’s 2006 commencement address.)

When things get tough, this is what you should do: Make good art. I’m serious. Husband runs off with a politician — make good art. Leg crushed and then eaten by a mutated boa constrictor — make good art. IRS on your trail — make good art. Cat exploded — make good art. Someone on the Internet thinks what you’re doing is stupid or evil or it’s all been done before — make good art. Probably things will work out somehow, eventually time will take the sting away, and that doesn’t even matter. Do what only you can do best: Make good art. Make it on the bad days, make it on the good days, too.

A wise woman once said, “If you are not making mistakes, you’re not taking enough risks.” Gaiman articulates the same sentiment with his own brand of exquisite eloquence:

I hope that in this year to come, you make mistakes.

Because if you are making mistakes, then you are making new things, trying new things, learning, living, pushing yourself, changing yourself, changing your world. You’re doing things you’ve never done before, and more importantly, you’re Doing Something.

So that’s my wish for you, and all of us, and my wish for myself. Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before. Don’t freeze, don’t stop, don’t worry that it isn’t good enough, or it isn’t perfect, whatever it is: art, or love, or work or family or life.

Whatever it is you’re scared of doing, Do it.

Make your mistakes, next year and forever.

Originally featured in May — read the full article here, along with a video of Gaiman’s original commencement address.


In How Children Succeed: Grit, Curiosity, and the Hidden Power of Character (public library) — a necessary addition to these fantastic reads on educationPaul Tough, whose writing has appeared in The New Yorker, Slate, Esquire, The New York Times, sets out to investigate the essential building blocks of character through the findings and practical insight of exceptional educators and bleeding-edge researchers. One of his core arguments is based on the work of pioneering psychologist and 2013 MacArthur “genius” grantee Angela Duckworth, who studied under positive psychology godfather Martin Seligman at my alma mater, the University of Pennsylvania, and has done more than anyone for advancing our understanding of how self-control and grit — the relentless work ethic of sustaining your commitments toward a long-term goal — impact success.

Duckworth had come to Penn in 2002, at the age of thirty-two, later in life than a typical graduate student. The daughter of Chinese immigrants, she had been a classic multitasking overachiever in her teens and twenties. After completing her undergraduate degree at Harvard (and starting a summer school for low-income kids in Cambridge in her spare time), she had bounced from one station of the mid-nineties meritocracy to the next: intern in the White House speechwriting office, Marshall scholar at Oxford (where she studied neuroscience), management consultant for McKinsey and Company, charter-school adviser.

Duckworth spent a number of years toying with the idea of starting her own charter school, but eventually concluded that the model didn’t hold much promise for changing the circumstances of children from disadvantaged backgrounds, those whom the education system was failing most tragically. Instead, she decided to pursue a PhD program at Penn. In her application essay, she shared how profoundly the experience of working in schools had changed her view of school reform and wrote:

The problem, I think, is not only the schools but also the students themselves. Here’s why: learning is hard. True, learning is fun, exhilarating and gratifying — but it is also often daunting, exhausting and sometimes discouraging. . . . To help chronically low-performing but intelligent students, educators and parents must first recognize that character is at least as important as intellect.

Duckworth began her graduate work by studying self-discipline. But when she completed her first-year thesis, based on a group of 164 eighth-graders from a Philadelphia middle school, she arrived at a startling discovery that would shape the course of her career: She found that the students’ self-discipline scores were far better predictors of their academic performance than their IQ scores. So she became intensely interested in what strategies and tricks we might develop to maximize our self-control, and whether those strategies can be taught. But self-control, it turned out, was only a good predictor when it came to immediate, concrete goals — like, say, resisting a cookie. Tough writes:

Duckworth finds it useful to divide the mechanics of achievement into two separate dimensions: motivation and volition. Each one, she says, is necessary to achieve long-term goals, but neither is sufficient alone. Most of us are familiar with the experience of possessing motivation but lacking volition: You can be extremely motivated to lose weight, for example, but unless you have the volition — the willpower, the self-control — to put down the cherry Danish and pick up the free weights, you’re not going to succeed. If a child is highly motivated, the self-control techniques and exercises Duckworth tried to teach [the students in her study] might be very helpful. But what if students just aren’t motivated to achieve the goals their teachers or parents want them to achieve? Then, Duckworth acknowledges, all the self-control tricks in the world aren’t going to help.

This is where grit comes in — the X-factor that helps us attain more long-term, abstract goals. To address this, Duckworth and her colleague Chris Peterson developed the Grit Scale — a deceptively simple test, on which you evaluate how much twelve statements apply to you, from “I am a hard worker” to “New ideas and projects sometimes distract me from previous ones.” The results are profoundly predictive of success at such wide-ranging domains of achievement as the National Spelling Bee and the West Point military academy. Tough describes the surprising power of this seemingly mundane questionnaire:

For each statement, respondents score themselves on a five-point scale, ranging from 5, “very much like me,” to 1, “not like me at all.” The test takes about three minutes to complete, and it relies entirely on self-report — and yet when Duckworth and Peterson took it out into the field, they found it was remarkably predictive of success. Grit, Duckworth discovered, is only faintly related to IQ — there are smart gritty people and dumb gritty people — but at Penn, high grit scores allowed students who had entered college with relatively low college-board scores to nonetheless achieve high GPAs. At the National Spelling Bee, Duckworth found that children with high grit scores were more likely to survive to the later rounds. Most remarkable, Duckworth and Peterson gave their grit test to more than twelve hundred freshman cadets as they entered the military academy at West Point and embarked on the grueling summer training course known as Beast Barracks. The military has developed its own complex evaluation, called the whole candidate score, to judge incoming cadets and predict which of them will survive the demands of West Point; it includes academic grades, a gauge of physical fitness, and a leadership potential score. But the more accurate predictor of which cadets persisted in Beast Barracks and which ones dropped out turned out to be Duckworth’s simple little twelve-item grit questionnaire.

You can take the Grit Scale here (registration is free).


Every year, intellectual impresario and Edge editor John Brockman summons some of our era’s greatest thinkers and unleashes them on one provocative question, whether it’s the single most elegant theory of how the world works or the best way to enhance our cognitive toolkit. This year, he sets out on the most ambitious quest yet, a meta-exploration of thought itself: Thinking: The New Science of Decision-Making, Problem-Solving, and Prediction (public library) collects short essays and lecture adaptations from such celebrated and wide-ranging (though not in gender) minds as Daniel Dennett, Jonathan Haidt, Dan Gilbert, and Timothy Wilson, covering subjects as diverse as morality, essentialism, and the adolescent brain.

One of the most provocative contributions comes from Nobel-winning psychologist Daniel Kahneman — author of the indispensable Thinking, Fast and Slow, one of the best psychology books of 2012 — who examines “the marvels and the flaws of intuitive thinking.”

In the 1970s, Kahneman and his colleague Amos Tversky, self-crowned “prophets of irrationality,” began studying what they called “heuristics and biases” — mental shortcuts we take, which frequently result in cognitive errors. Those errors, however, reveal a great deal about how our minds work:

If you want to characterize how something is done, then one of the most powerful ways of characterizing how the mind does anything is by looking at the errors that the mind produces while it’s doing it because the errors tell you what it is doing. Correct performance tells you much less about the procedure than the errors do.

One of the most fascinating examples of heuristics and biases is what we call intuition — a complex cluster of cognitive processes, sometimes helpful but often misleading. Kahneman notes that thoughts come to mind in one of two ways: Either by “orderly computation,” which involves a series of stages of remembering rules and then applying them, or by perception, an evolutionary function that allows us to predict outcomes based on what we’re perceiving. (For instance, seeing a woman’s angry face helps us predict the general sentiment and disposition of what she’s about to say.) It is the latter mode that precipitates intuition. Kahneman explains the interplay:

There is no sharp line between intuition and perception. … Perception is predictive. . . . If you want to understand intuition, it is very useful to understand perception, because so many of the rules that apply to perception apply as well to intuitive thinking. Intuitive thinking is quite different from perception. Intuitive thinking has language. Intuitive thinking has a lot of word knowledge organized in different ways more than mere perception. But some very basic characteristics [of] perception are extended almost directly to intuitive thinking.

He then considers how the two types of mental operations established by modern cognitive science illuminate intuition:

Type 1 is automatic, effortless, often unconscious, and associatively coherent. . . . Type 2 is controlled, effortful, usually conscious, tends to be logically coherent, rule-governed. Perception and intuition are Type 1. … Type 2 is more controlled, slower, is more deliberate. . . . Type 2 is who we think we are. [And yet] if one made a film on this, Type 2 would be a secondary character who thinks that he is the hero because that’s who we think we are, but in fact, it’s Type 1 that does most of the work, and it’s most of the work that is completely hidden from us.

Type 1 also encompasses all of our practiced skills — for instance, driving, speaking, and understanding a language — which after a certain threshold of mastery enter autopilot mode. (Though this presents its own set of problems.) Underpinning that mode of thinking is our associative memory, which Kahneman unpacks:

You have to think of [your associative memory] as a huge repository of ideas, linked to each other in many ways, including causal links and other links, and activation spreading from ideas to other ideas until a small subset of that enormous network is illuminated, and the subset is what’s happening in the mind at the moment. You’re not conscious of it, you’re conscious of very little of it.

The Type 1 modality of thought gives rise to a System 1 of interpretation, which is at the heart of what we call “intuition” — but which is far less accurate and reliable than we like to believe:

System 1 infers and invents causes and intentions. [This] happens automatically. Infants have it. . . . We’re equipped … for the perception of causality.

It neglects ambiguity and suppresses doubt and … exaggerates coherence. Associative coherence [is] in large part where the marvels turn into flaws. We see a world that is vastly more coherent than the world actually is. That’s because of this coherence-creating mechanism that we have. We have a sense-making organ in our heads, and we tend to see things that are emotionally coherent, and that are associatively coherent.

Most treacherous of all is our tendency to use our very confidence — and overconfidence — as evidence itself:

What’s interesting is that many a time people have intuitions that they’re equally confident about except they’re wrong. That happens through the mechanism I call “the mechanism of substitution.” You have been asked a question, and instead you answer another question, but that answer comes by itself with complete confidence, and you’re not aware that you’re doing something that you’re not an expert on because you have one answer. Subjectively, whether it’s right or wrong, it feels exactly the same. Whether it’s based on a lot of information, or a little information, this is something that you may step back and have a look at. But the subjective sense of confidence can be the same for intuition that arrives from expertise, and for intuitions that arise from heuristics. . . .

In other words, intuition, like attention, is “an intentional, unapologetic discriminator [that] asks what is relevant right now, and gears us up to notice only that” — a humbling antidote to our culture’s propensity for self-righteousness, and above all a reminder to allow yourself the uncomfortable luxury of changing your mind.

Originally featured in October — read the full article here.


We seem to have a strange but all too human cultural fixation on the daily routines and daily rituals of famous creators, from Vonnegut to Burroughs to Darwin — as if a glimpse of their day-to-day would somehow magically infuse ours with equal potency, or replicating it would allow us to replicate their genius in turn. And though much of this is mere cultural voyeurism, there is something to be said for the value of a well-engineered daily routine to anchor the creative process. Manage Your Day-to-Day: Build Your Routine, Find Your Focus, and Sharpen Your Creative Mind (public library), edited by Behance’s 99U editor-in-chief Jocelyn Glei, delves into the secrets of this holy grail of creativity. Twenty of today’s most celebrated thinkers and doers explore such facets of the creative life as optimizing your idea-generation, defying the demons of perfectionism, managing procrastination, and breaking through your creative blocks, with insights from magnificent minds ranging from behavioral economist Dan Ariely to beloved graphic designer Stefan Sagmeister.

In the foreword to the book, Behance founder Scott Belsky, author of the indispensable Making Ideas Happen, points to “reactionary workflow” — our tendency to respond to requests and other stimuli rather than create meaningful work — as today’s biggest problem and propounds a call to arms:

It’s time to stop blaming our surroundings and start taking responsibility. While no workplace is perfect, it turns out that our gravest challenges are a lot more primal and personal. Our individual practices ultimately determine what we do and how well we do it. Specifically, it’s our routine (or lack thereof), our capacity to work proactively rather than reactively, and our ability to systematically optimize our work habits over time that determine our ability to make ideas happen.


Only by taking charge of your day-to-day can you truly make an impact in what matters most to you. I urge you to build a better routine by stepping outside of it, find your focus by rising above the constant cacophony, and sharpen your creative prowess by analyzing what really matters most when it comes to making your ideas happen.

One of the book’s strongest insights comes from Gretchen Rubin — author of The Happiness Project: Or, Why I Spent a Year Trying to Sing in the Morning, Clean My Closets, Fight Right, Read Aristotle, and Generally Have More Fun, one of these 7 essential books on the art and science of happiness, titled after her fantastic blog of the same name — who points to frequency as the key to creative accomplishment:

We tend to overestimate what we can do in a short period, and underestimate what we can do over a long period, provided we work slowly and consistently. Anthony Trollope, the nineteenth-century writer who managed to be a prolific novelist while also revolutionizing the British postal system, observed, “A small daily task, if it be really daily, will beat the labours of a spasmodic Hercules.” Over the long run, the unglamorous habit of frequency fosters both productivity and creativity.

Frequency, she argues, helps facilitate what Arthur Koestler has famously termed “bisociation” — the crucial ability to link the seemingly unlinkable, which is the defining characteristic of the creative mind. Rubin writes:

You’re much more likely to spot surprising relationships and to see fresh connections among ideas, if your mind is constantly humming with issues related to your work. When I’m deep in a project, everything I experience seems to relate to it in a way that’s absolutely exhilarating. The entire world becomes more interesting. That’s critical, because I have a voracious need for material, and as I become hyperaware of potential fodder, ideas pour in. By contrast, working sporadically makes it hard to keep your focus. It’s easy to become blocked, confused, or distracted, or to forget what you were aiming to accomplish.


Creativity arises from a constant churn of ideas, and one of the easiest ways to encourage that fertile froth is to keep your mind engaged with your project. When you work regularly, inspiration strikes regularly.

Echoing Alexander Graham Bell, who memorably wrote that “it is the man who carefully advances step by step … who is bound to succeed in the greatest degree,” and Virginia Woolf, who extolled the creative benefits of keeping a diary, Rubin writes:

Step by step, you make your way forward. That’s why practices such as daily writing exercises or keeping a daily blog can be so helpful. You see yourself do the work, which shows you that you can do the work. Progress is reassuring and inspiring; panic and then despair set in when you find yourself getting nothing done day after day. One of the painful ironies of work life is that the anxiety of procrastination often makes people even less likely to buckle down in the future.

Riffing on wisdom from her latest book, Happier at Home: Kiss More, Jump More, Abandon a Project, Read Samuel Johnson, and My Other Experiments in the Practice of Everyday Life, Rubin offers:

I have a long list of “Secrets of Adulthood,” the lessons I’ve learned as I’ve grown up, such as: “It’s the task that’s never started that’s more tiresome,” “The days are long, but the years are short,” and “Always leave plenty of room in the suitcase.” One of my most helpful Secrets is, “What I do every day matters more than what I do once in a while.”

With a sentiment reminiscent of William James’s timeless words on habit, she concludes:

Day by day, we build our lives, and day by day, we can take steps toward making real the magnificent creations of our imaginations.

Entrepreneurship guru and culture-sage Seth Godin seconds Rubin and admonishes against confusing vacant ritualization with creative rituals that actually spur productivity:

Everybody who does creative work has figured out how to deal with their own demons to get their work done. There is no evidence that setting up your easel like Van Gogh makes you paint better. Tactics are idiosyncratic. But strategies are universal, and there are a lot of talented folks who are not succeeding the way they want to because their strategies are broken.

The strategy is simple, I think. The strategy is to have a practice, and what it means to have a practice is to regularly and reliably do the work in a habitual way.

There are many ways you can signify to yourself that you are doing your practice. For example, some people wear a white lab coat or a particular pair of glasses, or always work in a specific place — in doing these things, they are professionalizing their art.

He echoes Chuck Close (“Inspiration is for amateurs — the rest of us just show up and get to work.”), Tchaikovsky (“a self-respecting artist must not fold his hands on the pretext that he is not in the mood.”) E. B. White (“A writer who waits for ideal conditions under which to work will die without putting a word on paper.”), and Isabel Allende (“Show up, show up, show up, and after a while the muse shows up, too.”), observing:

The notion that I do my work here, now, like this, even when I do not feel like it, and especially when I do not feel like it, is very important. Because lots and lots of people are creative when they feel like it, but you are only going to become a professional if you do it when you don’t feel like it. And that emotional waiver is why this is your work and not your hobby.

Originally featured in May — read the full article here. Also of note: 99U’s sequel, Maximize Your Potential, which collects practical wisdom from 21 celebrated creative entrepreneurs.


“The principle of give and take; that is diplomacy— give one and take ten,” Mark Twain famously smirked. But for every such cynicism, there’s a heartening meditation on the art of asking and the beautiful osmosis of altruism. “The world is just,” Amelia Barr admonished in her rules for success, “it may, it does, patronize quacks; but it never puts them on a level with true men.” After all, it pays to be nice because, as Austin Kleon put it, “the world is a small town,” right?

Well, maybe — maybe not. Just as the world may be, how givers and takers fare in matters of success proves to be more complicated. So argues organizational psychology wunderkind Adam Grant (remember him?), the youngest-tenured and highest-rated Wharton professor at my alma mater, in Give and Take: A Revolutionary Approach to Success (public library).

Grant’s extensive research has shed light on a crucial element of success, debunking some enduring tenets of cultural mythology:

According to conventional wisdom, highly successful people have three things in common: motivation, ability, and opportunity. If we want to succeed, we need a combination of hard work, talent, and luck. [But there is] a fourth ingredient, one that’s critical but often neglected: success depends heavily on how we approach our interactions with other people. Every time we interact with another person at work, we have a choice to make: do we try to claim as much value as we can, or contribute value without worrying about what we receive in return?

At the heart of his insight is a dichotomy of behavioral styles people adopt in pursuing success:

Takers have a distinctive signature: they like to get more than they give. They tilt reciprocity in their own favor, putting their own interests ahead of others’ needs. Takers believe that the world is a competitive, dog-eat-dog place. They feel that to succeed, they need to be better than others. To prove their competence, they self-promote and make sure they get plenty of credit for their efforts. Garden-variety takers aren’t cruel or cutthroat; they’re just cautious and self-protective. “If I don’t look out for myself first,” takers think, “no one will.”

Grant contrasts takers with givers:

In the workplace, givers are a relatively rare breed. They tilt reciprocity in the other direction, preferring to give more than they get. Whereas takers tend to be self-focused, evaluating what other people can offer them, givers are other-focused, paying more attention to what other people need from them. These preferences aren’t about money: givers and takers aren’t distinguished by how much they donate to charity or the compensation that they command from their employers. Rather, givers and takers differ in their attitudes and actions toward other people. If you’re a taker, you help others strategically, when the benefits to you outweigh the personal costs. If you’re a giver, you might use a different cost-benefit analysis: you help whenever the benefits to others exceed the personal costs. Alternatively, you might not think about the personal costs at all, helping others without expecting anything in return. If you’re a giver at work, you simply strive to be generous in sharing your time, energy, knowledge, skills, ideas, and connections with other people who can benefit from them.

Outside the workplace, Grant argues by citing Yale psychologist Margaret Clark’s research, most of us are givers in close relationships like marriages and friendships, contributing without preoccupation with keeping score. In the workplace, however, few of us are purely givers or takers — rather, what dominates is a third style:

We become matchers, striving to preserve an equal balance of giving and getting. Matchers operate on the principle of fairness: when they help others, they protect themselves by seeking reciprocity. If you’re a matcher, you believe in tit for tat, and your relationships are governed by even exchanges of favors.

True to psychologists’ repeated insistence that personality is fluid rather than fixed, Grant notes:

Giving, taking, and matching are three fundamental styles of social interaction, but the lines between them aren’t hard and fast. You might find that you shift from one reciprocity style to another as you travel across different work roles and relationships. It wouldn’t be surprising if you act like a taker when negotiating your salary, a giver when mentoring someone with less experience than you, and a matcher when sharing expertise with a colleague. But evidence shows that at work, the vast majority of people develop a primary reciprocity style, which captures how they approach most of the people most of the time. And this primary style can play as much of a role in our success as hard work, talent, and luck.

Originally featured in April — for a closer look at Grant’s findings on the science of success, read the full article here.


Despite ample evidence and countless testaments to the opposite, there persists a toxic cultural mythology that creative and intellectual excellence comes from a passive gift bestowed upon the fortunate few by the gods of genius, rather than being the product of the active application and consistent cultivation of skill. So what might the root of that stubborn fallacy be? Childhood and upbringing, it turns out, might have a lot to do.

In The Examined Life: How We Lose and Find Ourselves (public library), psychoanalyst and University College London professor Stephen Grosz builds on more than 50,000 hours of conversation from his quarter-century experience as a practicing psychoanalyst to explore the machinery of our inner life, with insights that are invariably profound and often provocative — for instance, a section titled “How praise can cause a loss of confidence,” in which Grosz writes:

Nowadays, we lavish praise on our children. Praise, self-confidence and academic performance, it is commonly believed, rise and fall together. But current research suggests otherwise — over the past decade, a number of studies on self-esteem have come to the conclusion that praising a child as ‘clever’ may not help her at school. In fact, it might cause her to under-perform. Often a child will react to praise by quitting — why make a new drawing if you have already made ‘the best’? Or a child may simply repeat the same work — why draw something new, or in a new way, if the old way always gets applause?

Grosz cites psychologists Carol Dweck and Claudia Mueller’s famous 1998 study, which divided 128 children ages 10 and 11 into two groups. All were asked to solve mathematical problems, but one group were praised for their intellect (“You did really well, you’re so clever.”) while the other for their effort (“You did really well, you must have tried really hard.”) The kids were then given more complex problems, which those previously praised for their hard work approached with dramatically greater resilience and willingness to try different approaches whenever they reached a dead end. By contrast, those who had been praised for their cleverness were much more anxious about failure, stuck with tasks they had already mastered, and dwindled in tenacity in the face of new problems. Grosz summarizes the now-legendary findings:

Ultimately, the thrill created by being told ‘You’re so clever’ gave way to an increase in anxiety and a drop in self-esteem, motivation and performance. When asked by the researchers to write to children in another school, recounting their experience, some of the ‘clever’ children lied, inflating their scores. In short, all it took to knock these youngsters’ confidence, to make them so unhappy that they lied, was one sentence of praise.

He goes on to admonish against today’s culture of excessive parental praise, which he argues does more for lifting the self-esteem of the parents than for cultivating a healthy one in their children:

Admiring our children may temporarily lift our self-esteem by signaling to those around us what fantastic parents we are and what terrific kids we have — but it isn’t doing much for a child’s sense of self. In trying so hard to be different from our parents, we’re actually doing much the same thing — doling out empty praise the way an earlier generation doled out thoughtless criticism. If we do it to avoid thinking about our child and her world, and about what our child feels, then praise, just like criticism, is ultimately expressing our indifference.

To explore what the healthier substitute for praise might be, he recounts observing an eighty-year-old remedial reading teacher named Charlotte Stiglitz, the mother of the Nobel Prize-winning economist Joseph Stiglitz, who told Grosz of her teaching methodology:

I don’t praise a small child for doing what they ought to be able to do,’ she told me. ‘I praise them when they do something really difficult — like sharing a toy or showing patience. I also think it is important to say “thank you”. When I’m slow in getting a snack for a child, or slow to help them and they have been patient, I thank them. But I wouldn’t praise a child who is playing or reading.

Rather than utilizing the familiar mechanisms of reward and punishment, Grosz observed, Charlotte’s method relied on keen attentiveness to “what a child did and how that child did it.” Presence, he argues, helps build the child’s confidence by way of indicating he is worthy of the observer’s thoughts and attention — its absence, on the other hand, divorces in the child the journey from the destination by instilling a sense that the activity itself is worthless unless it’s a means to obtaining praise. Grosz reminds us how this plays out for all of us, and why it matters throughout life:

Being present, whether with children, with friends, or even with oneself, is always hard work. But isn’t this attentiveness — the feeling that someone is trying to think about us — something we want more than praise?

Originally featured in May — read the full article here.


Whether it’s “selling” your ideas, your writing, or yourself to a potential mate, the art of the sell is crucial to your fulfillment in life, both personal and professional. So argues Dan Pink in To Sell Is Human: The Surprising Truth About Moving Others (public library; UK) — a provocative anatomy of the art-science of “selling” in the broadest possible sense of the word, substantiated by ample research spanning psychology, behavioral economics, and the social sciences.

Pink, wary of the disagreeable twinges accompanying the claim that everyone should self-identify as a salesperson, preemptively counters in the introduction:

I’m convinced we’ve gotten it wrong.

This is a book about sales. But it is unlike any book about sales you have read (or ignored) before. That’s because selling in all its dimensions — whether pushing Buicks on a lot or pitching ideas in a meeting — has changed more in the last ten years than it did over the previous hundred. Most of what we think we understand about selling is constructed atop a foundation of assumptions that have crumbled.


Selling, I’ve grown to understand, is more urgent, more important, and, in its own sweet way, more beautiful than we realize. The ability to move others to exchange what they have for what we have is crucial to our survival and our happiness. It has helped our species evolve, lifted our living standards, and enhanced our daily lives. The capacity to sell isn’t some unnatural adaptation to the merciless world of commerce. It is part of who we are.

One of Pink’s most fascinating arguments echoes artist Chuck Close, who famously noted that “our whole society is much too problem-solving oriented. It is far more interesting to [participate in] ‘problem creation.’” Pink cites the research of celebrated social scientists Jacob Getzels and Mihaly Csikszentmihalyi, who in the 1960s recruited three dozen fourth-year art students for an experiment. They brought the young artists into a studio with two large tables. The first table displayed 27 eclectic objects that the school used in its drawing classes. The students were instructed to select one or more objects, then arrange a still life on the second table and draw it. What happened next reveals an essential pattern about how creativity works:

The young artists approached their task in two distinct ways. Some examined relatively few objects, outlined their idea swiftly, and moved quickly to draw their still life. Others took their time. They handled more objects, turned them this way and that, rearranged them several times, and needed much longer to complete the drawing. As Csikszentmihalyi saw it, the first group was trying to solve a problem: How can I produce a good drawing? The second was trying to find a problem: What good drawing can I produce?

As Csikszentmihalyi then assembled a group of art experts to evaluate the resulting works, he found that the problem-finders’ drawings had been ranked much higher in creativity than the problem-solvers’. Ten years later, the researchers tracked down these art students, who at that point were working for a living, and found that about half had left the art world, while the other half had gone on to become professional artists. That latter group was composed almost entirely of problem-finders. Another decade later, the researchers checked in again and discovered that the problem-finders were “significantly more successful — by the standards of the artistic community — than their peers.” Getzels concluded:

It is in fact the discovery and creation of problems rather than any superior knowledge, technical skill, or craftsmanship that often sets the creative person apart from others in his field.

Pink summarizes:

The more compelling view of the nature of problems has enormous implications for the new world of selling. Today, both sales and non-sales selling depend more on the creative, heuristic, problem-finding skills of artists than on the reductive, algorithmic, problem-solving skills of technicians.

Another fascinating chapter reveals counterintuitive insights about the competitive advantages of introversion vs. extraversion. While new theories might extol the power of introverts over traditional exaltations of extraversion, the truth turns out to be quite different: Pink turns to the research of social psychologist Adam Grant, management professor at the Wharton School of Business at the University of Pennsylvania (my alma mater).

Grant measured where a sample of call center sales representatives fell on the introversion-extraversion spectrum, then correlated that with their actual sales figures. Unsurprisingly, Grant found that extraverts averaged $125 per hour in revenue, exceeding introverts’ $120. His most surprising finding, however, was that “ambiverts” — those who fell in the middle of the spectrum, “not too hot, not too cold” — performed best of all, with an hourly average of $155. The outliers who brought in an astounding $208 per hour scored a solid 4 on the 1-7 introversion-extraversion scale.

Pink synthesizes the findings into an everyday insight for the rest of us:

The best approach is for the people on the ends to emulate those in the center. As some have noted, introverts are ‘geared to inspect,’ while extraverts are ‘geared to respond.’ Selling of any sort — whether traditional sales or non-sales selling — requires a delicate balance of inspecting and responding. Ambiverts can find that balance. They know when to speak and when to shut up. Their wider repertoires allow them to achieve harmony with a broader range of people and a more varied set of circumstances. Ambiverts are the best movers because they’re the most skilled attuners.

Pink goes on to outline “the new ABCs of moving others” — attunement (“the ability to bring one’s actions and outlook into harmony with other people an with the context you’re [sic] in”), buoyancy (a trifecta of “interrogative self-talk” that moves from making statements to asking questions, contagious “positivity,” and an optimistic “explanatory style” of explaining negative events to yourself), and clarity (“the capacity to help others see their situations in fresh and more revealing ways and to identify problems they didn’t realize they had”).

Originally featured in February — read the full article here, where you can watch the charming companion video.


“I pray to Jesus to preserve my sanity,” Jack Kerouac professed in discussing his writing routine. But those of us who fall on the more secular end of the spectrum might need a slightly more potent sanity-preservation tool than prayer. That’s precisely what writer and psychotherapist Philippa Perry offers in How To Stay Sane (public library; UK), part of The School of Life’s wonderful series reclaiming the traditional self-help genre as intelligent, non-self-helpy, yet immensely helpful guides to modern living.

At the heart of Perry’s argument — in line with neurologist Oliver Sacks’s recent meditation on memory and how “narrative truth,” rather than “historical truth,” shapes our impression of the world — is the recognition that stories make us human and learning to reframe our interpretations of reality is key to our experience of life:

Our stories give shape to our inchoate, disparate, fleeting impressions of everyday life. They bring together the past and the future into the present to provide us with structures for working towards our goals. They give us a sense of identity and, most importantly, serve to integrate the feelings of our right brain with the language of our left.


We are primed to use stories. Part of our survival as a species depended upon listening to the stories of our tribal elders as they shared parables and passed down their experience and the wisdom of those who went before. As we get older it is our short-term memory that fades rather than our long-term memory. Perhaps we have evolved like this so that we are able to tell the younger generation about the stories and experiences that have formed us which may be important to subsequent generations if they are to thrive.

I worry, though, about what might happen to our minds if most of the stories we hear are about greed, war and atrocity.

Perry goes on to cite research indicating that people who watch television for more than four hours a day see themselves as far more likely to fall victim in a violent incident in the forthcoming week than their peers who watch less than two hours a day. Just like E. B. White advocated for the responsibility of the writer to “to lift people up, not lower them down,” so too is our responsibility as the writers of our own life-stories to avoid the well-documented negativity bias of modern media — because, as artist Austin Kleon wisely put it, “you are a mashup of what you let into your life.” Perry writes:

Be careful which stories you expose yourself to.


The meanings you find, and the stories you hear, will have an impact on how optimistic you are: it’s how we evolved. … If you do not know how to draw positive meaning from what happens in life, the neural pathways you need to appreciate good news will never fire up.


The trouble is, if we do not have a mind that is used to hearing good news, we do not have the neural pathways to process such news.

Yet despite the adaptive optimism bias of the human brain, Perry argues a positive outlook is a practice — and one that requires mastering the art of vulnerability and increasing our essential tolerance for uncertainty:

You may find that you have been telling yourself that practicing optimism is a risk, as though, somehow, a positive attitude will invite disaster and so if you practice optimism it may increase your feelings of vulnerability. The trick is to increase your tolerance for vulnerable feelings, rather than avoid them altogether.


Optimism does not mean continual happiness, glazed eyes and a fixed grin. When I talk about the desirability of optimism I do not mean that we should delude ourselves about reality. But practicing optimism does mean focusing more on the positive fall-out of an event than on the negative. … I am not advocating the kind of optimism that means you blow all your savings on a horse running at a hundred to one; I am talking about being optimistic enough to sow some seeds in the hope that some of them will germinate and grow into flowers.

Another key obstruction to our sanity is our chronic aversion to being wrong, entwined with our damaging fear of the unfamiliar. Perry cautions:

We all like to think we keep an open mind and can change our opinions in the light of new evidence, but most of us seem to be geared to making up our minds very quickly. Then we process further evidence not with an open mind but with a filter, only acknowledging the evidence that backs up our original impression. It is too easy for us to fall into the rap of believing that being right is more important than being open to what might be.

If we practice detachment from our thoughts we learn to observe them as though we are taking a bird’s eye view of our own thinking. When we do this, we might find that our thinking belongs to an older, and different, story to the one we are now living.

Perry concludes:

We need to look at the repetitions in the stories we tell ourselves [and] at the process of the stories rather than merely their surface content. Then we can begin to experiment with changing the filter through which we look at the world, start to edit the story and thus regain flexibility where we have been getting stuck.

Originally featured in February — read the full article here.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:

You can also become a one-time patron with a single donation in any amount:

Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

02 DECEMBER, 2013

How Should We Live: History’s Forgotten Wisdom on Love, Time, Family, Empathy, and Other Aspects of the Art of Living


“How to pursue the art of living has become the great quandary of our age… The future of the art of living can be found by gazing into the past.”

“He who cannot draw on three thousand years is living from hand to mouth,” Goethe famously proclaimed. Thomas Hobbes extolled “the principal and proper work of history being to instruct, and enable men by the knowledge of actions past to bear themselves prudently in the present and providently in the future.” It is this notion of “applied history” that cultural historian and philosopher Roman Krznaric — who gave us How to Find Fulfilling Work, one of the best psychology and philosophy books of 2013 — places at the center of How Should We Live?: Great Ideas from the Past for Everyday Life (public library). Part psychological manual, part cultural manifesto, part philosophical memoir of our civilization’s collective conscience, the book explores yesteryear’s great works of philosophy, social science, economics, anthropology, and cultural mythology. Krznaric trawls the timeless to surface the timely and excavate practical ideas about the art of living, about how we, today, can live better, richer, more fulfilling lives — ideas across love, work, family, time, money, death, creativity, and more.

He writes in the introduction:

How to pursue the art of living has become the great quandary of our age.


I believe that the future of the art of living can be found by gazing into the past. If we explore how people have lived in other epochs and cultures, we can draw out lessons for the challenges and opportunities of everyday life. What secrets for living with passion lie in medieval attitudes towards death, or in the pin factories of the Industrial Revolution? How might an encounter with Ming-dynasty China, or Central African indigenous culture, change our views about bringing up our kids and caring for our parents? It is astonishing that, until now, we have made so little effort to unveil this wisdom from the past, which is based on how people have actually lived rather than utopian dreamings of what might be possible.

I think of history as a wonderbox, similar to the curiosity cabinets of the Renaissance — what the Germans called a Wunderkammer. Collectors used these cabinets to display an array of fascinating and unusual objects, each with a story to tell, such as a miniature Turkish abacus or a Japanese ivory carving. Passed down from one generation to another, they were repositories of family lore and learning, tastes and travels, a treasured inheritance. History, too, hands down to us intriguing stories and ideas from a cornucopia of cultures. It is our shared inheritance of curious, often fragmented artefacts that we can pick up at will and contemplate in wonder. There is much to learn about life by opening the wonderbox of history.

Rather than approaching that wonderbox as an instructional manual, however, Krznaric looks at history as a choose-your-own-adventure compendium of do’s as well as don’ts. In addition to shining light on yesteryear’s forgotten wisdom on living, he also seeks to highlight the habitual attitudes we’ve unwittingly inherited from the past and to bring mindfulness to the bad and the harmful as well, so we can begin to decondition them — those toxic belief systems that sell us impossible romantic ideals, strain our relationship with time, force us into work that falsely prizes narrow specialization over wide intellectual expansion, measure our success in material terms, and otherwise warp the meaning of life:

We need to trace the historical origins of these legacies which have quietly crept into our lives and surreptitiously shaped our worldviews. We may choose to accept them, understanding ourselves all the better for it, or we may reject them and cut ourselves free from an unwanted inheritance, ready to invent anew. That is the sublime power we wield when we have history in our hand.

The Histomap by John Sparks, 1931. Click image for details.

In a chapter on love, Krznaric contends that our modern definition of love is too narrow, which both deprives us of the breadth of this grand human capacity and sets us up for disappointment:

We can navigate these difficulties of love — and enhance its joys — by grasping the significance of two great tragedies in the history of the emotions. The first is that we have lost knowledge of the different varieties of love that existed in the past, especially those familiar to the ancient Greeks, who knew love could be discovered not just with a sexual partner, but also in friendships, amongst strangers, and with themselves. The second tragedy is that over the last thousand years, these varieties have been incorporated into a mythical notion of romantic love, which compels us to believe that they can all be found in one person, a unique soulmate. We can escape the confines of this inheritance by looking for love outside the realm of romantic attachments, and cultivating its many forms.

To lift our culturally conditioned blinders, he turns to the six varieties of love found in ancient Greek philosophy: eros, or sexual passion; phila, a more virtuous love often translated as “friendship”; ludus, the playful affection between children or casual lovers; pragma, the mature love and deep understanding that develop in lasting relationships; agape, the selfless love extended to all human beings, offered unconditionally and with no expectation of reciprocity; and philautia, or self-love, which can be both negative, manifested as greed and narcissism (after the myth of Narcissus), and positive, as a nourishing broadening of our capacity for all love, starting from within.

Vintage illustration for Homer's 'The Iliad and the Odyssey' by Alice and Martin Provensen. Click image for details.

Observing history’s long quest to define love, Krznaric finds in the ancient Greek model solace for our modern hearts:

One of the universal questions of emotional life has always been, “What is love?” I believe that this is a misleading question, and one which has caught us in futile knots of confusion in an attempt to identify some definitive essence of “true love.” The lesson from ancient Greece is that we must instead ask ourselves, “How can I cultivate the different varieties of love in my life?” That is the ultimate question of love that we face today. But if we wish to nurture these varieties, we must first dispel the potent myth of romantic love which stands in the way.

Krznaric argues that the myth of romantic love, whose origin he traces across three stages over the course of civilization — the poetry and music of medieval Persia in the first millennium, the courtly love ideal of Europe in the Middle Ages, and the utilitarian union of passion and companionship of the Dutch Golden Age in the seventeenth century — is a toxic myth that only robs us of love’s dimension and leads us to believe all forms of love should be found in a single person. Instead, he suggests that returning to the ancient Greek taxonomy would enrich our lives and give greater freedom in cultivating our capacity for love:

The varieties of love invented by the ancient Greeks … are what we should be striving to cultivate, and with a range of people rather than just one person. I am not saying that you should get your pragma from a steady marriage but then satisfy your eros in a series of lustful affairs. That is bound to be a destructive strategy, for sexual jealousy is part of our natures and few people can tolerate open relationships. What I mean is that we ought to acknowledge that we may only be fulfilled in love if we can nurture it in a multitude of ways and tap into its many sources. So we should foster our philia through having profound friendships outside our main relationship, and make space for our lover to do the same without resenting the time they spend apart from us. We can seek the joys of ludus not just in sex but in other forms of play, from tango dancing and performing in amateur theatre to laughing with children around the family dining table. And we must recognize that being drawn too far into self-love, or limiting our love to only a small circle of people, will not be enough to meet our inner need to feel part of a larger whole. So we should all make a place for agape in our lives, and transform love into a gift for strangers. That is how we can reach a point where our lives feel abundant with love.

In a chapter on family, Krznaric explores the lost history of the househusband — presently an exotic species statistically outnumbered by housewives by a ratio of forty to one, and yet a species that used to be far more common in pre-industrial society. In a chapter on empathy, he invokes the famous anecdote of St. Francis swapping his clothes with a beggar in order know what it was like to be a pauper, then writes:

Empathy matters not just because it makes you good, but because it is good for you. It has the power to heal broken relationships, erode our prejudices, expand our curiosity about strangers and make us rethink our ambitions. Ultimately empathy creates the human bonds that make life worth living.

He once again offers an important caveat to our current cultural understanding — or misunderstanding — of empathy:

It is important when thinking about empathy to distinguish it from the so-called Golden Rule: “Do unto others as you would have them do unto you.” Although a worthy notion, it is not empathy, since it involves considering how you — with your own views — would wish to be treated. Empathy is harder: it requires imagining others’ views and then acting accordingly. George Bernard Shaw understood the difference when he remarked, “Do not do unto others as you would have them do unto you — they may have different tastes.”

George Orwell

But it is another legendary writer to whom Krznaric points as the “one person who did more than most others to transform this experiential form of empathy into an extreme sport” — George Orwell. Raised in a privileged upper-middle-class British family, with the benefit of an elite education, Orwell found himself suddenly disillusioned with imperialism when he spent five years as a colonial officer in Burma in his early twenties. More than that, he couldn’t live with his own contribution to the system he came to so vehemently disdain — he wrote that the job made him “see the dirty work of Empire at close quarters” and left him oppressed “with an intolerable sense of guilt.” Those five years were Orwell’s training ground for empathy. Krznaric traces how the beloved writer put his own twist on the St. Francis approach:

If Burma was his apprenticeship as an empathist, Orwell’s formative training took place in London in the late 1920s and early 1930s. Determined to be a writer, he came up with a plan that would give him both a literary and a moral education: to conduct a radical experiment in experiencing poverty. He wanted to know what it was really like to be downtrodden, to exist on the margins of society, to be short of food, money and hope. Reading about it was not enough — his aim was to live it.


So, for several years, Orwell regularly dressed as a tramp in shabby clothes and shoes, and ventured out virtually penniless to frequent the “spikes” — hostels for the homeless — and doss-houses of the East End of London, wandering the streets with beggars and other destitutes. He would stay for anything from a few days to several weeks. At all times he did so without concessions or compromise, without carrying spare money for an emergency or wearing extra layers of clothes against the winter cold.


Orwell’s empathy grew out of an attempt to liberate himself from his elite background and the imperialism of which he had been a foot soldier. But he also wanted to touch injustice with his own hands rather than be just another clever intellectual who pitied the poor from a comfortable distance. And in this he undoubtedly succeeded.

He also succeeded in showing how empathy was about far more than ethics. His tramping excursions certainly challenged his prejudices and shifted his moral values, but they also gained him new friendships, nurtured his curiosity, expanded his ability to talk to people from different social backgrounds and provided him with a rich seam of literary materials which would last him for years. For a young man who had once worn a top hat at Eton,his experiments in living down and out were an intense, exhilarating and often challenging lesson in life itself, catapulting him out of the narrowness of his privileged past. Trying to survive on the streets of East London was the greatest travel experience he would ever have.

Though Krznaric notes that few of us can go to the extremes Orwell did in our pursuit of embodied empathy, he draws from the writer’s experience a seemingly simple yet, when applied, profound everyday aspiration:

The idea of empathy has distinct moral overtones and is often associated with “being good.” But experiential empathy should really be regarded as an unusual and stimulating form of travel. George Orwell would tell us to forget spending our next vacation at an exotic resort or visiting standard tourist sites. It is far more interesting to expand our minds by taking journeys into other people’s lives — and allowing them to see ours. Rather than asking ourselves, “Where can I go next?,” the question on our lips should be, “Whose shoes can I stand in next?”

But my favorite is a chapter in which Krznaric considers our strange relationship with time — that peculiar dimension of life that we’ve attempted to measure and make palpable since the dawn of, well, time. Given how profoundly metaphors shape our thoughts and emotions, Krznaric makes an especially pause-giving point in exploring how our flawed metaphors for time — much like our flawed metaphors for memory — deform our experience of it:

Our concept of time is similarly structured by metaphors, and we need to become aware of the subtle ways they work on our minds. One of the most prevalent metaphors already mentioned, which emerged during the Industrial Revolution, is of time as a commodity: spending time, buying time, wasting time, saving time, “time is money,” “living on borrowed time.” Another dating from the same period is time as a possession: “my time is my own,” “give me a moment of your time.” These two metaphors, in tandem, constitute the psycholinguistic roots of our problems with time. If our time is like private property, it becomes possible not only for it to be freely granted to others, but possibly owned by them or appropriated at an unfair price against our wishes.

Discus chronologicus by German engraver Christoph Weigel, a timekeeping chart circa 1720s. Click image for details.

I’ve always been extremely wary of the notion of “work-life balance,” which pits work as something undesirable that takes away from the desirable parts of life and must thus be balanced against those, kept to a reasonable level, quarantined. This dichotomy leaves no room for the kind of purposeful work that is life. Curiously, the very concept of “work-life balance” tends to use time as the currency by which balance is measured, so I find Krznaric’s admonition against conceiving of time as a commodity especially resonant when it comes to how we think about work and play:

One manifestation of the “time as a commodity and possession” metaphor is when we talk about taking “time off” from work. This expression is essentially saying that we have given our employer ownership of our time. . . . Each year the firm will give us back a little of our time, usually no more than a few weeks. This vacation period is usually referred to as “time off”; it is their gift to us, a temporary pause in the regular pattern, in which being at work is, by implication, “time on.”

Instead, he proposes we think of leisure as our “time on,” which would imbue our non-work time with value and awaken from the cultural trance that consistently blinds us to how much more precious presence is than productivity.

Krznaric traces the history of time as a commodity to the dawn of the Protestant ethic, which equates productivity and efficiency with virtue. Coupled with that is the cultural legacy of our fear of boredom, beneath which Krznaric finds a deeper anxiety:

On some level we fear boredom. A deeper explanation is that we are afraid that an extended pause would give us the time to realize that our lives are not as meaningful and fulfilled as we would like them to be. The time for contemplation has become an object of fear, a demon.

He argues that we can begin to undo this baleful legacy by borrowing the mindset of Gustave Flaubert, who famously observed that “anything becomes interesting if you look at it long enough.” Krznaric even proposes a somewhat radical pragmatic intervention: a “chronological diet” that banishes timekeeping devices from your wrist, your smartphone display, and your walls — a tactic, however disorienting at first, that promises to make you “less likely to interrupt a conversation or a thought with a glance at your watch which sends you scuttling off to the next task.”

How Should We Live?: Great Ideas from the Past for Everyday Life, an illuminating and awakening read in its entirety, goes on to explore how Michelangelo deformed how we think about creativity and precipitated the dangerous myth of the sole genius, why money is not enough, what the Toltec and Aztec civilizations teach us about our understanding of mortality, and much more. Complement it with George Myerson’s exploration of what history teaches us about attaining everyday happiness.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:

You can also become a one-time patron with a single donation in any amount:

Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.