“Sponsorship in the press is an invitation to corruption and abuse.”
By Maria Popova
In 1923, a prominent journalist bemoaned the death of the editor and the rise of the circulation manager as newspapers began grubbing for ever-more advertising revenue tailored their content around that goal, rather than around readers’ best interests. More than a half-century later, in the fall of 1975, Esquire magazine announced a forthcoming 23-page article by Pulitzer-Prize-winning journalist Harrison Salisbury, to be published in their February 1976 issue and sponsored by Xerox — an arrangement in which Salisbury would receive no payment from Esquire, but would be paid $40,000, plus another $15,000 in expenses, by the Xerox Corporation. The announcement spurred profound consternation in E. B. White, which he articulated with equal parts eloquence and rigor in his letters to the editor of Esquire and to Xerox’s Director of Communications, culled from the fantastic Letters of E. B. White (public library).
At the heart of the exchange is an infinitely important, at once timeless and incredibly timely discussion of what it means to have free press.
In the first letter, White writes:
This, it would seem to me, is not only a new idea in publishing, it charts a clear course for the erosion of the free press in America. Mr. Salisbury is a former associate editor of the New York Times and should know better. Esquire is a reputable sheet and should know better. But here we go — the Xerox-Salisbury-Esquire axis in full cry!
Apparently Mr. Salisbury had a momentary qualm about taking on the Xerox job. The Times reports him as saying, “At first I thought, gee whiz, should I do this?” But he quickly compared his annoying doubts and remembered that big corporations had in the past been known to sponsor “cultural enterprises,” such as opera. The emergence of a magazine reporter as a cultural enterprise is as stunning a sight as the emergence of a butterfly from a cocoon. Mr. Salisbury must have felt great, escaping from his confinement.
Well, it doesn’t take a giant intellect to detect in all this the shadow of disaster. If magazines decide to farm out their writers to advertisers and accept the advertiser’s payment to the writer and to the magazine, then the periodicals of this country will be far down the drain and will become so fuzzy as to be indistinguishable from the controlled press in other parts of the world.
E. B. White
The points White raises reflect some of my own profound concerns about journalism, media, and the free press today. On the one hand, a large part of me — the part that has been publishing an ad-free curiosity catalog supported by reader donations for the past seven years — believes that whenever corporate interests and advertising revenue become necessary for the production of content, both the spirit of journalism and the reader’s best interests suffer, and we get atrocities like the SEO-optimized, sensationalist headlines of the BuzzWorthy industrial complex, vacant linkbait infographics, and endless click-click-click slideshows. On the other hand, I remain keenly aware that quality journalism — especially ambitious endeavors like investigative pieces and longform features — is resource-intensive and requires funding, and the idea that readers would be willing to fund this kind of work directly is at best utopian and at worst highly unrealistic in a fragmented media landscape of commodified “content.”
It’s the same ambivalence one might feel at seeing a Fortune 100 CEO on the TED stage, as was the case with Bill Ford and PepsiCo’s Indra Nooyi at last year’s TED Long Beach. On the one hand, TED’s entire media brand is based on “ideas worth spreading” for the public good, which requires a certain amount of bravery. There can be no bravery when one is accountable to a board of trustees or investors, because the “users,” “consumers,” or whatever dehumanized placeholder we choose for the audience of a product, service, or piece of information should be its sole appropriate stakeholders. On the other hand, in a capitalist society, large corporations may be the only ones with the fiscal power to effect tangible change beyond the mere talk of idealism.
Shortly after his letter, White received a response from W. B. Jones, Xerox’s Director of Communications, featuring the following rationalization:
It seemed to us that the sponsorship was not subject to question provided: 1. Both the magazine and the writer had earned reputations for absolute integrity; 2. Our sponsorship was open and identified to readers; 3. The writer was paid ‘up front,’ so that his fee did not depend in any way on our reaction to the piece; 4. The writer understood that this was a one-shot assignment and he’d get no other from Xerox, no matter what we thought of the piece; 5. The magazine retained full editorial control of the project.
White’s response to Jones gets to the heart of democracy and free press with astounding precision:
The press in our free country is reliable and useful not because of its good character but because of its great diversity. As long as there are many owners, each pursuing his own brand of truth, we the people have the opportunity to arrive at the truth and to dwell in the light. The multiplicity of ownership is crucial. It’s only when there are a few owners, or, as in a government-controlled press, one owner, that the truth becomes elusive and the light fails. For a citizen in our free society, it is an enormous privilege and a wonderful protection to have access to hundreds of periodicals, each peddling its own belief. There is safety in numbers: the papers expose each other’s follies and peccadillos, correct each other’s mistakes, and cancel out each other’s biases. The reader is free to range around in the whole editorial bouillabaisse and explore it for the one clam that matters—the truth.
White goes on to argue that when the ownership of media lies in the hands of a single entity, be that a government or a media mogul, the direction of editorial accountability shift dangerously in a direction other than the reader’s. The multiplicity and sovereignty of media, he argues, is essential to ensuring we don’t live in a filter bubble of information.
Whenever money changes hands, something goes along with it — an intangible something that varies with the circumstances. It would be hard to resist the suspicion that Esquire feels indebted to Xerox, that Mr. Salisbury feels indebted to both, and that the ownership, or sovereignty, of Esquire has been nibbled all around the edges.
Sponsorship in the press is an invitation to corruption and abuse. The temptations are great, and there is an opportunist behind every bush. A funded article is a tempting morsel for any publication—particularly for one that is having a hard time making ends meet. A funded assignment is a tempting dish for a writer, who may pocket a much larger fee than he is accustomed to getting. And sponsorship is attractive to the sponsor himself, who, for one reason or another, feels an urge to penetrate the editorial columns after being so long pent up in the advertising pages. These temptations are real, and if the barriers were to be let down I believe corruption and abuse would soon follow. Not all corporations would approach subsidy in the immaculate way Xerox did or in the same spirit of benefaction. There are a thousand reasons for someone’s wishing to buy his way into print, many of them unpalatable, all of them to some degree self-serving. Buying and selling space in news columns could become a serious disease of the press. If it reached epidemic proportions, it could destroy the press. I don’t want IBM or the National Rifle Association providing me with a funded spectacular when I open my paper. I want to read what the editor and the publisher have managed to dig up on their own—and paid for out of the till.
White drives the point home with his signature style of the deeply personal conveying the broadly relevant:
My affection for the free press in a democracy goes back a long way. My love for it was my first and greatest love. If I felt a shock at the news of the Salisbury-Xerox-Esquire arrangement, it was because the sponsorship principle seemed to challenge and threaten everything I believe in: that the press must not only be free, it must be fiercely independent — to survive and to serve. Not all papers are fiercely independent, God knows, but there are always enough of them around to provide a core of integrity and an example that others feel obliged to steer by. The funded article is not in itself evil, but it is the beginning of evil, and it is an invitation to evil. I hope the invitation will not again be extended, and, if extended, I hope it will be declined.
Nearly another half-century later, “the funded article” describes, directly or indirectly, the vast majority of today’s information landscape. The basic ad-supported monetization model of media today, online and off, is a legacy model that only further commodifies content, erodes editorial integrity, and does the audience — who should be, to reiterate because this can’t be emphasized enough, the only appropriate stakeholder — a tragic disservice. Whoever figures out an intelligent alternative will save journalism from itself and rekindle the hope for a truly free press.
“Diversity fills the city with cartographic potential… New York belongs to everyone, and maps prove it.”
By Maria Popova
“Each of us is an atlas of sorts, already knowing how to navigate some portion of the world,” wrote Rebecca Solnit in her imaginative remapping of New York’s untold stories, “containing innumerable versions of place as experience and desire and fear, as route and landmark and memory.” But as fascinating as it is to imagine the world’s greatest metropolis remapped according to its unheralded dimensions, New York’s multitude of parallel realities is itself bountiful fodder for the artistic imagination and has inspired centuries of fanciful cartographic interpretations.
Exploring this lacuna between physical reality and the interpretive imagination is a very different kind of atlas — You Are Here: NYC: Mapping the Soul of the City (public library), envisioned and edited by Katharine Harmon. This localized follow-up to Harmon’s wonderful 2004 atlas of “personal geographies and other maps of the imagination” presents two hundred wildly diverse maps of the city, alongside original essays exploring the most iconic of them. There are historical treasures like the first geological maps of Manhattan, masterworks of art like Paula Scher’s obsessively detailed typographic maps, and conceptually daring pieces like artist Liz Scranton’s honeycomb shaped after the landforms of the NYC subway map. What emerges is a layered inquiry into the relationship between self and space, the plurality of perspectives aimed at the same place, and the myriad ways in which we orient ourselves to the landscape against which we live out our lives.
Harmon writes in the introduction:
What is it about the city that invites mapping? First, perhaps, is a need to find one’s place here. An endlessly morphing population of contemporary lives humming along, side by side and mutually oblivious, feeds a need to locate oneself. Another New Yorker writer, A.J. Liebling, wrote in 1938 of the city’s multiplicity of lives: “the worlds of weight lifters, yodelers, tugboat captains, and sideshow barkers, of the book ditchers, sparring partners, song pluggers, sporting girls and religious painters, of the dealers in rhesus monkeys and the bishops of churches.” Diversity fills the city with cartographic potential. Density, ethnicity, race, heritage, languages, income differentials, locals versus commuters versus tourists — all can be, and have been, mapped. New York belongs to everyone, and maps prove it.
With an eye to the two hundred dazzling cartographic curiosities included in the book, culled from an initial database of one thousand maps she had compiled, Harmon writes:
New York has no shortage of inventive thinkers who make excellent cartographers. Each act of creative cartography reflects both the state of mind of the mapper and the state of the city. And each contributes another page to a giant, ever-accumulating atlas of New York — an atlas as big as the city’s self-regard. Perhaps, in the end, what makes the city the most mapped metropolis in the world is that it offers complete cartographic liberty.
I contributed one of the essays for the book, “A Panorama of Power,” exploring the monumental Panorama of the City of New York currently housed at the Queens Museum. This is what I write:
“A poem,” E.B. White wrote in his 1949 masterpiece Here Is New York, “compresses much in a small space and adds music, thus heightening its meaning. The city is like poetry.” Nothing compresses the city in order to heighten its meaning more palpably than the Panorama of the City of New York — an astonishing feat of architecture, urban planning, and craftsmanship, strangely poetic in its deliberate contrast of scale and significance. To look at it is to see, perhaps for the first time, the city’s elegant enormity.
Constructed by a team of more than one hundred architectural model builders from Raymond Lester & Associates over the course of three years, this elaborate and elegant microcosm reduces every hundred feet of cityscape to one inch of Formica panels and Urethane foam. This conceptual compression cost $672,662.69 to construct in 1964 — the equivalent of approximately five million dollars today. But what makes the Panorama most striking is its affront to our sense of scale — at 9,335 square feet, it is both a miniature and an expanse, containing every street, every park, and every single one of the 895,000 buildings constructed prior to 1992, when Raymond Lester & Associates updated the model.
The Panorama, which now resides at the Queens Museum, was created for the 1964 World’s Fair as a celebration of master-builder Robert Moses and his indelible imprint on the cityscape. A brilliant architect and a fierce politician who publicly defied politicians—including, in one famous incident, President Roosevelt himself—Moses envisioned and brought to life 658 playgrounds, 416 miles of parkways, 288 tennis courts, 678 baseball diamonds, and numerous major roads and bridges. He was a man animated by “an imagination that leaped unhesitatingly at problems insoluble to other people,” as Robert E. Caro wrote in The Power Broker — his Pulitzer-winning 1,200-page biography of Moses.
But Moses, like the city itself, was also a man of duality. Although he began his career as an earnest idealist and an irrepressibly optimistic reformer, the power machine hardened him into a man of “iron will and determination,” in Caro’s words. Intent on bending the world’s greatest city to his will, he imprinted Gotham with his fiery fusion of idealism and egotism. That his legacy should be celebrated by a miniature model of the city, Moses’s favorite toy, is only fitting.
Perhaps most emblematic of all is how the Panorama was pitched at the 1964 World’s Fair, where it became a favorite attraction — as an indoor helicopter tour of the city, promising to provide a “god’s-eye view” of the urban ecosystem. In a sense, visitors were invited to try on the view of Moses — a self-anointed god who had drawn the master-map not only of the city’s infrastructure but also of its very character and destiny; the craftsman of the grand stage onto which, in the immortal words of White, “enormous and violent and wonderful events … are taking place every minute.”
In another essay from the book, New Yorker cartoon editor Bob Mankoff considers Saul Steinberg’s most famous cover, both timeless and rendered timely by the recent shock of sobering political perspectives:
I saw the New Yorker cover when it came out in 1976, and it wasn’t long before the magazine, in response to popular demand, made it into a poster. And not long after that you could find it on the walls of apartments and college dorms. Soon it was pretty much everywhere, even if only as a local imitation — who knows, maybe even out there on the far right horizon of the drawing, in Russia, perhaps there’s a yellowing poster of “The View of the World from Novosibirsk.”
The vast popularity of “View of the World” was that it appeared eminently “gettable,” especially when the image was topped by the New Yorker logo. With that affixed to the image, to put it in New Yorkeese, “what’s not to get?” It seemed to be an unambiguous visualization of that old quote, “If you’re leaving New York, you ain’t going nowhere.”
Yes, it was gettable, and more than that, easily adaptable and therefore adoptable, which is why so many other cities knocked off the cover, to proclaim, however dubiously, under their own local rubric, that they were the epicenter of existence. As a born-and-bred New Yawker, my own take was similar, with the very implausibility implicit in the derivative covers’ claims, actually making my own native chauvinism seem reasonable in comparison. I mean Novosibirsk may be a nice little city, but gimme a break.
“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,”Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library) — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.
Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:
Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.
By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.
This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.
For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:
Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.
The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:
Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.
In this way, the familiar becomes unfamiliar, and the old the new.
Her approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.
First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)
I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.
Given my soft spot for famous diaries, it should come as no surprise that I keep one myself. Perhaps the greatest gift of the practice has been the daily habit of reading what I had written on that day a year earlier; not only is it a remarkable tool of introspection and self-awareness, but it also illustrates that our memory “is never a precise duplicate of the original [but] a continuing act of creation” and how flawed our perception of time is — almost everything that occurred a year ago appears as having taken place either significantly further in the past (“a different lifetime,” I’d often marvel at this time-illusion) or significantly more recently (“this feels like just last month!”). Rather than a personal deficiency of those of us befallen by this tendency, however, it turns out to be a defining feature of how the human mind works, the science of which is at first unsettling, then strangely comforting, and altogether intensely interesting.
That’s precisely what acclaimed BBC broadcaster and psychology writer Claudia Hammond explores in Time Warped: Unlocking the Mysteries of Time Perception (public library) — a fascinating foray into the idea that our experience of time is actively created by our own minds and how these sensations of what neuroscientists and psychologists call “mind time” are created. As disorienting as the concept might seem — after all, we’ve been nursed on the belief that time is one of those few utterly reliable and objective things in life — it is also strangely empowering to think that the very phenomenon depicted as the unforgiving dictator of life is something we might be able to shape and benefit from. Hammond writes:
We construct the experience of time in our minds, so it follows that we are able to change the elements we find troubling — whether it’s trying to stop the years racing past, or speeding up time when we’re stuck in a queue, trying to live more in the present, or working out how long ago we last saw our old friends. Time can be a friend, but it can also be an enemy. The trick is to harness it, whether at home, at work, or even in social policy, and to work in line with our conception of time. Time perception matters because it is the experience of time that roots us in our mental reality. Time is not only at the heart of the way we organize life, but the way we experience it.
Among the most intriguing illustrations of “mind time” is the incredible elasticity of how we experience time. (“Where is it, this present?,”William James famously wondered. “It has melted in our grasp, fled ere we could touch it, gone in the instant of becoming.”) For instance, Hammond points out, we slow time down when gripped by mortal fear — the cliche about the slow-motion car crash is, in fact, a cognitive reality. This plays out even in situations that aren’t life-or-death per se but are still associated with strong feelings of fear. Hammond points to a study in which people with arachnophobia were asked to look at spiders — the very object of their intense fear — for 45 seconds and they overestimated the elapsed time. The same pattern was observed in novice skydivers, who estimated the duration of their peers’ falls as short, whereas their own, from the same altitude, were deemed longer.
Inversely, time seems to speed up as we get older — a phenomenon of which competing theories have attempted to make light. One, known as the “proportionality theory,” uses pure mathematics, holding that a year feels faster when you’re 40 than when you’re 8 because it only constitutes one fortieth of your life rather than a whole eighth. Among its famous proponents are Vladimir Nabokov and William James. But Hammond remains unconvinced:
The problem with the proportionality theory is that it fails to account for the way we experience time at any one moment. We don’t judge one day in the context of our whole lives. If we did, then for a 40-year-old every single day should flash by because it is less than one fourteen-thousandth of the life they’ve had so far. It should be fleeting and inconsequential, yet if you have nothing to do or an enforced wait at an airport for example, a day at 40 can still feel long and boring and surely longer than a fun day at the seaside packed with adventure for a child. … It ignores attention and emotion, which … can have a considerable impact on time perception.
Another theory suggests that perhaps it is the tempo of life in general that has accelerated, making things from the past appear as slower, including the passage of time itself.
But one definite change does take place with age: As we grow older, we tend to feel like the previous decade elapsed more rapidly, while the earlier decades of our lives seem to have lasted longer. Similarly, we tend to think of events that took place in the past 10 years as having happened more recently than they actually did. (Quick: What year did the devastating Japanese tsunami hit? When did we lose Maurice Sendak?) Conversely, we perceive events that took place more than a decade ago as having happened even longer ago. (When did Princess Diana die? What year was the Chernobyl disaster?) This, Hammond points out, is known as “forward telescoping”:
It is as though time has been compressed and — as if looking through a telescope — things seem closer than they really are. The opposite is called backward or reverse telescoping, also known as time expansion. This is when you guess that events happened longer ago than they really did. This is rare for distant events, but not uncommon for recent weeks.
The most straightforward explanation for it is called the clarity of memory hypothesis, proposed by the psychologist Norman Bradburn in 1987. This is the simple idea that because we know that memories fade over time, we use the clarity of a memory as a guide to its recency. So if a memory seems unclear we assume it happened longer ago.
The desire for fulfilling work — a job that provides a deep sense of purpose, and reflects our values, passions and personality — is a modern invention. … For centuries, most inhabitants of the Western world were too busy struggling to meet their subsistence needs to worry about whether they had an exciting career that used their talents and nurtured their wellbeing. But today, the spread of material prosperity has freed our minds to expect much more from the adventure of life.
We have entered a new age of fulfillment, in which the great dream is to trade up from money to meaning.
Krznaric goes on to outline two key afflictions of the modern workplace — “a plague of job dissatisfaction” and “uncertainty about how to choose the right career” — and frames the problem:
Never have so many people felt so unfulfilled in their career roles, and been so unsure what to do about it. Most surveys in the West reveal that at least half the workforce are unhappy in their jobs. One cross-European study showed that 60 per cent of workers would choose a different career if they could start again. In the United States, job satisfaction is at its lowest level — 45 per cent — since record-keeping began over two decades ago.
Of course, Krznaric points out, there’s plenty of cynicism and skepticism to go around, with people questioning whether it’s even possible to find a job in which we thrive and feel complete. He offers an antidote to the default thinking:
There are two broad ways of thinking about these questions. The first is the ‘grin and bear it’ approach. This is the view that we should get our expectations under control and recognize that work, for the vast majority of humanity — including ourselves — is mostly drudgery and always will be. Forget the heady dream of fulfillment and remember Mark Twain’s maxim. “Work is a necessary evil to be avoided.” … The history is captured in the word itself. The Latin labor means drudgery or toil, while the French travail derives from the tripalium, an ancient Roman instrument of torture made of three sticks. … The message of the ‘grin and bear it’ school of thought is that we need to accept the inevitable and put up with whatever job we can get, as long as it meets our financial needs and leaves us enough time to pursue our ‘real life’ outside office hours. The best way to protect ourselves from all the optimistic pundits pedaling fulfillment is to develop a hardy philosophy of acceptance, even resignation, and not set our hearts on finding a meaningful career.
I am more hopeful than this, and subscribe to a different approach, which is that it is possible to find work that is life-enhancing, that broadens our horizons and makes us feel more human.
This is a book for those who are looking for a job that is big enough for their spirit, something more than a ‘day job’ whose main function is to pay the bills.
Krznaric considers the five keys to making a career meaningful — earning money, achieving status, making a difference, following our passions, and using our talents — but goes on to demonstrate that they aren’t all created equal. In particular, he echoes 1970s Zen pioneer Alan Watts and modern science in arguing that money alone is a poor motivator:
Schopenhauer may have been right that the desire for money is widespread, but he was wrong on the issue of equating money with happiness. Overwhelming evidence has emerged in the last two decades that the pursuit of wealth is an unlikely path to achieving personal wellbeing — the ancient Greek ideal of eudaimonia or ‘the good life.’ The lack of any clear positive relationship between rising income and rising happiness has become one of the most powerful findings in the modern social sciences. Once our income reaches an amount that covers our basic needs, further increases add little, if anything, to our levels of life satisfaction.
We can easily find ourselves pursuing a career that society considers prestigious, but which we are not intrinsically devoted to ourselves — one that does not fulfill us on a day-to-day basis.
Krznaric pits respect, which he defines as “being appreciated for what we personally bring to a job, and being valued for our individual contribution,” as the positive counterpart to prestige and status, arguing that “in our quest for fulfilling work, we should seek a job that offers not just good status prospects, but good respect prospects.”
Rather than hoping to create a harmonious union between the pursuit of money and values, we might have better luck trying to combine values with talents. This idea comes courtesy of Aristotle, who is attributed with saying, ‘Where the needs of the world and your talents cross, there lies your vocation.’
Though most of his 77 “intuition pumps” address concrete questions, a dozen are “general-purpose” tools that apply deeply and widely, across just about any domain of thinking. The first of them is also arguably the most useful yet most uncomfortable: making mistakes.
The history of philosophy is in large measure the history of very smart people making very tempting mistakes, and if you don’t know the history, you are doomed to making the same darn mistakes all over again. … There is no such thing as philosophy-free science, just science that has been conducted without any consideration of its underlying philosophical assumptions.
He speaks for the generative potential of mistakes and their usefulness as an empirical tool:
Sometimes you don’t just want to risk making mistakes; you actually want to make them — if only to give you something clear and detailed to fix.
Therein lies the power of mistakes as a vehicle for, as Rilke famously put it, “living the questions” and thus advancing knowledge in a way that certainty cannot — for, as Richard Feynman memorably noted, the scientist’s job is to remain unsure, and so seems the philosopher’s. Dennett writes:
We philosophers are mistake specialists. … While other disciplines specialize in getting the right answers to their defining questions, we philosophers specialize in all the ways there are of getting things so mixed up, so deeply wrong, that nobody is even sure what the right questions are, let alone the answers. Asking the wrong questions risks setting any inquiry off on the wrong foot. Whenever that happens, this is a job for philosophers! Philosophy — in every field of inquiry — is what you have to do until you figure out what questions you should have been asking in the first place.
Mistakes are not just opportunities for learning; they are, in an important sense, the only opportunity for learning or making something truly new. Before there can be learning, there must be learners. There are only two non-miraculous ways for learners to come into existence: they must either evolve or be designed and built by learners that evolved. Biological evolution proceeds by a grand, inexorable process of trial and error — and without the errors the trials wouldn’t accomplish anything.
Trials can be either blind or foresighted. You, who know a lot, but not the answer to the question at hand, can take leaps — foresighted leaps. You can look before you leap, and hence be somewhat guided from the outset by what you already know. You need not be guessing at random, but don’t look down your nose at random guesses; among its wonderful products is … you!
And since evolution is the highest epitome of how the process of trial and error drives progress, Dennett makes a case for understanding evolution as a key to understanding everything else we humans value:
Evolution … is the central, enabling process not only of life but also of knowledge and learning and understanding. If you attempt to make sense of the world of ideas and meanings, free will and morality, art and science and even philosophy itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back. … For evolution, which knows nothing, the steps into novelty are blindly taken by mutations, which are random copying “errors” in DNA.
Dennett echoes Dostoyevsky (“Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others.”) and offers the key to making productive mistakes:
The chief trick to making good mistakes is not to hide them — especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own mistakes, turning them over in your mind as if they were works of art, which in a way they are. … The trick is to take advantage of the particular details of the mess you’ve made, so that your next attempt will be informed by it and not just another blind stab in the dark.
We have all heard the forlorn refrain “Well, it seemed like a good idea at the time!” This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say, “Well, it seemed like a good idea at the time!” is standing on the threshold of brilliance.
“The habit of mind which leads to a search for relationships between facts,” wrote James Webb Young in his famous 1939 5-step technique for creative problem-solving, “becomes of the highest importance in the production of ideas.” But just how does one acquire those vital cognitive customs? That’s precisely what science writer Maria Konnikova explores in Mastermind: How to Think Like Sherlock Holmes (UK; public library) — an effort to reverse-engineer Holmes’s methodology into actionable insights that help develop “habits of thought that will allow you to engage mindfully with yourself and your world as a matter of course.”
Bridging ample anecdotes from the adventures of Conan Doyle’s beloved detective with psychology studies both classic and cutting-edge, Konnikova builds a compelling case at the intersection of science and secular spiritualism, stressing the power of rigorous observation alongside a Buddhist-like, Cageian emphasis on mindfulness. She writes:
The idea of mindfulness itself is by no means a new one. As early as the end of the nineteenth century, William James, the father of modern psychology, wrote that, ‘The faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will. … An education which should improve this faculty would be the education par excellence.’ That faculty, at its core, is the very essence of mindfulness. And the education that James proposes, an education in a mindful approach to life and to thought.
In recent years, studies have shown that meditation-like thought (an exercise in the very attentional control that forms the center of mindfulness), for as little as fifteen minutes a day, can shift frontal brain activity toward a pattern that has been associated with more positive and more approach-oriented emotional states, and that looking at scenes of nature, for even a short while, can help us become more insightful, more creative, and more productive. We also know, more definitively than we ever have, that our brains are not built for multitasking — something that precludes mindfulness altogether. When we are forced to do multiple things at once, not only do we perform worse on all of them but our memory decreases and our general wellbeing suffers a palpable hit.
But for Sherlock Holmes, mindful presence is just a first step. It’s a means to a far larger, far more practical and practically gratifying goal. Holmes provides precisely what William James had prescribed: an education in improving our faculty of mindful thought and in using it in order to accomplish more, think better, and decide more optimally. In its broadest application, it is a means for improving overall decision making and judgment ability, starting from the most basic building block of your own mind.
But mindfulness, and the related mental powers it bestows upon its master, is a skill acquired with grit and practice, rather than an in-born talent or an easy feat attained with a few half-hearted tries:
It is most difficult to apply Holmes’s logic in those moments that matter the most. And so, all we can do is practice, until our habits are such that even the most severe stressors will bring out the very thought patterns that we’ve worked so hard to master.
Our intuition is shaped by context, and that context is deeply informed by the world we live in. It can thus serve as a blinder — or blind spot — of sorts. … With mindfulness, however, we can strive to find a balance between fact-checking our intuitions and remaining open-minded. We can then make our best judgments, with the information we have and no more, but with, as well, the understanding that time may change the shape and color of that information.
“I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose,” Holmes famously remarked. Indeed, much like the inventor’s mind, the problem-solver’s mind is the product of that very choice: The details and observations we select to include in our “brain attic” shape and filter our perception of reality. Konnikova writes:
Observation with a capital O — the way Holmes uses the word when he gives his new companion a brief history of his life with a single glance — does entail more than, well, observation (the lowercase kind). It’s not just about the passive process of letting objects enter into your visual field. It is about knowing what and how to observe and directing your attention accordingly: what details do you focus on? What details do you omit? And how do you take in and capture those details that you do choose to zoom in on? In other words, how do you maximize your brain attic’s potential? You don’t just throw any old detail up there, if you remember Holmes’s early admonitions; you want to keep it as clean as possible. Everything we choose to notice has the potential to become a future furnishing of our attics — and what’s more, its addition will mean a change in the attic’s landscape that will affect, in turn, each future addition. So we have to choose wisely.
Choosing wisely means being selective. It means not only looking but looking properly, looking with real thought. It means looking with the full knowledge that what you note — and how you note it — will form the basis of any future deductions you might make. It’s about seeing the full picture, noting the details that matter, and understanding how to contextualize those details within a broader framework of thought.
Originally featured in January — read the full article for more, including Konnikova’s four rules for Sherlockian thinking.
When things get tough, this is what you should do: Make good art. I’m serious. Husband runs off with a politician — make good art. Leg crushed and then eaten by a mutated boa constrictor — make good art. IRS on your trail — make good art. Cat exploded — make good art. Someone on the Internet thinks what you’re doing is stupid or evil or it’s all been done before — make good art. Probably things will work out somehow, eventually time will take the sting away, and that doesn’t even matter. Do what only you can do best: Make good art. Make it on the bad days, make it on the good days, too.
A wise woman once said, “If you are not making mistakes, you’re not taking enough risks.” Gaiman articulates the same sentiment with his own brand of exquisite eloquence:
I hope that in this year to come, you make mistakes.
Because if you are making mistakes, then you are making new things, trying new things, learning, living, pushing yourself, changing yourself, changing your world. You’re doing things you’ve never done before, and more importantly, you’re Doing Something.
So that’s my wish for you, and all of us, and my wish for myself. Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before. Don’t freeze, don’t stop, don’t worry that it isn’t good enough, or it isn’t perfect, whatever it is: art, or love, or work or family or life.
Duckworth had come to Penn in 2002, at the age of thirty-two, later in life than a typical graduate student. The daughter of Chinese immigrants, she had been a classic multitasking overachiever in her teens and twenties. After completing her undergraduate degree at Harvard (and starting a summer school for low-income kids in Cambridge in her spare time), she had bounced from one station of the mid-nineties meritocracy to the next: intern in the White House speechwriting office, Marshall scholar at Oxford (where she studied neuroscience), management consultant for McKinsey and Company, charter-school adviser.
Duckworth spent a number of years toying with the idea of starting her own charter school, but eventually concluded that the model didn’t hold much promise for changing the circumstances of children from disadvantaged backgrounds, those whom the education system was failing most tragically. Instead, she decided to pursue a PhD program at Penn. In her application essay, she shared how profoundly the experience of working in schools had changed her view of school reform and wrote:
The problem, I think, is not only the schools but also the students themselves. Here’s why: learning is hard. True, learning is fun, exhilarating and gratifying — but it is also often daunting, exhausting and sometimes discouraging. . . . To help chronically low-performing but intelligent students, educators and parents must first recognize that character is at least as important as intellect.
Duckworth began her graduate work by studying self-discipline. But when she completed her first-year thesis, based on a group of 164 eighth-graders from a Philadelphia middle school, she arrived at a startling discovery that would shape the course of her career: She found that the students’ self-discipline scores were far better predictors of their academic performance than their IQ scores. So she became intensely interested in what strategies and tricks we might develop to maximize our self-control, and whether those strategies can be taught. But self-control, it turned out, was only a good predictor when it came to immediate, concrete goals — like, say, resisting a cookie. Tough writes:
Duckworth finds it useful to divide the mechanics of achievement into two separate dimensions: motivation and volition. Each one, she says, is necessary to achieve long-term goals, but neither is sufficient alone. Most of us are familiar with the experience of possessing motivation but lacking volition: You can be extremely motivated to lose weight, for example, but unless you have the volition — the willpower, the self-control — to put down the cherry Danish and pick up the free weights, you’re not going to succeed. If a child is highly motivated, the self-control techniques and exercises Duckworth tried to teach [the students in her study] might be very helpful. But what if students just aren’t motivated to achieve the goals their teachers or parents want them to achieve? Then, Duckworth acknowledges, all the self-control tricks in the world aren’t going to help.
This is where grit comes in — the X-factor that helps us attain more long-term, abstract goals. To address this, Duckworth and her colleague Chris Peterson developed the Grit Scale — a deceptively simple test, on which you evaluate how much twelve statements apply to you, from “I am a hard worker” to “New ideas and projects sometimes distract me from previous ones.” The results are profoundly predictive of success at such wide-ranging domains of achievement as the National Spelling Bee and the West Point military academy. Tough describes the surprising power of this seemingly mundane questionnaire:
For each statement, respondents score themselves on a five-point scale, ranging from 5, “very much like me,” to 1, “not like me at all.” The test takes about three minutes to complete, and it relies entirely on self-report — and yet when Duckworth and Peterson took it out into the field, they found it was remarkably predictive of success. Grit, Duckworth discovered, is only faintly related to IQ — there are smart gritty people and dumb gritty people — but at Penn, high grit scores allowed students who had entered college with relatively low college-board scores to nonetheless achieve high GPAs. At the National Spelling Bee, Duckworth found that children with high grit scores were more likely to survive to the later rounds. Most remarkable, Duckworth and Peterson gave their grit test to more than twelve hundred freshman cadets as they entered the military academy at West Point and embarked on the grueling summer training course known as Beast Barracks. The military has developed its own complex evaluation, called the whole candidate score, to judge incoming cadets and predict which of them will survive the demands of West Point; it includes academic grades, a gauge of physical fitness, and a leadership potential score. But the more accurate predictor of which cadets persisted in Beast Barracks and which ones dropped out turned out to be Duckworth’s simple little twelve-item grit questionnaire.
You can take the Grit Scale here (registration is free).
8. THINKING: THE NEW SCIENCE OF DECISION-MAKING, PROBLEM-SOLVING AND PREDICTION
In the 1970s, Kahneman and his colleague Amos Tversky, self-crowned “prophets of irrationality,” began studying what they called “heuristics and biases” — mental shortcuts we take, which frequently result in cognitive errors. Those errors, however, reveal a great deal about how our minds work:
If you want to characterize how something is done, then one of the most powerful ways of characterizing how the mind does anything is by looking at the errors that the mind produces while it’s doing it because the errors tell you what it is doing. Correct performance tells you much less about the procedure than the errors do.
One of the most fascinating examples of heuristics and biases is what we call intuition — a complex cluster of cognitive processes, sometimes helpful but often misleading. Kahneman notes that thoughts come to mind in one of two ways: Either by “orderly computation,” which involves a series of stages of remembering rules and then applying them, or by perception, an evolutionary function that allows us to predict outcomes based on what we’re perceiving. (For instance, seeing a woman’s angry face helps us predict the general sentiment and disposition of what she’s about to say.) It is the latter mode that precipitates intuition. Kahneman explains the interplay:
There is no sharp line between intuition and perception. … Perception is predictive. . . . If you want to understand intuition, it is very useful to understand perception, because so many of the rules that apply to perception apply as well to intuitive thinking. Intuitive thinking is quite different from perception. Intuitive thinking has language. Intuitive thinking has a lot of word knowledge organized in different ways more than mere perception. But some very basic characteristics [of] perception are extended almost directly to intuitive thinking.
He then considers how the two types of mental operations established by modern cognitive science illuminate intuition:
Type 1 is automatic, effortless, often unconscious, and associatively coherent. . . . Type 2 is controlled, effortful, usually conscious, tends to be logically coherent, rule-governed. Perception and intuition are Type 1. … Type 2 is more controlled, slower, is more deliberate. . . . Type 2 is who we think we are. [And yet] if one made a film on this, Type 2 would be a secondary character who thinks that he is the hero because that’s who we think we are, but in fact, it’s Type 1 that does most of the work, and it’s most of the work that is completely hidden from us.
Type 1 also encompasses all of our practiced skills — for instance, driving, speaking, and understanding a language — which after a certain threshold of mastery enter autopilot mode. (Though this presents its own set of problems.) Underpinning that mode of thinking is our associative memory, which Kahneman unpacks:
You have to think of [your associative memory] as a huge repository of ideas, linked to each other in many ways, including causal links and other links, and activation spreading from ideas to other ideas until a small subset of that enormous network is illuminated, and the subset is what’s happening in the mind at the moment. You’re not conscious of it, you’re conscious of very little of it.
The Type 1 modality of thought gives rise to a System 1 of interpretation, which is at the heart of what we call “intuition” — but which is far less accurate and reliable than we like to believe:
System 1 infers and invents causes and intentions. [This] happens automatically. Infants have it. . . . We’re equipped … for the perception of causality.
It neglects ambiguity and suppresses doubt and … exaggerates coherence. Associative coherence [is] in large part where the marvels turn into flaws. We see a world that is vastly more coherent than the world actually is. That’s because of this coherence-creating mechanism that we have. We have a sense-making organ in our heads, and we tend to see things that are emotionally coherent, and that are associatively coherent.
Most treacherous of all is our tendency to use our very confidence — and overconfidence — as evidence itself:
What’s interesting is that many a time people have intuitions that they’re equally confident about except they’re wrong. That happens through the mechanism I call “the mechanism of substitution.” You have been asked a question, and instead you answer another question, but that answer comes by itself with complete confidence, and you’re not aware that you’re doing something that you’re not an expert on because you have one answer. Subjectively, whether it’s right or wrong, it feels exactly the same. Whether it’s based on a lot of information, or a little information, this is something that you may step back and have a look at. But the subjective sense of confidence can be the same for intuition that arrives from expertise, and for intuitions that arise from heuristics. . . .
In the foreword to the book, Behance founder Scott Belsky, author of the indispensable Making Ideas Happen, points to “reactionary workflow” — our tendency to respond to requests and other stimuli rather than create meaningful work — as today’s biggest problem and propounds a call to arms:
It’s time to stop blaming our surroundings and start taking responsibility. While no workplace is perfect, it turns out that our gravest challenges are a lot more primal and personal. Our individual practices ultimately determine what we do and how well we do it. Specifically, it’s our routine (or lack thereof), our capacity to work proactively rather than reactively, and our ability to systematically optimize our work habits over time that determine our ability to make ideas happen.
Only by taking charge of your day-to-day can you truly make an impact in what matters most to you. I urge you to build a better routine by stepping outside of it, find your focus by rising above the constant cacophony, and sharpen your creative prowess by analyzing what really matters most when it comes to making your ideas happen.
We tend to overestimate what we can do in a short period, and underestimate what we can do over a long period, provided we work slowly and consistently. Anthony Trollope, the nineteenth-century writer who managed to be a prolific novelist while also revolutionizing the British postal system, observed, “A small daily task, if it be really daily, will beat the labours of a spasmodic Hercules.” Over the long run, the unglamorous habit of frequency fosters both productivity and creativity.
You’re much more likely to spot surprising relationships and to see fresh connections among ideas, if your mind is constantly humming with issues related to your work. When I’m deep in a project, everything I experience seems to relate to it in a way that’s absolutely exhilarating. The entire world becomes more interesting. That’s critical, because I have a voracious need for material, and as I become hyperaware of potential fodder, ideas pour in. By contrast, working sporadically makes it hard to keep your focus. It’s easy to become blocked, confused, or distracted, or to forget what you were aiming to accomplish.
Creativity arises from a constant churn of ideas, and one of the easiest ways to encourage that fertile froth is to keep your mind engaged with your project. When you work regularly, inspiration strikes regularly.
Step by step, you make your way forward. That’s why practices such as daily writing exercises or keeping a daily blog can be so helpful. You see yourself do the work, which shows you that you can do the work. Progress is reassuring and inspiring; panic and then despair set in when you find yourself getting nothing done day after day. One of the painful ironies of work life is that the anxiety of procrastination often makes people even less likely to buckle down in the future.
I have a long list of “Secrets of Adulthood,” the lessons I’ve learned as I’ve grown up, such as: “It’s the task that’s never started that’s more tiresome,” “The days are long, but the years are short,” and “Always leave plenty of room in the suitcase.” One of my most helpful Secrets is, “What I do every day matters more than what I do once in a while.”
Day by day, we build our lives, and day by day, we can take steps toward making real the magnificent creations of our imaginations.
Entrepreneurship guru and culture-sageSeth Godin seconds Rubin and admonishes against confusing vacant ritualization with creative rituals that actually spur productivity:
Everybody who does creative work has figured out how to deal with their own demons to get their work done. There is no evidence that setting up your easel like Van Gogh makes you paint better. Tactics are idiosyncratic. But strategies are universal, and there are a lot of talented folks who are not succeeding the way they want to because their strategies are broken.
The strategy is simple, I think. The strategy is to have a practice, and what it means to have a practice is to regularly and reliably do the work in a habitual way.
There are many ways you can signify to yourself that you are doing your practice. For example, some people wear a white lab coat or a particular pair of glasses, or always work in a specific place — in doing these things, they are professionalizing their art.
The notion that I do my work here, now, like this, even when I do not feel like it, and especially when I do not feel like it, is very important. Because lots and lots of people are creative when they feel like it, but you are only going to become a professional if you do it when you don’t feel like it. And that emotional waiver is why this is your work and not your hobby.
Grant’s extensive research has shed light on a crucial element of success, debunking some enduring tenets of cultural mythology:
According to conventional wisdom, highly successful people have three things in common: motivation, ability, and opportunity. If we want to succeed, we need a combination of hard work, talent, and luck. [But there is] a fourth ingredient, one that’s critical but often neglected: success depends heavily on how we approach our interactions with other people. Every time we interact with another person at work, we have a choice to make: do we try to claim as much value as we can, or contribute value without worrying about what we receive in return?
At the heart of his insight is a dichotomy of behavioral styles people adopt in pursuing success:
Takers have a distinctive signature: they like to get more than they give. They tilt reciprocity in their own favor, putting their own interests ahead of others’ needs. Takers believe that the world is a competitive, dog-eat-dog place. They feel that to succeed, they need to be better than others. To prove their competence, they self-promote and make sure they get plenty of credit for their efforts. Garden-variety takers aren’t cruel or cutthroat; they’re just cautious and self-protective. “If I don’t look out for myself first,” takers think, “no one will.”
Grant contrasts takers with givers:
In the workplace, givers are a relatively rare breed. They tilt reciprocity in the other direction, preferring to give more than they get. Whereas takers tend to be self-focused, evaluating what other people can offer them, givers are other-focused, paying more attention to what other people need from them. These preferences aren’t about money: givers and takers aren’t distinguished by how much they donate to charity or the compensation that they command from their employers. Rather, givers and takers differ in their attitudes and actions toward other people. If you’re a taker, you help others strategically, when the benefits to you outweigh the personal costs. If you’re a giver, you might use a different cost-benefit analysis: you help whenever the benefits to others exceed the personal costs. Alternatively, you might not think about the personal costs at all, helping others without expecting anything in return. If you’re a giver at work, you simply strive to be generous in sharing your time, energy, knowledge, skills, ideas, and connections with other people who can benefit from them.
Outside the workplace, Grant argues by citing Yale psychologist Margaret Clark’s research, most of us are givers in close relationships like marriages and friendships, contributing without preoccupation with keeping score. In the workplace, however, few of us are purely givers or takers — rather, what dominates is a third style:
We become matchers, striving to preserve an equal balance of giving and getting. Matchers operate on the principle of fairness: when they help others, they protect themselves by seeking reciprocity. If you’re a matcher, you believe in tit for tat, and your relationships are governed by even exchanges of favors.
Giving, taking, and matching are three fundamental styles of social interaction, but the lines between them aren’t hard and fast. You might find that you shift from one reciprocity style to another as you travel across different work roles and relationships. It wouldn’t be surprising if you act like a taker when negotiating your salary, a giver when mentoring someone with less experience than you, and a matcher when sharing expertise with a colleague. But evidence shows that at work, the vast majority of people develop a primary reciprocity style, which captures how they approach most of the people most of the time. And this primary style can play as much of a role in our success as hard work, talent, and luck.
Despite ampleevidence and countlesstestaments to the opposite, there persists a toxic cultural mythology that creative and intellectual excellence comes from a passive gift bestowed upon the fortunate few by the gods of genius, rather than being the product of the active application and consistent cultivation of skill. So what might the root of that stubborn fallacy be? Childhood and upbringing, it turns out, might have a lot to do.
In The Examined Life: How We Lose and Find Ourselves (public library), psychoanalyst and University College London professor Stephen Grosz builds on more than 50,000 hours of conversation from his quarter-century experience as a practicing psychoanalyst to explore the machinery of our inner life, with insights that are invariably profound and often provocative — for instance, a section titled “How praise can cause a loss of confidence,” in which Grosz writes:
Nowadays, we lavish praise on our children. Praise, self-confidence and academic performance, it is commonly believed, rise and fall together. But current research suggests otherwise — over the past decade, a number of studies on self-esteem have come to the conclusion that praising a child as ‘clever’ may not help her at school. In fact, it might cause her to under-perform. Often a child will react to praise by quitting — why make a new drawing if you have already made ‘the best’? Or a child may simply repeat the same work — why draw something new, or in a new way, if the old way always gets applause?
Grosz cites psychologists Carol Dweck and Claudia Mueller’s famous 1998 study, which divided 128 children ages 10 and 11 into two groups. All were asked to solve mathematical problems, but one group were praised for their intellect (“You did really well, you’re so clever.”) while the other for their effort (“You did really well, you must have tried really hard.”) The kids were then given more complex problems, which those previously praised for their hard work approached with dramatically greater resilience and willingness to try different approaches whenever they reached a dead end. By contrast, those who had been praised for their cleverness were much more anxious about failure, stuck with tasks they had already mastered, and dwindled in tenacity in the face of new problems. Grosz summarizes the now-legendary findings:
Ultimately, the thrill created by being told ‘You’re so clever’ gave way to an increase in anxiety and a drop in self-esteem, motivation and performance. When asked by the researchers to write to children in another school, recounting their experience, some of the ‘clever’ children lied, inflating their scores. In short, all it took to knock these youngsters’ confidence, to make them so unhappy that they lied, was one sentence of praise.
He goes on to admonish against today’s culture of excessive parental praise, which he argues does more for lifting the self-esteem of the parents than for cultivating a healthy one in their children:
Admiring our children may temporarily lift our self-esteem by signaling to those around us what fantastic parents we are and what terrific kids we have — but it isn’t doing much for a child’s sense of self. In trying so hard to be different from our parents, we’re actually doing much the same thing — doling out empty praise the way an earlier generation doled out thoughtless criticism. If we do it to avoid thinking about our child and her world, and about what our child feels, then praise, just like criticism, is ultimately expressing our indifference.
To explore what the healthier substitute for praise might be, he recounts observing an eighty-year-old remedial reading teacher named Charlotte Stiglitz, the mother of the Nobel Prize-winning economist Joseph Stiglitz, who told Grosz of her teaching methodology:
I don’t praise a small child for doing what they ought to be able to do,’ she told me. ‘I praise them when they do something really difficult — like sharing a toy or showing patience. I also think it is important to say “thank you”. When I’m slow in getting a snack for a child, or slow to help them and they have been patient, I thank them. But I wouldn’t praise a child who is playing or reading.
Rather than utilizing the familiar mechanisms of reward and punishment, Grosz observed, Charlotte’s method relied on keen attentiveness to “what a child did and how that child did it.” Presence, he argues, helps build the child’s confidence by way of indicating he is worthy of the observer’s thoughts and attention — its absence, on the other hand, divorces in the child the journey from the destination by instilling a sense that the activity itself is worthless unless it’s a means to obtaining praise. Grosz reminds us how this plays out for all of us, and why it matters throughout life:
Being present, whether with children, with friends, or even with oneself, is always hard work. But isn’t this attentiveness — the feeling that someone is trying to think about us — something we want more than praise?
Pink, wary of the disagreeable twinges accompanying the claim that everyone should self-identify as a salesperson, preemptively counters in the introduction:
I’m convinced we’ve gotten it wrong.
This is a book about sales. But it is unlike any book about sales you have read (or ignored) before. That’s because selling in all its dimensions — whether pushing Buicks on a lot or pitching ideas in a meeting — has changed more in the last ten years than it did over the previous hundred. Most of what we think we understand about selling is constructed atop a foundation of assumptions that have crumbled.
Selling, I’ve grown to understand, is more urgent, more important, and, in its own sweet way, more beautiful than we realize. The ability to move others to exchange what they have for what we have is crucial to our survival and our happiness. It has helped our species evolve, lifted our living standards, and enhanced our daily lives. The capacity to sell isn’t some unnatural adaptation to the merciless world of commerce. It is part of who we are.
One of Pink’s most fascinating arguments echoes artist Chuck Close, who famously noted that “our whole society is much too problem-solving oriented. It is far more interesting to [participate in] ‘problem creation.’” Pink cites the research of celebrated social scientists Jacob Getzels and Mihaly Csikszentmihalyi, who in the 1960s recruited three dozen fourth-year art students for an experiment. They brought the young artists into a studio with two large tables. The first table displayed 27 eclectic objects that the school used in its drawing classes. The students were instructed to select one or more objects, then arrange a still life on the second table and draw it. What happened next reveals an essential pattern about how creativity works:
The young artists approached their task in two distinct ways. Some examined relatively few objects, outlined their idea swiftly, and moved quickly to draw their still life. Others took their time. They handled more objects, turned them this way and that, rearranged them several times, and needed much longer to complete the drawing. As Csikszentmihalyi saw it, the first group was trying to solve a problem: How can I produce a good drawing? The second was trying to find a problem: What good drawing can I produce?
As Csikszentmihalyi then assembled a group of art experts to evaluate the resulting works, he found that the problem-finders’ drawings had been ranked much higher in creativity than the problem-solvers’. Ten years later, the researchers tracked down these art students, who at that point were working for a living, and found that about half had left the art world, while the other half had gone on to become professional artists. That latter group was composed almost entirely of problem-finders. Another decade later, the researchers checked in again and discovered that the problem-finders were “significantly more successful — by the standards of the artistic community — than their peers.” Getzels concluded:
It is in fact the discovery and creation of problems rather than any superior knowledge, technical skill, or craftsmanship that often sets the creative person apart from others in his field.
The more compelling view of the nature of problems has enormous implications for the new world of selling. Today, both sales and non-sales selling depend more on the creative, heuristic, problem-finding skills of artists than on the reductive, algorithmic, problem-solving skills of technicians.
Another fascinating chapter reveals counterintuitive insights about the competitive advantages of introversion vs. extraversion. While new theories might extol the power of introverts over traditional exaltations of extraversion, the truth turns out to be quite different: Pink turns to the research of social psychologist Adam Grant, management professor at the Wharton School of Business at the University of Pennsylvania (my alma mater).
Grant measured where a sample of call center sales representatives fell on the introversion-extraversion spectrum, then correlated that with their actual sales figures. Unsurprisingly, Grant found that extraverts averaged $125 per hour in revenue, exceeding introverts’ $120. His most surprising finding, however, was that “ambiverts” — those who fell in the middle of the spectrum, “not too hot, not too cold” — performed best of all, with an hourly average of $155. The outliers who brought in an astounding $208 per hour scored a solid 4 on the 1-7 introversion-extraversion scale.
Pink synthesizes the findings into an everyday insight for the rest of us:
The best approach is for the people on the ends to emulate those in the center. As some have noted, introverts are ‘geared to inspect,’ while extraverts are ‘geared to respond.’ Selling of any sort — whether traditional sales or non-sales selling — requires a delicate balance of inspecting and responding. Ambiverts can find that balance. They know when to speak and when to shut up. Their wider repertoires allow them to achieve harmony with a broader range of people and a more varied set of circumstances. Ambiverts are the best movers because they’re the most skilled attuners.
Pink goes on to outline “the new ABCs of moving others” — attunement (“the ability to bring one’s actions and outlook into harmony with other people an with the context you’re [sic] in”), buoyancy (a trifecta of “interrogative self-talk” that moves from making statements to asking questions, contagious “positivity,” and an optimistic “explanatory style” of explaining negative events to yourself), and clarity (“the capacity to help others see their situations in fresh and more revealing ways and to identify problems they didn’t realize they had”).
“I pray to Jesus to preserve my sanity,” Jack Kerouac professed in discussing his writing routine. But those of us who fall on the more secular end of the spectrum might need a slightly more potent sanity-preservation tool than prayer. That’s precisely what writer and psychotherapist Philippa Perry offers in How To Stay Sane (public library; UK), part of The School of Life’s wonderful series reclaiming the traditional self-help genre as intelligent, non-self-helpy, yet immensely helpful guides to modern living.
Our stories give shape to our inchoate, disparate, fleeting impressions of everyday life. They bring together the past and the future into the present to provide us with structures for working towards our goals. They give us a sense of identity and, most importantly, serve to integrate the feelings of our right brain with the language of our left.
We are primed to use stories. Part of our survival as a species depended upon listening to the stories of our tribal elders as they shared parables and passed down their experience and the wisdom of those who went before. As we get older it is our short-term memory that fades rather than our long-term memory. Perhaps we have evolved like this so that we are able to tell the younger generation about the stories and experiences that have formed us which may be important to subsequent generations if they are to thrive.
I worry, though, about what might happen to our minds if most of the stories we hear are about greed, war and atrocity.
Perry goes on to cite research indicating that people who watch television for more than four hours a day see themselves as far more likely to fall victim in a violent incident in the forthcoming week than their peers who watch less than two hours a day. Just like E. B. White advocated for the responsibility of the writer to “to lift people up, not lower them down,” so too is our responsibility as the writers of our own life-stories to avoid the well-documented negativity bias of modern media — because, as artist Austin Kleon wisely put it, “you are a mashup of what you let into your life.” Perry writes:
Be careful which stories you expose yourself to.
The meanings you find, and the stories you hear, will have an impact on how optimistic you are: it’s how we evolved. … If you do not know how to draw positive meaning from what happens in life, the neural pathways you need to appreciate good news will never fire up.
The trouble is, if we do not have a mind that is used to hearing good news, we do not have the neural pathways to process such news.
You may find that you have been telling yourself that practicing optimism is a risk, as though, somehow, a positive attitude will invite disaster and so if you practice optimism it may increase your feelings of vulnerability. The trick is to increase your tolerance for vulnerable feelings, rather than avoid them altogether.
Optimism does not mean continual happiness, glazed eyes and a fixed grin. When I talk about the desirability of optimism I do not mean that we should delude ourselves about reality. But practicing optimism does mean focusing more on the positive fall-out of an event than on the negative. … I am not advocating the kind of optimism that means you blow all your savings on a horse running at a hundred to one; I am talking about being optimistic enough to sow some seeds in the hope that some of them will germinate and grow into flowers.
We all like to think we keep an open mind and can change our opinions in the light of new evidence, but most of us seem to be geared to making up our minds very quickly. Then we process further evidence not with an open mind but with a filter, only acknowledging the evidence that backs up our original impression. It is too easy for us to fall into the rap of believing that being right is more important than being open to what might be.
If we practice detachment from our thoughts we learn to observe them as though we are taking a bird’s eye view of our own thinking. When we do this, we might find that our thinking belongs to an older, and different, story to the one we are now living.
We need to look at the repetitions in the stories we tell ourselves [and] at the process of the stories rather than merely their surface content. Then we can begin to experiment with changing the filter through which we look at the world, start to edit the story and thus regain flexibility where we have been getting stuck.
Love is the great intangible. In our nightmares, we can create beasts out of pure emotion. Hate stalks the streets with dripping fangs, fear flies down narrow alleyways on leather wings, and jealousy spins sticky webs across the sky. In daydreams, we can maneuver with poise, foiling an opponent, scoring high on fields of glory while crowds cheer, cutting fast to the heart of an adventure. But what dream state is love? Frantic and serene, vigilant and calm, wrung-out and fortified, explosive and sedate — love commands a vast army of moods. Hoping for victory, limping from the latest skirmish, lovers enter the arena once again. Sitting still, we are as daring as gladiators.
Love is the white light of emotion. It includes many feelings which, out of laziness and confusion, we crowd into one simple word. Art is the prism that sets them free, then follows the gyrations of one or a few. When art separates this thick tangle of feelings, love bares its bones. But it cannot be measured or mapped. Everyone admits that love is wonderful and necessary, yet no one can agree on what it is.
Even the very etymology of love shies away from explaining how, when, and why we imbued love with such immense significance:
What a small word we use for an idea so immense and powerful it has altered the flow of history, calmed monsters, kindled works of art, cheered the forlorn, turned tough guys to mush, consoled the enslaved, driven strong women mad, glorified the humble, fueled national scandals, bankrupted robber barons, and made mincemeat of kings. How can love’s spaciousness be conveyed in the narrow confines of one syllable? If we search for the source of the word, we find a history vague and confusing, stretching back to the Sanskrit lubhyati (“he desires”). I’m sure the etymology rambles back much farther than that, to a one-syllable word heavy as a heartbeat. Love is an ancient delirium, a desire older than civilization, with taproots stretching deep into dark and mysterious days.
We think of it as a sort of traffic accident of the heart. It is an emotion that scares us more than cruelty, more than violence, more than hatred. We allow ourselves to be foiled by the vagueness of the word. After all, love requires the utmost vulnerability. We equip someone with freshly sharpened knives; strip naked; then invite him to stand close. What could be scarier?
Common as child birth, love seems rare nonetheless, always catches one by surprise, and cannot be taught. Each child rediscovers it, each couple redefines it, each parent reinvents it. People search for love as if it were a city lost beneath the desert dunes, where pleasure is the law, the streets are lined with brocade cushions, and the sun never sets.
Ackerman offers an important disclaimer on how we think about the history of love, which is in effect a universal reflection on all of history and something we too often forget — the idea that everything builds on what came before:
It’s tempting to think of love as a progression, from ignorance toward the refined light of reason, but that would be a mistake. The history of love is not a ladder we climb rung by rung leaving previous rungs below. Human history is not a journey across a landscape, in the course of which we leave one town behind as we approach another. Nomads constantly on the move, we carry everything with us, all we possess. We carry the seeds and nails and remembered hardships of everywhere we have lived, the beliefs and hurts and bones of every ancestor. Our baggage is heavy. We can’t bear to part with anything that ever made us human. The way we love in the twentieth century is as much an accumulation of past sentiments as a response to modern life.
Much like the study of psychology, which has a long history of treating pathology by bringing our emotions from the negative to the neutral and only a nascent interest in the kind of “positive psychology” that elevates us above the neutral, Ackerman points out that the science of love has been largely confined to examining the negative — and yet, that misses the most rewarding marvels of all:
After all, there are countless studies on war, hate, crime, prejudice, and so on. Social scientists prefer to study negative behaviors and emotions. Perhaps, they don’t feel as comfortable studying love per se. I add that “per se” because they are studying love — often they’re studying what happens when love is deficient, thwarted, warped, or absent. … We have the great fortune to live on a planet abounding with humans, plants, and animals; and I often marvel at the strange tasks evolution sets them. Of all the errands life seems to be running, of all the mysteries that enchant us, love is my favorite.