“Fear binds people together. And fear disperses them. Courage inspires communities: the courage of an example — for courage is as contagious as fear.”
By Maria Popova
The recent anniversary of Rosa Parks’s arrest, which sparked the historic Montgomery Bus Boycott, reminded me of “On Courage and Resistance” — the timeless Oscar Romero Award keynote address Susan Sontag delivered on March 30, 2003, originally published in the 2007 posthumous anthology At the Same Time: Essays and Speeches (public library). In honoring the Israeli soldiers who defied orders and refused to serve in the occupied territories, Sontag examines the osmosis between individual acts and collective fate, the interplay between morality and courage, and the role of fear in violence:
Fear binds people together. And fear disperses them. Courage inspires communities: the courage of an example — for courage is as contagious as fear. But courage, certain kinds of courage, can also isolate the brave.
The perennial destiny of principles: while everyone professes to have them, they are likely to be sacrificed when they become inconveniencing. Generally a moral principle is something that puts one at variance with accepted practice. And that variance has consequences, sometimes unpleasant consequences, as the community takes its revenge on those who challenge its contradictions — who want a society actually to uphold the principles it professes to defend.
The standard that a society should actually embody its own professed principles is a utopian one, in the sense that moral principles contradict the way things really are — and always will be. How things really are — and always will be — is neither all evil nor all good but deficient, inconsistent, inferior. Principles invite us to do something about the morass of contradictions in which we function morally. Principles invite us to clean up our act, to become intolerant of moral laxity and compromise and cowardice and the turning away from what is upsetting: that secret gnawing of the heart that tells us that what we are doing is not right, and so counsels us that we’d be better off just not thinking about it.
The cry of the antiprincipled: ‘I’m doing the best I can.’ The best given the circumstances, of course.
At the center of our moral life and our moral imagination are the great models of resistance: the great stories of those who have said no. No, I will not serve.
Courage has no moral value in itself, for courage is not, in itself, a moral virtue. Vicious scoundrels, murderers, terrorists may be brave. To describe courage as a virtue, we need an adjective: we speak of ‘moral courage’ — because there is such a thing as amoral courage, too.
She zooms in on the Israel-Palestine conflict and its reverberations around the world:
A wounded and fearful country, Israel, is going through the greatest crisis of its turbulent history, brought about by the policy of steadily increasing and reinforcing settlements on the territories won after its victory in the Arab-Israeli war of 1967. The decision of successive Israeli governments to retain control over the West Bank and Gaza, thereby denying their Palestinian neighbors a state of their own, is a catastrophe — moral, human, and political — for both peoples. The Palestinians need a sovereign state. Israel needs a sovereign Palestinian state. Those of us abroad who wish for Israel to survive cannot, should not, wish it to survive no matter what, no matter how. We owe a particular debt of gratitude to courageous Israeli Jewish witnesses, journalists, architects, poets, novelists, professors — among others — who have described and documented and protested and militated against the sufferings of the Palestinians living under the increasingly cruel terms of Israeli military subjugation and settler annexation.
The Israeli soldiers who are resisting service in the Occupied Territories are not refusing a particular order. They are refusing to enter the space where illegitimate orders are bound to be given… What the refuseniks have done — there are now more than one thousand of them, more than 250 of whom have gone to prison — does not contribute to tell us how the Israelis and Palestinians can make peace beyond the irrevocable demand that the settlements be disbanded. The actions of this heroic minority cannot contribute to the much-needed reform and democratization of the Palestinian Authority. Their stand will not lessen the grip of religious bigotry and racism in Israeli society or reduce the dissemination of virulent anti-Semitic propaganda in the aggrieved Arab world. It will not stop the suicide bombers.
It simply declares: enough. Or: there is a limit. Yesh gvul.
It provides a model of resistance. Of disobedience. For which there will always be penalties.
Sontag then issues a critique all the more apt today, nearly a decade of wars later:
Our ‘United We Stand’ or ‘Winner Takes All’ ethos: the United States is a country that has made patriotism equivalent to consensus.
On the flawed logic of going to — and staying at — war:
The force of arms has its own logic. If you commit an aggression and others resist, it is easy to convince the home front that the fighting must continue. Once the troops are there, they must be supported. It becomes irrelevant to question why the troops are there in the first place.
Sontag zooms back out into the bigger picture:
Let’s not underestimate the force of what we are opposing.
The world is, for almost everyone, that over which we have virtually no control. Common sense and the sense of self-protectiveness tell us to accommodate to what we cannot change.
It’s not hard to see how some of us might be persuaded of the justice, the necessity of a war. Especially of a war that is formulated as small, limited military actions that will actually contribute to peace or improve security; of an aggression that announces itself as a campaign of disarmament — admittedly, disarmament of the enemy; and, regrettably, requiring the application of overpowering force. An invasion that calls itself, officially, a liberation.
Every violence in war has been justified as a retaliation. We are threatened. We are defending ourselves. The others, they want to kill us. We must stop them.
Never mind the disparity of forces, of wealth, of firepower — or simply of population. How many Americans know that the population of Iraq is 24 million, half of whom are children? (The population of the United States, as you will remember, is 290 million.) Not to support those who are coming under fire from the enemy seems like treason.
She illustrates the case for personal responsibility — something Joan Didion pointed to as the pillar of character — with an example of how seemingly ineffectual individual acts of resistance can spark massively influential chain reactions of effects:
Thoreau’s going to prison in 1846 for refusing to pay the poll tax in protest against the American war on Mexico hardly stopped the war. But the resonance of that most unpunishing and briefest spell of imprisonment (famously, a single night in jail) has not ceased to inspire principled resistance to injustice through the second half of the twentieth century and into our new era. The movement in the late 1980s to shut down the Nevada Test Site, a key location for the nuclear arms race, failed in its goal; the operations of the test site were unaffected by the protests. But it led directly to the formation of a movement of protesters in faraway Alma Ata, who eventually succeeded in shutting down the main Soviet test site in Kazakhstan, citing the Nevada antinuclear activists as their inspiration and expressing solidarity with the Native Americans on whose land the Nevada Test Site had been located.
The likelihood that your acts of resistance cannot stop the injustice does not exempt you from acting in what you sincerely and reflectively hold to be the best interests of your community.
Thus: it is not in the best interests of Israel to be an oppressor.
Thus: it is not in the best interests of the United States to be a hyperpower, capable of imposing its will on any country in the world, as it chooses.
Sontag concludes with a necessary reminder that, just like the light and heat of the distant sun are responsible for the flame in your fireplace, our local, individual actions and inextricably connected to and fractionally instrumental in our global, collective fate:
Beyond these struggles, which are worthy of our passionate adherence, it is important to remember that in programs of political resistance the relation of cause and effect is convoluted and often indirect. All struggle, all resistance is — must be — concrete. And all struggle has a global resonance.
If not here, then there. If not now, then soon. Elsewhere as well as here.
From visionary Indian indie publisher Tara Books, who for nearly two decades have been giving voice to marginalized art and literature through a commune of artists, writers, and designers collaborating on beautifully crafted books celebrating Indian folk art traditions. Their latest gem, Drawing from the City (public library) by artist Tejubehan, is both more exquisite and born out of a more moving personal story than just about any book I’ve come across. Its gorgeous black-and-white pen-and-ink drawings, brimming with expressive lines and dots somewhere between Yayoi Kusama and Edward Gorey, tell the partly autobiographical, partly escapist tale of this self-taught artist who came of age as a woman trapped between unimaginable poverty and a wildly imaginative inner world in a patriarchal society.
Tejubehan takes us on a journey from her small village into the big city, where her poor parents move to find work. Three years pass. Teju is now a young woman and she marries a man who sings for a living. With his encouragement, she becomes an artist.
It is like magic. I sit in one place with paper and pen, and it is my hand that starts to move. Lines, dots, more lines, and more dots, and you have a picture. I can bring to life things that I have seen and know, but also things that I imagine. I can even bring the two together.
I have been moving all my life, looking for ways to survive, but this is a new direction. My heart is full.
I see a girl going somewhere on a bicycle, and I draw a whole group of girls, all of them on the way somewhere.
We reach the city! Everything is on the move here, not just the train. People rush past, pushing their way through the streets. Only the tall buildings seem rooted to the spot, along with a few trees that stand guard on the other side.
I don’t mind the rush though. The sun is setting, and I marvel at the lampposts that can turn night into day. Nights in our village are really dark.
At its heart, however, the story is really a feminist story — a vision for women’s liberation in a culture with oppressive gender norms and limiting social expectations. In envisioning the woman of the city — biking, driving, flying — Tejubehan is really envisioning what it might be like to live in a world where to be female means to be free to move and free to just be.
I like cars. I wonder what it’s like to move at such a high speed and to be in control of where you’re going. There are always two women in my cars. One drives and the other looks out of the window.
I want to be both of those women.
But even in the plane, my women are not content to sit still. So I float them down, wondering where they should go next. Should they fly forever like birds? Or should I draw some lines taking them down to the sea?
I rest my pen here for a moment. I have time to decide.
Like many of Tara’s other books, Drawing from the City has been silkscreen-printed and bound by hand on handmade paper. The cover is colored with traditional Indian dyes, emanating an enchanting earthy smell that reminds you what it’s like to hold an analog labor of love in your analog human hands.
Originally featured, with more images, in October.
Since childhood, Kusama has had a rare condition that makes her see colorful spots on everything she looks at. Her vision, both literally and creatively, is thus naturally surreal, almost hallucinogenic. Her vibrant artwork, sewn together in a magnificent fabric-bound hardcover tome, becomes an exquisite embodiment of Carroll’s story and his fascination with the extraordinary way in which children see and explore the ordinary world.
Originally featured, with more photos and a trailer, in April.
STEAL LIKE AN ARTIST
Much has been said about how creativity works, what its secrets are, and where good ideas come from, but most of that wisdom can be lost on young minds just dipping their toes in the vast and tumultuous ocean of self-initiated creation. Some time ago, artist and writer Austin Kleon — one of my favorite thinkers, a keen observer of and participant in the creative economy of the digital age — was invited to give a talk to students, the backbone for which was a list of 10 things he wished he’d heard as a young creator:
The book opens with a timeless T.S. Eliot endorsement of remix culture:
Immature poets imitate; mature poets steal; bad poets deface what they take, and good poets make it into something better, or at least something different.
Kleon goes on to delineate the qualities you’ll need to cultivate for the creative life — things like kindness, curiosity, “productive procrastination,” “a willingness to look stupid” — demonstrating that “creativity” isn’t some abstract phenomenon bestowed upon the fortunate few but, rather, a deliberate mindset and pragmatic ethos we can architect for ourselves. As he puts it, “you are a mashup of what you let into your life.”
He writes in the introduction:
It’s one of my theories that when people give advice, they’re really just talking to themselves in the past.
This book is me talking to a previous version of myself.
These are things I’ve learned over almost a decade of trying to figure out how to make art, but a funny thing happened when I started sharing them with others — I realized that they aren’t just for artists. They’re for everyone.
These ideas are for everyone who’s trying to inject some creativity into their life and their work. (That should describe all of us.)
Draw the art you want to see, start the business you want to run, play the music you want to hear, write the books you want to read, build the products you want to use — do the work you want to see done.
What does it mean to ‘change art’? Art, in any definition, is so much a business of transformation that change is always and everywhere part of its nature, whether you think of it in physical terms (stone into statue) or in intellectual or spiritual ones (giving form to invisible things). No sooner has an idea changed art that art reformulates that idea, allowing it to recognize itself. Around the early fifth century BC, for example, Greek sculptors changed the way they represented naked figures, probably under the influence of certain intellectual attitudes to the human body. At the same time, their nude statues endowed fifth-century Greek ideas about what it means to be human with an extraordinarily long and fertile posterity. As so often where art is concerned, the transformation works both ways, more on the analogy of a chemical reaction than the introduction of a new material in engineering or a new process in politics.
Whereas stories are diachronic — they take time in the telling and involve the unfolding of events through time — visual images work synchronically, being interpreted almost instantaneously by the viewer. Visual artists have therefore developed a wide range of strategies for the task of storytelling.
In 2007, artist and illustrator Jane Mount began painting “portraits of people through the spines of their books” — those aspirational bookshelves we all hold in our heads (and, ideally, on our walls), full of all the books that helped us discover and rediscover who we are, what we stand for, and what we’d like to become. A kind of book spine poetry of identity. In 2010, she paired with Paris Review writer Thessaly La Force and the two asked more than a hundred of today’s most exciting creators — writers, artists, designers, critics, filmmakers, chefs, architects — what those favorite, timeless books were for them. Thus, My Ideal Bookshelf* (public library) was born — a magnificent collection of Mount’s illustrated “portraits” of these modern-day icons, alongside short essays by each contributor explaining why the books included are meaningful to him or her. Besides the sheer voyeuristic pleasure of peeking inside the personal libraries of great minds, the project is at once a celebration of bibliophilia and a testament to the fact that the most interesting people are woven of incredibly eclectic influences.
La Force offers a necessary disclaimer in the introduction:
So much depends on where you, the reader, are — physically and metaphorically — when you decide to pick up a book and give it a chance. Which explains why there’s no such thing as one ideal bookshelf; there is no ur-bookshelf. It would be a mistake to try to read this book with that goal in mind. In the end., the one element that links all the ideal bookshelves in these pages is the never-ending search. e’re all still hunting, still hoping to discover one more book that we’ll love and treasure for the rest of our lives.
I love the architecture of public libraries, the very large windows. Inside it’s polished, it’s quiet; during the day, the sun is usually streaming through one room or another. And all the people are sitting there together, but they’re all going to completely different places through the books they’re reading.
Jennifer Egan contemplates the substance of life:
My goal as a writer is to do as much as possible at one time. Life itself is so cacophonous and complex. It’s not that I want to create a cacophony, but I want to do justice to the complexity around us. I don’t want to oversimplify it. I want to take one thing and build from that, and then keep building, until I begin to approximate the complexity of the world and our perceptions of it.
MoMA’s Paola Antonelli considers a book’s content in contrast to its thing-ness:
Hello World is a new book by Alice Rawsthorn, the one and only, the best design critic in the entire world. She keeps the banner of design flying high. Irma Boom designed it, and Irma is very simply the best book designer alive. I personally love reading books electronically. I proudly have a big wall of books in my apartment, but I’m continually getting rid of books that get on my nerves because I don’t think they’re good enough to deserve to take up space in my life. You can walk into a bookstore and find that 95 percent of the books on display might as well have been directly electronic. Mind you, they might be great texts, fabulous additions to human knowledge, but they did not need to have their own paper body. I want physical books to have a concept. Irma designs objects. her books are breathtaking as things.
The thing about this bookshelf is that each of these books is a vast experience unto itself, while also being both self-contained and superbly useless. Reading any one of them doesn’t get you anywhere particularly meaningful; you haven’t arrived or graduated; you’ve just gone and done something that passes the time. It’s like taking a long walk with a friend who’s got a lot to say. There’s no cumulative purpose to it — it’s just an excellent way to waste your life.
Illustrator extraordinaire Christoph Niemann speaks to the indispensable value of influence and considers David Foster Wallace as a creative echelon:
I think the most successful illustrations are those that build on some other reference. You can’t completely reinvent something.
For me, David Foster Wallace is almost painful to read. It’s like he’s mumbling. You think he’s just writing down every single idea that comes into his head, but then when you reach the end, you realize that every sentence has been perfectly composed. I wish I could find something in his work that I could put to use in my own.
The one and only Patti Smith touches on that quality of literature that makes it the original “Internet” of hyperlinked discovery:
I longed to read everything I possibly could, and the things I read in turn produced new yearnings.
I really think you can’t progress as a writer unless you read, and the ideal time to read is when you can read generously. It didn’t even occur to me that I could have a book of my own in the library someday. That’s how you should read.
Design and typography maestro Stefan Sagmeister draws a parallel between his design process and his book selection:
As a designer, I often use a process described by the Maltese philosopher Edward de Bono. He suggests starting to think about an idea for a particular project by taking a random objet as a point of departure. So, let’s say I have to design a pen. Instead of looking at other pens, and thinking about how pens are used and who my target audience is, and so on and so forth, I’ll consider, say — I’m in a hotel room right now — beadspreads.
I started writing for children because someone asked me to. I thought it was a different skill set, even though it’s really not. I asked the editor to send me a bunch of children’s books that the publishing house had published. And they were all terrible. Every single one of them. Which inspired me.
But perhaps most poignant of all, or at least most resonant with my own relationship with books, is writer Pico Iyer:
What more could one ask of a companion? To be forever new and yet forever steady. To be strange and familiar all at once, with enough change to quicken my mind, enough steadiness to give sanctuary to my heart. The books on my shelf never asked to come together, and they would not trust or want to listen to one another; but each is a piece of a stained-glass whole without which I couldn’t make sense to myself, or to the world outside.
Each line of the “trick verse” builds upon the previous one, flowing into a kind of rhythmic redundancy embodied in the physical structure of the book as each repeating line is printed only once, but appears on two pages by peeking through exquisitely die-cut holes that play on the stark black-and-white illustrations. Thus, if read page by page the way one would read a traditional book, the poem sounds spellbindingly surreal — but if read through the die-cuts, a beautiful and crisp story comes together.
French comic artist and illustrator Blexbolex may be best-known for his contemplative meditations on people and time, aimed at children yet agelessly delightful and thought-provoking, but he is also a masterful explorer of complex grown-up themes. No Man’s Land (public library), from London indie publisher No Brow, is a poignant satire of the mind’s well-documented gift for fooling itself and seducing us into our own hand-spun illusory realities. Printed in three spot-colors, screenprint-like, on beautiful matte paper — Blexbolex’s signature style — it tells the story of a hero spiraling into an implausible dreamland in hopeless escapism from the processes of mortality.
And still, that insinuating, ever-growing silence.
Hell. I survived hell; you don’t even have the beginning of the slightest idea.
For the past two years, graphic designer Vahram Muratyan, a self-described “lover of Paris wandering through New York,” has been chronicling the peculiarities and contradictions of the two cities through “a friendly visual match” of minimalist illustrated parallel portraits. This year, Muratyan joined the finest blog-turned-books with Paris versus New York: A Tally of Two Cities (public library) — an absolutely charming collection of these vibrant visual dichotomies and likenesses. From beverages to beards, hands to houses, Muratyan captures the intricacies of cultural difference in a way that blends the minimalist and playful visual whimsy of Noma Bar’s Guess Who? with the side-by-side parallelism of Mark Laita’s Created Equal to deliver something entirely new and entirely delightful.
Originally featured, with more images, in January.
THE BIG NEW YORKER BOOK OF DOGS
Dogs have enjoyed a long track record as fiction heroes, photography models, and subjects of scientific curiosity. But they’ve also had an admirable history of inhabiting the spectrum between trope and muse for some of literary history’s greatest talent. The Big New Yorker Book of Dogs (public library) collects such canine-themed gems — fiction, poetry, feature articles, humor, cartoons, cover art, manuscript drafts — from a slew of titans culled from the magazine’s archive, including Brain Pickings regulars E. B. White, Maira Kalman, John Updike, Jonathan Lethem, and Roald Dahl. Divided into four sections — Good Dogs, Bad Dogs, Top Dogs, and Underdogs — and spanning such subjects as evolution, domesticity, love, family, obedience, bereavement, language, and more, the lavishly illustrated 400-page tome is an absolute treat from cover to cover.
Malcolm Gladwell writes in the foreword:
A few words about you. You bought this book: several hundred pages on dogs. You are, in other words, as unhealthily involved in the emotional life of dogs as the rest of us. Have you wondered why you bought it? One possible answer is that you see the subject of man’s affection for dogs as a way of examining all sorts of broader issues. Is it the case of a simple thing revealing a great many complex truths? We do a lot of this at The New Yorker. To be honest: I do a lot of this at The New Yorker — always going on and on about how A is just a metaphor for B, and blah, blah, blah. But let’s be clear. You didn’t really buy this boo because of some grand metaphor. Dogs are not about something else. Dogs are about dogs.
From E. B. White comes a playful, heart-warming poem circa 1930:
DOG AROUND THE BLOCK
Dog around the block, sniff,
Hydrant sniffing, corner, grating,
Sniffing, always, starting forward,
Backward, dragging, sniffing backward,
Leash at taut, least at dangle,
Leash in people’s feet entangle—
Sniffing dog, apprised of smellings,
Loving old acquaintances, sniff,
Sniffing hydrant for reminders,
Leg against the wall, raise,
Leaving grating, corner greeting,
Chance for meeting, sniff, meeting,
Meeting, telling, news of smelling,
Nose to tail, tail to nose,
Rigid, careful, pose,
Liking, partly liking, hating,
Then another hydrant, grating,
Leash at taut, leash at dangle,
Tangle, sniff, untangle,
Dog around the block, sniff.
In a piece bearing the deceptively unassuming title “Dog Story,” Adam Gopnik deploys his formidable dual storytelling torpedo of disarming personal anecdote and uncompromising scientific rigor to explore post-Darwinian views on dog domestication:
[C]ountering [Darwin’s] view comes a new view of dog history, more in keeping with our own ostentatiously less man-centered world view. Dogs, we are now told, by a sequence of scientific speculators … domesticated themselves. They chose us. A marginally calmer canid came close to the circle of human warmth — and, more important, human refuse — and was tolerated by the humans inside: let him eat the garbage. Then this scavenging wolf mated with another calm wolf, and soon a family of calmer wolves proliferated just outside the firelight. It wasn’t cub-snatching on the part of humans, but breaking and entering on the part of wolves, that gave us dogs. ‘Hey, you be ferocious and eat them when you can catch them,’ the protodogs said, in evolutionary effect, to their wolf siblings. ‘We’ll just do what they like and have them feed us. Dignity? It’s a small price to pay for free food. Check with you in ten thousand years and we’ll see who’s had more kids.’ (Estimated planetary dog population: one billion. Estimated planetary wild wolf population: three hundred thousand.)
A few pages later, Gopnik’s gentle arrow to the heart of our relationship with dogs:
Dogs have little imagination about us and our inner lives but limitless intuition about them; we have false intuitions about their inner lives but limitless imagination about them. Our relationship meets in the middle.
In another essay on Thurber, the magazine’s quintessential dog-lover, whose artwork graces the book cover, Gopnik does away with Gladwell’s disclaimer and offers an insightful A-is-a-metaphor-for-B analysis of Thurber’s meta-symbolism:
So why dogs? The answer is simple: for Thurber, the dog chimed with, represented, the American man in his natural state—a state that, as Thurber saw it, was largely scared out of him by the American woman. When Thurber was writing about dogs, he was writing about men. The virtues that seemed inherent in dogs—peacefulness, courage, and stoical indifference to circumstance—were ones that he felt had been lost by their owners. The American man had the permanent jumps, and the American dog did not. The dog was man set free from family obligations, Monastic Man. Dogs ‘would in all probability have averted the Depression, for they can go through lots tougher things than we and still think it’s boom time. They demand very little of their heyday; a kind word is more to them than fame, a soup bone than gold; they are perfectly contented with a warm fire and a good book to chew (preferably an autographed first edition lent by a friend); wine and song they can completely forgo; and they can almost completely forgo women.’ For Thurber, the dog is not man’s best friend so much as man’s sole dodgy ally in his struggle with man’s strangest necessity, woman.
Indeed, it is also Gopnik who, in the same essay, captures in just a few short sentences the entire ethos of the book — and the very heart of man’s relationship with dog:
Integrity, even grouchy growling integrity, in a world that doesn’t value it; nobility in a time that doesn’t want it—what Thurber’s dogs do is absurd or even pernicious (they bite people, or drag junk furniture for miles) but demonstrates the necessary triumph of the superfluous. Which is what dogs are all about; it is the canine way. Nothing is less necessary than a pet dog, or more needed. Thurber’s theme is that a dog’s life is spent, as a man’s life should be, doing pointless things that have the solemnity of inner purpose.
From cosmology to cosmic love, or what your biological clock has to do with diagraming evolution.
By Maria Popova
It’s that time of year again, the time for those highly subjective, grossly non-exhaustive, yet inevitable and invariably fun best-of reading lists. To kick off the season, here are, in no particular order, my ten favorite science books of 2012. (Catch up on last year’s reading list here.)
“Six hours’ sleep for a man, seven for a woman, and eight for a fool,” Napoleon famously prescribed. (He would have scoffed at Einstein, then, who was known to require ten hours of sleep for optimal performance.) This perceived superiority of those who can get by on less sleep isn’t just something Napoleon shared with dictators like Hitler and Stalin, it’s an enduring attitude woven into our social norms and expectations, from proverbs about early birds to the basic scheduling structure of education and the workplace. But in Internal Time: Chronotypes, Social Jet Lag, and Why You’re So Tired (public library), a fine addition to these 7 essential books on time, German chronobiologist Till Roenneberg demonstrates through a wealth of research that our sleep patterns have little to do with laziness and other such scorned character flaws, and everything to do with biology.
In fact, each of us possesses a different chronotype — an internal timing type best defined by your midpoint of sleep, or midsleep, which you can calculate by dividing your average sleep duration by two and adding the resulting number to your average bedtime on free days, meaning days when your sleep and waking times are not dictated by the demands of your work or school schedule. For instance, if you go to bed at 11 P.M. and wake up at 7 A.M., add four hours to 11pm and you get 3 A.M. as your midsleep.
Roenneberg traces the evolutionary roots of different sleep cycles and argues that while earlier chronotypes might have had a social advantage in agrarian and industrial societies, today’s world of time-shift work and constant connectivity has invalidated such advantages but left behind the social stigma around later chronotypes.
This myth that early risers are good people and that late risers are lazy has its reasons and merits in rural societies but becomes questionable in a modern 24/7 society. The old moral is so prevalent, however, that it still dominates our beliefs, even in modern times. The postman doesn’t think for a second that the young man might have worked until the early morning hours because he is a night-shift worker or for other reasons. He labels healthy young people who sleep into the day as lazy — as long sleepers. This attitude is reflected in the frequent use of the word-pair early birds and long sleepers [in the media]. Yet this pair is nothing but apples and oranges, because the opposite of early is late and the opposite of long is short.
Roenneberg goes on to explore sleep duration, a measure of sleep types that complements midsleep, demonstrating just as wide a spectrum of short and long sleepers and debunking the notion that people who get up late sleep longer than others — this judgment, after all, is based on the assumption that everyone goes to bed at the same time, which we increasingly do not.
The disconnect between our internal, biological time and social time — defined by our work schedules and social engagements — leads to what Roenneberg calls social jet lag, a kind of chronic exhaustion resembling the symptoms of jet lag and comparable to having to work for a company a few time zones to the east of your home.
Unlike what happens in real jet lag, people who suffer from social jet lag never leave their home base and can therefore never adjust to a new light-dark environment … While real jet lag is acute and transient, social jet lag is chronic. The amount of social jet lag that an individual is exposed to can be quantified as the difference between midsleep on free days and midsleep on work days … Over 40 percent of the Central European population suffers from social jet lag of two hours or more, and the internal time of over 15 percent is three hours or more out of synch with external time. There is no reason to assume that this would be different in other industrialized nations.
Chronotypes vary with age:
Young children are relatively early chronotypes (to the distress of many young parents), and then gradually become later. During puberty and adolescence humans become true night owls, and then around twenty years of age reach a turning point and become earlier again for the rest of their lives. On average, women reach this turning point at nineteen and a half while men start to become earlier again at twenty-one … [T]his clear turning point in the developmental changes of chronotype … [is] the first biological marker for the end of adolescence.
Roenneberg points out that in our culture, there is a great disconnect between teenagers’ biological abilities and our social expectations of them, encapsulated in what is known as the disco hypothesis — the notion that if only teens would go to bed earlier, meaning not party until late, they’d be better able to wake up clear-headed and ready for school at the expected time. The data, however, indicate otherwise — adolescents’ internal time is shifted so they don’t find sleep before the small hours of the night, a pattern also found in the life cycle of rodents.
Here, we brush up against a painfully obtrusive cultural obstacle: School starts early — as early as 7 A.M. in some European countries — and teens are expected to perform well on a schedule not designed with their internal time in mind. As a result, studies have shown that many students show the signs of narcolepsy — a severe sleeping disorder that makes one fall asleep at once when given the chance, immediately entering REM sleep. The implications are worrisome:
Teenagers need around eight to ten hours of sleep but get much less during their workweek. A recent study found that when the starting time of high school is delayed by an hour, the percentage of students who get at least eight hours of sleep per night jumps from 35.7 percent to 50 percent. Adolescent students’ attendance rate, their performance, their motivation, even their eating habits all improve significantly if school times are delayed.
Similar detrimental effects of social jet lag are found in shift work, which Roenneberg calls “one of the most blatant assaults on the body clock in modern society.” (And while we may be tempted to equate shift work with the service industry, any journalist, designer, developer, or artist who works well into the night on deadline can relate — hey, it’s well past midnight again as I’m writing this.) In fact, the World Health Organization recently classified “shift work that involves circadian disruption” as a potential cause of cancer, and the consequences of social jet lag and near-narcolepsy extend beyond the usual suspects of car accidents and medical errors:
We are only beginning to understand the potentially detrimental consequences of social jet lag. One of these has already been worked out with frightening certainty: the more severe the social jet lag that people suffer, the more likely it is that they are smokers. Tis is not a question of quantity (number of cigarettes per day) but simple whether they are smokers or not … Statistically, we experience the worst social jet lag as teenagers, when our body clocks are drastically delayed for biological reasons, but we still have to get up at the same traditional times for school. This coincides with the age when most individuals start smoking. Assuredly there are many different reasons people start smoking at that age, but social jet lag certainly contributes to the risk.
If young people’s psychological and emotional well-being isn’t incentive enough for policy makers — who, by the way, Roenneberg’s research indicates tend to be early chronotypes themselves — to consider later school times, one would think their health should be.
The correlation between social jet lag and smoking continues later in life as well, particularly when it comes to quitting:
[T]he less stress smokers have, the easier it is for them to quit. Social jet lag is stress, so the chances of successfully quitting smoking are higher when the mismatch of internal and external time is smaller. The numbers connecting smoking with social jet lag are striking: Among those who suffer less than an hour of social jet lag per day, we find 15 to 20 percent are smokers. This percentage systematically rises to over 60 percent when internal and external time are more than five hours out of sync.
Another factor contributing to our social jet lag is Daylight Savings Time. Even though DST’s proponents argue that it’s just one small hour, the data suggest that between October and March, DST throws off our body clocks by up to four weeks, depending on our latitude, not allowing our bodies to properly adjust to the time change, especially if we happen to be later chronotypes. The result is increased social jet lag and decreased sleep duration.
But what actually regulates our internal time? Though the temporal structures of sun time — tide, day, month, and year — play a significant role in the lives of all organisms, our biological clocks evolved in a “time-free” world and are somewhat independent of such external stimuli as light and dark. For instance, early botanical studies showed that a mimosa plant kept in a pitch-dark closet would still open and close its leaves the way it does in the day-night cycle, and subsequent studies of human subjects confined to dark bunkers showed similar preservation of their sleep and waking patterns, which followed, albeit imperfectly, the 24-hour cycle of day and night.
Our internal clocks, in fact, can be traced down to the genetic level, with individual “clock genes” and, most prominently, the suprachiasmatic nucleus, or SCN — a small region in the brain’s midline that acts as a kind of “master clock” for mammals, regulating neuronal and hormonal activity around our circadian rhythms. Roenneberg explains how our internal clocks work on the DNA level:
In the nucleus, the DNA sequence of a clock gene is transcribed to mRNA; the resulting message is exported from the nucleus, translated into a clock protein, and is then modified. This clock protein is itself part of the molecular machinery that controls the transcription of its ‘own’ gene. When enough clock proteins have been made, they are imported back into the nucleus, where they start to inhibit the transcription of their own mRNA. Once this inhibition is strong enough, no more mRNA molecules are transcribed, and the existing ones are gradually destroyed. As a consequence, no more proteins can be produced and the existing ones will also gradually be destroyed. When they are all gone, the transcriptional machinery is not suppressed anymore, and a new cycle can begin.
Despite this complexity, the important take-home message is that daily rhythms are generated by molecular mechanisms that could potentially work in a single cell, for example a single neuron of the SCN.
Internal Time goes on to illuminate many other aspects of how chronotypes and social jet lag impact our daily lives, from birth and suicide rates to when we borrow books from the library to why older men marry younger women, and even why innovators and entrepreneurs tend to have later chronotypes.
With this book, we wanted to bring back a sense of the unknown that has been lost in the age of information. … Remember that before you do a quick online search for the purpose of the horned owl’s horns, you should give yourself some time to wonder.
The motion graphics book trailer is an absolute masterpiece itself:
Was there an era before our own, out of which our current universe was born? Do the laws of physics, the dimensions of space-time, the strengths and types and asymmetries of nature’s forces and particles, and the potential for life have to be as we observe them, or is there a branching multi-verse of earlier and later epochs filled with unimaginably exotic realms? We do not know.
Exploring how gravity works, Terry Matilsky notes:
[T]he story is not finished. We know that general relativity is not the final answer, because we have not been able to synthesize gravity with the other known laws of physics in a comprehensive “theory of everything.”
Zooming in on the microcosm of our own bodies and their curious behaviors, Jill Conte considers why we blush:
The ruddy or darkened hue of a blush occurs when muscles in the walls of blood vessels within the skin relax and allow more blood to flow. Interestingly, the skin of the blush region contains more blood vessels than do other parts of the body. These vessels are also larger and closer to the surface, which indicates a possible relationship among physiology, emotion, and social communication. While it is known that blood flow to the skin, which serves to feed cells and regulate surface body temperature, is controlled by the sympathetic nervous system, the exact mechanism by which this process is activated specifically to produce a blush remains unknown.
When legendary theoretical physicist Stephen Hawking was setting out to release A Brief History of Time, one of the most influential science books in modern history, his publishers admonished him that every equation included would halve the book’s sales. Undeterred, he dared include E = mc², even though cutting it out would have allegedly sold another 10 million copies. The anecdote captures the extent of our culture’s distaste for, if not fear of, equations. And yet, argues mathematician Ian Stewart in In Pursuit of the Unknown: 17 Equations That Changed the World, equations have held remarkable power in facilitating humanity’s progress and, as such, call for rudimentary understanding as a form of our most basic literacy.
The power of equations lies in the philosophically difficult correspondence between mathematics, a collective creation of human minds, and an external physical reality. Equations model deep patterns in the outside world. By learning to value equations, and to read the stories they tell, we can uncover vital features of the world around us… This is the story of the ascent of humanity, told in 17 equations.
From how the Pythagorean theorem, which linked geometry and algebra, laid the groundwork of the best current theories of space, time, and gravity to how the Navier-Stokes equation applies to modeling climate change, Stewart delivers a scientist’s gift in a storyteller’s package to reveal how these seemingly esoteric equations are really the foundation for nearly everything we know and use today.
Some the most revolutionary of the breakthroughs Stewart outlines came from thinkers actively interested in both the sciences and the humanities. Take René Descartes, for instance, who is best remembered for his timeless soundbite, Cogito ergo sum — I think, therefore I am. But Descartes’ interests, Stewart points out, extended beyond philosophy and into science and mathematics. In 1639, he observed a curious numerical pattern in regular solids — what was true of a cube was also true of a dodecahedron or an icosahedron, for all of which subtracting from the number of faces the number of edges and then adding the number of vertices equaled 2. (Try it: A cube has 6 faces, 12 edges, and 8 vertices, so 6 – 12 + 8 = 2.) But Descartes, perhaps enchanted by philosophy’s grander questions, saw the equation as a minor curiosity and never published it. Only centuries later mathematicians recognized it as monumentally important. It eventually resulted in Euler’s formula, which helps explain everything from how enzymes act on cellular DNA to why the motion of the celestial bodies can be chaotic.
So how did equations begin, anyway? Stewart explains:
An equation derives its power from a simple source. It tells us that two calculations, which appear different, have the same answer. The key symbol is the equals sign, =. The origins of most mathematical symbols are either lost in the mists of antiquity, or are so recent that there is no doubt where they came from. The equals sign is unusual because it dates back more than 450 years, yet we not only know who invented it, we even know why. The inventor was Robert Recorde, in 1557, in The Whetstone of Witte. He used two parallel lines (he used an obsolete word gemowe, meaning ‘twin’) to avoid tedious repetition of the words ‘is equal to’. He chose that symbol because ‘no two things can be more equal’. Recorde chose well. His symbol has remained in use for 450 years.
The original coinage appeared as follows:
To avoide the deiouse repetition of these woordes: is equalle to: I will sette as I doe often in woorke use, a paire of paralleles, or gemowe lines of one lengthe: =, bicause noe .2. thynges, can be moare equalle.
Far from being a mere math primer or trivia aid, In Pursuit of the Unknown is an essential piece of modern literacy, wrapped in an articulate argument for why this kind of knowledge should be precisely that.
Stewart concludes by turning his gaze towards the future, offering a kind of counter-vision to algo-utopians like Stephen Wolfram and making, instead, a case for the reliable humanity of the equation:
It is still entirely credible that we might soon find new laws of nature based on discrete, digital structures and systems. The future may consist of algorithms, not equations. But until that day dawns, if ever, our greatest insights into nature’s laws take the form of equations, and we should learn to understand them and appreciate them. Equations have a track record. They really have changed the world — and they will change it again.
“Science is always wrong,” George Bernard Shaw famously proclaimed in a toast to Albert Einstein. “It never solves a problem without creating 10 more.”
In the fifth century BC, long before science as we know it existed, Socrates, the very first philosopher, famously observed, “I know one thing, that I know nothing.” Some 21 centuries later, while inventing calculus in 1687, Sir Isaac Newton likely knew all there was to know in science at the time — a time when it was possible for a single human brain to hold all of mankind’s scientific knowledge. Fast-forward 40 generations to today, and the average high school student has more scientific knowledge than Newton did at the end of his life. But somewhere along that superhighway of progress, we seem to have developed a kind of fact-fetishism that shackles us to the allure of the known and makes us indifferent to the unknown knowable. Yet it’s the latter — the unanswered questions — that makes science, and life, interesting. That’s the eloquently argued case at the heart of Ignorance: How It Drives Science, in which Stuart Firestein sets out to debunk the popular idea that knowledge follows ignorance, demonstrating instead that it’s the other way around and, in the process, laying out a powerful manifesto for getting the public engaged with science — a public to whom, as Neil deGrasse Tyson recently reminded Senate, the government is accountable in making the very decisions that shape the course of science.
The tools and currencies of our information economy, Firestein points out, are doing little in the way of fostering the kind of question-literacy essential to cultivating curiosity:
Are we too enthralled with the answers these days? Are we afraid of questions, especially those that linger too long? We seem to have come to a phase in civilization marked by a voracious appetite for knowledge, in which the growth of information is exponential and, perhaps more important, its availability easier and faster than ever.*
The cult of expertise — whose currency are static answers — obscures the very capacity for cultivating a thirst for ignorance:
There are a lot of facts to be known in order to be a professional anything — lawyer, doctor, engineer, accountant, teacher. But with science there is one important difference. The facts serve mainly to access the ignorance… Scientists don’t concentrate on what they know, which is considerable but minuscule, but rather on what they don’t know…. Science traffics in ignorance, cultivates it, and is driven by it. Mucking about in the unknown is an adventure; doing it for a living is something most scientists consider a privilege.
Working scientists don’t get bogged down in the factual swamp because they don’t care all that much for facts. It’s not that they discount or ignore them, but rather that they don’t see them as an end in themselves. They don’t stop at the facts; they begin there, right beyond the facts, where the facts run out. Facts are selected, by a process that is a kind of controlled neglect, for the questions they create, for the ignorance they point to.
Firestein, who chairs the Department of Biological Sciences at Columbia University, stresses that beyond simply accumulating facts, scientists use them as raw material, not finished product. He cautions:
Understanding the raw material for the product is a subtle error but one that can have surprisingly far-reaching consequences. Understanding this error and its ramifications, and setting it straight, is crucial to understanding science.
What emerges is an elegant definition of science:
Real science is a revision in progress, always. It proceeds in fits and starts of ignorance.
In highlighting this commonality science holds with other domains of creative and intellectual labor, Firestein turns to the poet John Keats, who described the ideal state of the literary psyche as Negative Capability — “that is when a man is capable of being in uncertainties, Mysteries, doubts without any irritable reaching after fact & reason.” Firestein translates this to science:
Being a scientist requires having faith in uncertainty, finding pleasure in mystery, and learning to cultivate doubt. There is no surer way to screw up an experiment than to be certain of its outcome.
He captures the heart of this argument in an eloquent metaphor:
Science, then, is not like the onion in the often used analogy of stripping away layer after layer to get at some core, central, fundamental truth. Rather it’s like the magic well: no matter how many buckets of water you remove, there’s always another one to be had. Or even better, it’s like the widening ripples on the surface of a pond, the ever larger circumference in touch with more and more of what’s outside the circle, the unknown. This growing forefront is where science occurs… It is a mistake to bob around in the circle of facts instead of riding the wave to the great expanse lying outside the circle.
However, more important than the limits of our knowledge, Firestein is careful to point out, are the limits to our ignorance. (Cue in Errol Morris’s fantastic 2010 five-part New York Times series, The Anosognosic’s Dilemma.) Science historian and Stanford professor Robert Proctor has even coined a term for the study of ignorance — agnotology — and, Firestein argues, it is a conduit to better understanding progress.
Science historian and philosopher Nicholas Rescher has offered a different term for a similar concept: Copernican cognitivism, suggesting that just like Copernicus showed us there was nothing privileged about our position in space by debunking the geocentric model of the universe, there is also nothing privileged about our cognitive landscape.
But the most memorable articulation of the limits of our own ignorance comes from the Victorian novella Flatland, where a three-dimensional sphere shows up in a two-dimensional land and inadvertently wreaks havoc on its geometric inhabitants’ most basic beliefs about the world as they struggle to imagine the very possibility of a third dimension.
An engagement with the interplay of ignorance and knowledge, the essential bargaining chips of science, is what elevated modern civilization from the intellectual flatness of the Middle Ages. Firestein points out that “the public’s direct experience of the empirical methods of science” helped humanity evolve from the magical and mystical thinking of Western medieval thought to the rational discourse of contemporary culture.
At the same time, Firestein laments, science today is often “as inaccessible to the public as if it were written in classical Latin.” Making it more accessible, he argues, necessitates introducing explanations of science that focus on the unknown as an entry point — a more inclusive gateway than the known.
In one of the most compelling passages of the book, he broadens this insistence on questions over answers to the scientific establishment itself:
Perhaps the most important application of ignorance is in the sphere of education, particularly of scientists… We must ask ourselves how we should educate scientists in the age of Google and whatever will supersede it… The business model of our Universities, in place now for nearly a thousand years, will need to be revised.
Instead of a system where the collection of facts is an end, where knowledge is equated with accumulation, where ignorance is rarely discussed, we will have to provide the Wiki-raised student with a taste of and for boundaries, the edge of the widening circle of ignorance, how the data, which are not unimportant, frames the unknown. We must teach students how to think in questions, how to manage ignorance. W. B. Yeats admonished that ‘education is not the filling of a pail, but the lighting of a fire.’
Firestein sums it up beautifully:
Science produces ignorance, and ignorance fuels science. We have a quality scale for ignorance. We judge the value of science by the ignorance it defines. Ignorance can be big or small, tractable or challenging. Ignorance can be thought about in detail. Success in science, either doing it or understanding it, depends on developing comfort with the ignorance, something akin to Keats’ negative capability.
The Ancient Greeks believed that one fell asleep when the brain filled with blood and awakened once it drained back out. Nineteenth-century philosophers contended that sleep happened when the brain was emptied of ambitions and stimulating thoughts. “If sleep doesn’t serve an absolutely vital function, it is the greatest mistake evolution ever made,” biologist Allan Rechtschaffen once remarked. Even today, sleep remains one of the most poorly understood human biological functions, despite some recent strides in understanding the “social jetlag” of our internal clocks and the relationship between dreaming and depression. In Dreamland: Adventures in the Strange Science of Sleep (public library), journalist David K. Randall — who stumbled upon the idea after crashing violently into a wall while sleepwalking — explores “the largest overlooked part of your life and how it affects you even if you don’t have a sleep problem.” From gender differences to how come some people snore and others don’t to why we dream, he dives deep into this mysterious third of human existence to illuminate what happens when night falls and how it impacts every aspect of our days.
Most of us will spend a full third of our lives asleep, and yet we don’t have the faintest idea of what it does for our bodies and our brains. Research labs offer surprisingly few answers. Sleep is one of the dirty little secrets of science. My neurologist wasn’t kidding when he said there was a lot that we don’t know about sleep, starting with the most obvious question of all — why we, and every other animal, need to sleep in the first place.
But before we get too anthropocentrically arrogant in our assumptions, it turns out the quantitative requirement of sleep isn’t correlated with how high up the evolutionary chain an organism is:
Lions and gerbils sleep about thirteen hours a day. Tigers and squirrels nod off for about fifteen hours. At the other end of the spectrum, elephants typically sleep three and a half hours at a time, which seems lavish compared to the hour and a half of shut-eye that the average giraffe gets each night.
Humans need roughly one hour of sleep for every two hours they are awake, and the body innately knows when this ratio becomes out of whack. Each hour of missed sleep one night will result in deeper sleep the next, until the body’s sleep debt is wiped clean.
What, then, happens as we doze off, exactly? Like all science, our understanding of sleep seems to be a constant “revision in progress”:
Despite taking up so much of life, sleep is one of the youngest fields of science. Until the middle of the twentieth century, scientists thought that sleep was an unchanging condition during which time the brain was quiet. The discovery of rapid eye movements in the 1950s upended that. Researchers then realized that sleep is made up of five distinct stages that the body cycles through over roughly ninety-minute periods. The first is so light that if you wake up from it, you might not realize that you have been sleeping. The second is marked by the appearance of sleep-specific brain waves that last only a few seconds at a time. If you reach this point in the cycle, you will know you have been sleeping when you wake up. This stage marks the last drop before your brain takes a long ride away from consciousness. Stages three and four are considered deep sleep. In three, the brain sends out long, rhythmic bursts called delta waves. Stage four is known as slow-wave sleep for the speed of its accompanying brain waves. The deepest form of sleep, this is the farthest that your brain travels from conscious thought. If you are woken up while in stage four, you will be disoriented, unable to answer basic questions, and want nothing more than to go back to sleep, a condition that researchers call sleep drunkenness. The final stage is REM sleep, so named because of the rapid movements of your eyes dancing against your eyelids. In this stage of sleep, the brain is as active as it is when it is awake. This is when most dreams occur.
Randall’s most urgent point, however, echoes what we’ve already heard from German chronobiologist Till Roenneberg (see above) — in our blind lust for the “luxuries” of modern life, with all its 24-hour news cycles, artificial lighting on demand, and expectations of round-the-clock telecommunications availability, we’ve thrown ourselves into a kind of circadian schizophrenia:
We are living in an age when sleep is more comfortable than ever and yet more elusive. Even the worst dorm-room mattress in America is luxurious compared to sleeping arrangements that were common not long ago. During the Victorian era, for instance, laborers living in workhouses slept sitting on benches, with their arms dangling over a taut rope in front of them. They paid for this privilege, implying that it was better than the alternatives. Families up to the time of the Industrial Revolution engaged in the nightly ritual of checking for rats and mites burrowing in the one shared bedroom. Modernity brought about a drastic improvement in living standards, but with it came electric lights, television, and other kinds of entertainment that have thrown our sleep patterns into chaos.
Work has morphed into a twenty-four-hour fact of life, bringing its own set of standards and expectations when it comes to sleep … Sleep is ingrained in our cultural ethos as something that can be put off, dosed with coffee, or ignored. And yet maintaining a healthy sleep schedule is now thought of as one of the best forms of preventative medicine.
Reflecting on his findings, Randall marvels:
As I spent more time investigating the science of sleep, I began to understand that these strange hours of the night underpin nearly every moment of our lives.
Indeed, Dreamland goes on to explore how sleep — its mechanisms, its absence, its cultural norms — affects everyone from police officers and truck drivers to artists and entrepreneurs, permeating everything from our decision-making to our emotional intelligence.
Since the dawn of recorded history, humanity has been turning to the visual realm as a sensemaking tool for the world and our place in it, mapping and visualizing everything from the body to the brain to the universe to information itself. Trees of Life: A Visual History of Evolution (public library) catalogs 230 tree-like branching diagrams, culled from 450 years of mankind’s visual curiosity about the living world and our quest to understand the complex ecosystem we share with other organisms, from bacteria to birds, microbes to mammals.
Though the use of a tree as a metaphor for understanding the relationships between organisms is often attributed to Darwin, who articulated it in his Origin of Species by Means of Natural Selection in 1859, the concept, most recently appropriated in mapping systems and knowledge networks, is actually much older, predating the theory of evolution itself. The collection is thus at once a visual record of the evolution of science and of its opposite — the earliest examples, dating as far back as the sixteenth century, portray the mythic order in which God created Earth, and the diagrams’ development over the centuries is as much a progression of science as it is of culture, society, and paradigm.
Theodore W. Pietsch writes in the introduction:
The tree as an iconographic metaphor is perhaps the most universally widespread of all great cultural symbols. Trees appear and reappear throughout human history to illustrate nearly every aspect of life. The structural complexity of a tree — its roots, trunk, bifurcating branches, and leaves — has served as an ideal symbol throughout the ages to visualize and map hierarchies of knowledge and ideas.
Neil deGrasse Tyson might be one of today’s most prominent astrophysicists, but he’s also a kind of existential philosopher, bringing his insights from science into the broader realm of the human condition — a kind of modern-day Carl Sagan with a rare gift for blending science and storytelling to both rub neurons with his fellow scientists and engage a popular-interest audience. In Space Chronicles: Facing the Ultimate Frontier, Tyson explores the future of space travel in the wake of NASA’s decision to put human space flight essentially on hold, using his signature wit and scientific prowess to lay out an urgent manifesto for the economic, social, moral, and cultural importance of space exploration. This excerpt from the introduction captures Tyson’s underlying ethos and echoes other great thinkers’ ideas about intuition and rationality, blending the psychosocial with the political:
Some of the most creative leaps ever taken by the human mind are decidedly irrational, even primal. Emotive forces are what drive the greatest artistic and inventive expressions of our species. How else could the sentence ‘He’s either a madman or a genius’ be understood?
It’s okay to be entirely rational, provided everybody else is too. But apparently this state of existence has been achieved only in fiction [where] societal decisions get made with efficiency and dispatch, devoid of pomp, passion, and pretense.
To govern a society shared by people of emotion, people of reason, and everybody in between — as well as people who think their actions are shaped by logic but in fact are shaped by feelings and nonempirical philosophies — you need politics. At its best, politics navigates all the minds-states for the sake of the greater good, alert to the rocky shoals of community, identity, and the economy. At its worst, politics thrives on the incomplete disclosure or misrepresentation of data required by an electorate to make informed decisions, whether arrived at logically or emotionally.
Nowhere does Tyson’s gift shine more brilliantly than in this goosebump-inducing mashup by Max Schlickenmeyer, remixing images of nature at its most inspiring with the narration of Tyson’s answer to a TIME magazine reader, who asked, “What is the most astounding fact you can share with us about the Universe?”
When I look up at the night sky and I know that, yes, we are part of this Universe, we are in this Universe, but perhaps more important than most of those facts is that the Universe is in us. When I reflect on that fact, I look up — many people feel small, because they’re small, the Universe is big — but I feel big, because my atoms came from those stars. There’s a level of connectivity — that’s really what you want in life. You want to feel connected, you want to feel relevant. You want to feel like you’re a participant in the goings on and activities and events around you. That’s precisely what we are, just by being alive.”
For the past 175 years, the The National Library of Medicine in Bethesda has been building the world’s largest collection of biomedical images, artifacts, and ephemera. With more than 17 million items spanning ten centuries, it’s a treasure trove of rare, obscure, extravagant wonders, most of which remain unseen by the public and unknown even to historians, librarians, and curators. Until now.
Hidden Treasure is an exquisite large-format volume that culls some of the most fascinating, surprising, beautiful, gruesome, and idiosyncratic objects from the Library’s collection in 450 full-color illustrations. From rare “magic lantern slides” doctors used to entertain and cure inmates at the St. Elizabeth’s Hospital for the Insane to astonishing anatomical atlases to the mimeographed report of the Japanese medical team first to enter Hiroshima after the atomic blast, each of the curious ephemera is contextualized in a brief essay by a prominent scholar, journalist, artist, collector, or physician. What results is a remarkable journey not only into the evolution of mankind’s understanding of the physicality of being human, but also into the evolution of librarianship itself, amidst the age of the digital humanities.
Michael North, Jeffrey Reznick, and Michael Sappol remind us in the introduction:
It’s no secret that nowadays we look for libraries on the Internet — without moving from our desks or laptops or mobile phones… We’re in a new and miraculous age. But there are still great libraries, in cities and on campuses, made of brick, sandstone, marble, and glass, containing physical objects, and especially enshrining the book: the Library of Congress, Bibliotheque Nationale de France, the British Library, the New York Public Library, the Wellcome Library, the great university libraries at Oxford, Harvard, Yale, Johns Hopkins, and elsewhere. And among them is the National LIbrary of Medicine in Bethesda, the world’s largest medical library, with its collection of over 17 million books, journals, manuscripts, prints, photographs, posters, motion pictures, sound recordings, and “ephemera” (pamphlets, matchbook covers, stereograph cards, etc.).
Thoughtfully curated, beautifully produced, and utterly transfixing, Hidden Treasure unravels our civilization’s relationship with that most human of humannesses. Because try as we might to order the heavens, map the mind, and chart time in our quest to know the abstract, we will have failed at being human if we neglect this most fascinating frontier of concrete existence, the mysterious and ever-alluring physical body.
“The universe is made of stories, not atoms,” poet Muriel Rukeyser famously remarked. “We’re made of star-stuff,” Carl Sagan countered. But some of the most fascinating and important stories are thosethatexplain atoms and “star stuff.” Such is the case of The Quantum Universe: Everything That Can Happen Does Happen by rockstar-physicist Brian Cox and University of Manchester professor Jeff Forshaw — a remarkable and absorbing journey into the fundamental fabric of nature, exploring how quantum theory provides a framework for explaining everything from silicon chips to stars to human behavior.
Quantum theory is perhaps the prime example of the infinitely esoteric becoming the profoundly useful. Esoteric, because it describes a world in which a particle really can be in several places at once and moves from one place to another by exploring the entire Universe simultaneously. Useful, because understanding the behaviour of the smallest building blocks of the universe underpins our understanding of everything else. This claim borders on the hubristic, because the world is filled with diverse and complex phenomena. Notwithstanding this complexity, we have discovered that everything is constructed out of a handful of tiny particles that move around according to the rules of quantum theory. The rules are so simple that they can be summarized on the back of an envelope. And the fact that we do not need a whole library of books to explain the essential nature of things is one of the greatest mysteries of all.
The story weaves a century of scientific hindsight and theoretical developments, from Einstein to Feynman by way of Max Planck, who coined the term “quantum” in 1900 to describe the “black body radiation” of hot objects through light emitted in little packets of energy he called “quanta,” to arrive at a modern perspective on quantum theory and its primary role in predicting observable phenomena.
The picture of the universe we inhabit, as revealed by modern physics, [is] one of underlying simplicity; elegant phenomena dance away out of sight and the diversity of the macroscopic world emerges. This is perhaps the crowning achievement of modern science; the reduction of the tremendous complexity in the world, human beings included, to a description of the behaviour of just a handful of tiny subatomic particles and the four forces that act between them.
To demonstrate that quantum theory is intimately entwined with the fabric of our everyday, rather than a weird and esoteric fringe of science, Cox offers an example rooted in the familiar. (An example, in this particular case, based on the wrong assumption — I was holding an iPad — in a kind of ironic meta-wink from Heisenberg’s uncertainty principle.)
Consider the world around you. You are holding a book made of paper, the crushed pulp of a tree. Trees are machines able to take a supply of atoms and molecules, break them down and rearrange them into cooperating colonies composed of many trillions of individual parts. They do this using a molecule known as chlorophyll, composed of over a hundred carbon, hydrogen and oxygen atoms twisted into an intricate shape with a few magnesium and nitrogen atoms bolted on. This assembly of particles is able to capture the light that has travelled the 93 million miles from our star, a nuclear furnace the volume of a million earths, and transfer that energy into the heart of cells, where it is used to build molecules from carbon dioxide and water, giving out life-enriching oxygen as it does so. It’s these molecular chains that form the superstructure of trees and all living things, the paper in your book. You can read the book and understand the words because you have eyes that can convert the scattered light from the pages into electrical impulses that are interpreted by your brain, the most complex structure we know of in the Universe. We have discovered that all these things are nothing more than assemblies of atoms, and that the wide variety of atoms are constructed using only three particles: electrons, protons and neutrons. We have also discovered that the protons and neutrons are themselves made up of smaller entities called quarks, and that it is where things stop, as far as we can tell today. Underpinning all of this is quantum theory.
But at the core of The Quantum Universe are a handful of grand truths that transcend the realm of science as an academic discipline and shine out into the vastest expanses of human existence: that in science, as in art, everything builds on what came before; that everything is connected to everything else; and, perhaps most importantly, that despite our greatest compulsions for control and certainty, much of the universe — to which the human heart and mind belong — remains reigned over by chance and uncertainty. Cox puts it this way:
A key feature of quantum theory [is that] it deals with probabilities rather than certainties, not because we lack absolute knowledge, but because some aspects of Nature are, at their very heart, governed by the laws of chance.”
“If you wish to make an apple pie from scratch,”Carl Sagan famously observed in Cosmos, “you must first invent the universe.” The questions children ask are often so simple, so basic, that they turn unwittingly yet profoundly philosophical in requiring apple-pie-from-scratch type of answers. To explore this fertile intersection of simplicity and expansiveness, Gemma Elwin Harris asked thousands of primary school children between the ages of four and twelve to send in their most restless questions, then invited some of today’s most prominent scientists, philosophers, and writers to answer them. The result is Big Questions from Little People & Simple Answers from Great Minds (public library) — a compendium of fascinating explanations of deceptively simple everyday phenomena, featuring such modern-day icons as Mary Roach, Noam Chomsky, Philip Pullman, Richard Dawkins, and many more, with a good chunk of the proceeds being donated to Save the Children.
Most of the time, you feel in charge of your own mind. You want to play with some Lego? Your brain is there to make it happen. You fancy reading a book? You can put the letters together and watch characters emerge in your imagination.
But at night, strange stuff happens. While you’re in bed, your mind puts on the weirdest, most amazing and sometimes scariest shows.
In the olden days, people believed that our dreams were full of clues about the future. Nowadays, we tend to think that dreams are a way for the mind to rearrange and tidy itself up after the activities of the day.
Why are dreams sometimes scary? During the day, things may happen that frighten us, but we are so busy we don’t have time to think properly about them. At night, while we are sleeping safely, we can give those fears a run around. Or maybe something you did during the day was lovely but you were in a hurry and didn’t give it time. It may pop up in a dream. In dreams, you go back over things you missed, repair what got damaged, make up stories about what you’d love, and explore the fears you normally put to the back of your mind.
Dreams are both more exciting and more frightening than daily life. They’re a sign that our brains are marvellous machines — and that they have powers we don’t often give them credit for, when we’re just using them to do our homework or play a computer game. Dreams show us that we’re not quite the bosses of our own selves.
Evolutionary biologist Richard Dawkins breaks down the math of evolution and cousin marriages to demonstrate that we are all related:
Yes, we are all related. You are a (probably distant) cousin of the Queen, and of the president of the United States, and of me. You and I are cousins of each other. You can prove it to yourself.
Everybody has two parents. That means, since each parent had two parents of their own, that we all have four grandparents. Then, since each grandparent had to have two parents, everyone has eight great-grandparents, and sixteen great- great-grandparents and thirty-two great-great-great-grandparents and so on.
You can go back any number of generations and work out the number of ancestors you must have had that same number of generations ago. All you have to do is multiply two by itself that number of times.
Suppose we go back ten centuries, that is to Anglo-Saxon times in England, just before the Norman Conquest, and work out how many ancestors you must have had alive at that time.
If we allow four generations per century, that’s about forty generations ago.
Two multiplied by itself forty times comes to more than a thousand trillion. Yet the total population of the world at that time was only around three hundred million. Even today the population is seven billion, yet we have just worked out that a thousand years ago your ancestors alone were more than 150 times as numerous.
The real population of the world at the time of Julius Caesar was only a few million, and all of us, all seven billion of us, are descended from them. We are indeed all related. Every marriage is between more or less distant cousins, who already share lots and lots of ancestors before they have children of their own.
By the same kind of argument, we are distant cousins not only of all human beings but of all animals and plants. You are a cousin of my dog and of the lettuce you had for lunch, and of the next bird that you see fly past the window. You and I share ancestors with all of them. But that is another story.
To understand why, you need to know more about how your brain works. One of its main tasks is to try to make good guesses about what’s going to happen next. While you’re busy getting on with your life, walking downstairs or eating your breakfast, parts of your brain are always trying to predict the future.
Remember when you first learned how to ride a bicycle? At first, it took a lot of concentration to keep the handlebars steady and push the pedals. But after a while, cycling became easy. Now you’re not aware of the movements you make to keep the bike going. From experience, your brain knows exactly what to expect so your body rides the bike automatically. Your brain is predicting all the movements you need to make.
You only have to think consciously about cycling if something changes — like if there’s a strong wind or you get a flat tyre. When something unexpected happens like this, your brain is forced to change its predictions about what will happen next. If it does its job well, you’ll adjust to the strong wind, leaning your body so you don’t fall.
Why is it so important for our brains to predict what will happen next? It helps us make fewer mistakes and can even save our lives.
Because your brain is always predicting your own actions, and how your body will feel as a result, you cannot tickle yourself. Other people can tickle you because they can surprise you. You can’t predict what their tickling actions will be.
And this knowledge leads to an interesting truth: if you build a machine that allows you to move a feather, but the feather moves only after a delay of a second, then you can tickle your- self. The results of your own actions will now surprise you.
Particle physicist and cosmologist Lawrence Krauss explains why we’re all made of stardust:
Everything in your body, and everything you can see around you, is made up of tiny objects called atoms. Atoms come in different types called elements. Hydrogen, oxygen and carbon are three of the most important elements in your body.
How did those elements get into our bodies? The only way they could have got there, to make up all the material on our Earth, is if some of those stars exploded a long time ago, spew- ing all the elements from their cores into space. Then, about four and a half billion years ago, in our part of our galaxy, the material in space began to collapse. This is how the Sun was formed, and the solar system around it, as well as the material that forms all life on earth.
So, most of the atoms that now make up your body were created inside stars! The atoms in your left hand might have come from a different star from those in your right hand. You are really a child of the stars.
But my favorite answers are to the all-engulfing question, How do we fall in love? Author Jeanette Winterson offers this breathlessly poetic response:
You don’t fall in love like you fall in a hole. You fall like falling through space. It’s like you jump off your own private planet to visit someone else’s planet. And when you get there it all looks different: the flowers, the animals, the colours people wear. It is a big surprise falling in love because you thought you had everything just right on your own planet, and that was true, in a way, but then somebody signalled to you across space and the only way you could visit was to take a giant jump. Away you go, falling into someone else’s orbit and after a while you might decide to pull your two planets together and call it home. And you can bring your dog. Or your cat. Your goldfish, hamster, collection of stones, all your odd socks. (The ones you lost, including the holes, are on the new planet you found.)
And you can bring your friends to visit. And read your favourite stories to each other. And the falling was really the big jump that you had to make to be with someone you don’t want to be without. That’s it.
What happens when we fall in love is probably one of the most difficult things in the whole universe to explain. It’s something we do without thinking. In fact, if we think about it too much, we usually end up doing it all wrong and get in a terrible muddle. That’s because when you fall in love, the right side of your brain gets very busy. The right side is the bit that seems to be especially important for our emotions. Language, on the other hand, gets done almost completely in the left side of the brain. And this is one reason why we find it so difficult to talk about our feelings and emotions: the language areas on the left side can’t send messages to the emotional areas on the right side very well. So we get stuck for words, unable to describe our feelings.
But science does allow us to say a little bit about what happens when we fall in love. First of all, we know that love sets off really big changes in how we feel. We feel all light-headed and emotional. We can be happy and cry with happiness at the same time. Suddenly, some things don’t matter any more and the only thing we are interested in is being close to the person we have fallen in love with.
These days we have scanner machines that let us watch a person’s brain at work. Different parts of the brain light up on the screen, depending on what the brain is doing. When people are in love, the emotional bits of their brains are very active, lighting up. But other bits of the brain that are in charge of more sensible thinking are much less active than normal. So the bits that normally say ‘Don’t do that because it would be crazy!’ are switched off, and the bits that say ‘Oh, that would be lovely!’ are switched on.
Why does this happen? One reason is that love releases certain chemicals in our brains. One is called dopamine, and this gives us a feeling of excitement. Another is called oxytocin and seems to be responsible for the light-headedness and cosiness we feel when we are with the person we love. When these are released in large quantities, they go to parts of the brain that are especially responsive to them.
But all this doesn’t explain why you fall in love with a particular person. And that is a bit of a mystery, since there seems to be no good reason for our choices. In fact, it seems to be just as easy to fall in love with someone after you’ve married them as before, which seems the wrong way round. And here’s another odd thing. When we are in love, we can trick ourselves into thinking the other person is perfect. Of course, no one is really perfect. But the more perfect we find each other, the longer our love will last.
In Central Park: An Anthology (public library), Andrew Blauner collects twenty paeans to this one particular, and particularly beloved, part of the city by twenty of its most celebrated authors. Adrian Benepe promises in the introduction:
Reading this volume is a little like a walk in the park with some truly excellent companions… .
It underscores the fact that Central Park is not simply a geographic destination, nor just the essential masterpiece of landscape architecture and great creative accomplishments of the nineteenth century. Once you add people and time, it becomes a ever-evolving work of art and performance art. It is central to our thinking, our style, and our magnificence.
And the slim but potent volume lives up to that promise.
In “Through the Children’s Gate,” Adam Gopnik brings a dimensional lens to one of New Yorkers’ most persistent and enduring laments: the city’s inescapable pace of change, with its embedded nostalgia for what once was and never will again be:
Still, croissants and crime are not lifestyle choices, to be taken according to taste; the reduction of fear, as anyone who has spent time in Harlem can attest, is a grace as large as any imaginable. To revise Chesterton slightly: People who refuse to be sentimental about the normal things don’t end up being sentimental about nothing; they end up being sentimental about anything, shedding tears over old muggings and the perfect, glittering shards of the little crack vials, sparkling like diamonds in the gutter. Où sont les neiges d’antan?: Who cares if the snows were all of cocaine? We saw them falling and our hearts were glad.
It is a strange thing to be the serpent in one’s own garden, the snake in one’s own grass. The suburbanization of New York is a fact, and a worrying one, and everyone has moments of real disappointment and distraction. The Soho where we came of age, with its organic intertwinings of art and food, commerce and cutting edge, is unrecognizable to us now— but then that Soho we knew was unrecognizable to its first émigrés, who by then had moved on to Tribeca. This is only to say that in the larger, inevitable human accounting of New York, there are gains and losses, a zero sum of urbanism: The great gain of civility and peace is offset by a loss of creative kinds of vitality and variety. (There are new horizons of Bohemia in Brooklyn and beyond, of course, but Brooklyn has its bards already, to sing its streets and smoke, as they will and do. My heart lies with the old island of small homes and big buildings, the sounds coming from one resonating against the sounding board of the other.)
But those losses are inevitably specific. There is always a new New York coming into being as the old one disappears. And that city— or cities; there are a lot of different ones on the same map— has its peculiar pleasures and absurdities as keen as any other’s. The one I awakened to, and into— partly by intellectual affinity, and much more by the ringing of an alarm clock every morning at seven— was the civilization of childhood in New York. The phrase is owed to Iona Opie, the great scholar of children’s games and rhymes, whom I got to interview once. “Childhood is a civilization with its own rules and rituals,” she told me, charmingly but flatly, long before I had children of my own. “Children never refer to each other as children. They call themselves, rightly, people, and tell you what it is that people like them— their people— believe and do.” The Children’s Gate exists; you really can go through it.
In “Framed in Silver,” Mark Helprin reflects on the park through the dusty photographs of his own childhood:
My father and I are in Central Park, on the path that leads from the playground at Ninety-third Street toward the Reservoir. I am about two. It is not long after the war, still the first half of the twentieth century. I know nothing of what has passed. You can see in his face that as someone who was born as the century turned, my father knows perhaps too much. I know nothing of what is to come. Having lived through the great wars and the small, he does. We are walking together, he in a double-breasted great coat, I in an absurd snowsuit. He has a Liberty of London scarf, and his hair is still as black as it was in the desert. I come up to his midthigh, a hood surrounds my face, and on top of it, and my head, is a pompom.
We have passed the playground that was the setting of my first dream, in which I flew from one outcropping of granite to another. Unknowing of the nature of dreams, when I awoke I believed that I had actually flown. I’m holding my father’s hand, or, rather, he is holding mine, which disappears quite easily in his. Confident of his absolute protection, I think that as long as I am tethered and close, nothing can ever hurt me. He knows better.
Although I dreamed that I could fly, I would not have dreamed that someday I would look back upon the invisible paths made by those whom I love and who are gone, that the picture in which I am walking in Central Park with my father would darken over time, like a clock about to mark the inevitable moment in which I will rejoin him. And then, perhaps as now I am aware of the invisible paths made by others, still others might feel, like the breeze you cannot see, the invisible paths made by me.
In “The Colossus of New York,” Colson Whitehead paints a mosaic portrait of the archetypes you’re promised to encounter in the park — the hipsters, the socialite ladies, the entitled parents, the photographers, the lovers. And, of course, the runners:
SO MANY PEOPLE running. Is something chasing them. Yes, something different is chasing each of them and gaining slowly. She feels fit and trim. People remove layers one by one the deeper they get into the park. The sweaters keep falling from their waists no matter how they tie them. The matching strides of the jogging pair give no indication that after she tells her secret he will stop and bend and put his palms to his knees. Like some of the trees here, some of today’s miseries are evergreen. Others merely deciduous. This is his tenth attempt to join the jogging culture. This latest outfit will do the trick. Pant and heave. How much farther. Reservoir of what. Small devices keep track of ingrown miles. Unfold these laps from their tight circuit to make marathons. It’s his best time yet, never to be repeated. If he had known, he would have saved it for after a hard day at the office or a marital argument. Instead all he has is sweat stains to commemorate. One convert says, I’m going to come here every day from now on. It’s so refreshing.
In “Some Music in the Park,” Francine Prose traces the history of the park as a stage for music and politics:
There was nothing neutral about Nina Simone’s performance. She sang “Strange Fruit,” which is about the bodies of lynching victims hanging from trees in the South. She sang “Four Women,” which is about the oppression— slavery, rape, prostitution— of African American women. She sang “Mississippi Goddam,” a song inspired by the murder of Medgar Evers and the church bombing in Birmingham, Alabama, that killed four little girls. Every time she said Goddam, she spit the word at the audience. I had never seen a performer, let alone a woman, let alone a black woman, be that angry on stage. She was telling us that, to paraphrase a saying popular in those days, we were not part of the solution; we were part of the problem.
In “The Sixth Borough,” Jonathan Safran Foer (he of Tree of Codes fame) weaves a whimsical alternative mythology, in which a sixth borough mysteriously floats away from the island of Manhattan, but a piece of it is transplanted — literally, lifted off with giant hooks and pulled by the people of New York into its new place — to become what we know as Central Park:
Children were allowed to lie down on the park as it was being moved. This was considered a concession, although no one knew why a concession was necessary, or why it was to children that this concession must be made. The biggest fireworks show in history lighted the skies of New York City that night, and the Philharmonic played its heart out. The children of New York lay on their backs, body to body, filling every inch of the park as if it had been designed for them and that moment. The fireworks sprinkled down, dissolving in the air just before they reached the ground, and the children were pulled, one inch and one second at a time, into Manhattan and adulthood. By the time the park found its current resting place, every single one of the children had fallen asleep, and the park was a mosaic of their dreams. Some hollered out, some smiled unconsciously, some were perfectly still.
Was there really a Sixth Borough?
There’s no irrefutable evidence. There’s nothing that could convince someone who doesn’t want to be convinced.
Foer does what he does best, grounding the escapist whimsy back into a brilliantly human reality:
[I]t’s hard for anyone, even the most cynical of cynics, to spend more than a few minutes in Central Park without feeling that he or she is experiencing some tense in addition to just the present. Maybe it’s our own nostalgia for what’s past, or our own hopes for what’s to come. Or maybe it’s the residue of the dreams from that night the park was moved, when all of the children of New York City exercised their subconsciouses at once. Maybe we miss what they had lost, and yearn for what they wanted.
In “Fogg in the Park,” Paul Auster juxtaposes the unspoken behavioral governance of the city with the parallel universe of the park:
To walk among the crowd means never going faster than anyone else, never lagging behind your neighbor, never doing anything to disrupt the flow of human traffic. If you play by the rules of this game, people will tend to ignore you. There is a particular glaze that comes over the eyes of New Yorkers when they walk through the streets, a natural and perhaps necessary form of indifference to others. It doesn’t matter how you look, for example. Outrageous costumes, bizarre hairdos, T-shirts with obscene slogans printed across them— no one pays attention to such things. On the other hand, the way you act inside your clothes is of the utmost importance. Odd gestures of any kind are automatically taken as a threat. Talking out loud to yourself, scratching your body, looking someone directly in the eye: these deviations can trigger off hostile and sometimes violent reactions from those around you. You must not stagger or swoon, you must not clutch the walls, you must not sing, for all forms of spontaneous or involuntary behavior are sure to elicit stares, caustic remarks, and even an occasional shove or kick in the shins. I was not so far gone that I received any treatment of that sort, but I saw it happen to others, and I knew that a day might eventually come when I wouldn’t be able to control myself anymore. By contrast, life in Central Park allowed for a much broader range of variables. No one thought twice if you stretched out on the grass and went to sleep in the middle of the day. No one blinked if you sat under a tree and did nothing, if you played your clarinet, if you howled at the top of your lungs. Except for the office workers who lurked around the fringes of the park at lunch hour, the majority of people who came in there acted as if they were on holiday. The same things that would have alarmed them in the streets were dismissed as casual amusements. People smiled at each other and held hands, bent their bodies into unusual shapes, kissed. It was live and let live, and as long as you did not actively interfere with what others were doing, you were free to do what you liked.
What emerges is a meditation on what it means to be oneself:
In the park, I did not have to carry around this burden of self-consciousness. It gave me a threshold, a boundary, a way to distinguish between the inside and the outside. If the streets forced me to see myself as others saw me, the park gave me a chance to return to my inner life, to hold on to myself purely in terms of what was happening inside me. It is possible to survive without a roof over your head, I discovered, but you cannot live without establishing an equilibrium between the inner and outer.
Perhaps that was all I had set out to prove in the first place: that once you throw your life to the winds, you will discover things you had never known before, things that cannot be learned under any other circumstances.
Like its subject, Central Park: An Anthology is woven of the kind of magic that summons wildly different multiverses and commands them to fold unto each other with fluidity and grace as a single enchanted world unfolds.