Brain Pickings Icon
Brain Pickings

Search results for “time travel”

The Annihilation of Space and Time: Rebecca Solnit on How Muybridge Froze the Flow of Existence, Shaped Visual Culture, and Changed Our Consciousness

“Before, every face, every place, every event, had been unique, seen only once and then lost forever among the changes of age, light, time. The past existed only in memory and interpretation, and the world beyond one’s own experience was mostly stories.”

The Annihilation of Space and Time: Rebecca Solnit on How Muybridge Froze the Flow of Existence, Shaped Visual Culture, and Changed Our Consciousness

The great Russian filmmaker Andrei Tarkovsky described the art of cinema as “sculpting in time,” asserting that people go to the movies because they long to experience “time lost or spent or not yet had.” A century earlier, the English photographer Eadweard Muybridge (April 9, 1830–May 8, 1904) exposed the bedrock of time and devised the first chisel for its sculpting in his pioneering photographic studies of motion, which forever changed the modern world — not only by ushering in a technological revolution the effects of which permeate and even dictate our daily lives today, but also, given how bound up in space and time our thinking ego is, transforming our very consciousness. For the very first time, Muybridge’s motion studies captured what T.S. Eliot would later call “the still point of the turning world.”

With her unparalleled intellectual elegance and poetic prose, Rebecca Solnit tells the story of that transformation in River of Shadows: Eadweard Muybridge and the Technological Wild West (public library).

Eadweard Muybridge: The Horse in Motion
Eadweard Muybridge: The Horse in Motion

Solnit frames the impact of the trailblazing experiments Muybridge conducted in the spring of 1872, when he first photographed a galloping horse:

[Muybridge] had captured aspects of motion whose speed had made them as invisible as the moons of Jupiter before the telescope, and he had found a way to set them back in motion. It was as though he had grasped time itself, made it stand still, and then made it run again, over and over. Time was at his command as it had never been at anyone’s before. A new world had opened up for science, for art, for entertainment, for consciousness, and an old world had retreated farther.

Technology and consciousness, of course, have always shaped one another, perhaps nowhere more so than in our experience of time — from the moment Galileo’s invention of the clock sparked modern timekeeping to the brutality with which social media timelines beleaguer us with a crushing sense of perpetual urgency. But the 1870s were a particularly fecund zeitgeist of technological transformation by Solnit’s perfect definition of technology as “a practice, a technique, or a device for altering the world or the experience of the world.” She writes:

The experience of time was itself changing dramatically during Muybridge’s seventy-four years, hardly ever more dramatically than in the 1870s. In that decade the newly invented telephone and phonograph were added to photography, telegraphy, and the railroad as instruments for “annihilating time and space.”

[…]

The modern world, the world we live in, began then, and Muybridge helped launch it.

[…]

His trajectory ripped through all the central stories of his time — the relationship to the natural world and the industrialization of the human world, the Indian wars, the new technologies and their impact on perception and consciousness. He is the man who split the second, as dramatic and far-reaching an action as the splitting of the atom.

Eadweard Muybridge: Sequenced image of a rotating sulky wheel with self-portrait
Eadweard Muybridge: Sequenced image of a rotating sulky wheel with self-portrait

Shining a sidewise gleam at just how radically the givens we take for granted have changed since Muybridge’s time, Solnit writes of that era in which a man could shoot his wife’s lover and be acquitted for justifiable homicide:

In the eight years of his motion-study experiments in California, he also became a father, a murderer, and a widower, invented a clock, patented two photographic innovations, achieved international renown as an artist and a scientist, and completed four other major photographic projects.

With the invention of cinema still more than a decade away, Muybridge’s shutters and film development techniques fused cutting-edge engineering and chemistry to produce more and better high-speed photographs than anyone had before. In a sense, Virginia Woolf’s famous complaint about the visual language of cinema“the eye licks it all up instantaneously, and the brain, agreeably titillated, settles down to watch things happening without bestirring itself to think,” she scoffed in 1926 — was an indictment of this new visual language of time and, indirectly, of Muybridge’s legacy. Had he not rendered time visible and tangible, Bertrand Russell may not have proclaimed that “even though time be real, to realize the unimportance of time is the gate of wisdom”; had his pioneering photography not altered our relationship to the moment, Italo Calvino would not have had to issue his prescient lamentation that “the life that you live in order to photograph it is already, at the outset, a commemoration of itself.”

Edweard Muyridge: A man standing on his hands from a lying down position
Eadweard Muyridge: A man standing on his hands from a lying down position

In a testament to the notion that all creative work builds on what came before, Muybridge made significant improvements on the zoetrope — a rotating device, invented in 1834, which creates the illusion of motion by presenting a series of spinning images through a slot. But alongside the practical improvement upon existing technologies, he also built upon larger cultural leaps — most significantly, the rise of the railroads, which compressed space and time unlike anything ever had.

In 1872, the railroad magnate Leland Stanford — who would later co-found Stanford University with his wife, Jane — commissioned Muybridge to study the gaits of galloping and trotting horses in order to determine whether all four feet lifted off the ground at once at any point. Since horses gallop at a speed that outpaces the perception of the human eye, this was impossible to discern without freezing motion into a still image. So began Muybridge’s transformation of time.

Horse in Motion: One of Muybridge's motion studies commissioned by Stanford
Horse in Motion: One of Muybridge’s motion studies commissioned by Stanford

With her penchant for cultural history laced with subtle, perfectly placed political commentary, Solnit traces the common root of Hollywood and Silicon Valley to Muybridge:

Perhaps because California has no past — no past, at least, that it is willing to remember — it has always been peculiarly adept at trailblazing the future. We live in the future launched there.

If one wanted to find an absolute beginning point, a creation story, for California’s two greatest transformations of the world, these experiments with horse and camera would be it. Out of these first lost snapshots eventually came a world-changing industry, and out of the many places where movies are made, one particular place: Hollywood. The man who owned the horse and sponsored the project believed in the union of science and business and founded the university that much later generated another industry identified, like Hollywood, by its central place: Silicon Valley.

It would be impossible to grasp the profound influence Muybridge and his legacy had on culture without understanding how dramatically different the world he was born into was from the one he left. Solnit paints the technological backdrop of his childhood:

Pigeons were the fastest communications technology; horses were the fastest transportation technology; the barges moved at the speed of the river or the pace of the horses that pulled them along the canals. Nature itself was the limit of speed: humans could only harness water, wind, birds, beasts. Born into this almost medievally slow world, the impatient, ambitious, inventive Muybridge would leave it and link himself instead to the fastest and newest technologies of the day.

The first passenger railroad opened on September 15, 1830 — mere months after Muybridge’s birth. Like any technological bubble, the spread of this novelty brought with it an arsenal of stock vocabulary. The notion of “annihilating time and space” became one of the era’s most used, then invariably overused, catchphrases. (In a way, clichés themselves — phrases to which we turn for cognitive convenience, out of a certain impatience with language — are another manifestation of our defiant relationship to time.) Applied first to the railways, the phrase soon spread to the various technological advancements that radiated, directly or indirectly, from them. Solnit writes:

“Annihilating time and space” is what most new technologies aspire to do: technology regards the very terms of our bodily existence as burdensome. Annihilating time and space most directly means accelerating communications and transportation. The domestication of the horse and the invention of the wheel sped up the rate and volume of transit; the invention of writing made it possible for stories to reach farther across time and space than their tellers and stay more stable than memory; and new communications, reproduction, and transportation technologies only continue the process. What distinguishes a technological world is that the terms of nature are obscured; one need not live quite in the present or the local.

[…]

The devices for such annihilation poured forth faster and faster, as though inventiveness and impatience had sped and multiplied too.

Eadweard Muybridge: Animal Locomotion, Plate 62
Eadweard Muybridge: Running full speed (Animal Locomotion, Plate 62)

But perhaps the most significant impact of the railroads, Solnit argues, was that they began standardizing human experience as goods, people, and their values traveled faster and farther than ever before. In contracting the world, the railways began to homogenize it. And just as society was adjusting to this new mode of relating to itself, another transformative invention bookended the decade: On January 7, 1839, the French artist Louis Daguerre debuted what he called daguerreotypy — a pioneering imaging method that catalyzed the dawn of photography.

With an eye to the era’s European and American empiricism, animated by a “restlessness that regarded the unknown as a challenge rather than a danger,” Solnit writes:

Photography may have been its most paradoxical invention: a technological breakthrough for holding onto the past, a technology always rushing forward, always looking backward.

[…]

Photography was a profound transformation of the world it entered. Before, every face, every place, every event, had been unique, seen only once and then lost forever among the changes of age, light, time. The past existed only in memory and interpretation, and the world beyond one’s own experience was mostly stories… [Now,] every photograph was a moment snatched from the river of time.

The final invention in the decades’s trifecta of technological transformation was the telegraph. Together, these three developments — photography, the railroads, and the telegraph — marked the beginning of our modern flight from presence, which would become the seedbed of our unhappiness over the century that followed. By chance, Muybridge came into the world at the pinnacle of this transformation; by choice, he became instrumental in guiding its course and, in effect, shaping modernity.

Eadweard Muybridge: Cockatoo flying (Animal Locomotion, Plate 758)
Eadweard Muybridge: Cockatoo flying (Animal Locomotion, Plate 758)

Solnit writes:

Before the new technologies and ideas, time was a river in which human beings were immersed, moving steadily on the current, never faster than the speeds of nature — of currents, of wind, of muscles. Trains liberated them from the flow of the river, or isolated them from it. Photography appears on this scene as though someone had found a way to freeze the water of passing time; appearances that were once as fluid as water running through one’s fingers became solid objects… Appearances were permanent, information was instantaneous, travel exceeded the fastest speed of bird, beast, and man. It was no longer a natural world in the sense it always had been, and human beings were no longer contained within nature.

Time itself had been of a different texture, a different pace, in the world Muybridge was born into. It had not yet become a scarce commodity to be measured out in ever smaller increments as clocks acquired second hands, as watches became more affordable mass-market commodities, as exacting schedules began to intrude into more and more activities. Only prayer had been precisely scheduled in the old society, and church bells had been the primary source of time measurement.

Simone Weil once defined prayer as “absolutely unmixed attention,” and perhaps the commodification of time that started in the 1830s was the beginning of the end of our capacity for such attention; perhaps Muybridge was the horseman of our attentional apocalypse.

Eadweard Muybridge: Woman removing mantle
Eadweard Muybridge: Woman removing mantle

Solnit considers the magnitude of his ultimate impact on our experience of time:

In the spring of 1872 a man photographed a horse. With the motion studies that resulted it was as though he were returning bodies themselves to those who craved them — not bodies as they might daily be experienced, bodies as sensations of gravity, fatigue, strength, pleasure, but bodies become weightless images, bodies dissected and reconstructed by light and machine and fantasy.

[…]

What they had lost was solid; what they gained was made out of air. That exotic new world of images speeding by would become the true home of those who spent their Saturdays watching images beamed across the darkness of the movie theater, then their evenings watching images beamed through the atmosphere and brought home into a box like a camera obscura or a crystal ball, then their waking hours surfing the Internet wired like the old telegraph system. Muybridge was a doorway, a pivot between that old world and ours, and to follow him is to follow the choices that got us here.

In the remainder of her rich and revelatory River of Shadows, Solnit goes on to follow the Rube Goldberg trajectory of these choices, linking Muybridge and his legacy to aspects of our daily lives ranging from the internet to how we inhabit our bodies. Complement it with Susan Sontag on the aesthetic consumerism of photography, the revisit Solnit on how to nurture our hope in times of despair, the rewards of walking, what reading does for the human spirit, and how modern noncommunication is changing our experience of time, solitude, and communion.

BP

A Madman Dreams of Tuning Machines: The Story of Joseph Weber, the Tragic Hero of Science Who Followed Einstein’s Vision and Pioneered the Sound of Spacetime

…and a remarkable letter from Freeman Dyson on the difficult, necessary art of changing one’s mind.

A Madman Dreams of Tuning Machines: The Story of Joseph Weber, the Tragic Hero of Science Who Followed Einstein’s Vision and Pioneered the Sound of Spacetime

In his groundbreaking 1915 paper on general relativity, Albert Einstein envisioned gravitational waves — ripples in the fabric of spacetime caused by astronomic events of astronomical energy. Although fundamental to our understanding of the universe, gravitational waves were a purely theoretical construct for him. He lived in an era when any human-made tool for detecting something this faraway was simply unimaginable, even by the greatest living genius, and many of the cosmic objects capable of producing such tremendous tumult — black holes, for instance — were yet to be discovered.

One September morning in 2015, almost exactly a century after Einstein published his famous paper, scientists turned his mathematical dream into a tangible reality — or, rather, an audible one.

blackholes_einstein

The Laser Interferometer Gravitational-Wave Observatory — an enormous international collaboration known as LIGO, consisting of two massive listening instruments 3,000 kilometers apart, decades in the making — recorded the sound of a gravitational wave produced by two mammoth black holes that had collided more than a billion years ago, more than a billion light-years away.

One of the most significant discoveries in the history of science, this landmark event introduces a whole new modality of curiosity in our quest to know the cosmos, its thrill only amplified by the fact that we had never actually seen black holes before hearing them. Nearly everything we know about the universe today, we know through five centuries of optical observation of light and particles. Now begins a new era of sonic exploration. Turning an inquisitive ear to the cosmos might, and likely will, revolutionize our understanding of it as radically as Galileo did when he first pointed his telescope at the skies.

In Black Hole Blues and Other Songs from Outer Space (public library) — one of the finest and most beautifully written books I’ve ever read, which I recently reviewed for The New York Times — astrophysicist and novelist Janna Levin tells the story of LIGO and its larger significance as a feat of science and the human spirit. Levin, a writer who bends language with effortless might and uses it not only as an instrument of thought but also as a Petri dish for emotional nuance, probes deep into the messy human psychology that animated these brilliant and flawed scientists as they persevered in this ambitious quest against enormous personal, political, and practical odds.

Levin — who has written beautifully about free will and the relationship between genius and madness — paints the backdrop for this improbable triumph:

Somewhere in the universe two black holes collide — as heavy as stars, as small as cities, literally black (the complete absence of light) holes (empty hollows). Tethered by gravity, in their final seconds together the black holes course through thousands of revolutions about their eventual point of contact, churning up space and time until they crash and merge into one bigger black hole, an event more powerful than any since the origin of the universe, outputting more than a trillion times the power of a billion Suns. The black holes collide in complete darkness. None of the energy exploding from the collision comes out as light. No telescope will ever see the event.

What nobody could see LIGO could hear — a sensitive, sophisticated ear pressed to the fabric of spacetime, tuned to what Levin so poetically eulogizes as “the total darkness, the empty space, the vacuity, the great expanse of nothingness, of emptiness, of pure space and time.” She writes of this astonishing instrument:

An idea sparked in the 1960s, a thought experiment, an amusing haiku, is now a thing of metal and glass.

But what makes the book most enchanting is Levin’s compassionate insight into the complex, porous, often tragic humanity undergirding the metal and glass — nowhere more tragic than in the story of Joseph Weber, the controversial pioneer who became the first to bring Einstein’s equations into the lab. Long before LIGO was even so much as a thought experiment, Weber envisioned and built a very different instrument for listening to the cosmos.

Weber was born Yonah Geber to a family of Lithuanian Jewish immigrants in early-twentieth-century New Jersey. His mother’s heavy accent caused his teacher to mishear the boy’s name as “Joseph,” so he became Joe. After he was hit by a bus at the age of five, young Joe required speech rehabilitation therapy, which replaced his Yiddish accent with a generic American one that led his family to call him “Yankee.” As a teenager, he dropped out of Cooper Union out of concern for his parents’ finances and joined the Navy instead, where he served on an aircraft carrier that was sunk during WWII. When the war ended, he became a microwave engineer and was hired as a professor at the University of Maryland at the then-enviable salary — especially for a 29-year-old — of $6,500 a year.

Eager to do microwave research, he turned to the great physicist George Gamow, who had theorized cosmic microwave background radiation — a thermal remnant of the Big Bang, which would provide unprecedented insight into the origin of the universe and which Weber wanted to dedicate his Ph.D. career to detecting. But Gamow inexplicably snubbed him. Two other scientists eventually discovered cosmic microwave background radiation by accident and received the Nobel Prize for the discovery. Weber then turned to atomic physics and devised the maser — the predecessor of the laser — but, once again, other scientists beat him to the public credit and received a Nobel for that discovery, too.

Levin writes:

Joe’s scientific life is defined by these significant near misses… He was Shackleton many times, almost the first: almost the first to see the big bang, almost the first to patent the laser, almost the first to detect gravitational waves. Famous for nearly getting there.

And that is how Weber got to gravitational waves — a field he saw as so small and esoteric that he stood a chance of finally being the first. Levin writes:

In 1969 Joe Weber announced that he had achieved an experimental feat widely believed to be impossible: He had detected evidence for gravitational waves. Imagine his pride, the pride to be the first, the gratification of discovery, the raw shameless pleasure of accomplishment. Practically single-handedly, through sheer determination, he conceives of the possibility. He fills multiple notebooks, hundreds of pages deep, with calculations and designs and ideas, and then he makes the experimental apparatus real. He builds an ingenious machine, a resonant bar, a Weber bar, which vibrates in sympathy with a gravitational wave. A solid aluminum cylinder about 2 meters long, 1 meter in diameter, and in the range of 3,000 pounds, as guitar strings go, isn’t easy to pluck. But it has one natural frequency at which a strong gravitational wave would ring the bar like a tuning fork.

Joseph Weber with his cylinder
Joseph Weber with his cylinder

Following his announcement, Weber became an overnight celebrity. His face graced magazine covers. NASA put one of his instruments on the Moon. He received ample laud from peers. Even the formidable J. Robert Oppenheimer, a man of slim capacity for compliments, encouraged him with a remark Weber never forgot: “The work you’re doing,” Oppenheimer told him, “is just about the most exciting work going on anywhere around here.”

Under the spell of this collective excitement, scientists around the world began building replicas of Weber’s cylinder. But one after another, they were unable to replicate his results — the electrifying eagerness to hear gravitational waves was met with the dead silence of the cosmos.

Weber plummeted from grace as quickly as he had ascended. (Einstein himself famously scoffed at the fickle nature of fame.) Levin writes:

Joe Weber’s claims in 1969 to have detected gravitational waves, the claims that catapulted his fame, that made him possibly the most famous living scientist of his generation, were swiftly and vehemently refuted. The subsequent decades offered near total withdrawal of support, both from scientific funding agencies and his peers. He was almost fired from the University of Maryland.

Among Weber’s most enthusiastic initial supporters was the great theoretical physicist Freeman Dyson. Perhaps out of his staunch belief that no question is unanswerable, Dyson had emboldened Weber to attempt what no one had attempted before — to hear the sound of spacetime. But when the evidence against Weber’s data began to mount, Dyson was anguished by a sense of personal responsibility for having encouraged him, so he wrote Weber an extraordinary letter urging him to practice the immensely difficult art of changing one’s mind. Levin quotes the letter, penned on June 5, 1975:

Dear Joe,

I have been watching with fear and anguish the ruin of our hopes. I feel a considerable personal responsibility for having advised you in the past to “stick your neck out.” Now I still consider you a great man unkindly treated by fate, and I am anxious to save whatever can be saved. So I offer my advice again for what it is worth.

A great man is not afraid to admit publicly that he has made a mistake and has changed his mind. I know you are a man of integrity. You are strong enough to admit that you are wrong. If you do this, your enemies will rejoice but your friends will rejoice even more. You will save yourself as a scientist, and you will find that those whose respect is worth having will respect you for it.

I write now briefly because long explanations will not make the message clearer. Whatever you decide, I will not turn my back on you.

With all good wishes,
Yours ever
Freeman

But Weber decided not to heed his friend’s warm caution. His visionary genius coexisted with one of the most unfortunate and most inescapable of human tendencies — our bone-deep resistance to the shame of admitting error. He paid a high price: His disrepute soon veered into cruelty — he was ridiculed and even baited by false data intended to trick him into reaffirming his claims, only to be publicly humiliated all over again. In one of the archival interviews Levin excavates, he laments:

I simply cannot understand the vehemence and the professional jealousy, and why each guy has to feel that he has to cut off a pound of my flesh… Boltzmann committed suicide with this sort of treatment.

Here, I think of Levin’s penchant for celebrating tragic heroes whose posthumous redemption only adds to their tragedy. Her magnificent novel A Mad Man Dreams of Turing Machines is based on the real lives of computing pioneer Alan Turing and mathematician Kurt Gödel, both of whom committed suicide — Turing after particularly cruel mistreatment. Levin’s writing emanates a deep sympathy for those who have fallen victim to some combination of their own fallible humanity and the ferocious inhumanity of unforgiving, bloodthirsty others. No wonder Weber’s story sings to her. A mad man dreams of tuning machines.

Without diminishing the role of personal pathology and individual neurochemistry, given what psychologists know about suicide prevention, social support likely played a vital role in Weber’s ability to withstand the barrage of viciousness — Dyson’s sympathetic succor, but most of all the love of his wife, the astronomer Virginia Trimble, perhaps the most unambivalently likable character in the book. Levin writes:

She called him Weber and he called her Trimble. They married in March 1972 after a cumulative three weekends together. She laughs. “Weber never had any trouble making up his mind.” Twenty-three years her senior, he always insisted she do what she wanted and needed to do. Perhaps trained in part by his first wife, Anita, a physicist who took a protracted break to raise their four boys, the widower had no reservations about Virginia’s work, her independence, or her IQ. (Stratospheric. In an issue of Life magazine with a now-vintage cover, in an article titled “Behind a Lovely Face, a 180 I.Q.” about the then eighteen-year-old astrophysics major, she is quoted as classifying the men she dates into three types: “Guys who are smarter than I am, and I’ve found one or two. Guys who think they are— they’re legion. And those who don’t care.”)

Virginia Trimble in LIFE magazine, 1962
Behind a Lovely Face, a 180 I.Q.: Virginia Trimble in LIFE magazine, 1962

Trimble was the second woman ever allowed at the famed Palomar Observatory, a year after pioneering astronomer Vera Rubin broke the optical-glass ceiling by becoming the first to observe there. Levin, whose subtle kind-natured humor never fails to delight, captures Trimble’s irreverent brilliance:

In her third year, having demonstrated her tenacity — particularly manifest in the fact that she still hadn’t married, she suspects — she was awarded a fellowship from the National Science Foundation. When she arrived at Caltech, she was delighted. “I thought, ‘Look at all of these lovely men.’” In her seventies, with her coral dress, matching shoes and lip color, Moon earrings, and gold animal-head ring, she beams. Still a lovely face. And still an IQ of 180.

This fierce spirit never left Trimble. Now in her seventies, she tells Levin:

When I fell and broke my hip last September, I spent four days on the floor of my apartment singing songs and reciting poetry until I was found.

It isn’t hard to see why Weber — why anyone — would fall in love with Trimble. But although their love sustained him and he didn’t take his own life, he suffered an end equally heartbreaking.

By the late 1980s, Weber had submerged himself even deeper into the quicksand of his convictions, stubbornly trying to prove that his instrument could hear the cosmos. For the next twenty years, he continued to operate his own lab funded out of pocket — a drab concrete box in the Maryland woods, where he was both head scientist and janitor. Meanwhile, LIGO — a sophisticated instrument that would eventually cost more than $1 billion total, operated by a massive international team of scientists — was gathering momentum nearby, thanks largely to the scientific interest in gravitational astronomy that Weber’s early research had sparked.

He was never invited to join LIGO. Trimble surmises that even if he had been, he would’ve declined.

One freezing winter morning in 2000, just as LIGO’s initial detectors were being built, 81-year-old Weber went to clean his lab, slipped on the ice in front of the building, hit his head, and fell unconscious. He was found two days later and taken to the E.R., but he never recovered. He died at the hospital several months later from the lymphoma he’d been battling. The widowed Trimble extracts from her husband’s tragedy an unsentimental parable of science — a testament to the mismatch between the time-scale of human achievement, with all the personal glory it brings, and that of scientific progress:

Science is a self-correcting process, but not necessarily in one’s own lifetime.

When the LIGO team published the official paper announcing the groundbreaking discovery, Weber was acknowledged as the pioneer of gravitational wave research. But like Alan Turing, who was granted posthumous pardon by the Queen more than half a century after he perished by inhumane injustice, Weber’s redemption is culturally bittersweet at best. I’m reminded of a beautiful passage from Levin’s novel about Turing and Gödel, strangely perfect in the context of Weber’s legacy:

Their genius is a testament to our own worth, an antidote to insignificance; and their bounteous flaws are luckless but seemingly natural complements, as though greatness can be doled out only with an equal measure of weakness… Their broken lives are mere anecdotes in the margins of their discoveries. But then their discoveries are evidence of our purpose, and their lives are parables on free will.

Free will, indeed, is what Weber exercised above all — he lived by it and died by it. In one of the interviews Levin unearths, he reflects from the depths of his disrepute:

If you do science the principal reason to do it is because you enjoy it and if you don’t enjoy it you shouldn’t do it, and I enjoy it. And I must say I’m enjoying it… That’s the best you can do.

At the end of the magnificent and exceptionally poetic Black Hole Blues, the merits of which I’ve extolled more fully here, Levin offers a wonderfully lyrical account of LIGO’s triumph as she peers into the furthest reaches of the spacetime odyssey that began with Einstein, gained momentum with Weber, and is only just beginning to map the course of human curiosity across the universe:

Two very big stars lived in orbit around each other several billion years ago. Maybe there were planets around them, although the two-star system might have been too unstable or too simple in composition to accommodate planets. Eventually one star died, and then the other, and two black holes formed. They orbited in darkness, probably for billions of years before that final 200 milliseconds when the black holes collided and merged, launching their loudest gravitational wave train into the universe.

The sound traveled to us from 1.4 billion light-years away. One billion four hundred million light-years.

[…]

We heard black holes collide. We’ll point to where the sound might have come from, to the best of our abilities, a swatch of space from an earlier epoch. Somewhere in the southern sky, pulling away from us with the expansion of the universe, the big black hole will roll along its own galaxy, dark and quiet until something wanders past, an interstellar dust cloud or an errant star. After a few billion years the host galaxy might collide with a neighbor, tossing the black hole around, maybe toward a supermassive black hole in a growing galactic center. Our star will die. The Milky Way will blend with Andromeda. The record of this discovery along with the wreckage of our solar system will eventually fall into black holes, as will everything else in the cosmos, the expanding space eventually silent, and all the black holes will evaporate into oblivion near the end of time.

blackholeblues_jannalevin2

BP

Nietzsche on Dreams as an Evolutionary Time Machine for the Human Brain

“Dreams carry us back to the earlier stages of human culture and afford us a means of understanding it more clearly.”

Nietzsche on Dreams as an Evolutionary Time Machine for the Human Brain

We spend a third of our lives in a parallel nocturnal universe and the half-imagined, half-remembered experiences we have there are in constant dynamic interaction with our waking selves. Our nightly dreams are both fragmentary reflections of our conscious lives, rearranged into barely recognizable mosaics by our unconscious, and potent agents of psychic transmutation — a powerful dream can cast an unshakable mood over the wakeful hours, or even days, that follow it. Science is only just beginning to shed light on the role of dreams in memory consolidation and mood, but their nature and purpose remain largely a mystery. “We feel dreamed by someone else, a sleeping counterpart,” the poet Mark Strand wrote in his beautiful ode to dreams.

Friedrich Nietzsche (October 15, 1844–August 25, 1900) saw this sleeping counterpart as our link to primitive humanity — an atavistic remnant of the pre-rational human mind. Nearly two decades before Freud’s seminal treatise on dreams, Nietzsche explored the mystique of the nocturnal unconscious in a portion of Human, All Too Human: A Book for Free Spirits (public library | free ebook) — his altogether terrific 1879 inquiry into how we become who we are.

In a section on dreams and civilization, he writes:

In the dream … we have the source of all metaphysic. Without the dream, men would never have been incited to an analysis of the world. Even the distinction between soul and body is wholly due to the primitive conception of the dream, as also the hypothesis of the embodied soul, whence the development of all superstition, and also, probably, the belief in god. “The dead still live: for they appear to the living in dreams.” So reasoned mankind at one time, and through many thousands of years.

Therein lies Nietzsche’s most intriguing point: Sleep, he suggests, is a kind of evolutionary time machine — a portal to the primitive past of our sensemaking instincts. He paints the sleeping brain as a blunt Occam’s Razor — in seeking out the simplest explanations for our daily confusions, it ends up succumbing to the simplistic. This, Nietzsche argues, is how superstitions and religious mythologies may have originated:

The function of the brain which is most encroached upon in slumber is the memory; not that it is wholly suspended, but it is reduced to a state of imperfection as, in primitive ages of mankind, was probably the case with everyone, whether waking or sleeping. Uncontrolled and entangled as it is, it perpetually confuses things as a result of the most trifling similarities, yet in the same mental confusion and lack of control the nations invented their mythologies, while nowadays travelers habitually observe how prone the savage is to forgetfulness, how his mind, after the least exertion of memory, begins to wander and lose itself until finally he utters falsehood and nonsense from sheer exhaustion. Yet, in dreams, we all resemble this savage. Inadequacy of distinction and error of comparison are the basis of the preposterous things we do and say in dreams, so that when we clearly recall a dream we are startled that so much idiocy lurks within us. The absolute distinctness of all dream-images, due to implicit faith in their substantial reality, recalls the conditions in which earlier mankind were placed, for whom hallucinations had extraordinary vividness, entire communities and even entire nations laboring simultaneously under them. Therefore: in sleep and in dream we make the pilgrimage of early mankind over again.

Illustration by Tom Seidmann-Freud, Freud’s cross-dressing niece, from a philosophical 1922 children’s book about dreams

Just like the dreaming self contains vestiges of every self we’ve inhabited since childhood, to be resurrected in sleep, Nietzsche argues that the dreaming brain contains vestiges of the primitive stages of the human brain, when our cognitive capacity for problem-solving was far more limited and unmoored from critical thinking. Nearly a century before modern scientists came to study what actually happens to the brain and body while we sleep, he writes:

Everyone knows from experience how a dreamer will transform one piercing sound, for example, that of a bell, into another of quite a different nature, say, the report of cannon. In his dream he becomes aware first of the effects, which he explains by a subsequent hypothesis and becomes persuaded of the purely conjectural nature of the sound. But how comes it that the mind of the dreamer goes so far astray when the same mind, awake, is habitually cautious, careful, and so conservative in its dealings with hypotheses? Why does the first plausible hypothesis of the cause of a sensation gain credit in the dreaming state? (For in a dream we look upon that dream as reality, that is, we accept our hypotheses as fully established). I have no doubt that as men argue in their dreams to-day, mankind argued, even in their waking moments, for thousands of years: the first causa, that occurred to the mind with reference to anything that stood in need of explanation, was accepted as the true explanation and served as such… In the dream this atavistic relic of humanity manifests its existence within us, for it is the foundation upon which the higher rational faculty developed itself and still develops itself in every individual. Dreams carry us back to the earlier stages of human culture and afford us a means of understanding it more clearly.

Illustration by Judith Clay from Thea’s Tree

Nietzsche considers the cognitive machinery of this dreamsome deduction:

If we close our eyes the brain immediately conjures up a medley of impressions of light and color, apparently a sort of imitation and echo of the impressions forced in upon the brain during its waking moments. And now the mind, in co-operation with the imagination, transforms this formless play of light and color into definite figures, moving groups, landscapes. What really takes place is a sort of reasoning from effect back to cause.

[…]

The imagination is continually interposing its images inasmuch as it participates in the production of the impressions made through the senses day by day: and the dream-fancy does exactly the same thing — that is, the presumed cause is determined from the effect and after the effect: all this, too, with extraordinary rapidity, so that in this matter, as in a matter of jugglery or sleight-of-hand, a confusion of the mind is produced and an after effect is made to appear a simultaneous action, an inverted succession of events, even. — From these considerations we can see how late strict, logical thought, the true notion of cause and effect must have been in developing, since our intellectual and rational faculties to this very day revert to these primitive processes of deduction, while practically half our lifetime is spent in the super-inducing conditions.

In a sentiment that calls to mind Polish Nobel laureate Wislawa Szymborska’s wonderful notion that the poet is “the spiritual heir of primitive humanity,” Nietzsche adds:

Even the poet, the artist, ascribes to his sentimental and emotional states causes which are not the true ones. To that extent he is a reminder of early mankind and can aid us in its comprehension.

Human, All Too Human is a spectacular read in its totality. Complement this particular portion with the science of how REM sleep regulates our negative moods and the psychology of dreams and why we have nightmares, then revisit Nietzsche on the power of music, how to find yourself, why a fulfilling life requires embracing rather than running from difficulty, and his ten rules for writers.

BP

Rebecca Solnit on Hope in Dark Times, Resisting the Defeatism of Easy Despair, and What Victory Really Means for Movements of Social Change

“This is an extraordinary time full of vital, transformative movements that could not be foreseen. It’s also a nightmarish time. Full engagement requires the ability to perceive both.”

Rebecca Solnit on Hope in Dark Times, Resisting the Defeatism of Easy Despair, and What Victory Really Means for Movements of Social Change

“There is no love of life without despair of life,” wrote Albert Camus — a man who in the midst of World War II, perhaps the darkest period in human history, saw grounds for luminous hope and issued a remarkable clarion call for humanity to rise to its highest potential on those grounds. It was his way of honoring the same duality that artist Maira Kalman would capture nearly a century later in her marvelous meditation on the pursuit of happiness, where she observed: “We hope. We despair. We hope. We despair. That is what governs us. We have a bipolar system.”

In my own reflections on hope, cynicism, and the stories we tell ourselves, I’ve considered the necessity of these two poles working in concert. Indeed, the stories we tell ourselves about these poles matter. The stories we tell ourselves about our public past shape how we interpret and respond to and show up for the present. The stories we tell ourselves about our private pasts shape how we come to see our personhood and who we ultimately become. The thin line between agency and victimhood is drawn in how we tell those stories.

The language in which we tell ourselves these stories matters tremendously, too, and no writer has weighed the complexities of sustaining hope in our times of readily available despair more thoughtfully and beautifully, nor with greater nuance, than Rebecca Solnit does in Hope in the Dark: Untold Histories, Wild Possibilities (public library).

Rebecca Solnit (Photograph: Sallie Dean Shatz)
Rebecca Solnit (Photograph: Sallie Dean Shatz)

Expanding upon her previous writings on hope, Solnit writes in the foreword to the 2016 edition of this foundational text of modern civic engagement:

Hope is a gift you don’t have to surrender, a power you don’t have to throw away. And though hope can be an act of defiance, defiance isn’t enough reason to hope. But there are good reasons.

Solnit — one of the most singular, civically significant, and poetically potent voices of our time, emanating echoes of Virginia Woolf’s luminous prose and Adrienne Rich’s unflinching political conviction — originally wrote these essays in 2003, six weeks after the start of Iraq war, in an effort to speak “directly to the inner life of the politics of the moment, to the emotions and preconceptions that underlie our political positions and engagements.” Although the specific conditions of the day may have shifted, their undergirding causes and far-reaching consequences have only gained in relevance and urgency in the dozen years since. This slim book of tremendous potency is therefore, today more than ever, an indispensable ally to every thinking, feeling, civically conscious human being.

Solnit looks back on this seemingly distant past as she peers forward into the near future:

The moment passed long ago, but despair, defeatism, cynicism, and the amnesia and assumptions from which they often arise have not dispersed, even as the most wildly, unimaginably magnificent things came to pass. There is a lot of evidence for the defense… Progressive, populist, and grassroots constituencies have had many victories. Popular power has continued to be a profound force for change. And the changes we’ve undergone, both wonderful and terrible, are astonishing.

[…]

This is an extraordinary time full of vital, transformative movements that could not be foreseen. It’s also a nightmarish time. Full engagement requires the ability to perceive both.

Illustration by Charlotte Pardi from Cry, Heart, But Never Break by Glenn Ringtved

With an eye to such disheartening developments as climate change, growing income inequality, and the rise of Silicon Valley as a dehumanizing global superpower of automation, Solnit invites us to be equally present for the counterpoint:

Hope doesn’t mean denying these realities. It means facing them and addressing them by remembering what else the twenty-first century has brought, including the movements, heroes, and shifts in consciousness that address these things now.

Enumerating Edward Snowden, marriage equality, and Black Lives Matter among those, she adds:

This has been a truly remarkable decade for movement-building, social change, and deep, profound shifts in ideas, perspective, and frameworks for broad parts of the population (and, of course, backlashes against all those things).

With great care, Solnit — whose mind remains the sharpest instrument of nuance I’ve encountered — maps the uneven terrain of our grounds for hope:

It’s important to say what hope is not: it is not the belief that everything was, is, or will be fine. The evidence is all around us of tremendous suffering and tremendous destruction. The hope I’m interested in is about broad perspectives with specific possibilities, ones that invite or demand that we act. It’s also not a sunny everything-is-getting-better narrative, though it may be a counter to the everything-is-getting-worse narrative. You could call it an account of complexities and uncertainties, with openings.

Solnit’s conception of hope reminds me of the great existential psychiatrist Irvin D. Yalom’s conception of meaning: “The search for meaning, much like the search for pleasure,” he wrote, “must be conducted obliquely.” That is, it must take place in the thrilling and terrifying terra incognita that lies between where we are and where we wish to go, ultimately shaping where we do go. Solnit herself has written memorably about how we find ourselves by getting lost, and finding hope seems to necessitate a similar surrender to uncertainty. She captures this idea beautifully:

Hope locates itself in the premises that we don’t know what will happen and that in the spaciousness of uncertainty is room to act. When you recognize uncertainty, you recognize that you may be able to influence the outcomes — you alone or you in concert with a few dozen or several million others. Hope is an embrace of the unknown and the unknowable, an alternative to the certainty of both optimists and pessimists. Optimists think it will all be fine without our involvement; pessimists take the opposite position; both excuse themselves from acting. It’s the belief that what we do matters even though how and when it may matter, who and what it may impact, are not things we can know beforehand. We may not, in fact, know them afterward either, but they matter all the same, and history is full of people whose influence was most powerful after they were gone.

Illustration from The Harvey Milk Story, a picture-book biography of the slain LGBT rights pioneer

Amid a 24-hour news cycle that nurses us on the illusion of immediacy, this recognition of incremental progress and the long gestational period of consequences — something at the heart of every major scientific revolution that has changed our world — is perhaps our most essential yet most endangered wellspring of hope. Solnit reminds us, for instance, that women’s struggle for the right to vote took seven decades:

For a time people liked to announce that feminism had failed, as though the project of overturning millennia of social arrangements should achieve its final victories in a few decades, or as though it had stopped. Feminism is just starting, and its manifestations matter in rural Himalayan villages, not just first-world cities.

She considers one particularly prominent example of this cumulative cataclysm — the Arab Spring, “an extraordinary example of how unpredictable change is and how potent popular power can be,” the full meaning of and conclusions from which we are yet to draw. Although our cultural lore traces the spark of the Arab Spring to the moment Mohamed Bouazizi set himself on fire in an act of protest, Solnit traces the unnoticed accretion of tinder across space and time:

You can tell the genesis story of the Arab Spring other ways. The quiet organizing going on in the shadows beforehand matters. So does the comic book about Martin Luther King and civil disobedience that was translated into Arabic and widely distributed in Egypt shortly before the Arab Spring. You can tell of King’s civil disobedience tactics being inspired by Gandhi’s tactics, and Gandhi’s inspired by Tolstoy and the radical acts of noncooperation and sabotage of British women suffragists. So the threads of ideas weave around the world and through the decades and centuries.

In a brilliant counterpoint to Malcolm Gladwell’s notoriously short-sighted view of social change, Solnit sprouts a mycological metaphor for this imperceptible, incremental buildup of influence and momentum:

After a rain mushrooms appear on the surface of the earth as if from nowhere. Many do so from a sometimes vast underground fungus that remains invisible and largely unknown. What we call mushrooms mycologists call the fruiting body of the larger, less visible fungus. Uprisings and revolutions are often considered to be spontaneous, but less visible long-term organizing and groundwork — or underground work — often laid the foundation. Changes in ideas and values also result from work done by writers, scholars, public intellectuals, social activists, and participants in social media. It seems insignificant or peripheral until very different outcomes emerge from transformed assumptions about who and what matters, who should be heard and believed, who has rights.

Ideas at first considered outrageous or ridiculous or extreme gradually become what people think they’ve always believed. How the transformation happened is rarely remembered, in part because it’s compromising: it recalls the mainstream when the mainstream was, say, rabidly homophobic or racist in a way it no longer is; and it recalls that power comes from the shadows and the margins, that our hope is in the dark around the edges, not the limelight of center stage. Our hope and often our power.

[…]

Change is rarely straightforward… Sometimes it’s as complex as chaos theory and as slow as evolution. Even things that seem to happen suddenly arise from deep roots in the past or from long-dormant seeds.

One of Beatrix Potter’s little-known scientific studies and illustrations of mushrooms

And yet Solnit’s most salient point deals with what comes after the revolutionary change — with the notion of victory not as a destination but as a starting point for recommitment and continual nourishment of our fledgling ideals:

A victory doesn’t mean that everything is now going to be nice forever and we can therefore all go lounge around until the end of time. Some activists are afraid that if we acknowledge victory, people will give up the struggle. I’ve long been more afraid that people will give up and go home or never get started in the first place if they think no victory is possible or fail to recognize the victories already achieved. Marriage equality is not the end of homophobia, but it’s something to celebrate. A victory is a milestone on the road, evidence that sometimes we win, and encouragement to keep going, not to stop.

Solnit examines this notion more closely in one of the original essays from the book, titled “Changing the Imagination of Change” — a meditation of even more acute timeliness today, more than a decade later, in which she writes:

Americans are good at responding to crisis and then going home to let another crisis brew both because we imagine that the finality of death can be achieved in life — it’s called happily ever after in personal life, saved in politics — and because we tend to think political engagement is something for emergencies rather than, as people in many other countries (and Americans at other times) have imagined it, as a part and even a pleasure of everyday life. The problem seldom goes home.

[…]

Going home seems to be a way to abandon victories when they’re still delicate, still in need of protection and encouragement. Human babies are helpless at birth, and so perhaps are victories before they’ve been consolidated into the culture’s sense of how things should be. I wonder sometimes what would happen if victory was imagined not just as the elimination of evil but the establishment of good — if, after American slavery had been abolished, Reconstruction’s promises of economic justice had been enforced by the abolitionists, or, similarly, if the end of apartheid had been seen as meaning instituting economic justice as well (or, as some South Africans put it, ending economic apartheid).

It’s always too soon to go home. Most of the great victories continue to unfold, unfinished in the sense that they are not yet fully realized, but also in the sense that they continue to spread influence. A phenomenon like the civil rights movement creates a vocabulary and a toolbox for social change used around the globe, so that its effects far outstrip its goals and specific achievements — and failures.

Invoking James Baldwin’s famous proclamation that “not everything that is faced can be changed, but nothing can be changed until it is faced,” Solnit writes:

It’s important to emphasize that hope is only a beginning; it’s not a substitute for action, only a basis for it.

What often obscures our view of hope, she argues, is a kind of collective amnesia that lets us forget just how far we’ve come as we grow despondent over how far we have yet to go. She writes:

Amnesia leads to despair in many ways. The status quo would like you to believe it is immutable, inevitable, and invulnerable, and lack of memory of a dynamically changing world reinforces this view. In other words, when you don’t know how much things have changed, you don’t see that they are changing or that they can change.

Illustration by Isabelle Arsenault from Mr. Gauguin’s Heart by Marie-Danielle Croteau, the story of how Paul Gauguin used the grief of his childhood as a catalyst for a lifetime of art

This lack of a long view is perpetuated by the media, whose raw material — the very notion of “news” — divorces us from the continuity of life and keeps us fixated on the current moment in artificial isolate. Meanwhile, Solnit argues in a poignant parallel, such amnesia poisons and paralyzes our collective conscience by the same mechanism that depression poisons and paralyzes the private psyche — we come to believe that the acute pain of the present is all that will ever be and cease to believe that things will look up. She writes:

There’s a public equivalent to private depression, a sense that the nation or the society rather than the individual is stuck. Things don’t always change for the better, but they change, and we can play a role in that change if we act. Which is where hope comes in, and memory, the collective memory we call history.

A dedicated rower, Solnit ends with the perfect metaphor:

You row forward looking back, and telling this history is part of helping people navigate toward the future. We need a litany, a rosary, a sutra, a mantra, a war chant for our victories. The past is set in daylight, and it can become a torch we can carry into the night that is the future.

Hope in the Dark is a robust anchor of intelligent idealism amid our tumultuous era of disorienting defeatism — a vitalizing exploration of how we can withstand the marketable temptations of false hope and easy despair. Complement it with Camus on how to ennoble our minds in dark times and Viktor Frankl on why idealism is the best realism, then revisit Solnit on the rewards of walking, what reading does for the human spirit, and how modern noncommunication is changing our experience of time, solitude, and communion.

BP

View Full Site

Brain Pickings participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from a link on here, I receive a small percentage of its price. Privacy policy.