Brain Pickings Icon
Brain Pickings

Search results for “time travel”

James Gleick on Our Anxiety About Time, the Origin of the Term “Type A,” and the Curious Psychology of Elevator Impatience

“We have reached the epoch of the nanosecond… That is our condition, a culmination of millennia of evolution in human societies, technologies, and habits of mind.”

James Gleick on Our Anxiety About Time, the Origin of the Term “Type A,” and the Curious Psychology of Elevator Impatience

“Hurrying and delaying are alike ways of trying to resist the present,” Alan Watts observed of the difficult pleasures of presence in the middle of the twentieth century, as the mechanized acceleration of modern life was beginning to take our already aggravated relationship with time to new frontiers of frustration. I thought of him one November morning shortly after I moved to New York when, already overwhelmed by the city’s pace, I swiped my brand new subway card at the turnstile and confidently marched through, only to jam my hips into the immobile metal rod. Puzzled, I looked over to the tiny primitive screen above the turnstile, which chided me coldly in cyan electronic letters: “SWIPE FASTER.” Just these two words, stern and commanding — no “PLEASE,” not even “TRY TO.” In the world’s fastest-paced city, even the mindless machines are temporally judgmental and make sure you remain on par.

Our leap into temporal expediency had several pivotal launching pads since Galileo invented modern timekeeping and set into motion our forward-lurching momentum. At the end of the 19th century, the invention of railroads and motion pictures catalyzed “the annihilation of space and time.” By 1912, a satirical children’s book mocked a man who failed to rise from bed fast and early enough as “a stupid guy.” When Bertrand Russell wrote in 1918 that “both in thought and in feeling, even though time be real, to realise the unimportance of time is the gate of wisdom,” it was both a valediction to an era freer of illusory urgencies and a foreboding of our own epoch, in which the tyranny of time has rendered us incapable of distinguishing between urgency and importance.

Illustration by Peter Newell from The Rocket Book, 1912
Illustration by Peter Newell from The Rocket Book, 1912

Science was already hijacking time from the domain of metaphysics and fomenting the popular imagination with its rush of discoveries, so when Einstein and Bergson sat down for their famous debate in 1922, the moment was ripe to forever change our experience of time. (It may be a coincidence, but it is nonetheless an emblematic one, that 1955 was both the year Einstein died and the year scientists concretized the second itself by ceasing to tinker with its length, until then defined as 1/86,400 of the mutable duration of a real day.)

The impact of these and related developments on society and the human psyche are what the inimitable James Gleick explores in Faster: The Acceleration of Just About Everything (public library) — a book written nearly two decades ago that has not only stood the test of time but has grown all the more perceptive and prescient in the years since.

Half a century after German philosopher Josef Pieper argued that leisure is the basis of culture and the root of human dignity, Gleick writes:

We are in a rush. We are making haste. A compression of time characterizes the life of the century.

[…]

We have a word for free time: leisure. Leisure is time off the books, off the job, off the clock. If we save time, we commonly believe we are saving it for our leisure. We know that leisure is really a state of mind, but no dictionary can define it without reference to passing time. It is unrestricted time, unemployed time, unoccupied time. Or is it? Unoccupied time is vanishing. The leisure industries (an oxymoron maybe, but no contradiction) fill time, as groundwater fills a sinkhole. The very variety of experience attacks our leisure as it attempts to satiate us. We work for our amusement.

[…]

Sociologists in several countries have found that increasing wealth and increasing education bring a sense of tension about time. We believe that we possess too little of it: that is a myth we now live by.

Illustration by Vahram Muratyan from About Time, a minimalist illustrated meditation on our fraught relationship with time

To fully appreciate Gleick’s insightful prescience, it behooves us to remember that he is writing long before the social web as we know it, before the conspicuous consumption of “content” became the currency of the BuzzMalnourishment industrial complex, before the timelines of Twitter and Facebook came to dominate our record and experience of time. (Prescience, of course, is a form of time travel — perhaps our only nonfictional way to voyage into the future.) Gleick writes:

We live in the buzz. We wish to live intensely, and we wonder about the consequences — whether, perhaps, we face the biological dilemma of the waterflea, whose heart beats faster as the temperature rises. This creature lives almost four months at 46 degrees Fahrenheit but less than one month at 82 degrees.

[…]

Yet we have made our choices and are still making them. We humans have chosen speed and we thrive on it — more than we generally admit. Our ability to work fast and play fast gives us power. It thrills us… No wonder we call sudden exhilaration a rush.

Gleick considers what our units of time reveal about our units of thought:

We have reached the epoch of the nanosecond. This is the heyday of speed. “Speed is the form of ecstasy the technical revolution has bestowed on man,” laments the Czech novelist Milan Kundera, suggesting by ecstasy a state of simultaneous freedom and imprisonment… That is our condition, a culmination of millennia of evolution in human societies, technologies, and habits of mind.

[…]

Particle physicists may freeze a second, open it up, and explore its dappled contents like surgeons pawing through an abdomen, but in real life, when events occur within thousandths of a second, our minds cannot distinguish past from future. What can we grasp in a nanosecond — a billionth of a second? … Within the millisecond, the bat presses against the ball; a bullet finds time to enter a skull and exit again; a rock plunges into a still pond, where the unexpected geometry of the splash pattern pops into existence. During a nanosecond, balls, bullets, and droplets are motionless.

Illustration from Just a Second by Steve Jenkins, a children’s book about what takes place on Earth in a single second

If the nanosecond seems too negligible to matter, it is only because we are fundamentally blinded by the biological limits of our perception. (We are, for instance, only just beginning to understand the monumental importance of the microbiome, imperceptible to the naked eye yet crucial to nearly every aspect of our bodily existence.) In 1849, when trailblazing astronomer Maria Mitchell became the first woman hired by the U.S. federal government for a non-domestic specialized skill, she labored as a “computer of Venus” — a sort of one-woman GPS, performing mathematically rigorous celestial calculations to help sailors navigate the globe. The nanosecond was still decades away from being measured and named, so her calculations, however adroit, were crude by modern standards. Today, as Gleick points out, an error of one nanosecond translates into a misplacement by one foot in modern GPS systems. This means that just a dozen nanoseconds can steer you the wrong way altogether.

But perhaps the most striking illustration of just how frantically we’ve fragmented time and how insistently we’ve imbued the fragments with restlessness comes from an unlikely source — a mid-century social science study published in 1959 under the title “Association of Specific Overt Behavior Pattern with Blood and Cardiovascular Findings,” the validity of which has since failed to hold up against scientific scrutiny but the linguistic legacy of which has only grown in the half-century since: In addition to originating the notion of “hurry sickness,” this study also coined the term “Type A,” which has since planted itself firmly and anxiously in our collective conscience.

Gleick writes:

This magnificently bland coinage, put forward by a pair of California cardiologists in 1959, struck a collective nerve and entered the language. It is a token of our confusion: are we victims or perpetrators of the crime of haste? Are we living at high speed with athleticism and vigor, or are we stricken by hurry sickness?

The cardiologists, Meyer Friedman and Ray Rosenman, listed a set of personality traits which, they claimed, tend to go hand in hand with one another and also with heart disease. They described these traits rather unappealingly, as characteristics about and around the theme of impatience. Excessive competitiveness. Aggressiveness. “A harrying sense of time urgency.” The Type A idea emerged in technical papers and then formed the basis of a popular book and made its way into dictionaries.

The archetypal Type A was a person the researchers called “Paul,” whom they described unambiguously:

A very disproportionate amount of his emotional energy is consumed in struggling against the normal constraints of time. “How can I move faster, and do more and more things in less and less time?” is the question that never ceases to torment him. Paul hurries his thinking, his speech and his movements. He also strives to hurry the thinking, speech, and movements of those about him; they must communicate rapidly and relevantly if they wish to avoid creating impatience in him. Planes must arrive and depart precisely on time for Paul, cars ahead of him on the highway must maintain a speed he approves of, and there must never be a queue of persons standing between him and a bank clerk, a restaurant table, or the interior of a theater. In fact, he is infuriated whenever people talk slowly or circuitously, when planes are late, cars dawdle on the highway, and queues form.

Art by Lisbeth Zwerger for a special edition of Alice’s Adventures in Wonderland

The study ultimately didn’t live up to its hypothesis that a Type A personality predisposes to heart disease — the researchers failed to account for various confounds, including the facts that patients in Group A drank, smoked, and ate more than those in Group B. But what it didn’t prove in science it proved in society — the need for a term that confers validity about an experience so prevalent and so intimately familiar to so many. (In her beautiful essay on language and creativity, the poet Jane Hirshfield has written about how, through the language of poetic image, “something previously unformulated (in the most literal sense) comes into the realm of the expressed” until we begin to feel that without its existence “the world’s store of truth would be diminished.”) Gleick writes:

If the Type A phenomenon made for poor medical research, it stands nonetheless as a triumph of social criticism. Some of us yield more willingly to impatience than others, but on the whole Type A is who we are—not just the coronary-prone among us, but all of us, as a society and as an age. No wonder the concept has proven too rich a cultural totem to be dismissed. We understand it. We know it when we see it. Type A people walk fast and eat fast. They finish your sentences for you. They feel guilty about relaxing. They try to do two or more things at once…

Perhaps the most perfect place to study the psychological machinery of the Type A person is the elevator — the social life of small urban spaces, on steroids; a supreme testing ground for our terror of idleness, once celebrated as a virtue and now reviled as a sin; the ultimate petri dish for the contagion of hurry sickness (for, lest we forget, the elevator is a prime environment for groupthink). Gleick explains:

Among the many aggravators of Type A-ness in modern life, elevators stand out. By its very nature, elevatoring — short-range vertical transportation, as the industry calls it — is a pressure-driven business. Although there are still places on earth where people live full lives without ever seeing an elevator, the Otis Elevator Company estimates that its cars raise and lower the equivalent of the planet’s whole population every nine days. This is a clientele that dislikes waiting.

Illustration by André François from Little Boy Brown by Isobel Harris, 1949

Gleick cites a curious and revealing passage from a 1979 report by Otis researchers studying elevator behavior:

Waiting, some stand still, others pace, and another may make small gestures of impatience such as foot tapping, jiggling change in a pocket, scanning the walls and ceiling with apparent concentration… At intervals, nearly everyone regards the elevator location display above the doors by tipping their head slightly back and raising their eyes… Men, but hardly ever women, may rock gently back and forth…

The long silences, the almost library hush, that we can observe where people wait for elevators are not only what they seem… The longer the silence the more likely one or more of us will become slightly embarrassed… the more embarrassing and tense are the little interior dramas that we play out each within our own theater of projection…

The actual period of waiting that elapses before a particular group may feel that waiting has become a nearly unendurable torment will probably vary significantly with the composition of the group, the time of day, and the type of building in which they are traveling… The wait is hardly ever long, however much the subjective experience may stretch it out.

What makes the elevator so upsetting to the Type A person is that it forces upon us perpetually moving moderns the anxiety of stillness, a punishing counterpoint to the self-elected exhilaration of speed. An interesting, if discomfiting, thing to consider: At the time of Gleick’s writing, elevator riders tended to fill that anxious space of time with bodily fidgeting, occasional small talk, and no doubt large quantities of quiet inner rage; today, the average elevator is filled with people hunched over their devices, heads bent, looking like a congregation of mourners — an alarmingly apt image, for we are now irreversibly bereaved of that bygone era of innocent fidget-filled idleness, unburdened by the tyrannical impulse for productivity. We no longer allow ourselves boredom, that crucible of creativity, even in the elevator.

Building engineers have long tried to address the collective malady of elevator impatience — a problem only exacerbated as buildings grow taller and taller, requiring a greater number of elevators to prevent infuriating elevator traffic jams. For a while, a fanciful solution gained traction: A pressurized “sky lobby” — a transit point in a skyscraper, wherein an air lock repressurizes elevator passengers before they plunge into a rapid descent. But as abstract and at times illusory as time may seem, it grounds us mercilessly into the creaturely reality of our biology, which put an end to the sky lobby idea. Gleick writes:

One small problem resists solution. Evolution neglected to armor the human eardrum against the sudden change in air pressure that comes with a fall of hundreds of feet at high speed. Natural selection rarely had the opportunity to work with survivors of this experience, to fine-tune their eustachian tubes in preparain for vertical transport. So at mid-century, when Frank Lloyd Wright designed a mile-high tower with 528 stories, helicopter landing pads, and quintuple-deck elevators running on atomic power, airline pilots instantly wrote to alert him to the impracticality. The age of high-altitude passenger aviation was just beginning, and the pilots knew that elevators descending thousands of feet within a minute or two would subject their passengers to severe inner-ear pain. Sure enough, decades later, the Sears Tower in Chicago had to slow its observation-deck elevators because at least one passenger had complained of a broken ear drum — an extreme manifestation of hurry sickness.

What remained was the low-tech solution of manipulating the psychology of human impatience, most palpably triggered by what engineers call “door dwell” — the amount of time it takes the elevator doors to automatically close after making a stop on a given floor, programmed to last anywhere between two and four seconds. There is, of course, a way to override the automatic door dwell and win back, as it were, some of those precious blinks: the “DOOR CLOSE” button — a Type A favorite and typically the most worn out one in elevators, for people press it compulsively and repeatedly despite the negligible time-saving benefits and the knowledge that pushing it three times in antsy succession is no more effective than pushing it once. Gleick considers the curious compulsion of poking this seductive yet temporally impotent button:

Although elevators leave the factory with all their functions ready to work, the manufacturers realize that building managers often choose to disable DOOR CLOSE. Buildings fear trapped limbs and lawsuits. Thus they turn their resident populations into subjects in a Pavlovian experiment in negative feedback. The subjects hunger for something even purer than food: speed.

[…]

How many times will you continue to press a button that does nothing? Do you press elevator call buttons that are already lighted — despite your suspicion that, once the button has been pressed, no amount of further attention will hasten the car’s arrival? Your suspicion is accurate. The computers could instruct elevators to give preference to floors with many calls. But elevator engineers know better than to provide any greater incentive than already exists for repeated pressing of the button. They remember Pavlov. They know what happens to those dogs.

Gleick’s Faster is immeasurably insightful in its entirety and often strikingly prophetic. Complement it with German psychologist Marc Wittmann on the psychology of time, physicist Paul Davies on why we experience it as linear, and T.S. Eliot’s timeless ode to the nature of time, then revisit Gleick on the source of Richard Feynman’s genius and the story behind Newton’s “standing on the shoulders of giants” metaphor.

BP

The Annihilation of Space and Time: Rebecca Solnit on How Muybridge Froze the Flow of Existence, Shaped Visual Culture, and Changed Our Consciousness

“Before, every face, every place, every event, had been unique, seen only once and then lost forever among the changes of age, light, time. The past existed only in memory and interpretation, and the world beyond one’s own experience was mostly stories.”

The Annihilation of Space and Time: Rebecca Solnit on How Muybridge Froze the Flow of Existence, Shaped Visual Culture, and Changed Our Consciousness

The great Russian filmmaker Andrei Tarkovsky described the art of cinema as “sculpting in time,” asserting that people go to the movies because they long to experience “time lost or spent or not yet had.” A century earlier, the English photographer Eadweard Muybridge (April 9, 1830–May 8, 1904) exposed the bedrock of time and devised the first chisel for its sculpting in his pioneering photographic studies of motion, which forever changed the modern world — not only by ushering in a technological revolution the effects of which permeate and even dictate our daily lives today, but also, given how bound up in space and time our thinking ego is, transforming our very consciousness. For the very first time, Muybridge’s motion studies captured what T.S. Eliot would later call “the still point of the turning world.”

With her unparalleled intellectual elegance and poetic prose, Rebecca Solnit tells the story of that transformation in River of Shadows: Eadweard Muybridge and the Technological Wild West (public library).

Eadweard Muybridge: The Horse in Motion
Eadweard Muybridge: The Horse in Motion

Solnit frames the impact of the trailblazing experiments Muybridge conducted in the spring of 1872, when he first photographed a galloping horse:

[Muybridge] had captured aspects of motion whose speed had made them as invisible as the moons of Jupiter before the telescope, and he had found a way to set them back in motion. It was as though he had grasped time itself, made it stand still, and then made it run again, over and over. Time was at his command as it had never been at anyone’s before. A new world had opened up for science, for art, for entertainment, for consciousness, and an old world had retreated farther.

Technology and consciousness, of course, have always shaped one another, perhaps nowhere more so than in our experience of time — from the moment Galileo’s invention of the clock sparked modern timekeeping to the brutality with which social media timelines beleaguer us with a crushing sense of perpetual urgency. But the 1870s were a particularly fecund zeitgeist of technological transformation by Solnit’s perfect definition of technology as “a practice, a technique, or a device for altering the world or the experience of the world.” She writes:

The experience of time was itself changing dramatically during Muybridge’s seventy-four years, hardly ever more dramatically than in the 1870s. In that decade the newly invented telephone and phonograph were added to photography, telegraphy, and the railroad as instruments for “annihilating time and space.”

[…]

The modern world, the world we live in, began then, and Muybridge helped launch it.

[…]

His trajectory ripped through all the central stories of his time — the relationship to the natural world and the industrialization of the human world, the Indian wars, the new technologies and their impact on perception and consciousness. He is the man who split the second, as dramatic and far-reaching an action as the splitting of the atom.

Eadweard Muybridge: Sequenced image of a rotating sulky wheel with self-portrait
Eadweard Muybridge: Sequenced image of a rotating sulky wheel with self-portrait

Shining a sidewise gleam at just how radically the givens we take for granted have changed since Muybridge’s time, Solnit writes of that era in which a man could shoot his wife’s lover and be acquitted for justifiable homicide:

In the eight years of his motion-study experiments in California, he also became a father, a murderer, and a widower, invented a clock, patented two photographic innovations, achieved international renown as an artist and a scientist, and completed four other major photographic projects.

With the invention of cinema still more than a decade away, Muybridge’s shutters and film development techniques fused cutting-edge engineering and chemistry to produce more and better high-speed photographs than anyone had before. In a sense, Virginia Woolf’s famous complaint about the visual language of cinema“the eye licks it all up instantaneously, and the brain, agreeably titillated, settles down to watch things happening without bestirring itself to think,” she scoffed in 1926 — was an indictment of this new visual language of time and, indirectly, of Muybridge’s legacy. Had he not rendered time visible and tangible, Bertrand Russell may not have proclaimed that “even though time be real, to realize the unimportance of time is the gate of wisdom”; had his pioneering photography not altered our relationship to the moment, Italo Calvino would not have had to issue his prescient lamentation that “the life that you live in order to photograph it is already, at the outset, a commemoration of itself.”

Edweard Muyridge: A man standing on his hands from a lying down position
Eadweard Muyridge: A man standing on his hands from a lying down position

In a testament to the notion that all creative work builds on what came before, Muybridge made significant improvements on the zoetrope — a rotating device, invented in 1834, which creates the illusion of motion by presenting a series of spinning images through a slot. But alongside the practical improvement upon existing technologies, he also built upon larger cultural leaps — most significantly, the rise of the railroads, which compressed space and time unlike anything ever had.

In 1872, the railroad magnate Leland Stanford — who would later co-found Stanford University with his wife, Jane — commissioned Muybridge to study the gaits of galloping and trotting horses in order to determine whether all four feet lifted off the ground at once at any point. Since horses gallop at a speed that outpaces the perception of the human eye, this was impossible to discern without freezing motion into a still image. So began Muybridge’s transformation of time.

Horse in Motion: One of Muybridge's motion studies commissioned by Stanford
Horse in Motion: One of Muybridge’s motion studies commissioned by Stanford

With her penchant for cultural history laced with subtle, perfectly placed political commentary, Solnit traces the common root of Hollywood and Silicon Valley to Muybridge:

Perhaps because California has no past — no past, at least, that it is willing to remember — it has always been peculiarly adept at trailblazing the future. We live in the future launched there.

If one wanted to find an absolute beginning point, a creation story, for California’s two greatest transformations of the world, these experiments with horse and camera would be it. Out of these first lost snapshots eventually came a world-changing industry, and out of the many places where movies are made, one particular place: Hollywood. The man who owned the horse and sponsored the project believed in the union of science and business and founded the university that much later generated another industry identified, like Hollywood, by its central place: Silicon Valley.

It would be impossible to grasp the profound influence Muybridge and his legacy had on culture without understanding how dramatically different the world he was born into was from the one he left. Solnit paints the technological backdrop of his childhood:

Pigeons were the fastest communications technology; horses were the fastest transportation technology; the barges moved at the speed of the river or the pace of the horses that pulled them along the canals. Nature itself was the limit of speed: humans could only harness water, wind, birds, beasts. Born into this almost medievally slow world, the impatient, ambitious, inventive Muybridge would leave it and link himself instead to the fastest and newest technologies of the day.

The first passenger railroad opened on September 15, 1830 — mere months after Muybridge’s birth. Like any technological bubble, the spread of this novelty brought with it an arsenal of stock vocabulary. The notion of “annihilating time and space” became one of the era’s most used, then invariably overused, catchphrases. (In a way, clichés themselves — phrases to which we turn for cognitive convenience, out of a certain impatience with language — are another manifestation of our defiant relationship to time.) Applied first to the railways, the phrase soon spread to the various technological advancements that radiated, directly or indirectly, from them. Solnit writes:

“Annihilating time and space” is what most new technologies aspire to do: technology regards the very terms of our bodily existence as burdensome. Annihilating time and space most directly means accelerating communications and transportation. The domestication of the horse and the invention of the wheel sped up the rate and volume of transit; the invention of writing made it possible for stories to reach farther across time and space than their tellers and stay more stable than memory; and new communications, reproduction, and transportation technologies only continue the process. What distinguishes a technological world is that the terms of nature are obscured; one need not live quite in the present or the local.

[…]

The devices for such annihilation poured forth faster and faster, as though inventiveness and impatience had sped and multiplied too.

Eadweard Muybridge: Animal Locomotion, Plate 62
Eadweard Muybridge: Running full speed (Animal Locomotion, Plate 62)

But perhaps the most significant impact of the railroads, Solnit argues, was that they began standardizing human experience as goods, people, and their values traveled faster and farther than ever before. In contracting the world, the railways began to homogenize it. And just as society was adjusting to this new mode of relating to itself, another transformative invention bookended the decade: On January 7, 1839, the French artist Louis Daguerre debuted what he called daguerreotypy — a pioneering imaging method that catalyzed the dawn of photography.

With an eye to the era’s European and American empiricism, animated by a “restlessness that regarded the unknown as a challenge rather than a danger,” Solnit writes:

Photography may have been its most paradoxical invention: a technological breakthrough for holding onto the past, a technology always rushing forward, always looking backward.

[…]

Photography was a profound transformation of the world it entered. Before, every face, every place, every event, had been unique, seen only once and then lost forever among the changes of age, light, time. The past existed only in memory and interpretation, and the world beyond one’s own experience was mostly stories… [Now,] every photograph was a moment snatched from the river of time.

The final invention in the decades’s trifecta of technological transformation was the telegraph. Together, these three developments — photography, the railroads, and the telegraph — marked the beginning of our modern flight from presence, which would become the seedbed of our unhappiness over the century that followed. By chance, Muybridge came into the world at the pinnacle of this transformation; by choice, he became instrumental in guiding its course and, in effect, shaping modernity.

Eadweard Muybridge: Cockatoo flying (Animal Locomotion, Plate 758)
Eadweard Muybridge: Cockatoo flying (Animal Locomotion, Plate 758)

Solnit writes:

Before the new technologies and ideas, time was a river in which human beings were immersed, moving steadily on the current, never faster than the speeds of nature — of currents, of wind, of muscles. Trains liberated them from the flow of the river, or isolated them from it. Photography appears on this scene as though someone had found a way to freeze the water of passing time; appearances that were once as fluid as water running through one’s fingers became solid objects… Appearances were permanent, information was instantaneous, travel exceeded the fastest speed of bird, beast, and man. It was no longer a natural world in the sense it always had been, and human beings were no longer contained within nature.

Time itself had been of a different texture, a different pace, in the world Muybridge was born into. It had not yet become a scarce commodity to be measured out in ever smaller increments as clocks acquired second hands, as watches became more affordable mass-market commodities, as exacting schedules began to intrude into more and more activities. Only prayer had been precisely scheduled in the old society, and church bells had been the primary source of time measurement.

Simone Weil once defined prayer as “absolutely unmixed attention,” and perhaps the commodification of time that started in the 1830s was the beginning of the end of our capacity for such attention; perhaps Muybridge was the horseman of our attentional apocalypse.

Eadweard Muybridge: Woman removing mantle
Eadweard Muybridge: Woman removing mantle

Solnit considers the magnitude of his ultimate impact on our experience of time:

In the spring of 1872 a man photographed a horse. With the motion studies that resulted it was as though he were returning bodies themselves to those who craved them — not bodies as they might daily be experienced, bodies as sensations of gravity, fatigue, strength, pleasure, but bodies become weightless images, bodies dissected and reconstructed by light and machine and fantasy.

[…]

What they had lost was solid; what they gained was made out of air. That exotic new world of images speeding by would become the true home of those who spent their Saturdays watching images beamed across the darkness of the movie theater, then their evenings watching images beamed through the atmosphere and brought home into a box like a camera obscura or a crystal ball, then their waking hours surfing the Internet wired like the old telegraph system. Muybridge was a doorway, a pivot between that old world and ours, and to follow him is to follow the choices that got us here.

In the remainder of her rich and revelatory River of Shadows, Solnit goes on to follow the Rube Goldberg trajectory of these choices, linking Muybridge and his legacy to aspects of our daily lives ranging from the internet to how we inhabit our bodies. Complement it with Susan Sontag on the aesthetic consumerism of photography, the revisit Solnit on how to nurture our hope in times of despair, the rewards of walking, what reading does for the human spirit, and how modern noncommunication is changing our experience of time, solitude, and communion.

BP

A Madman Dreams of Tuning Machines: The Story of Joseph Weber, the Tragic Hero of Science Who Followed Einstein’s Vision and Pioneered the Sound of Spacetime

…and a remarkable letter from Freeman Dyson on the difficult, necessary art of changing one’s mind.

A Madman Dreams of Tuning Machines: The Story of Joseph Weber, the Tragic Hero of Science Who Followed Einstein’s Vision and Pioneered the Sound of Spacetime

In his groundbreaking 1915 paper on general relativity, Albert Einstein envisioned gravitational waves — ripples in the fabric of spacetime caused by astronomic events of astronomical energy. Although fundamental to our understanding of the universe, gravitational waves were a purely theoretical construct for him. He lived in an era when any human-made tool for detecting something this faraway was simply unimaginable, even by the greatest living genius, and many of the cosmic objects capable of producing such tremendous tumult — black holes, for instance — were yet to be discovered.

One September morning in 2015, almost exactly a century after Einstein published his famous paper, scientists turned his mathematical dream into a tangible reality — or, rather, an audible one.

blackholes_einstein

The Laser Interferometer Gravitational-Wave Observatory — an enormous international collaboration known as LIGO, consisting of two massive listening instruments 3,000 kilometers apart, decades in the making — recorded the sound of a gravitational wave produced by two mammoth black holes that had collided more than a billion years ago, more than a billion light-years away.

One of the most significant discoveries in the history of science, this landmark event introduces a whole new modality of curiosity in our quest to know the cosmos, its thrill only amplified by the fact that we had never actually seen black holes before hearing them. Nearly everything we know about the universe today, we know through five centuries of optical observation of light and particles. Now begins a new era of sonic exploration. Turning an inquisitive ear to the cosmos might, and likely will, revolutionize our understanding of it as radically as Galileo did when he first pointed his telescope at the skies.

In Black Hole Blues and Other Songs from Outer Space (public library) — one of the finest and most beautifully written books I’ve ever read, which I recently reviewed for The New York Times — astrophysicist and novelist Janna Levin tells the story of LIGO and its larger significance as a feat of science and the human spirit. Levin, a writer who bends language with effortless might and uses it not only as an instrument of thought but also as a Petri dish for emotional nuance, probes deep into the messy human psychology that animated these brilliant and flawed scientists as they persevered in this ambitious quest against enormous personal, political, and practical odds.

Levin — who has written beautifully about free will and the relationship between genius and madness — paints the backdrop for this improbable triumph:

Somewhere in the universe two black holes collide — as heavy as stars, as small as cities, literally black (the complete absence of light) holes (empty hollows). Tethered by gravity, in their final seconds together the black holes course through thousands of revolutions about their eventual point of contact, churning up space and time until they crash and merge into one bigger black hole, an event more powerful than any since the origin of the universe, outputting more than a trillion times the power of a billion Suns. The black holes collide in complete darkness. None of the energy exploding from the collision comes out as light. No telescope will ever see the event.

What nobody could see LIGO could hear — a sensitive, sophisticated ear pressed to the fabric of spacetime, tuned to what Levin so poetically eulogizes as “the total darkness, the empty space, the vacuity, the great expanse of nothingness, of emptiness, of pure space and time.” She writes of this astonishing instrument:

An idea sparked in the 1960s, a thought experiment, an amusing haiku, is now a thing of metal and glass.

But what makes the book most enchanting is Levin’s compassionate insight into the complex, porous, often tragic humanity undergirding the metal and glass — nowhere more tragic than in the story of Joseph Weber, the controversial pioneer who became the first to bring Einstein’s equations into the lab. Long before LIGO was even so much as a thought experiment, Weber envisioned and built a very different instrument for listening to the cosmos.

Weber was born Yonah Geber to a family of Lithuanian Jewish immigrants in early-twentieth-century New Jersey. His mother’s heavy accent caused his teacher to mishear the boy’s name as “Joseph,” so he became Joe. After he was hit by a bus at the age of five, young Joe required speech rehabilitation therapy, which replaced his Yiddish accent with a generic American one that led his family to call him “Yankee.” As a teenager, he dropped out of Cooper Union out of concern for his parents’ finances and joined the Navy instead, where he served on an aircraft carrier that was sunk during WWII. When the war ended, he became a microwave engineer and was hired as a professor at the University of Maryland at the then-enviable salary — especially for a 29-year-old — of $6,500 a year.

Eager to do microwave research, he turned to the great physicist George Gamow, who had theorized cosmic microwave background radiation — a thermal remnant of the Big Bang, which would provide unprecedented insight into the origin of the universe and which Weber wanted to dedicate his Ph.D. career to detecting. But Gamow inexplicably snubbed him. Two other scientists eventually discovered cosmic microwave background radiation by accident and received the Nobel Prize for the discovery. Weber then turned to atomic physics and devised the maser — the predecessor of the laser — but, once again, other scientists beat him to the public credit and received a Nobel for that discovery, too.

Levin writes:

Joe’s scientific life is defined by these significant near misses… He was Shackleton many times, almost the first: almost the first to see the big bang, almost the first to patent the laser, almost the first to detect gravitational waves. Famous for nearly getting there.

And that is how Weber got to gravitational waves — a field he saw as so small and esoteric that he stood a chance of finally being the first. Levin writes:

In 1969 Joe Weber announced that he had achieved an experimental feat widely believed to be impossible: He had detected evidence for gravitational waves. Imagine his pride, the pride to be the first, the gratification of discovery, the raw shameless pleasure of accomplishment. Practically single-handedly, through sheer determination, he conceives of the possibility. He fills multiple notebooks, hundreds of pages deep, with calculations and designs and ideas, and then he makes the experimental apparatus real. He builds an ingenious machine, a resonant bar, a Weber bar, which vibrates in sympathy with a gravitational wave. A solid aluminum cylinder about 2 meters long, 1 meter in diameter, and in the range of 3,000 pounds, as guitar strings go, isn’t easy to pluck. But it has one natural frequency at which a strong gravitational wave would ring the bar like a tuning fork.

Joseph Weber with his cylinder
Joseph Weber with his cylinder

Following his announcement, Weber became an overnight celebrity. His face graced magazine covers. NASA put one of his instruments on the Moon. He received ample laud from peers. Even the formidable J. Robert Oppenheimer, a man of slim capacity for compliments, encouraged him with a remark Weber never forgot: “The work you’re doing,” Oppenheimer told him, “is just about the most exciting work going on anywhere around here.”

Under the spell of this collective excitement, scientists around the world began building replicas of Weber’s cylinder. But one after another, they were unable to replicate his results — the electrifying eagerness to hear gravitational waves was met with the dead silence of the cosmos.

Weber plummeted from grace as quickly as he had ascended. (Einstein himself famously scoffed at the fickle nature of fame.) Levin writes:

Joe Weber’s claims in 1969 to have detected gravitational waves, the claims that catapulted his fame, that made him possibly the most famous living scientist of his generation, were swiftly and vehemently refuted. The subsequent decades offered near total withdrawal of support, both from scientific funding agencies and his peers. He was almost fired from the University of Maryland.

Among Weber’s most enthusiastic initial supporters was the great theoretical physicist Freeman Dyson. Perhaps out of his staunch belief that no question is unanswerable, Dyson had emboldened Weber to attempt what no one had attempted before — to hear the sound of spacetime. But when the evidence against Weber’s data began to mount, Dyson was anguished by a sense of personal responsibility for having encouraged him, so he wrote Weber an extraordinary letter urging him to practice the immensely difficult art of changing one’s mind. Levin quotes the letter, penned on June 5, 1975:

Dear Joe,

I have been watching with fear and anguish the ruin of our hopes. I feel a considerable personal responsibility for having advised you in the past to “stick your neck out.” Now I still consider you a great man unkindly treated by fate, and I am anxious to save whatever can be saved. So I offer my advice again for what it is worth.

A great man is not afraid to admit publicly that he has made a mistake and has changed his mind. I know you are a man of integrity. You are strong enough to admit that you are wrong. If you do this, your enemies will rejoice but your friends will rejoice even more. You will save yourself as a scientist, and you will find that those whose respect is worth having will respect you for it.

I write now briefly because long explanations will not make the message clearer. Whatever you decide, I will not turn my back on you.

With all good wishes,
Yours ever
Freeman

But Weber decided not to heed his friend’s warm caution. His visionary genius coexisted with one of the most unfortunate and most inescapable of human tendencies — our bone-deep resistance to the shame of admitting error. He paid a high price: His disrepute soon veered into cruelty — he was ridiculed and even baited by false data intended to trick him into reaffirming his claims, only to be publicly humiliated all over again. In one of the archival interviews Levin excavates, he laments:

I simply cannot understand the vehemence and the professional jealousy, and why each guy has to feel that he has to cut off a pound of my flesh… Boltzmann committed suicide with this sort of treatment.

Here, I think of Levin’s penchant for celebrating tragic heroes whose posthumous redemption only adds to their tragedy. Her magnificent novel A Mad Man Dreams of Turing Machines is based on the real lives of computing pioneer Alan Turing and mathematician Kurt Gödel, both of whom committed suicide — Turing after particularly cruel mistreatment. Levin’s writing emanates a deep sympathy for those who have fallen victim to some combination of their own fallible humanity and the ferocious inhumanity of unforgiving, bloodthirsty others. No wonder Weber’s story sings to her. A mad man dreams of tuning machines.

Without diminishing the role of personal pathology and individual neurochemistry, given what psychologists know about suicide prevention, social support likely played a vital role in Weber’s ability to withstand the barrage of viciousness — Dyson’s sympathetic succor, but most of all the love of his wife, the astronomer Virginia Trimble, perhaps the most unambivalently likable character in the book. Levin writes:

She called him Weber and he called her Trimble. They married in March 1972 after a cumulative three weekends together. She laughs. “Weber never had any trouble making up his mind.” Twenty-three years her senior, he always insisted she do what she wanted and needed to do. Perhaps trained in part by his first wife, Anita, a physicist who took a protracted break to raise their four boys, the widower had no reservations about Virginia’s work, her independence, or her IQ. (Stratospheric. In an issue of Life magazine with a now-vintage cover, in an article titled “Behind a Lovely Face, a 180 I.Q.” about the then eighteen-year-old astrophysics major, she is quoted as classifying the men she dates into three types: “Guys who are smarter than I am, and I’ve found one or two. Guys who think they are— they’re legion. And those who don’t care.”)

Virginia Trimble in LIFE magazine, 1962
Behind a Lovely Face, a 180 I.Q.: Virginia Trimble in LIFE magazine, 1962

Trimble was the second woman ever allowed at the famed Palomar Observatory, a year after pioneering astronomer Vera Rubin broke the optical-glass ceiling by becoming the first to observe there. Levin, whose subtle kind-natured humor never fails to delight, captures Trimble’s irreverent brilliance:

In her third year, having demonstrated her tenacity — particularly manifest in the fact that she still hadn’t married, she suspects — she was awarded a fellowship from the National Science Foundation. When she arrived at Caltech, she was delighted. “I thought, ‘Look at all of these lovely men.’” In her seventies, with her coral dress, matching shoes and lip color, Moon earrings, and gold animal-head ring, she beams. Still a lovely face. And still an IQ of 180.

This fierce spirit never left Trimble. Now in her seventies, she tells Levin:

When I fell and broke my hip last September, I spent four days on the floor of my apartment singing songs and reciting poetry until I was found.

It isn’t hard to see why Weber — why anyone — would fall in love with Trimble. But although their love sustained him and he didn’t take his own life, he suffered an end equally heartbreaking.

By the late 1980s, Weber had submerged himself even deeper into the quicksand of his convictions, stubbornly trying to prove that his instrument could hear the cosmos. For the next twenty years, he continued to operate his own lab funded out of pocket — a drab concrete box in the Maryland woods, where he was both head scientist and janitor. Meanwhile, LIGO — a sophisticated instrument that would eventually cost more than $1 billion total, operated by a massive international team of scientists — was gathering momentum nearby, thanks largely to the scientific interest in gravitational astronomy that Weber’s early research had sparked.

He was never invited to join LIGO. Trimble surmises that even if he had been, he would’ve declined.

One freezing winter morning in 2000, just as LIGO’s initial detectors were being built, 81-year-old Weber went to clean his lab, slipped on the ice in front of the building, hit his head, and fell unconscious. He was found two days later and taken to the E.R., but he never recovered. He died at the hospital several months later from the lymphoma he’d been battling. The widowed Trimble extracts from her husband’s tragedy an unsentimental parable of science — a testament to the mismatch between the time-scale of human achievement, with all the personal glory it brings, and that of scientific progress:

Science is a self-correcting process, but not necessarily in one’s own lifetime.

When the LIGO team published the official paper announcing the groundbreaking discovery, Weber was acknowledged as the pioneer of gravitational wave research. But like Alan Turing, who was granted posthumous pardon by the Queen more than half a century after he perished by inhumane injustice, Weber’s redemption is culturally bittersweet at best. I’m reminded of a beautiful passage from Levin’s novel about Turing and Gödel, strangely perfect in the context of Weber’s legacy:

Their genius is a testament to our own worth, an antidote to insignificance; and their bounteous flaws are luckless but seemingly natural complements, as though greatness can be doled out only with an equal measure of weakness… Their broken lives are mere anecdotes in the margins of their discoveries. But then their discoveries are evidence of our purpose, and their lives are parables on free will.

Free will, indeed, is what Weber exercised above all — he lived by it and died by it. In one of the interviews Levin unearths, he reflects from the depths of his disrepute:

If you do science the principal reason to do it is because you enjoy it and if you don’t enjoy it you shouldn’t do it, and I enjoy it. And I must say I’m enjoying it… That’s the best you can do.

At the end of the magnificent and exceptionally poetic Black Hole Blues, the merits of which I’ve extolled more fully here, Levin offers a wonderfully lyrical account of LIGO’s triumph as she peers into the furthest reaches of the spacetime odyssey that began with Einstein, gained momentum with Weber, and is only just beginning to map the course of human curiosity across the universe:

Two very big stars lived in orbit around each other several billion years ago. Maybe there were planets around them, although the two-star system might have been too unstable or too simple in composition to accommodate planets. Eventually one star died, and then the other, and two black holes formed. They orbited in darkness, probably for billions of years before that final 200 milliseconds when the black holes collided and merged, launching their loudest gravitational wave train into the universe.

The sound traveled to us from 1.4 billion light-years away. One billion four hundred million light-years.

[…]

We heard black holes collide. We’ll point to where the sound might have come from, to the best of our abilities, a swatch of space from an earlier epoch. Somewhere in the southern sky, pulling away from us with the expansion of the universe, the big black hole will roll along its own galaxy, dark and quiet until something wanders past, an interstellar dust cloud or an errant star. After a few billion years the host galaxy might collide with a neighbor, tossing the black hole around, maybe toward a supermassive black hole in a growing galactic center. Our star will die. The Milky Way will blend with Andromeda. The record of this discovery along with the wreckage of our solar system will eventually fall into black holes, as will everything else in the cosmos, the expanding space eventually silent, and all the black holes will evaporate into oblivion near the end of time.

blackholeblues_jannalevin2

BP

Nietzsche on Dreams as an Evolutionary Time Machine for the Human Brain

“Dreams carry us back to the earlier stages of human culture and afford us a means of understanding it more clearly.”

Nietzsche on Dreams as an Evolutionary Time Machine for the Human Brain

We spend a third of our lives in a parallel nocturnal universe and the half-imagined, half-remembered experiences we have there are in constant dynamic interaction with our waking selves. Our nightly dreams are both fragmentary reflections of our conscious lives, rearranged into barely recognizable mosaics by our unconscious, and potent agents of psychic transmutation — a powerful dream can cast an unshakable mood over the wakeful hours, or even days, that follow it. Science is only just beginning to shed light on the role of dreams in memory consolidation and mood, but their nature and purpose remain largely a mystery. “We feel dreamed by someone else, a sleeping counterpart,” the poet Mark Strand wrote in his beautiful ode to dreams.

Friedrich Nietzsche (October 15, 1844–August 25, 1900) saw this sleeping counterpart as our link to primitive humanity — an atavistic remnant of the pre-rational human mind. Nearly two decades before Freud’s seminal treatise on dreams, Nietzsche explored the mystique of the nocturnal unconscious in a portion of Human, All Too Human: A Book for Free Spirits (public library | free ebook) — his altogether terrific 1879 inquiry into how we become who we are.

In a section on dreams and civilization, he writes:

In the dream … we have the source of all metaphysic. Without the dream, men would never have been incited to an analysis of the world. Even the distinction between soul and body is wholly due to the primitive conception of the dream, as also the hypothesis of the embodied soul, whence the development of all superstition, and also, probably, the belief in god. “The dead still live: for they appear to the living in dreams.” So reasoned mankind at one time, and through many thousands of years.

Therein lies Nietzsche’s most intriguing point: Sleep, he suggests, is a kind of evolutionary time machine — a portal to the primitive past of our sensemaking instincts. He paints the sleeping brain as a blunt Occam’s Razor — in seeking out the simplest explanations for our daily confusions, it ends up succumbing to the simplistic. This, Nietzsche argues, is how superstitions and religious mythologies may have originated:

The function of the brain which is most encroached upon in slumber is the memory; not that it is wholly suspended, but it is reduced to a state of imperfection as, in primitive ages of mankind, was probably the case with everyone, whether waking or sleeping. Uncontrolled and entangled as it is, it perpetually confuses things as a result of the most trifling similarities, yet in the same mental confusion and lack of control the nations invented their mythologies, while nowadays travelers habitually observe how prone the savage is to forgetfulness, how his mind, after the least exertion of memory, begins to wander and lose itself until finally he utters falsehood and nonsense from sheer exhaustion. Yet, in dreams, we all resemble this savage. Inadequacy of distinction and error of comparison are the basis of the preposterous things we do and say in dreams, so that when we clearly recall a dream we are startled that so much idiocy lurks within us. The absolute distinctness of all dream-images, due to implicit faith in their substantial reality, recalls the conditions in which earlier mankind were placed, for whom hallucinations had extraordinary vividness, entire communities and even entire nations laboring simultaneously under them. Therefore: in sleep and in dream we make the pilgrimage of early mankind over again.

Illustration by Tom Seidmann-Freud, Freud’s cross-dressing niece, from a philosophical 1922 children’s book about dreams

Just like the dreaming self contains vestiges of every self we’ve inhabited since childhood, to be resurrected in sleep, Nietzsche argues that the dreaming brain contains vestiges of the primitive stages of the human brain, when our cognitive capacity for problem-solving was far more limited and unmoored from critical thinking. Nearly a century before modern scientists came to study what actually happens to the brain and body while we sleep, he writes:

Everyone knows from experience how a dreamer will transform one piercing sound, for example, that of a bell, into another of quite a different nature, say, the report of cannon. In his dream he becomes aware first of the effects, which he explains by a subsequent hypothesis and becomes persuaded of the purely conjectural nature of the sound. But how comes it that the mind of the dreamer goes so far astray when the same mind, awake, is habitually cautious, careful, and so conservative in its dealings with hypotheses? Why does the first plausible hypothesis of the cause of a sensation gain credit in the dreaming state? (For in a dream we look upon that dream as reality, that is, we accept our hypotheses as fully established). I have no doubt that as men argue in their dreams to-day, mankind argued, even in their waking moments, for thousands of years: the first causa, that occurred to the mind with reference to anything that stood in need of explanation, was accepted as the true explanation and served as such… In the dream this atavistic relic of humanity manifests its existence within us, for it is the foundation upon which the higher rational faculty developed itself and still develops itself in every individual. Dreams carry us back to the earlier stages of human culture and afford us a means of understanding it more clearly.

Illustration by Judith Clay from Thea’s Tree

Nietzsche considers the cognitive machinery of this dreamsome deduction:

If we close our eyes the brain immediately conjures up a medley of impressions of light and color, apparently a sort of imitation and echo of the impressions forced in upon the brain during its waking moments. And now the mind, in co-operation with the imagination, transforms this formless play of light and color into definite figures, moving groups, landscapes. What really takes place is a sort of reasoning from effect back to cause.

[…]

The imagination is continually interposing its images inasmuch as it participates in the production of the impressions made through the senses day by day: and the dream-fancy does exactly the same thing — that is, the presumed cause is determined from the effect and after the effect: all this, too, with extraordinary rapidity, so that in this matter, as in a matter of jugglery or sleight-of-hand, a confusion of the mind is produced and an after effect is made to appear a simultaneous action, an inverted succession of events, even. — From these considerations we can see how late strict, logical thought, the true notion of cause and effect must have been in developing, since our intellectual and rational faculties to this very day revert to these primitive processes of deduction, while practically half our lifetime is spent in the super-inducing conditions.

In a sentiment that calls to mind Polish Nobel laureate Wislawa Szymborska’s wonderful notion that the poet is “the spiritual heir of primitive humanity,” Nietzsche adds:

Even the poet, the artist, ascribes to his sentimental and emotional states causes which are not the true ones. To that extent he is a reminder of early mankind and can aid us in its comprehension.

Human, All Too Human is a spectacular read in its totality. Complement this particular portion with the science of how REM sleep regulates our negative moods and the psychology of dreams and why we have nightmares, then revisit Nietzsche on the power of music, how to find yourself, why a fulfilling life requires embracing rather than running from difficulty, and his ten rules for writers.

BP

View Full Site

Brain Pickings participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from a link on here, I get a small percentage of its price. That helps support Brain Pickings by offsetting a fraction of what it takes to maintain the site, and is very much appreciated. Privacy policy.