Brain Pickings

Posts Tagged ‘Galileo’

20 OCTOBER, 2014

The Hummingbird Effect: How Galileo Invented Time and Gave Rise to the Modern Tyranny of the Clock

By:

How the invisible hand of the clock powered the Industrial Revolution and sparked the Information Age.

While we appreciate it in the abstract, few of us pause to grasp the miracles of modern life, from artificial light to air conditioning, as Steven Johnson puts it in the excellent How We Got to Now: Six Innovations That Made the Modern World (public library), “how amazing it is that we drink water from a tap and never once worry about dying forty-eight hours later from cholera.” Understanding how these everyday marvels first came to be, then came to be taken for granted, not only allows us to see our familiar world with new eyes — something we are wired not to do — but also lets us appreciate the remarkable creative lineage behind even the most mundane of technologies underpinning modern life. Johnson writes in the introduction:

Our lives are surrounded and supported by a whole class of objects that are enchanted with the ideas and creativity of thousands of people who came before us: inventors and hobbyists and reformers who steadily hacked away at the problem of making artificial light or clean drinking water so that we can enjoy those luxuries today without a second thought, without even thinking of them as luxuries in the first place… We are indebted to those people every bit as much as, if not more than, we are to the kings and conquerors and magnates of traditional history.

Johnson points out that, much like the evolution of bees gave flowers their colors and the evolution of pollen altered the design of the hummingbird’s wings, the most remarkable thing about innovations is the way they precipitate unanticipated changes that reverberate far and wide beyond the field or discipline or problem at the epicenter of the particular innovation. Pointing to the Gutenberg press — itself already an example of the combinatorial nature of creative breakthroughs — Johnson writes:

Johannes Gutenberg’s printing press created a surge in demand for spectacles, as the new practice of reading made Europeans across the continent suddenly realize that they were farsighted; the market demand for spectacles encouraged a growing number of people to produce and experiment with lenses, which led to the invention of the microscope, which shortly thereafter enabled us to perceive that our bodies were made up of microscopic cells. You wouldn’t think that printing technology would have anything to do with the expansion of our vision down to the cellular scale, just as you wouldn’t have thought that the evolution of pollen would alter the design of a hummingbird’s wing. But that is the way change happens.

Johnson terms these complex chains of influences the “hummingbird effect,” named after the famous “butterfly effect” concept from chaos theory — Edward Lorenz’s famous metaphor for the idea that a change as imperceptible as the flap of a butterfly’s wings can result in an effect as grand as a hurricane far away several weeks later — but different in a fundamental way:

The extraordinary (and unsettling) property of the butterfly effect is that it involves a virtually unknowable chain of causality; you can’t map the link between the air molecules bouncing around the butterfly and the storm system brewing in the Atlantic. They may be connected, because everything is connected on some level, but it is beyond our capacity to parse those connections or, even harder, to predict them in advance. But something very different is at work with the flower and the hummingbird: while they are very different organisms, with very different needs and aptitudes, not to mention basic biological systems, the flower clearly influences the hummingbird’s physiognomy in direct, intelligible ways.

Under the “hummingbird effect,” an innovation in one field can trigger unexpected breakthroughs in wholly different domains, but the traces of those original influences often remain obscured. Illuminating them allows us to grasp the many dimensions of change, its complex and often unintended consequences, the multiple scales of experience that have always defined human history and, perhaps above all, to lend much-needed dimension to the flat myth of genius. Playing off the sentiment at the heart of Richard Feynman’s famous ode to a flower, Johnson writes:

History happens on the level of atoms, the level of planetary climate change, and all the levels in between. If we are trying to get the story right, we need an interpretative approach that can do justice to all those different levels.

[...]

There is something undeniably appealing about the story of a great inventor or scientist — Galileo and his telescope, for instance — working his or her way toward a transformative idea. But there is another, deeper story that can be told as well: how the ability to make lenses also depended on the unique quantum mechanical properties of silicon dioxide and on the fall of Constantinople. Telling the story from that long-zoom perspective doesn’t subtract from the traditional account focused on Galileo’s genius. It only adds.

Nundinal calendar, Rome. The ancient Etruscans developed an eight-day market week, known as the nundinal cycle, around the eighth or seventh century BC.

In fact, of the six such widely reverberating innovations that Johnson highlights, the one sparked by Galileo is the most fascinating because it captures so many dimensions of our eternal and eternally bedeviled relationship with time — our astoundingly elastic perception of it, the way it dictates our internal rhythms and our creative routines, its role in free will, and much more. Johnson tells an absorbing origin story the way only he can:

Legend has it that in 1583, a nineteen-year-old student at the University of Pisa attended prayers at the cathedral and, while daydreaming in the pews, noticed one of the altar lamps swaying back and forth. While his companions dutifully recited the Nicene Creed around him, the student became almost hypnotized by the lamp’s regular motion. No matter how large the arc, the lamp appeared to take the same amount of time to swing back and forth. As the arc decreased in length, the speed of the lamp decreased as well. To confirm his observations, the student measured the lamp’s swing against the only reliable clock he could find: his own pulse.

The swinging altar lamp inside Duomo of Pisa

That teenager, of course, was Galileo. Johnson explains the significance of that mythic moment:

That Galileo was daydreaming about time and rhythm shouldn’t surprise us: his father was a music theorist and played the lute. In the middle of the sixteenth century, playing music would have been one of the most temporally precise activities in everyday culture. (The musical term “tempo” comes from the Italian word for time.) But machines that could keep a reliable beat didn’t exist in Galileo’s age; the metronome wouldn’t be invented for another few centuries. So watching the altar lamp sway back and forth with such regularity planted the seed of an idea in Galileo’s young mind. As is so often the case, however, it would take decades before the seed would blossom into something useful.

'Portrait of Galileo Galilei' by Justus Sustermans, 1636

Indeed, Galileo’s mass experience stands as a spectacular testament to the usefulness of useless knowledge. Over the next two decades, he busied himself with becoming a professor of mathematics, tinkering with telescopes, and, as Johnson aptly puts it, “more or less inventing modern science” (and withstanding the pushback). And yet he kept the image of that swinging altar lamp on the back-burner of his mind. Eventually, as he grew increasingly enchanted with motion and dynamics, he decided to build a pendulum that would simulate what he had observed that distant day at the cathedral. His discovery confirmed his intuition — what determined the time it took the pendulum to swing wasn’t the size of the arc or the weight of the object, but merely the length of the string. Johnson cites Galileo’s excited letter to his peer Giovanni Battista Baliani:

The marvelous property of the pendulum is that it makes all its vibrations, large or small, in equal times.

Galileo's sketches for the pendulum clock

In our present age of productivity, when our entire lives depend on accurate timekeeping — from our daily routines to our conference calls to financial markets and flights — it’s hard to imagine just how groundbreaking and downright miraculous the concept of measuring time accurately was in 16th-century Italy. And yet that’s precisely what it was — Italian towns then, Johnson points out, had clunky mechanical clocks that reflected a loose estimation of time, often losing twenty minutes a day, and had to be constantly corrected by sundial readings. Johnson writes:

The state of the art in timekeeping technology was challenged by just staying accurate on the scale of days. The idea of a timepiece that might be accurate to the second was preposterous.

Preposterous, and seemingly unnecessary. Just like Frederic Tudor’s ice trade, it was an innovation that had no natural market. You couldn’t keep accurate time in the middle of the sixteenth century, but no one really noticed, because there was no need for split-second accuracy. There were no buses to catch, or TV shows to watch, or conference calls to join. If you knew roughly what hour of the day it was, you could get by just fine.

Discus chronologicus, early 1720s, from Cartographies of Time. (Click image for details)

This is where the wings of the hummingbird begin to flutter: The real tipping point in accuracy, Johnson points out in a twist, “would emerge not from the calendar but from the map” — which makes sense given our long history of using cartography to measure time. He explains:

This was the first great age of global navigation, after all. Inspired by Columbus, ships were sailing to the Far East and the newly discovered Americas, with vast fortunes awaiting those who navigated the oceans successfully. (And almost certain death awaiting those who got lost.) But sailors lacked any way to determine longitude at sea. Latitude you could gauge just by looking up at the sky. But before modern navigation technology, the only way to figure out a ship’s longitude involved two clocks. One clock was set to the exact time of your origin point (assuming you knew the longitude of that location). The other clock recorded the current time at your location at sea. The difference between the two times told you your longitudinal position: every four minutes of difference translated to one degree of longitude, or sixty-eight miles at the equator.

In clear weather, you could easily reset the ship clock through accurate readings of the sun’s position. The problem was the home-port clock. With timekeeping technology losing or gaining up to twenty minutes a day, it was practically useless on day two of the journey.

This was an era when European royalty offered handsome bounties for specific innovations — the then-version of venture capital — incentivizing such scientific breakthroughs as Maria Mitchell’s comet discoveries and Johannes Hevelius’s star catalog. As the need to solve the navigation problem grew in urgency, the rewards offered for a solution grew in magnitude — and this was what resurfaced Galileo’s teenage vision for “equal time” all those years later. Johnson describes Galileo’s journey as a superb example of the “slow churn” of creativity, the value of cross-pollinating disciplines, and the importance of playing “the long game”:

[Galileo's] astronomical observations had suggested that the regular eclipses of Jupiter’s moons might be useful for navigators keeping time at sea, but the method he devised was too complicated (and not as accurate as he had hoped). And so he returned, one last time, to the pendulum.

Fifty-eight years in the making, his slow hunch about the pendulum’s “magical property” had finally begun to take shape. The idea lay at the intersection point of multiple disciplines and interests: Galileo’s memory of the altar lamp, his studies of motion and the moons of Jupiter, the rise of a global shipping industry, and its new demand for clocks that would be accurate to the second. Physics, astronomy, maritime navigation, and the daydreams of a college student: all these different strains converged in Galileo’s mind. Aided by his son, he began drawing up plans for the first pendulum clock.

There is something so poetic about Galileo inventing split-second time for the public on a private scale of decades.

Over the century that followed, the pendulum clock, a hundred times more accurate than any preceding technology, became a staple of European life and forever changed our relationship with time. But the hummingbird’s wings continued to flap — accurate timekeeping became the imperceptible heartbeat beneath all technology of the Industrial Revolution, from scheduling the division of labor in factories to keeping steam-powered locomotives running on time. It was the invisible hand of the clock that first moved the market — a move toward unanticipated innovations in other fields. Without clocks, Johnson argues, the Industrial Revolution may have never taken off — or “at the very least, have taken much longer to reach escape velocity.” He explains:

Accurate clocks, thanks to their unrivaled ability to determine longitude at sea, greatly reduced the risks of global shipping networks, which gave the first industrialists a constant supply of raw materials and access to overseas markets. In the late 1600s and early 1700s, the most reliable watches in the world were manufactured in England, which created a pool of expertise with fine-tool manufacture that would prove to be incredibly handy when the demands of industrial innovation arrived, just as the glassmaking expertise producing spectacles opened the door for telescopes and microscopes. The watchmakers were the advance guard of what would become industrial engineering.

But the most radical innovation of clock time was the emergence of the new working day. Up until that point, people divided their days not into modular abstract units — after all, what is an hour? — but into a fluid series of activities:

Instead of fifteen minutes, time was described as how long it would take to milk the cow or nail soles to a new pair of shoes. Instead of being paid by the hour, craftsmen were conventionally paid by the piece produced — what was commonly called “taken-work” — and their daily schedules were almost comically unregulated.

Rather, they were self-regulated by shifting factors like the worker’s health or mood, the weather, and the available daylight during that particular season. The emergence of factories demanded a reliable, predictable industrial workforce, which in turn called for fundamentally reframing the human perception of time. In one particularly pause-giving parenthetical aside, Johnson writes:

The lovely double entendre of “punching the clock” would have been meaningless to anyone born before 1700.

Workers punching the time clock at the Rouge Plant of the Ford Motor Company

And yet, as with most innovations, the industrialization of time came with a dark side — one Bertrand Russell so eloquently lamented in the 1920s when he asked: “What will be the good of the conquest of leisure and health, if no one remembers how to use them?” Johnson writes:

The natural rhythms of tasks and leisure had to be forcibly replaced with an abstract grid. When you spend your whole life inside that grid, it seems like second nature, but when you are experiencing it for the first time, as the laborers of industrial England did in the second half of the eighteenth century, it arrives as a shock to the system. Timepieces were not just tools to help you coordinate the day’s events, but something more ominous: the “deadly statistical clock,” in Dickens’s Hard Times, “which measured every second with a beat like a rap upon a coffin lid.”

[...]

To be a Romantic at the turn of the nineteenth century was in part to break from the growing tyranny of clock time: to sleep late, ramble aimlessly through the city, refuse to live by the “statistical clocks” that governed economic life… The time discipline of the pendulum clock took the informal flow of experience and nailed it to a mathematical grid. If time is a river, the pendulum clock turned it into a canal of evenly spaced locks, engineered for the rhythms of industry.

Johnson goes on to trace the hummingbird flutterings to the emergence of pocket watches, the democratization of time through the implementation of Standard Time, and the invention of the first quartz clock in 1928, which boasted the unprecedented accuracy of losing or gaining only one thousandth of a second per day. He observes the most notable feature of these leaps and bounds:

One of the strangest properties of the measurement of time is that it doesn’t belong neatly to a single scientific discipline. In fact, each leap forward in our ability to measure time has involved a handoff from one discipline to another. The shift from sundials to pendulum clocks relied on a shift from astronomy to dynamics, the physics of motion. The next revolution in time would depend on electromechanics. With each revolution, though, the general pattern remained the same: scientists discover some natural phenomenon that displays the propensity for keeping “equal time” that Galileo had observed in the altar lamps, and before long a wave of inventors and engineers begin using that new tempo to synchronize their devices.

But the most groundbreaking effect of the quartz clock — the most unpredictable manifestation of the hummingbird effect in the story of time — was that it gave rise to modern computing and the Information Age. Johnson writes:

Computer chips are masters of time discipline… Instead of thousands of operations per minute, the microprocessor is executing billions of calculations per second, while shuffling information in and out of other microchips on the circuit board. Those operations are all coordinated by a master clock, now almost without exception made of quartz… A modern computer is the assemblage of many different technologies and modes of knowledge: the symbolic logic of programming languages, the electrical engineering of the circuit board, the visual language of interface design. But without the microsecond accuracy of a quartz clock, modern computers would be useless.

Theodor Nelson's pioneering 1974 book 'Computer Lib | Dream Machines,' an exploration of the creative potential of computer networks, from '100 Ideas that Changed the Web' (Click image for more)

But as is often the case given the “thoroughly conscious ignorance” by which science progresses, new frontiers of knowledge only exposed what is yet to be reached. With the invention of the quartz clock also came the realization that the length of the day wasn’t as reliable as previously thought and the earth’s rotation wasn’t the most accurate tool for reaching Galileo’s measurement ideal of “equal time.” As Johnson puts it, “quartz let us ‘see’ that the seemingly equal times of a solar day weren’t nearly as equal as we had assumed” — the fact that a block of vibrating sand did a better job of keeping time than the sun and the earth, celebrated for centuries as the ultimate timekeepers, became the ultimate “deathblow to the pre-Copernican universe.”

What accurate timekeeping needed, ever since Galileo’s contemplation of the pendulum, was something that oscillated in the most consistent rhythm possible — and that’s what Niels Bohr and Werner Heisenberg’s discovery of the atom in the beginning of the twentieth century finally provided. With its rhythmically spinning electrons, the smallest chemical unit became the greatest and most consistent oscillator ever known. When the first atomic clocks were built in the 1950s, they introduced a groundbreaking standard of accuracy, measuring time down to the nanosecond, thousandfold better than the quartz clock’s microseconds.

Half a century later, this unprecedented precision is something we’ve come to take for granted — and yet it continues to underpin our lives with a layer of imperceptible magic. In one example, Johnson brings us full-circle to the relationship between timekeeping and map navigation where Galileo began:

Every time you glance down at your smartphone to check your location, you are unwittingly consulting a network of twenty-four atomic clocks housed in satellites in low-earth orbit above you. Those satellites are sending out the most elemental of signals, again and again, in perpetuity: the time is 11:48:25.084738 . . . the time is 11:48:25.084739. . . . When your phone tries to figure out its location, it pulls down at least three of these time stamps from satellites, each reporting a slightly different time thanks to the duration it takes the signal to travel from satellite to the GPS receiver in your hand. A satellite reporting a later time is closer than one reporting an earlier time. Since the satellites have perfectly predictable locations, the phone can calculate its exact position by triangulating among the three different time stamps. Like the naval navigators of the eighteenth century, GPS determines your location by comparing clocks. This is in fact one of the recurring stories of the history of the clock: each new advance in timekeeping enables a corresponding advance in our mastery of geography — from ships, to railroads, to air traffic, to GPS. It’s an idea that Einstein would have appreciated: measuring time turns out to be key to measuring space.

Therein lies the remarkable power and reach of the hummingbird effect, which Johnson condenses into an elegant concluding reflection:

Embedded in your ability to tell the time is the understanding of how electrons circulate within cesium atoms; the knowledge of how to send microwave signals from satellites and how to measure the exact speed with which they travel; the ability to position satellites in reliable orbits above the earth, and of course the actual rocket science needed to get them off the ground; the ability to trigger steady vibrations in a block of silicon dioxide — not to mention all the advances in computation and microelectronics and network science necessary to process and represent that information on your phone. You don’t need to know any of these things to tell the time now, but that’s the way progress works: the more we build up these vast repositories of scientific and technological understanding, the more we conceal them. Your mind is silently assisted by all that knowledge each time you check your phone to see what time it is, but the knowledge itself is hidden from view. That is a great convenience, of course, but it can obscure just how far we’ve come since Galileo’s altar-lamp daydreams in the Duomo of Pisa.

But perhaps the strangest thing about time is how each leap of innovation further polarized the scales on which it played out. As in the case of Galileo, who took six decades to master the minute, the same breakthroughs that gave atomic time its trailblazing accuracy also gave us radiation and radiometric dating, which was essential in debunking the biblical myth and proving that earth’s age was in the billions, not thousands, of years.

5,068-year-old bristlecone pine from Rachel Sussman's 'The Oldest Living Things in the World' (Click image for more)

Pointing to the Long Now Foundation’s quest to bury a clock that ticks once every 10,000 years beneath some of the oldest living pines in the world — an effort to extract us from the toxic grip of short-termism and, in the words of Long Now founder Kevin Kelly, nudge us to think about “generational-scale questions and projects” — Johnson ends with a wonderfully poetic reflection:

This is the strange paradox of time in the atomic age: we live in ever shorter increments, guided by clocks that tick invisibly with immaculate precision; we have short attention spans and have surrendered our natural rhythms to the abstract grid of clock time. And yet simultaneously, we have the capacity to imagine and record histories that are thousands or millions of years old, to trace chains of cause and effect that span dozens of generations. We can wonder what time it is and glance down at our phone and get an answer that is accurate to the split-second, but we can also appreciate that the answer was, in a sense, five hundred years in the making: from Galileo’s altar lamp to Niels Bohr’s cesium, from the chronometer to Sputnik. Compared to an ordinary human being from Galileo’s age, our time horizons have expanded in both directions: from the microsecond to the millennium.

In the remainder of How We Got to Now, a remarkable and perspective-shifting masterwork in its entirety, Johnson goes on to examine with equal dimension and rigor the workings of the hummingbird effect through the invention and evolution of such concepts as sound, light, glass, sanitation, and cooling.

For more on the mysteries of time, see these seven revelatory perspectives for a variety of fields, then revisit the curious psychology of why time slows down when you’re afraid, speeds up as you age, and gets warped while you’re on vacation.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

02 JUNE, 2014

William Shakespeare, Astronomer: How Galileo Influenced the Bard

By:

A stanzaic vision for Jupiter’s moons and Saturn’s rings.

William Shakespeare — to the extent that he existed at all — lived during a remarkable period in human history. Born the same year as Galileo, a founding father of the Scientific Revolution, and shortly before Montaigne, the Bard witnessed an unprecedented intersection of science and philosophy as humanity sought to make sense of its existence. One of the era’s most compelling sensemaking mechanisms was the burgeoning field of astronomy, which brought to the ancient quest to order the heavens a new spirit of scientific ambition.

In The Science of Shakespeare: A New Look at the Playwright’s Universe (public library), science journalist Dan Falk explores the curious connection between the legendary playwright and the spirit of the Scientific Revolution, arguing that the Bard was significantly influenced by science, especially by observational astronomy.

'A Comet Lands in Brooklyn,’ an installation designed by StudioKCA and David Delgado of NASA’s Jet Propulsion Laboratory for the 2014 World Science Festival

Of particular interest is what Falk calls “one of the most intriguing plays (and one of the most overlooked works) in the entire canon” — the romantic tragedy Cymbeline. Pointing to a strange and highly symbolic scene in the play’s final act, where the hero sees in a dream the ghosts of his four dead family members circling around him as he sleeps, Falk writes:

Shakespeare’s plays cover a lot of ground, and employ many theatrical tricks — but as for gods descending from the heavens, this episode is unique; there is nothing else like it in the entire canon. Martin Butler calls the Jupiter scene the play’s “spectacular high point,” as it surely is. But the scene is also bizarre, unexpected, and extravagant — so much so that some have wondered if it represents Shakespeare’s own work.

[…]

If anything in Shakespeare’s late plays points to Galileo, this is it: Jupiter, so often invoked by characters in so many of the plays, never actually makes a personal appearance — until this point in Cymbeline. And of course Jupiter is not alone in the scene: Just below him, we see four ghosts moving in a circle. . . . Could the four ghosts represent the four moons of Jupiter, newly discovered by Galileo?

The timeline, Falk points out, is right — Cymbeline is believed to have been written in the summer or fall of 1610, mere months after the publication of Galileo’s short but seminal treatise on his initial telescopic observations, Sidereus Nuncius (Starry Messenger). Falk finds more evidence in an earlier scene, where Jachimo meets the married Imogen, having been introduced by her husband, Posthumus, who has dared Jachimo in a wager to try seducing Imogen — a feat Posthumus deems unattainable. Mesmerized by her beauty, Jachimo decides to win the wager by convincing Imogen that Posthumus had been unfaithful to her on his travels, implying that her best recourse of revenge would be to be unfaithful in turn — of course, by sleeping with Jachimo himself. Lo and behold, his ploy backfires — Imogen is infuriated. To salvage the situation, Jachimo makes a U-turn, claiming to have made everything up as a way of testing her and extolling Posthumus’s virtues. And yet, even though Imogen forgives him, Jachimo is struck by the sketchiness of his own story. Falk cites the following passage spoken by Jachimo:

Thanks, fairest lady.
What, are men mad? Hath Nature given them eyes
To see this vaulted arch and the rich crop
Of sea and land, which can distinguish ’twixt
The fiery orbs above and the twinned stones
Upon th’unnumbered beach, and can we not
Partition make with spectacles so precious
’Twixt fair and foul?

First atlas of the moon, 1647, from 'Ordering the Heavens.' Click image for more.

Falk writes:

The passage seems to allude, at least in part, to the sights one might see in the heavens; at the very least, it has something to do with distinguishing different kinds of objects (including, it would seem, stars) from one another. But the context is crucial: The first line is spoken to Imogen; the remaining lines are clearly an aside, spoken only to the audience. He seems to be saying, My story is unbelievable; why would Posthumus stoop so low, when his own wife is so beautiful? After all, he reasons, the eye gives one the power to tell the stars apart, and even to distinguish one stone on the beach from another; can’t Posthumus see the difference between his wife and a common whore? [Penn State University astronomer Peter] Usher passes over the sexual aspect of these lines, however, and focuses on the astronomical: The “vaulted arch” is surely the sky; the “fiery orbs above” must be the stars. Could the precious “spectacles” be a reference to a telescope-like device?

In the remainder of The Science of Shakespeare, a wonderfully dimensional read in its entirety, Falk goes on to explore a number of other allusions to astronomy throughout the play, from Imogen’s oblique wink at the English mathematician and astronomer Thomas Digges to Shakespeare’s potential reference to the structure of Saturn’s rings. At the heart of his argument is an ambitious effort to offer empirical assurance for what we all intuit — that art and science need each other, inform and inspire one another, and are branches from the same tree of the human longing in a universe that is more like a mirror of meaning than a window of understanding, beaming back at us whatever imagination we imbue it with.

How right pioneering astronomer Maria Mitchell was when, two and a half centuries later, she marveled at the shared sensibility of science and poetry:

We especially need imagination in science. It is not all mathematics, nor all logic, but it is somewhat beauty and poetry.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

17 JANUARY, 2014

From Galileo to Google: How Big Data Illuminates Human Culture

By:

“Through our scopes, we see ourselves. Every new lens is also a new mirror.”

Given my longtime fascination with the so-termed digital humanities and with data visualization, and my occasional dabbles in the intersection of the two, I’ve followed the work of data scholars Erez Aiden and Jean-Baptiste Michel with intense interest since its public beginnings. Now, they have collected and contextualized their findings in the compelling Uncharted: Big Data as a Lens on Human Culture (public library) — a stimulating record of their seven-year quest to quantify cultural change through the dual lens of history and digital data by analyzing the contents of the 30,000 books digitized by Google, using Google’s Ngram viewer tool to explore how the usage frequency of specific words changes over time and what that might reveal about corresponding shifts in our cultural values and beliefs about economics, politics, health, science, the arts, and more.

Aiden and Michel, who met at Harvard’s Program for Evolutionary Dynamics and dubbed their field of research “culturomics,” contextualize the premise:

At its core, this big data revolution is about how humans create and preserve a historical record of their activities. Its consequences will transform how we look at ourselves. It will enable the creation of new scopes that make it possible for our society to more effectively probe its own nature. Big data is going to change the humanities, transform the social sciences, and renegotiate the relationship between the world of commerce and the ivory tower.

And big data is indeed big — humongous, even. Each of us, on average, has an annual data footprint of nearly one terabyte, and together we amount to a staggering five zettabytes per year. Since each byte consists of eight bits — short for “binary digits,” with each bit representing a binary yes-no question answered either by a 1 (“yes”) or a 0 (“no”) — humanity’s aggregate annual data footprint is equivalent to a gobsmacking forty sextillion (40,000,000,000,000,000,000,000) bits. Aiden and Michel humanize these numbers, so challenging for the human brain to grasp, with a pause-giving analog analogy:

If you wrote out the information contained in one megabyte by hand, the resulting line of 1s and 0s would be more than five times as tall as Mount Everest. If you wrote out one gigabyte by hand, it would circumnavigate the globe at the equator. If you wrote out one terabyte by hand, it would extend to Saturn and back twenty-five times. If you wrote out one petabyte by hand, you could make a round trip to the Voyager 1 probe, the most distant man-made object in the universe. If you wrote out one exabyte by hand, you would reach the star Alpha Centauri. If you wrote out all five zettabytes that humans produce each year by hand, you would reach the galactic core of the Milky Way. If instead of sending e-mails and streaming movies, you used your five zettabytes as an ancient shepherd might have—to count sheep—you could easily count a flock that filled the entire universe, leaving no empty space at all.

But what makes our age unlike any preceding era is precisely that this information exists not as handwritten documents but as digital data, which opens up wholly new frontiers of making sense of the meaning embedded in these seemingly meaningless strings of 1’s and 0’s. Aiden and Michel put it beautifully:

Like an optic lens, which makes it possible to reliably transform and manipulate light, digital media make it possible to reliably transform and manipulate information. Given enough digital records and enough computing power, a new vantage point on human culture becomes possible, one that has the potential to make awe-inspiring contributions to how we understand the world and our place in it.

Aiden and Michel have focused their efforts on one particular, and particularly important, aspect of the big-data universe: books. More specifically, the more than 30 million books digitized by Google, or roughly a quarter of humanity’s existing books. They call this digital library “one of the most fascinating datasets in the history of history,” and it certainly is — not only due to its scale, which exceeds the collections of any university library, from Oxford’s 11 million volumes to Harvard’s 17 million, as well as the National Library of Russia with its 15 million and the National Library of China with its 26 million. At the outset of Aiden and Michel’s project, the only analog library still greater than the Google Books collection was the Library of Congress, which contains 33 million — but Google may well have surpassed that number by now.

Still, big data presents a number of problems. For one, it’s messy — something that doesn’t sit well with scientists’ preference for “carefully constructed questions using elegant experiments that produce consistently accurate results,” Aiden and Michel point out. By contrast, a big dataset tends to be “a miscellany of facts and measurements, collected for no scientific purpose, using an ad hoc procedure … riddled with errors, and marred by numerous, frustrating gaps.”

To further complicate things, big data doesn’t comply with the basic premise of the scientific method — rather than eventuating causal relationships borne out of pre-existing hypotheses, it presents a seemingly bottomless pit of correlations awaiting discovery, often through the combination of doggedness and serendipity, an approach diametrically opposed to hypothesis-driven research. But that, arguably, is exactly what makes big data so alluring — as Stuart Firestein has argued in his fantastic case for why ignorance rather than certitude drives science, modern science could use what the scientific establishment so readily dismisses as “curiosity-driven research” — exploratory, hypothesis-free investigations of processes, relationships, and phenomena.

Michel and Aiden address these biases of science:

As we continue to stockpile unexplained and underexplained patterns, some have argued that correlation is threatening to unseat causation as the bedrock of scientific storytelling. Or even that the emergence of big data will lead to the end of theory. But that view is a little hard to swallow. Among the greatest triumphs of modern science are theories, like Einstein’s general relativity or Darwin’s evolution by natural selection, that explain the cause of a complex phenomenon in terms of a small set of first principles. If we stop striving for such theories, we risk losing sight of what science has always been about. What does it mean when we can make millions of discoveries, but can’t explain a single one? It doesn’t mean that we should give up on explaining things. It just means that we have our work cut out for us.

Such curiosity-driven inquiries speak to the heart of science — the eternal question of what science actually is — which Michel and Aiden capture elegantly:

What makes a problem fascinating? No one really agrees. It seemed to us that a fascinating question was something that a young child might ask, that no one knew how to answer, and for which a few person-years of scientific exploration — the kind of effort we could muster ourselves — might result in meaningful progress. Children are a great source of ideas for scientists, because the questions they ask, though superficially simple and easy to understand, are so often profound.

Indeed, indeed.

The promise of big data, it seems, is at once to return us to the roots of our childlike curiosity and to advance science to new frontiers of understanding the world. Much like the invention of the telescope transformed modern science and empowered thinkers like Galileo to spark a new understanding of the world, the rise of big data, Aiden and Michel argue, offers to “create a kind of scope that, instead of observing physical objects, would observe historical change” — and, in the process, to catapult us into unprecedented heights of knowledge:

The great promise of a new scope is that it can take us to uncharted worlds. But the great danger of a new scope is that, in our enthusiasm, we too quickly pass from what our eyes see to what our mind’s eye hopes to see. Even the most powerful data yields to the sovereignty of its interpreter. … Through our scopes, we see ourselves. Every new lens is also a new mirror.

They illustrate this with an example by way of Galileo himself, who began a series of observations of Mars in the fall of 1610 and soon noticed something remarkably curious: Mars seemed to be getting smaller and smaller as the months progressed, shrinking down to a third of its September size by December. This, of course, indicated that the planet was drifting farther and farther from Earth, which went on to become that essential piece of evidence demonstrating that the Ptolemic idea of the geocentric universe was wrong: Earth wasn’t at the center of the cosmos, and the planets were moving according to their own orbits.

But Galileo, with this primitive telescope, couldn’t see any detail of red planet’s surface — that didn’t happen until centuries later when an astronomer by the name of Giovanni Schiaparelli aimed his far more powerful telescope at Mars. Suddenly, before his eyes were mammoth ridges that covered the planet’s surface like painted lines. These findings made their way to a man named Percival Lowell and impressed him so that in 1894, he built an entire observatory in Flagstaff, Arizona, equipped with a yet more powerful telescope, so that he could observe those mysterious lines. Lowell and his team went on to painstakingly record and map Mars’s mesh of nearly 700 criss-crossing “canals,” all the while wondering how they might have been created.

One of Lowell's drawings of the Martian canals.

Turning to the previous century’s theory that Mars’s scarce water reserves were contained in the planet’s frozen poles, Lowell assumed that the lines were a meticulous network of canals made by the inhabitants of a perishing planet in an effort to rehydrate it back to life. Based solely on his telescopic observations and the hypotheses of yore, Lowell concluded that Mars was populated by intelligent life — a “discovery” that at once excited and riled the scientific community, and even permeated popular culture. Even Henry Norris Russell, the unofficial “dean of American astronomers,” called Lowell’s ideas “perhaps the best of the existing theories, and certainly the most stimulating to the imagination.” And so they were — by 1898, H.G. Wells had penned The War of the Worlds.

While Lowell’s ideas dwindled in the centuries that followed, they still held their appeal. It wasn’t until NASA’s landmark Mariner mission beamed back close-up photos of Mars — the significance of which Carl Sagan, Ray Bradbury, and Arthur C. Clarke famously debated — that the anticlimactic reality set in: There were no fanciful irrigation canals, and no little green men who built them.

The moral, as Aiden and Michel point out, is that “Martians didn’t come from Mars: They came from the mind of [Lowell].”

What big data offers, then, is hope for unbridling some of our cultural ideas and ideologies from the realm of myth and anchoring them instead to the spirit of science — which brings us to the crux of the issue:

Digital historical records are making it possible to quantify our human collective as never before.

[…]

Human history is much more than words can tell. History is also found in the maps we drew and the sculptures we crafted. It’s in the houses we built, the fields we kept, and the clothes we wore. It’s in the food we ate, the music we played, and the gods we believed in. It’s in the caves we painted and the fossils of the creatures that came before us. Inevitably, most of this material will be lost: Our creativity far outstrips our record keeping. But today, more of it can be preserved than ever before.

What makes Aiden and Michel’s efforts particularly noteworthy, however, is that they are as much a work of scrupulous scholarship as of passionate advocacy. They are doing for big data in the humanities what Neil deGrasse Tyson has been doing for space exploration, instigating both cultural interest and government support. They remind us that in today’s era of big science, where the Human Genome Project’s price tag was $3 billion and the Large Hadron Collider’s quest for the Higgs boson cost $9 billion, there is an enormous disconnect between the cultural value of the humanities and the actual price we put on better understanding human history — by contrast to such big science enterprises, the entire annual budget of the National Endowment for the Humanities is a mere $150 million. Michel and Aiden remind us just what’s at stake:

The problem of digitizing the historical record represents an unprecedented opportunity for big-science-style work in the humanities. If we can justify multibillion-dollar projects in the sciences, we should also consider the potential impact of a multibillion-dollar project aimed at recording, preserving, and sharing the most important and fragile tranches of our history to make them widely available for ourselves and our children. By working together, teams of scientists, humanists, and engineers can create shared resources of extraordinary power. These efforts could easily seed the Googles and Facebooks of tomorrow. After all, both these companies started as efforts to digitize aspects of our society. Big humanities is waiting to happen.

And yet the idea is nothing new. Count on the great Isaac Asimov to have presaged it, much like he did online education, the fate of space exploration, and even Carl Sagan’s rise to stardom. In his legendary Foundation trilogy, Asimov conceives his hero, Hari Seldon, as a masterful mathematician who can predict the future through complex mathematical equations rooted in aggregate measurements about the state of society at any given point in time. Like Seldon, who can’t anticipate what any individual person will do but can foreshadow larger cultural outcomes, big data, Aiden and Michel argue, is the real-life equivalent of Asimov’s idea, which he termed “psychohistory” — an invaluable tool for big-picture insight into our collective future.

Perhaps more than anything, however, big data holds the promise of righting the balance of quality over quantity in our culture of information overabundance, helping us to extract meaning from (digital) matter. In a society that tweets more words every hour than all of the surviving ancient Greek texts combined, we certainly could use that.

Uncharted is an excellent and timely read in its entirety, both as a curious window into the secret life of language and as an important piece of advocacy for the value of the digital humanities in the age of data. Sample the project with Aiden and Michel’s entertaining and illuminating TED talk:

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.