Brain Pickings

Posts Tagged ‘innovation’

10 DECEMBER, 2014

How Ada Lovelace, Lord Byron’s Daughter, Became the World’s First Computer Programmer

By:

How a young woman with the uncommon talent of applying poetic imagination to science envisioned the Symbolic Medea that would become the modern computer, sparking the birth of the digital age.

Augusta Ada King, Countess of Lovelace, born Augusta Ada Byron on December 10, 1815, later came to be known simply as Ada Lovelace. Today, she is celebrated as the world’s first computer programmer — the first person to marry the mathematical capabilities of computational machines with the poetic possibilities of symbolic logic applied with imagination. This peculiar combination was the product of Ada’s equally peculiar — and in many ways trying — parenting.

Eleven months before her birth, her father, the great Romantic poet and scandalous playboy Lord Byron, had reluctantly married her mother, Annabella Milbanke, a reserved and mathematically gifted young woman from a wealthy family — reluctantly, because Byron saw in Annabella less a romantic prospect than a hedge against his own dangerous passions, which had carried him along a conveyer belt of indiscriminate affairs with both men and women.

Lord Byron in Albanian dress (Portrait by Thomas Phillips, 1835)

But shortly after Ada was conceived, Lady Byron began suspecting her husband’s incestuous relationship with his half-sister, Augusta. Five weeks after Ada’s birth, Annabella decided to seek a separation. Her attorneys sent Lord Byron a letter stating that “Lady B. positively affirms that she has not at any time spread reports injurious to Lord Byrons [sic] character” — with the subtle but clear implication that unless Lord Byron complies, she might. The poet now came to see his wife, whom he had once called “Princess of Parallelograms” in affectionate reverence for her mathematical talents, as a calculating antagonist, a “Mathematical Medea,” and later came to mock her in his famous epic poem Don Juan: “Her favourite science was the mathematical… She was a walking calculation.”

Augusta Ada Byron as a child

Ada was never to meet her father, who died in Greece the age of thirty-six. Ada was eight. On his deathbed, he implored his valet: “Oh, my poor dear child! — my dear Ada! My God, could I have seen her! Give her my blessing.” The girl was raised by her mother, who was bent on eradicating any trace of her father’s influence by immersing her in science and math from the time she was four. At twelve, Ada became fascinated by mechanical engineering and wrote a book called Flyology, in which she illustrated with her own plates her plan for constructing a flying apparatus. And yet she felt that part of her — the poetic part — was being repressed. In a bout of teenage defiance, she wrote to her mother:

You will not concede me philosophical poetry. Invert the order! Will you give me poetical philosophy, poetical science?

Indeed, the very friction that had caused her parents to separate created the fusion that made Ada a pioneer of “poetical science.”

That fruitful friction is what Walter Isaacson explores as he profiles Ada in the opening chapter of The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution (public library | IndieBound), alongside such trailblazers as Vannevar Bush, Alan Turing, and Stewart Brand. Isaacson writes:

Ada had inherited her father’s romantic spirit, a trait that her mother tried to temper by having her tutored in mathematics. The combination produced in Ada a love for what she took to calling “poetical science,” which linked her rebellious imagination to her enchantment with numbers. For many, including her father, the rarefied sensibilities of the Romantic era clashed with the techno-excitement of the Industrial Revolution. But Ada was comfortable at the intersection of both eras.

Ada King, Countess of Lovelace (Portrait by Alfred Edward Chalon, 1840)

When she was only seventeen, Ada attended one of legendary English polymath Charles Babbage’s equally legendary salons. There, amid the dancing, readings, and intellectual games, Babbage performed a dramatic demonstration of his Difference Engine, a beast of a calculating machine he was building. Ada was instantly captivated by its poetical possibilities, far beyond what the machine’s own inventor had envisioned. Later, one of her friends would remark: “Miss Byron, young as she was, understood its working, and saw the great beauty of the invention.”

Isaacson outlines the significance of that moment, in both Ada’s life and the trajectory of our culture:

Ada’s love of both poetry and math primed her to see beauty in a computing machine. She was an exemplar of the era of Romantic science, which was characterized by a lyrical enthusiasm for invention and discovery.

[…]

It was a time not unlike our own. The advances of the Industrial Revolution, including the steam engine, mechanical loom, and telegraph, transformed the nineteenth century in much the same way that the advances of the Digital Revolution — the computer, microchip, and Internet — have transformed our own. At the heart of both eras were innovators who combined imagination and passion with wondrous technology, a mix that produced Ada’s poetical science and what the twentieth-century poet Richard Brautigan would call “machines of loving grace.”

Enchanted by the prospect of the “poetical science” she imagined possible, Ada set out to convince Charles Babbage to be her mentor. She pitched him in a letter:

I have a peculiar way of learning, and I think it must be a peculiar man to teach me successfully… Do not reckon me conceited, … but I believe I have the power of going just as far as I like in such pursuits, and where there is so decided a taste, I should almost say a passion, as I have for them, I question if there is not always some portion of natural genius even.

Here, Isaacson makes a peculiar remark: “Whether due to her opiates or her breeding or both,” he writes in quoting that letter, “she developed a somewhat outsize opinion of her own talents and began to describe herself as a genius.” The irony, of course, is that she was a genius — Isaacson himself acknowledges that by the very act of choosing to open his biography of innovation with her. But would a man of such ability and such unflinching confidence in that ability be called out for his “outsize opinion,” for being someone with an “exalted view of [his] talents,” as Isaacson later writes of Ada? If a woman of her indisputable brilliance can’t be proud of her own talent without being dubbed delusional, then, surely, there is little hope for the rest of us mere female mortals to make any claim to confidence without being accused of hubris.

To be sure, if Isaacson didn’t see the immense value of Ada’s cultural contribution, he would not have included her in the book — a book that opens and closes with her, no less. These remarks, then, are perhaps less a matter of lamentable personal opinion than a reflection of limiting cultural conventions and our ambivalence about the admissible level of confidence a woman can have in her own talents.

Isaacson, indeed — despite disputing whether Ada deserves anointment as “the world’s first computer programmer” commonly attributed to her — makes the appropriateness of celebrating her contribution clear:

Ada’s ability to appreciate the beauty of mathematics is a gift that eludes many people, including some who think of themselves as intellectual. She realized that math was a lovely language, one that describes the harmonies of the universe and can be poetic at times. Despite her mother’s efforts, she remained her father’s daughter, with a poetic sensibility that allowed her to view an equation as a brushstroke that painted an aspect of nature’s physical splendor, just as she could visualize the “wine-dark sea” or a woman who “walks in beauty, like the night.” But math’s appeal went even deeper; it was spiritual. Math “constitutes the language through which alone we can adequately express the great facts of the natural world,” she said, and it allows us to portray the “changes of mutual relationship” that unfold in creation. It is “the instrument through which the weak mind of man can most effectually read his Creator’s works.”

This ability to apply imagination to science characterized the Industrial Revolution as well as the computer revolution, for which Ada was to become a patron saint. She was able, as she told Babbage, to understand the connection between poetry and analysis in ways that transcended her father’s talents. “I do not believe that my father was (or ever could have been) such a Poet as I shall be an Analyst; for with me the two go together indissolubly,” she wrote.

But Ada’s most important contribution came from her role as both a vocal champion of Babbage’s ideas, at a time when society questioned them as ludicrous, and as an amplifier of their potential beyond what Babbage himself had imagined. Isaacson writes:

Ada Lovelace fully appreciated the concept of a general-purpose machine. More important, she envisioned an attribute that might make it truly amazing: it could potentially process not only numbers but any symbolic notations, including musical and artistic ones. She saw the poetry in such an idea, and she set out to encourage others to see it as well.

Trial model of Babbage's Analytical Engine, completed after his death (Science Museum)

In her 1843 supplement to Babbage’s Analytical Engine, simply titled Notes, she outlined four essential concepts that would shape the birth of modern computing a century later. First, she envisioned a general-purpose machine capable not only of performing preprogrammed tasks but also of being reprogrammed to execute a practically unlimited range of operations — in other words, as Isaacson points out, she envisioned the modern computer.

Her second concept would become a cornerstone of the digital age — the idea that such a machine could handle far more than mathematical calculations; that it could be a Symbolic Medea capable of processing musical and artistic notations. Isaacson writes:

This insight would become the core concept of the digital age: any piece of content, data, or information — music, text, pictures, numbers, symbols, sounds, video — could be expressed in digital form and manipulated by machines. Even Babbage failed to see this fully; he focused on numbers. But Ada realized that the digits on the cogs could represent things other than mathematical quantities. Thus did she make the conceptual leap from machines that were mere calculators to ones that we now call computers.

Her third innovation was a step-by-step outline of “the workings of what we now call a computer program or algorithm.” But it was her fourth one, Isaacson notes, that was and still remains most momentous — the question of whether machines can think independently, which we still struggle to answer in the age of Siri-inspired fantasies like the movie Her. Ada wrote in her Notes:

The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths.

In the closing chapter, titled “Ada Forever,” Isaacson considers the enduring implications of this question:

Ada might also be justified in boasting that she was correct, at least thus far, in her more controversial contention: that no computer, no matter how powerful, would ever truly be a “thinking” machine. A century after she died, Alan Turing dubbed this “Lady Lovelace’s Objection” and tried to dismiss it by providing an operational definition of a thinking machine — that a person submitting questions could not distinguish the machine from a human — and predicting that a computer would pass this test within a few decades. But it’s now been more than sixty years, and the machines that attempt to fool people on the test are at best engaging in lame conversation tricks rather than actual thinking. Certainly none has cleared Ada’s higher bar of being able to “originate” any thoughts of its own.

In encapsulating Ada’s ultimate legacy, Isaacson once again touches on our ambivalence about the mythologies of genius — perhaps even more so of women’s genius — and finds wisdom in her own words:

As she herself wrote in those “Notes,” referring to the Analytical Engine but in words that also describe her fluctuating reputation, “In considering any new subject, there is frequently a tendency, first, to overrate what we find to be already interesting or remarkable; and, secondly, by a sort of natural reaction, to undervalue the true state of the case.”

The reality is that Ada’s contribution was both profound and inspirational. More than Babbage or any other person of her era, she was able to glimpse a future in which machines would become partners of the human imagination, together weaving tapestries as beautiful as those from Jacquard’s loom. Her appreciation for poetical science led her to celebrate a proposed calculating machine that was dismissed by the scientific establishment of her day, and she perceived how the processing power of such a device could be used on any form of information. Thus did Ada, Countess of Lovelace, help sow the seeds for a digital age that would blossom a hundred years later.

Ada died of progressively debilitating uterine cancer in 1852, when she was thirty-six — the same age as Lord Byron. She requested that she be buried in a country grave, alongside the father whom she never knew but whose poetical sensibility profoundly shaped her own genius of “poetical science.”

The Innovators goes on to trace Ada’s influence as it reverberates through the seminal work of a stable of technological pioneers over the century and a half since her death. Complement it with Ada’s spirited letter on science and religion.

Donating = Loving

In 2014, I poured thousands of hours and tons of love into bringing you (ad-free) Brain Pickings. But it also took some hefty practical expenses to keep things going. If you found any joy and stimulation here over the year, please consider helping me fuel the former and offset the latter by becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

20 OCTOBER, 2014

The Hummingbird Effect: How Galileo Invented Time and Gave Rise to the Modern Tyranny of the Clock

By:

How the invisible hand of the clock powered the Industrial Revolution and sparked the Information Age.

While we appreciate it in the abstract, few of us pause to grasp the miracles of modern life, from artificial light to air conditioning, as Steven Johnson puts it in the excellent How We Got to Now: Six Innovations That Made the Modern World (public library), “how amazing it is that we drink water from a tap and never once worry about dying forty-eight hours later from cholera.” Understanding how these everyday marvels first came to be, then came to be taken for granted, not only allows us to see our familiar world with new eyes — something we are wired not to do — but also lets us appreciate the remarkable creative lineage behind even the most mundane of technologies underpinning modern life. Johnson writes in the introduction:

Our lives are surrounded and supported by a whole class of objects that are enchanted with the ideas and creativity of thousands of people who came before us: inventors and hobbyists and reformers who steadily hacked away at the problem of making artificial light or clean drinking water so that we can enjoy those luxuries today without a second thought, without even thinking of them as luxuries in the first place… We are indebted to those people every bit as much as, if not more than, we are to the kings and conquerors and magnates of traditional history.

Johnson points out that, much like the evolution of bees gave flowers their colors and the evolution of pollen altered the design of the hummingbird’s wings, the most remarkable thing about innovations is the way they precipitate unanticipated changes that reverberate far and wide beyond the field or discipline or problem at the epicenter of the particular innovation. Pointing to the Gutenberg press — itself already an example of the combinatorial nature of creative breakthroughs — Johnson writes:

Johannes Gutenberg’s printing press created a surge in demand for spectacles, as the new practice of reading made Europeans across the continent suddenly realize that they were farsighted; the market demand for spectacles encouraged a growing number of people to produce and experiment with lenses, which led to the invention of the microscope, which shortly thereafter enabled us to perceive that our bodies were made up of microscopic cells. You wouldn’t think that printing technology would have anything to do with the expansion of our vision down to the cellular scale, just as you wouldn’t have thought that the evolution of pollen would alter the design of a hummingbird’s wing. But that is the way change happens.

Johnson terms these complex chains of influences the “hummingbird effect,” named after the famous “butterfly effect” concept from chaos theory — Edward Lorenz’s famous metaphor for the idea that a change as imperceptible as the flap of a butterfly’s wings can result in an effect as grand as a hurricane far away several weeks later — but different in a fundamental way:

The extraordinary (and unsettling) property of the butterfly effect is that it involves a virtually unknowable chain of causality; you can’t map the link between the air molecules bouncing around the butterfly and the storm system brewing in the Atlantic. They may be connected, because everything is connected on some level, but it is beyond our capacity to parse those connections or, even harder, to predict them in advance. But something very different is at work with the flower and the hummingbird: while they are very different organisms, with very different needs and aptitudes, not to mention basic biological systems, the flower clearly influences the hummingbird’s physiognomy in direct, intelligible ways.

Under the “hummingbird effect,” an innovation in one field can trigger unexpected breakthroughs in wholly different domains, but the traces of those original influences often remain obscured. Illuminating them allows us to grasp the many dimensions of change, its complex and often unintended consequences, the multiple scales of experience that have always defined human history and, perhaps above all, to lend much-needed dimension to the flat myth of genius. Playing off the sentiment at the heart of Richard Feynman’s famous ode to a flower, Johnson writes:

History happens on the level of atoms, the level of planetary climate change, and all the levels in between. If we are trying to get the story right, we need an interpretative approach that can do justice to all those different levels.

[…]

There is something undeniably appealing about the story of a great inventor or scientist — Galileo and his telescope, for instance — working his or her way toward a transformative idea. But there is another, deeper story that can be told as well: how the ability to make lenses also depended on the unique quantum mechanical properties of silicon dioxide and on the fall of Constantinople. Telling the story from that long-zoom perspective doesn’t subtract from the traditional account focused on Galileo’s genius. It only adds.

Nundinal calendar, Rome. The ancient Etruscans developed an eight-day market week, known as the nundinal cycle, around the eighth or seventh century BC.

In fact, of the six such widely reverberating innovations that Johnson highlights, the one sparked by Galileo is the most fascinating because it captures so many dimensions of our eternal and eternally bedeviled relationship with time — our astoundingly elastic perception of it, the way it dictates our internal rhythms and our creative routines, its role in free will, and much more. Johnson tells an absorbing origin story the way only he can:

Legend has it that in 1583, a nineteen-year-old student at the University of Pisa attended prayers at the cathedral and, while daydreaming in the pews, noticed one of the altar lamps swaying back and forth. While his companions dutifully recited the Nicene Creed around him, the student became almost hypnotized by the lamp’s regular motion. No matter how large the arc, the lamp appeared to take the same amount of time to swing back and forth. As the arc decreased in length, the speed of the lamp decreased as well. To confirm his observations, the student measured the lamp’s swing against the only reliable clock he could find: his own pulse.

The swinging altar lamp inside Duomo of Pisa

That teenager, of course, was Galileo. Johnson explains the significance of that mythic moment:

That Galileo was daydreaming about time and rhythm shouldn’t surprise us: his father was a music theorist and played the lute. In the middle of the sixteenth century, playing music would have been one of the most temporally precise activities in everyday culture. (The musical term “tempo” comes from the Italian word for time.) But machines that could keep a reliable beat didn’t exist in Galileo’s age; the metronome wouldn’t be invented for another few centuries. So watching the altar lamp sway back and forth with such regularity planted the seed of an idea in Galileo’s young mind. As is so often the case, however, it would take decades before the seed would blossom into something useful.

'Portrait of Galileo Galilei' by Justus Sustermans, 1636

Indeed, Galileo’s mass experience stands as a spectacular testament to the usefulness of useless knowledge. Over the next two decades, he busied himself with becoming a professor of mathematics, tinkering with telescopes, and, as Johnson aptly puts it, “more or less inventing modern science” (and withstanding the pushback). And yet he kept the image of that swinging altar lamp on the back-burner of his mind. Eventually, as he grew increasingly enchanted with motion and dynamics, he decided to build a pendulum that would simulate what he had observed that distant day at the cathedral. His discovery confirmed his intuition — what determined the time it took the pendulum to swing wasn’t the size of the arc or the weight of the object, but merely the length of the string. Johnson cites Galileo’s excited letter to his peer Giovanni Battista Baliani:

The marvelous property of the pendulum is that it makes all its vibrations, large or small, in equal times.

Galileo's sketches for the pendulum clock

In our present age of productivity, when our entire lives depend on accurate timekeeping — from our daily routines to our conference calls to financial markets and flights — it’s hard to imagine just how groundbreaking and downright miraculous the concept of measuring time accurately was in 16th-century Italy. And yet that’s precisely what it was — Italian towns then, Johnson points out, had clunky mechanical clocks that reflected a loose estimation of time, often losing twenty minutes a day, and had to be constantly corrected by sundial readings. Johnson writes:

The state of the art in timekeeping technology was challenged by just staying accurate on the scale of days. The idea of a timepiece that might be accurate to the second was preposterous.

Preposterous, and seemingly unnecessary. Just like Frederic Tudor’s ice trade, it was an innovation that had no natural market. You couldn’t keep accurate time in the middle of the sixteenth century, but no one really noticed, because there was no need for split-second accuracy. There were no buses to catch, or TV shows to watch, or conference calls to join. If you knew roughly what hour of the day it was, you could get by just fine.

Discus chronologicus, early 1720s, from Cartographies of Time. (Click image for details)

This is where the wings of the hummingbird begin to flutter: The real tipping point in accuracy, Johnson points out in a twist, “would emerge not from the calendar but from the map” — which makes sense given our long history of using cartography to measure time. He explains:

This was the first great age of global navigation, after all. Inspired by Columbus, ships were sailing to the Far East and the newly discovered Americas, with vast fortunes awaiting those who navigated the oceans successfully. (And almost certain death awaiting those who got lost.) But sailors lacked any way to determine longitude at sea. Latitude you could gauge just by looking up at the sky. But before modern navigation technology, the only way to figure out a ship’s longitude involved two clocks. One clock was set to the exact time of your origin point (assuming you knew the longitude of that location). The other clock recorded the current time at your location at sea. The difference between the two times told you your longitudinal position: every four minutes of difference translated to one degree of longitude, or sixty-eight miles at the equator.

In clear weather, you could easily reset the ship clock through accurate readings of the sun’s position. The problem was the home-port clock. With timekeeping technology losing or gaining up to twenty minutes a day, it was practically useless on day two of the journey.

This was an era when European royalty offered handsome bounties for specific innovations — the then-version of venture capital — incentivizing such scientific breakthroughs as Maria Mitchell’s comet discoveries and Johannes Hevelius’s star catalog. As the need to solve the navigation problem grew in urgency, the rewards offered for a solution grew in magnitude — and this was what resurfaced Galileo’s teenage vision for “equal time” all those years later. Johnson describes Galileo’s journey as a superb example of the “slow churn” of creativity, the value of cross-pollinating disciplines, and the importance of playing “the long game”:

[Galileo’s] astronomical observations had suggested that the regular eclipses of Jupiter’s moons might be useful for navigators keeping time at sea, but the method he devised was too complicated (and not as accurate as he had hoped). And so he returned, one last time, to the pendulum.

Fifty-eight years in the making, his slow hunch about the pendulum’s “magical property” had finally begun to take shape. The idea lay at the intersection point of multiple disciplines and interests: Galileo’s memory of the altar lamp, his studies of motion and the moons of Jupiter, the rise of a global shipping industry, and its new demand for clocks that would be accurate to the second. Physics, astronomy, maritime navigation, and the daydreams of a college student: all these different strains converged in Galileo’s mind. Aided by his son, he began drawing up plans for the first pendulum clock.

There is something so poetic about Galileo inventing split-second time for the public on a private scale of decades.

Over the century that followed, the pendulum clock, a hundred times more accurate than any preceding technology, became a staple of European life and forever changed our relationship with time. But the hummingbird’s wings continued to flap — accurate timekeeping became the imperceptible heartbeat beneath all technology of the Industrial Revolution, from scheduling the division of labor in factories to keeping steam-powered locomotives running on time. It was the invisible hand of the clock that first moved the market — a move toward unanticipated innovations in other fields. Without clocks, Johnson argues, the Industrial Revolution may have never taken off — or “at the very least, have taken much longer to reach escape velocity.” He explains:

Accurate clocks, thanks to their unrivaled ability to determine longitude at sea, greatly reduced the risks of global shipping networks, which gave the first industrialists a constant supply of raw materials and access to overseas markets. In the late 1600s and early 1700s, the most reliable watches in the world were manufactured in England, which created a pool of expertise with fine-tool manufacture that would prove to be incredibly handy when the demands of industrial innovation arrived, just as the glassmaking expertise producing spectacles opened the door for telescopes and microscopes. The watchmakers were the advance guard of what would become industrial engineering.

But the most radical innovation of clock time was the emergence of the new working day. Up until that point, people divided their days not into modular abstract units — after all, what is an hour? — but into a fluid series of activities:

Instead of fifteen minutes, time was described as how long it would take to milk the cow or nail soles to a new pair of shoes. Instead of being paid by the hour, craftsmen were conventionally paid by the piece produced — what was commonly called “taken-work” — and their daily schedules were almost comically unregulated.

Rather, they were self-regulated by shifting factors like the worker’s health or mood, the weather, and the available daylight during that particular season. The emergence of factories demanded a reliable, predictable industrial workforce, which in turn called for fundamentally reframing the human perception of time. In one particularly pause-giving parenthetical aside, Johnson writes:

The lovely double entendre of “punching the clock” would have been meaningless to anyone born before 1700.

Workers punching the time clock at the Rouge Plant of the Ford Motor Company

And yet, as with most innovations, the industrialization of time came with a dark side — one Bertrand Russell so eloquently lamented in the 1920s when he asked: “What will be the good of the conquest of leisure and health, if no one remembers how to use them?” Johnson writes:

The natural rhythms of tasks and leisure had to be forcibly replaced with an abstract grid. When you spend your whole life inside that grid, it seems like second nature, but when you are experiencing it for the first time, as the laborers of industrial England did in the second half of the eighteenth century, it arrives as a shock to the system. Timepieces were not just tools to help you coordinate the day’s events, but something more ominous: the “deadly statistical clock,” in Dickens’s Hard Times, “which measured every second with a beat like a rap upon a coffin lid.”

[…]

To be a Romantic at the turn of the nineteenth century was in part to break from the growing tyranny of clock time: to sleep late, ramble aimlessly through the city, refuse to live by the “statistical clocks” that governed economic life… The time discipline of the pendulum clock took the informal flow of experience and nailed it to a mathematical grid. If time is a river, the pendulum clock turned it into a canal of evenly spaced locks, engineered for the rhythms of industry.

Johnson goes on to trace the hummingbird flutterings to the emergence of pocket watches, the democratization of time through the implementation of Standard Time, and the invention of the first quartz clock in 1928, which boasted the unprecedented accuracy of losing or gaining only one thousandth of a second per day. He observes the most notable feature of these leaps and bounds:

One of the strangest properties of the measurement of time is that it doesn’t belong neatly to a single scientific discipline. In fact, each leap forward in our ability to measure time has involved a handoff from one discipline to another. The shift from sundials to pendulum clocks relied on a shift from astronomy to dynamics, the physics of motion. The next revolution in time would depend on electromechanics. With each revolution, though, the general pattern remained the same: scientists discover some natural phenomenon that displays the propensity for keeping “equal time” that Galileo had observed in the altar lamps, and before long a wave of inventors and engineers begin using that new tempo to synchronize their devices.

But the most groundbreaking effect of the quartz clock — the most unpredictable manifestation of the hummingbird effect in the story of time — was that it gave rise to modern computing and the Information Age. Johnson writes:

Computer chips are masters of time discipline… Instead of thousands of operations per minute, the microprocessor is executing billions of calculations per second, while shuffling information in and out of other microchips on the circuit board. Those operations are all coordinated by a master clock, now almost without exception made of quartz… A modern computer is the assemblage of many different technologies and modes of knowledge: the symbolic logic of programming languages, the electrical engineering of the circuit board, the visual language of interface design. But without the microsecond accuracy of a quartz clock, modern computers would be useless.

Theodor Nelson's pioneering 1974 book 'Computer Lib | Dream Machines,' an exploration of the creative potential of computer networks, from '100 Ideas that Changed the Web' (Click image for more)

But as is often the case given the “thoroughly conscious ignorance” by which science progresses, new frontiers of knowledge only exposed what is yet to be reached. With the invention of the quartz clock also came the realization that the length of the day wasn’t as reliable as previously thought and the earth’s rotation wasn’t the most accurate tool for reaching Galileo’s measurement ideal of “equal time.” As Johnson puts it, “quartz let us ‘see’ that the seemingly equal times of a solar day weren’t nearly as equal as we had assumed” — the fact that a block of vibrating sand did a better job of keeping time than the sun and the earth, celebrated for centuries as the ultimate timekeepers, became the ultimate “deathblow to the pre-Copernican universe.”

What accurate timekeeping needed, ever since Galileo’s contemplation of the pendulum, was something that oscillated in the most consistent rhythm possible — and that’s what Niels Bohr and Werner Heisenberg’s discovery of the atom in the beginning of the twentieth century finally provided. With its rhythmically spinning electrons, the smallest chemical unit became the greatest and most consistent oscillator ever known. When the first atomic clocks were built in the 1950s, they introduced a groundbreaking standard of accuracy, measuring time down to the nanosecond, thousandfold better than the quartz clock’s microseconds.

Half a century later, this unprecedented precision is something we’ve come to take for granted — and yet it continues to underpin our lives with a layer of imperceptible magic. In one example, Johnson brings us full-circle to the relationship between timekeeping and map navigation where Galileo began:

Every time you glance down at your smartphone to check your location, you are unwittingly consulting a network of twenty-four atomic clocks housed in satellites in low-earth orbit above you. Those satellites are sending out the most elemental of signals, again and again, in perpetuity: the time is 11:48:25.084738 . . . the time is 11:48:25.084739. . . . When your phone tries to figure out its location, it pulls down at least three of these time stamps from satellites, each reporting a slightly different time thanks to the duration it takes the signal to travel from satellite to the GPS receiver in your hand. A satellite reporting a later time is closer than one reporting an earlier time. Since the satellites have perfectly predictable locations, the phone can calculate its exact position by triangulating among the three different time stamps. Like the naval navigators of the eighteenth century, GPS determines your location by comparing clocks. This is in fact one of the recurring stories of the history of the clock: each new advance in timekeeping enables a corresponding advance in our mastery of geography — from ships, to railroads, to air traffic, to GPS. It’s an idea that Einstein would have appreciated: measuring time turns out to be key to measuring space.

Therein lies the remarkable power and reach of the hummingbird effect, which Johnson condenses into an elegant concluding reflection:

Embedded in your ability to tell the time is the understanding of how electrons circulate within cesium atoms; the knowledge of how to send microwave signals from satellites and how to measure the exact speed with which they travel; the ability to position satellites in reliable orbits above the earth, and of course the actual rocket science needed to get them off the ground; the ability to trigger steady vibrations in a block of silicon dioxide — not to mention all the advances in computation and microelectronics and network science necessary to process and represent that information on your phone. You don’t need to know any of these things to tell the time now, but that’s the way progress works: the more we build up these vast repositories of scientific and technological understanding, the more we conceal them. Your mind is silently assisted by all that knowledge each time you check your phone to see what time it is, but the knowledge itself is hidden from view. That is a great convenience, of course, but it can obscure just how far we’ve come since Galileo’s altar-lamp daydreams in the Duomo of Pisa.

But perhaps the strangest thing about time is how each leap of innovation further polarized the scales on which it played out. As in the case of Galileo, who took six decades to master the minute, the same breakthroughs that gave atomic time its trailblazing accuracy also gave us radiation and radiometric dating, which was essential in debunking the biblical myth and proving that earth’s age was in the billions, not thousands, of years.

5,068-year-old bristlecone pine from Rachel Sussman's 'The Oldest Living Things in the World' (Click image for more)

Pointing to the Long Now Foundation’s quest to bury a clock that ticks once every 10,000 years beneath some of the oldest living pines in the world — an effort to extract us from the toxic grip of short-termism and, in the words of Long Now founder Kevin Kelly, nudge us to think about “generational-scale questions and projects” — Johnson ends with a wonderfully poetic reflection:

This is the strange paradox of time in the atomic age: we live in ever shorter increments, guided by clocks that tick invisibly with immaculate precision; we have short attention spans and have surrendered our natural rhythms to the abstract grid of clock time. And yet simultaneously, we have the capacity to imagine and record histories that are thousands or millions of years old, to trace chains of cause and effect that span dozens of generations. We can wonder what time it is and glance down at our phone and get an answer that is accurate to the split-second, but we can also appreciate that the answer was, in a sense, five hundred years in the making: from Galileo’s altar lamp to Niels Bohr’s cesium, from the chronometer to Sputnik. Compared to an ordinary human being from Galileo’s age, our time horizons have expanded in both directions: from the microsecond to the millennium.

In the remainder of How We Got to Now, a remarkable and perspective-shifting masterwork in its entirety, Johnson goes on to examine with equal dimension and rigor the workings of the hummingbird effect through the invention and evolution of such concepts as sound, light, glass, sanitation, and cooling.

For more on the mysteries of time, see these seven revelatory perspectives for a variety of fields, then revisit the curious psychology of why time slows down when you’re afraid, speeds up as you age, and gets warped while you’re on vacation.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

01 OCTOBER, 2014

The Unsung Heroes of Innovation: A 1964 Manifesto for the Role of the Critic-Curator in How Ideas Spread

By:

“It would be a mistake to distinguish too sharply between those who contribute a new way of doing and those who contribute a new way of thinking.”

“Art doesn’t explain itself,” music critic Greil Marcus observed in considering what the history of rock ‘n’ roll reveals about innovation. The role of context — that mesh of explanations enveloping an idea in a sheath of meaning to reveal why it matters and why we should pay attention — is essential in the cultural uptake of any new concept. I have long believed the role of the cultural critic or curator — the celebrator of ideas — to be one of helping people discern what matters in the world and understand why it matters, of elevating the meaningful from the fleeting and, in the process, elevating the human spirit toward progress. The great social science writer John W. Gardner explores this notion with unparalleled elegance and economy of words in a section of Self-Renewal: The Individual and the Innovative Society (public library) — his excellent, forgotten field guide to keeping your company and your soul vibrantly alive, written in 1964 but enormously relevant to modern entrepreneurship, politics, and personal growth.

The most meaningful, impactful, and enduring innovations, Gardner argues, often come quietly, even surreptitiously:

The new thing rarely comes on with a flourish or trumpets. The historic innovation looks exciting in the history books, but if one could question those who lived at the time, the typical response would be neither “I opposed it” nor “I welcomed it,” but “I didn’t know it was happening.”

The unsung heroes of innovation, Gardner suggests, are those who shed light on the new and noteworthy, who extend an invitation to people to pay attention and care — a function all the more vital today, half a century later, when there is so much more vying for our attention and it is so much more straining to distinguish between the noteworthy and the merely noisy.

Illustration from 'Flashlight' by Lizi Body. Click image for more.

Those who bring attention to valuable ideas, then, are themselves vital agents of change, without whom the inventors and their creations would slide under the cultural radar and into obscurity. Editor Ursula Nordstrom did this for a young and insecure Maurice Sendak. Publisher John Martin did it for Charles Bukowski. Ralph Waldo Emerson did it for young Walt Whitman.

To give a great idea wings, Gardner suggests, is at least as valuable as to hatch it:

The capacity of public somnolence to retard change illuminates the role of the critic… Critics who call attention to an area that requires renewal are very much a part of the innovative process…

One of the most serious obstacles to clear thinking about renewal is the excessively narrow conception of the innovator that is commonly held. It focuses on technology and on the men who intent specific new devices: Alexander Graham Bell and the telephone; Marconi and wireless; Edison and the phonograph; the Wright Brothers and the airplane.

Gardner returns to the underappreciated, vital role of the critic-celebrator in amplifying the ideas that improve society and precipitate progress:

We tend to think of innovators as those who contribute to a new way of doing things. But many far-reaching changes have been touched off by those who contributed to a new way of thinking about things…

It would be a mistake to distinguish too sharply between those who contribute a new way of doing and those who contribute a new way of thinking.

Today, Gardner himself is an underappreciated celebrator of the human spirit and his Self-Renewal endures as a timeless manifesto for what true progress requires. Sample it further with Gardner on what children can teach us about risk, failure, and personal growth, then see some related thoughts on how to cultivate wisdom in the age of information.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.