Brain Pickings

Posts Tagged ‘technology’

10 DECEMBER, 2014

How Ada Lovelace, Lord Byron’s Daughter, Became the World’s First Computer Programmer

By:

How a young woman with the uncommon talent of applying poetic imagination to science envisioned the Symbolic Medea that would become the modern computer, sparking the birth of the digital age.

Augusta Ada King, Countess of Lovelace, born Augusta Ada Byron on December 10, 1815, later came to be known simply as Ada Lovelace. Today, she is celebrated as the world’s first computer programmer — the first person to marry the mathematical capabilities of computational machines with the poetic possibilities of symbolic logic applied with imagination. This peculiar combination was the product of Ada’s equally peculiar — and in many ways trying — parenting.

Eleven months before her birth, her father, the great Romantic poet and scandalous playboy Lord Byron, had reluctantly married her mother, Annabella Milbanke, a reserved and mathematically gifted young woman from a wealthy family — reluctantly, because Byron saw in Annabella less a romantic prospect than a hedge against his own dangerous passions, which had carried him along a conveyer belt of indiscriminate affairs with both men and women.

Lord Byron in Albanian dress (Portrait by Thomas Phillips, 1835)

But shortly after Ada was conceived, Lady Byron began suspecting her husband’s incestuous relationship with his half-sister, Augusta. Five weeks after Ada’s birth, Annabella decided to seek a separation. Her attorneys sent Lord Byron a letter stating that “Lady B. positively affirms that she has not at any time spread reports injurious to Lord Byrons [sic] character” — with the subtle but clear implication that unless Lord Byron complies, she might. The poet now came to see his wife, whom he had once called “Princess of Parallelograms” in affectionate reverence for her mathematical talents, as a calculating antagonist, a “Mathematical Medea,” and later came to mock her in his famous epic poem Don Juan: “Her favourite science was the mathematical… She was a walking calculation.”

Augusta Ada Byron as a child

Ada was never to meet her father, who died in Greece the age of thirty-six. Ada was eight. On his deathbed, he implored his valet: “Oh, my poor dear child! — my dear Ada! My God, could I have seen her! Give her my blessing.” The girl was raised by her mother, who was bent on eradicating any trace of her father’s influence by immersing her in science and math from the time she was four. At twelve, Ada became fascinated by mechanical engineering and wrote a book called Flyology, in which she illustrated with her own plates her plan for constructing a flying apparatus. And yet she felt that part of her — the poetic part — was being repressed. In a bout of teenage defiance, she wrote to her mother:

You will not concede me philosophical poetry. Invert the order! Will you give me poetical philosophy, poetical science?

Indeed, the very friction that had caused her parents to separate created the fusion that made Ada a pioneer of “poetical science.”

That fruitful friction is what Walter Isaacson explores as he profiles Ada in the opening chapter of The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution (public library | IndieBound), alongside such trailblazers as Vannevar Bush, Alan Turing, and Stewart Brand. Isaacson writes:

Ada had inherited her father’s romantic spirit, a trait that her mother tried to temper by having her tutored in mathematics. The combination produced in Ada a love for what she took to calling “poetical science,” which linked her rebellious imagination to her enchantment with numbers. For many, including her father, the rarefied sensibilities of the Romantic era clashed with the techno-excitement of the Industrial Revolution. But Ada was comfortable at the intersection of both eras.

Ada King, Countess of Lovelace (Portrait by Alfred Edward Chalon, 1840)

When she was only seventeen, Ada attended one of legendary English polymath Charles Babbage’s equally legendary salons. There, amid the dancing, readings, and intellectual games, Babbage performed a dramatic demonstration of his Difference Engine, a beast of a calculating machine he was building. Ada was instantly captivated by its poetical possibilities, far beyond what the machine’s own inventor had envisioned. Later, one of her friends would remark: “Miss Byron, young as she was, understood its working, and saw the great beauty of the invention.”

Isaacson outlines the significance of that moment, in both Ada’s life and the trajectory of our culture:

Ada’s love of both poetry and math primed her to see beauty in a computing machine. She was an exemplar of the era of Romantic science, which was characterized by a lyrical enthusiasm for invention and discovery.

[…]

It was a time not unlike our own. The advances of the Industrial Revolution, including the steam engine, mechanical loom, and telegraph, transformed the nineteenth century in much the same way that the advances of the Digital Revolution — the computer, microchip, and Internet — have transformed our own. At the heart of both eras were innovators who combined imagination and passion with wondrous technology, a mix that produced Ada’s poetical science and what the twentieth-century poet Richard Brautigan would call “machines of loving grace.”

Enchanted by the prospect of the “poetical science” she imagined possible, Ada set out to convince Charles Babbage to be her mentor. She pitched him in a letter:

I have a peculiar way of learning, and I think it must be a peculiar man to teach me successfully… Do not reckon me conceited, … but I believe I have the power of going just as far as I like in such pursuits, and where there is so decided a taste, I should almost say a passion, as I have for them, I question if there is not always some portion of natural genius even.

Here, Isaacson makes a peculiar remark: “Whether due to her opiates or her breeding or both,” he writes in quoting that letter, “she developed a somewhat outsize opinion of her own talents and began to describe herself as a genius.” The irony, of course, is that she was a genius — Isaacson himself acknowledges that by the very act of choosing to open his biography of innovation with her. But would a man of such ability and such unflinching confidence in that ability be called out for his “outsize opinion,” for being someone with an “exalted view of [his] talents,” as Isaacson later writes of Ada? If a woman of her indisputable brilliance can’t be proud of her own talent without being dubbed delusional, then, surely, there is little hope for the rest of us mere female mortals to make any claim to confidence without being accused of hubris.

To be sure, if Isaacson didn’t see the immense value of Ada’s cultural contribution, he would not have included her in the book — a book that opens and closes with her, no less. These remarks, then, are perhaps less a matter of lamentable personal opinion than a reflection of limiting cultural conventions and our ambivalence about the admissible level of confidence a woman can have in her own talents.

Isaacson, indeed — despite disputing whether Ada deserves anointment as “the world’s first computer programmer” commonly attributed to her — makes the appropriateness of celebrating her contribution clear:

Ada’s ability to appreciate the beauty of mathematics is a gift that eludes many people, including some who think of themselves as intellectual. She realized that math was a lovely language, one that describes the harmonies of the universe and can be poetic at times. Despite her mother’s efforts, she remained her father’s daughter, with a poetic sensibility that allowed her to view an equation as a brushstroke that painted an aspect of nature’s physical splendor, just as she could visualize the “wine-dark sea” or a woman who “walks in beauty, like the night.” But math’s appeal went even deeper; it was spiritual. Math “constitutes the language through which alone we can adequately express the great facts of the natural world,” she said, and it allows us to portray the “changes of mutual relationship” that unfold in creation. It is “the instrument through which the weak mind of man can most effectually read his Creator’s works.”

This ability to apply imagination to science characterized the Industrial Revolution as well as the computer revolution, for which Ada was to become a patron saint. She was able, as she told Babbage, to understand the connection between poetry and analysis in ways that transcended her father’s talents. “I do not believe that my father was (or ever could have been) such a Poet as I shall be an Analyst; for with me the two go together indissolubly,” she wrote.

But Ada’s most important contribution came from her role as both a vocal champion of Babbage’s ideas, at a time when society questioned them as ludicrous, and as an amplifier of their potential beyond what Babbage himself had imagined. Isaacson writes:

Ada Lovelace fully appreciated the concept of a general-purpose machine. More important, she envisioned an attribute that might make it truly amazing: it could potentially process not only numbers but any symbolic notations, including musical and artistic ones. She saw the poetry in such an idea, and she set out to encourage others to see it as well.

Trial model of Babbage's Analytical Engine, completed after his death (Science Museum)

In her 1843 supplement to Babbage’s Analytical Engine, simply titled Notes, she outlined four essential concepts that would shape the birth of modern computing a century later. First, she envisioned a general-purpose machine capable not only of performing preprogrammed tasks but also of being reprogrammed to execute a practically unlimited range of operations — in other words, as Isaacson points out, she envisioned the modern computer.

Her second concept would become a cornerstone of the digital age — the idea that such a machine could handle far more than mathematical calculations; that it could be a Symbolic Medea capable of processing musical and artistic notations. Isaacson writes:

This insight would become the core concept of the digital age: any piece of content, data, or information — music, text, pictures, numbers, symbols, sounds, video — could be expressed in digital form and manipulated by machines. Even Babbage failed to see this fully; he focused on numbers. But Ada realized that the digits on the cogs could represent things other than mathematical quantities. Thus did she make the conceptual leap from machines that were mere calculators to ones that we now call computers.

Her third innovation was a step-by-step outline of “the workings of what we now call a computer program or algorithm.” But it was her fourth one, Isaacson notes, that was and still remains most momentous — the question of whether machines can think independently, which we still struggle to answer in the age of Siri-inspired fantasies like the movie Her. Ada wrote in her Notes:

The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths.

In the closing chapter, titled “Ada Forever,” Isaacson considers the enduring implications of this question:

Ada might also be justified in boasting that she was correct, at least thus far, in her more controversial contention: that no computer, no matter how powerful, would ever truly be a “thinking” machine. A century after she died, Alan Turing dubbed this “Lady Lovelace’s Objection” and tried to dismiss it by providing an operational definition of a thinking machine — that a person submitting questions could not distinguish the machine from a human — and predicting that a computer would pass this test within a few decades. But it’s now been more than sixty years, and the machines that attempt to fool people on the test are at best engaging in lame conversation tricks rather than actual thinking. Certainly none has cleared Ada’s higher bar of being able to “originate” any thoughts of its own.

In encapsulating Ada’s ultimate legacy, Isaacson once again touches on our ambivalence about the mythologies of genius — perhaps even more so of women’s genius — and finds wisdom in her own words:

As she herself wrote in those “Notes,” referring to the Analytical Engine but in words that also describe her fluctuating reputation, “In considering any new subject, there is frequently a tendency, first, to overrate what we find to be already interesting or remarkable; and, secondly, by a sort of natural reaction, to undervalue the true state of the case.”

The reality is that Ada’s contribution was both profound and inspirational. More than Babbage or any other person of her era, she was able to glimpse a future in which machines would become partners of the human imagination, together weaving tapestries as beautiful as those from Jacquard’s loom. Her appreciation for poetical science led her to celebrate a proposed calculating machine that was dismissed by the scientific establishment of her day, and she perceived how the processing power of such a device could be used on any form of information. Thus did Ada, Countess of Lovelace, help sow the seeds for a digital age that would blossom a hundred years later.

Ada died of progressively debilitating uterine cancer in 1852, when she was thirty-six — the same age as Lord Byron. She requested that she be buried in a country grave, alongside the father whom she never knew but whose poetical sensibility profoundly shaped her own genius of “poetical science.”

The Innovators goes on to trace Ada’s influence as it reverberates through the seminal work of a stable of technological pioneers over the century and a half since her death. Complement it with Ada’s spirited letter on science and religion.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

12 NOVEMBER, 2014

The Mirror and the Meme: A 600-Year History of the Selfie

By:

How glass, tin, and mercury converged on a Venetian island in the 15th century to fundamentally change the way we look at ourselves.

In 1977, long before the social web as we know it existed, Susan Sontag foresaw a new dawn of “aesthetic consumerism” sparked by photography’s social aspect. Today, nowhere is this phenomenon more glaring than in the ubiquitous selfie. But how did we end up with this peculiar Möbius strip of self-image, turning our gaze inward and outward at the same time?

In How We Got to Now: Six Innovations That Made the Modern World (public library | IndieBound) — which also gave us the mind-stretching story of how Galileo invented timeSteven Johnson peels back history’s curtains to reveal the unsuspected causal chains of innovation behind “that iconic, early-twenty-first-century act: snapping a selfie on your phone.” We might be apt to celebrate innovations like the evolution of photography and the rise of the social web as the impetus for the selfie revolution — but these, Johnson points out with his penchant for the unexpected sociocultural twist, can’t hold a candle to the true breakthrough that makes it all possible: glass. He explains:

Glass supports this entire network: we take pictures through glass lenses, store and manipulate them on circuit boards made of fiberglass, transmit them around the world via glass cables, and enjoy them on screens made of glass. It’s silicon dioxide all the way down the chain.

We already know that glass is a remarkable material that planted the seed for the innovation gap between East and West, but the role of glass in the journey of our self-image extends beyond the technology and into how we think about the human countenance itself. Even though the self-portrait is a fixture of Renaissance art and early modernism, Johnson points out that it practically didn’t exist as an artistic convention until the beginning of the 15th century. “People painted landscapes and royalty and religious scenes and a thousand other subjects,” he notes. “But they didn’t paint themselves.”

And then something happened — glassmakers on the Venetian island of Murano figured out how to combine glass with a new technological breakthrough in metallurgy, which allowed them to coat the back of a piece of glass with a medley of tin and mercury, producing a highly reflective surface. The result was the mirror, which forever changed how we see ourselves. Johnson writes:

For the first time, mirrors became part of the fabric of everyday life. This was a revelation on the most intimate of levels: before mirrors came along, the average person went through life without ever seeing a truly accurate representation of his or her face, just fragmentary, distorted glances in pools of water or polished metals.

Mirrors appeared so magical that they were quickly integrated into somewhat bizarre sacred rituals: During holy pilgrimages, it became common practice for well-off pilgrims to take a mirror with them. When visiting sacred relics, they would position themselves so that they could catch sight of the bones in the mirror’s reflection. Back home, they would then show off these mirrors to friends and relatives, boasting that they had brought back physical evidence of the relic by capturing the reflection of the sacred scene. Before turning to the printing press, Gutenberg had the start-up idea of manufacturing and selling small mirrors for departing pilgrims.

But the most momentous impact of the mirror, Johnson argues, was a secular one — in revolutionizing the art of seeing, it invariably revolutionized art itself:

Filippo Brunelleschi employed a mirror to invent linear perspective in painting, by drawing a reflection of the Florence Baptistry instead of his direct perception of it. The art of the late Renaissance is heavily populated by mirrors lurking inside paintings, most famously in Diego Velázquez’s inverted masterpiece, Las Meninas, which shows the artist (and the extended royal family) in the middle of painting King Philip IV and Queen Mariana of Spain. The entire image is captured from the point of view of two royal subjects sitting for their portrait; it is, in a very literal sense, a painting about the act of painting. The king and queen are visible only in one small fragment of the canvas, just to the right of Velázquez himself: two small, blurry images reflected back in a mirror. As a tool, the mirror became an invaluable asset to painters who could now capture the world around them in a far more realistic fashion, including the detailed features of their own faces.

'Las Meninas' by Diego Velázquez, 1656

Even Da Vinci extolled the creative value of the mirror in his notebooks:

When you wish to see whether the general effect of your picture corresponds with that of the object represented after nature, take a mirror and set it so that it reflects the actual thing, and then compare the reflection with your picture, and consider carefully whether the subject of the two images is in conformity with both, studying especially the mirror. The mirror ought to be taken as a guide.

Johnson considers the role of this breakthrough as a sensemaking device in an era when we were using glass to orient ourselves to the cosmos and, thanks to the mirror, to orient ourselves to ourselves:

At the exact moment that the glass lens was allowing us to extend our vision to the stars or microscopic cells, glass mirrors were allowing us to see ourselves for the first time. It set in motion a reorientation of society that was more subtle, but no less transformative, than the reorientation of our place in the universe that the telescope engendered.

[…]

The mirror played a direct role in allowing artists to paint themselves and invent perspective as a formal device; and shortly thereafter a fundamental shift occurred in the consciousness of Europeans that oriented them around the self in a new way, a shift that would ripple across the world (and that is still rippling).

The Hall of Mirrors at Versailles Palace

This rippling, Johnson points out, was a perfect example of the hummingbird effect at work as various sociocultural forces converged to unleash its full potential for transformation:

The self-centered world played well with the early forms of modern capitalism that were thriving in places like Venice and Holland (home to those masters of painterly introspection, Dürer and Rembrandt). Likely, these various forces complemented each other: glass mirrors were among the first high-tech furnishings for the home, and once we began gazing into those mirrors, we began to see ourselves differently, in ways that encouraged the market systems that would then happily sell us more mirrors…

The mirror doesn’t “force” the Renaissance to happen; it “allows” it to happen.

[…]

Without a technology that enabled humans to see a clear reflection of reality, including their own faces, the particular constellation of ideas in art and philosophy and politics that we call the Renaissance would have had a much more difficult time coming into being.

Peering into the history of the mirror and the future of its modern progeny — which includes, notably, the selfie — Johnson leaves the trajectory of its sociocultural impact open-ended:

The mirror helped invent the modern self, in some real but unquantifiable way. That much we should agree on. Whether that was a good thing in the end is a separate question, one that may never be settled conclusively.

How We Got to Now is an illuminating read in its totality. Complement it with 100 ideas that changed art.

Public domain photographs via Flickr Commons

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

21 OCTOBER, 2014

Craigslist Founder Craig Newmark on Trust, Integrity, Human Nature, and Why a Steady Moral Compass Is the Best Investment

By:

“What surprises me, in a way, is how almost universally people are trustworthy and good.”

In 2007, Y Combinator founding partner Jessica Livingston set out “to establish a fund of experience that everyone can learn from” by interviewing some of the most successful entrepreneurs at the time — the founders and first employees of such celebrated companies as Apple, PayPal, Flickr, Adobe, and Firefox. The resulting conversations were published in the now-classic volume Founders at Work: Stories of Startups’ Early Days (public library), titled after the Paris Review’s iconic Writers at Work.

Today, in a culture that talks a great deal about “creating value” but seems to care very little about upholding values, and writes history with the same bias, I keep coming back to the most heartening interview in the volume — Livingston’s conversation with craigslist founder Craig Newmark, whose beloved lo-fi website began in 1994 as a hunch, became a humble side-project email list in 1995 highlighting interesting events in the San Francisco area, and turned into Newmark’s full-time labor-of-love business in 1999. In 2004, eBay purchased a 25% stake in the company from a former employee, but craigslist remains independent and privately owned, helping millions of people in several hundred cities around the world find everything from used couches to true love. Underpinning the site’s success is Newmark’s own idealism, his adamant refusal to surrender to cynicism or succumb to commercialism, and his unflinching faith in the human spirit.

Newmark’s most powerful tool as an entrepreneur and a human being is the very thing Kurt Vonnegut believed was the key to happiness — the knowledge that one has enough. Recounting a pivotal point at which advertisers began approaching him about running banner ads on his free site, Newmark gets to the heart of the values question:

I thought about my own values and I was thinking, “Hey, how much money do I need?” … So I figured I would just not do that.

At that point, I got the first inkling of what I now call my “moral compass.” I better understood it later—particularly since the presidential elections, because then I realized that people were claiming a moral high ground who actually didn’t practice what they preached, and it’s about time for people of goodwill to reassert their idea of what’s right and what’s wrong.

Newmark was able to stay true to his own values by making very deliberate choices about not letting outside interests interfere with his vision — specifically investors, who invariably bring their own financial interests and thus begin to warp values in favor of narrowly defined “value.” Newmark tells Livingston:

I’ve stepped away from a huge amount of money, and I’m following through.

[…]

I coasted on savings for several months… I funded it with my own time. In no form did we ever take investment money… For the most part, for the first few years, it was just putting my own time and energy into it. If I was billing for my own hours, it would have been a great deal of money.

And that energy was considerable — when Livingston asks whether craigslist garnered “a positive response pretty quickly,” Newmark speaking to the idea that one should “expect anything worthwhile to take a long time” and responds:

Our traffic has always been slow but sure. We’re the tortoise, not the hare. Now and then we’ll get a surge of growth, but it’s been slow but steady.

At this intersection of firm values and steadfast dedication lies Newmark’s most essential insight. While “follow your gut” is a common platitude often dismissed with a scoff, especially in our culture of great impatience for any semblance of earnestness, there is something to be said for the difference between a throwaway aphorism and an ideal enacted in one’s own life as a “quiet, precise, judicious exercise of probity and care — with no one there to see or cheer.” Newmark’s greatest learning is very much the latter:

The biggest entrepreneurial lesson I’ve learned has been that you really do need to follow your instincts.

[…]

Trust your instincts and your moral compass… The deal is: we’re not pious about this. We try hard not to be sanctimonious. This is the way people really live; we just don’t talk about it. I’d prefer to be cynical and not talk about it, and yet, that’s real life.

Therein lies his most heartening conviction — the same one Isaac Asimov shared in his spectacular short meditation on cynicism and the human spirit. Newmark, like Asimov, speaks from a place of resolute humanism, echoing legendary graphic designer Milton Glaser’s memorable perspective on the universe. He tells Livingston:

What surprises me, in a way, is how almost universally people are trustworthy and good. There are problems, and sometimes people bicker, which is a pain in the ass, but people are good. No matter what your religious background, we share pretty much the same values. There are some minor differences that we disagree on, but the differences are at the 5 percent level. That’s pretty good.

Artwork from Sophie Blackall's illustrated craigslist missed connections. Click image for more.

After noting that the two most important factors in his company culture were an atmosphere of trust and a keen moral compass, Newmark considers how that reverberates throughout the craigslist community itself. When Livingston asks whether he ever worried about spammers and other ill-willed people trying to take advantage of the site, he answers:

We have a really good culture of trust on the site — of goodwill. You know, we’re finding that pretty much everyone out there shares, more or less, the same moral compass as we do and as my personal one. People are good. There are some bad guys out there, but they are a very tiny minority and our community is self-policing. People want other people to play fair, and that works… It works great in all sorts of ways, and it’s also an expression of our values. Mutual trust. This is kind of democracy in real life. Everyone wins, except for the bad guys.

Founders at Work is a trove of wisdom in its entirety, from Paul Graham’s characteristically contrarian and inspiring introduction to the remaining interviews with legendary entrepreneurs like Steve Wozniak, Caterina Fake, and Brewster Kahle.

Complement this particular excerpt with the question posed by Alan Watts — what would you do if money was no object? — which should underpin every entrepreneurial pursuit, then revisit this field guide to finding your purpose.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

20 OCTOBER, 2014

The Hummingbird Effect: How Galileo Invented Timekeeping and Forever Changed Modern Life

By:

How the invisible hand of the clock powered the Industrial Revolution and sparked the Information Age.

While we appreciate it in the abstract, few of us pause to grasp the miracles of modern life, from artificial light to air conditioning, as Steven Johnson puts it in the excellent How We Got to Now: Six Innovations That Made the Modern World (public library), “how amazing it is that we drink water from a tap and never once worry about dying forty-eight hours later from cholera.” Understanding how these everyday marvels first came to be, then came to be taken for granted, not only allows us to see our familiar world with new eyes — something we are wired not to do — but also lets us appreciate the remarkable creative lineage behind even the most mundane of technologies underpinning modern life. Johnson writes in the introduction:

Our lives are surrounded and supported by a whole class of objects that are enchanted with the ideas and creativity of thousands of people who came before us: inventors and hobbyists and reformers who steadily hacked away at the problem of making artificial light or clean drinking water so that we can enjoy those luxuries today without a second thought, without even thinking of them as luxuries in the first place… We are indebted to those people every bit as much as, if not more than, we are to the kings and conquerors and magnates of traditional history.

Johnson points out that, much like the evolution of bees gave flowers their colors and the evolution of pollen altered the design of the hummingbird’s wings, the most remarkable thing about innovations is the way they precipitate unanticipated changes that reverberate far and wide beyond the field or discipline or problem at the epicenter of the particular innovation. Pointing to the Gutenberg press — itself already an example of the combinatorial nature of creative breakthroughs — Johnson writes:

Johannes Gutenberg’s printing press created a surge in demand for spectacles, as the new practice of reading made Europeans across the continent suddenly realize that they were farsighted; the market demand for spectacles encouraged a growing number of people to produce and experiment with lenses, which led to the invention of the microscope, which shortly thereafter enabled us to perceive that our bodies were made up of microscopic cells. You wouldn’t think that printing technology would have anything to do with the expansion of our vision down to the cellular scale, just as you wouldn’t have thought that the evolution of pollen would alter the design of a hummingbird’s wing. But that is the way change happens.

Johnson terms these complex chains of influences the “hummingbird effect,” named after the famous “butterfly effect” concept from chaos theory — Edward Lorenz’s famous metaphor for the idea that a change as imperceptible as the flap of a butterfly’s wings can result in an effect as grand as a hurricane far away several weeks later — but different in a fundamental way:

The extraordinary (and unsettling) property of the butterfly effect is that it involves a virtually unknowable chain of causality; you can’t map the link between the air molecules bouncing around the butterfly and the storm system brewing in the Atlantic. They may be connected, because everything is connected on some level, but it is beyond our capacity to parse those connections or, even harder, to predict them in advance. But something very different is at work with the flower and the hummingbird: while they are very different organisms, with very different needs and aptitudes, not to mention basic biological systems, the flower clearly influences the hummingbird’s physiognomy in direct, intelligible ways.

Under the “hummingbird effect,” an innovation in one field can trigger unexpected breakthroughs in wholly different domains, but the traces of those original influences often remain obscured. Illuminating them allows us to grasp the many dimensions of change, its complex and often unintended consequences, the multiple scales of experience that have always defined human history and, perhaps above all, to lend much-needed dimension to the flat myth of genius. Playing off the sentiment at the heart of Richard Feynman’s famous ode to a flower, Johnson writes:

History happens on the level of atoms, the level of planetary climate change, and all the levels in between. If we are trying to get the story right, we need an interpretative approach that can do justice to all those different levels.

[…]

There is something undeniably appealing about the story of a great inventor or scientist — Galileo and his telescope, for instance — working his or her way toward a transformative idea. But there is another, deeper story that can be told as well: how the ability to make lenses also depended on the unique quantum mechanical properties of silicon dioxide and on the fall of Constantinople. Telling the story from that long-zoom perspective doesn’t subtract from the traditional account focused on Galileo’s genius. It only adds.

Nundinal calendar, Rome. The ancient Etruscans developed an eight-day market week, known as the nundinal cycle, around the eighth or seventh century BC.

In fact, of the six such widely reverberating innovations that Johnson highlights, the one sparked by Galileo is the most fascinating because it captures so many dimensions of our eternal and eternally bedeviled relationship with time — our astoundingly elastic perception of it, the way it dictates our internal rhythms and our creative routines, its role in free will, and much more. Johnson tells an absorbing origin story the way only he can:

Legend has it that in 1583, a nineteen-year-old student at the University of Pisa attended prayers at the cathedral and, while daydreaming in the pews, noticed one of the altar lamps swaying back and forth. While his companions dutifully recited the Nicene Creed around him, the student became almost hypnotized by the lamp’s regular motion. No matter how large the arc, the lamp appeared to take the same amount of time to swing back and forth. As the arc decreased in length, the speed of the lamp decreased as well. To confirm his observations, the student measured the lamp’s swing against the only reliable clock he could find: his own pulse.

The swinging altar lamp inside Duomo of Pisa

That teenager, of course, was Galileo. Johnson explains the significance of that mythic moment:

That Galileo was daydreaming about time and rhythm shouldn’t surprise us: his father was a music theorist and played the lute. In the middle of the sixteenth century, playing music would have been one of the most temporally precise activities in everyday culture. (The musical term “tempo” comes from the Italian word for time.) But machines that could keep a reliable beat didn’t exist in Galileo’s age; the metronome wouldn’t be invented for another few centuries. So watching the altar lamp sway back and forth with such regularity planted the seed of an idea in Galileo’s young mind. As is so often the case, however, it would take decades before the seed would blossom into something useful.

'Portrait of Galileo Galilei' by Justus Sustermans, 1636

Indeed, Galileo’s mass experience stands as a spectacular testament to the usefulness of useless knowledge. Over the next two decades, he busied himself with becoming a professor of mathematics, tinkering with telescopes, and, as Johnson aptly puts it, “more or less inventing modern science” (and withstanding the pushback). And yet he kept the image of that swinging altar lamp on the back-burner of his mind. Eventually, as he grew increasingly enchanted with motion and dynamics, he decided to build a pendulum that would simulate what he had observed that distant day at the cathedral. His discovery confirmed his intuition — what determined the time it took the pendulum to swing wasn’t the size of the arc or the weight of the object, but merely the length of the string. Johnson cites Galileo’s excited letter to his peer Giovanni Battista Baliani:

The marvelous property of the pendulum is that it makes all its vibrations, large or small, in equal times.

Galileo's sketches for the pendulum clock

In our present age of productivity, when our entire lives depend on accurate timekeeping — from our daily routines to our conference calls to financial markets and flights — it’s hard to imagine just how groundbreaking and downright miraculous the concept of measuring time accurately was in 16th-century Italy. And yet that’s precisely what it was — Italian towns then, Johnson points out, had clunky mechanical clocks that reflected a loose estimation of time, often losing twenty minutes a day, and had to be constantly corrected by sundial readings. Johnson writes:

The state of the art in timekeeping technology was challenged by just staying accurate on the scale of days. The idea of a timepiece that might be accurate to the second was preposterous.

Preposterous, and seemingly unnecessary. Just like Frederic Tudor’s ice trade, it was an innovation that had no natural market. You couldn’t keep accurate time in the middle of the sixteenth century, but no one really noticed, because there was no need for split-second accuracy. There were no buses to catch, or TV shows to watch, or conference calls to join. If you knew roughly what hour of the day it was, you could get by just fine.

Discus chronologicus, early 1720s, from Cartographies of Time. (Click image for details)

This is where the wings of the hummingbird begin to flutter: The real tipping point in accuracy, Johnson points out in a twist, “would emerge not from the calendar but from the map” — which makes sense given our long history of using cartography to measure time. He explains:

This was the first great age of global navigation, after all. Inspired by Columbus, ships were sailing to the Far East and the newly discovered Americas, with vast fortunes awaiting those who navigated the oceans successfully. (And almost certain death awaiting those who got lost.) But sailors lacked any way to determine longitude at sea. Latitude you could gauge just by looking up at the sky. But before modern navigation technology, the only way to figure out a ship’s longitude involved two clocks. One clock was set to the exact time of your origin point (assuming you knew the longitude of that location). The other clock recorded the current time at your location at sea. The difference between the two times told you your longitudinal position: every four minutes of difference translated to one degree of longitude, or sixty-eight miles at the equator.

In clear weather, you could easily reset the ship clock through accurate readings of the sun’s position. The problem was the home-port clock. With timekeeping technology losing or gaining up to twenty minutes a day, it was practically useless on day two of the journey.

This was an era when European royalty offered handsome bounties for specific innovations — the then-version of venture capital — incentivizing such scientific breakthroughs as Maria Mitchell’s comet discoveries and Johannes Hevelius’s star catalog. As the need to solve the navigation problem grew in urgency, the rewards offered for a solution grew in magnitude — and this was what resurfaced Galileo’s teenage vision for “equal time” all those years later. Johnson describes Galileo’s journey as a superb example of the “slow churn” of creativity, the value of cross-pollinating disciplines, and the importance of playing “the long game”:

[Galileo’s] astronomical observations had suggested that the regular eclipses of Jupiter’s moons might be useful for navigators keeping time at sea, but the method he devised was too complicated (and not as accurate as he had hoped). And so he returned, one last time, to the pendulum.

Fifty-eight years in the making, his slow hunch about the pendulum’s “magical property” had finally begun to take shape. The idea lay at the intersection point of multiple disciplines and interests: Galileo’s memory of the altar lamp, his studies of motion and the moons of Jupiter, the rise of a global shipping industry, and its new demand for clocks that would be accurate to the second. Physics, astronomy, maritime navigation, and the daydreams of a college student: all these different strains converged in Galileo’s mind. Aided by his son, he began drawing up plans for the first pendulum clock.

There is something so poetic about Galileo inventing split-second time for the public on a private scale of decades.

Over the century that followed, the pendulum clock, a hundred times more accurate than any preceding technology, became a staple of European life and forever changed our relationship with time. But the hummingbird’s wings continued to flap — accurate timekeeping became the imperceptible heartbeat beneath all technology of the Industrial Revolution, from scheduling the division of labor in factories to keeping steam-powered locomotives running on time. It was the invisible hand of the clock that first moved the market — a move toward unanticipated innovations in other fields. Without clocks, Johnson argues, the Industrial Revolution may have never taken off — or “at the very least, have taken much longer to reach escape velocity.” He explains:

Accurate clocks, thanks to their unrivaled ability to determine longitude at sea, greatly reduced the risks of global shipping networks, which gave the first industrialists a constant supply of raw materials and access to overseas markets. In the late 1600s and early 1700s, the most reliable watches in the world were manufactured in England, which created a pool of expertise with fine-tool manufacture that would prove to be incredibly handy when the demands of industrial innovation arrived, just as the glassmaking expertise producing spectacles opened the door for telescopes and microscopes. The watchmakers were the advance guard of what would become industrial engineering.

But the most radical innovation of clock time was the emergence of the new working day. Up until that point, people divided their days not into modular abstract units — after all, what is an hour? — but into a fluid series of activities:

Instead of fifteen minutes, time was described as how long it would take to milk the cow or nail soles to a new pair of shoes. Instead of being paid by the hour, craftsmen were conventionally paid by the piece produced — what was commonly called “taken-work” — and their daily schedules were almost comically unregulated.

Rather, they were self-regulated by shifting factors like the worker’s health or mood, the weather, and the available daylight during that particular season. The emergence of factories demanded a reliable, predictable industrial workforce, which in turn called for fundamentally reframing the human perception of time. In one particularly pause-giving parenthetical aside, Johnson writes:

The lovely double entendre of “punching the clock” would have been meaningless to anyone born before 1700.

Workers punching the time clock at the Rouge Plant of the Ford Motor Company

And yet, as with most innovations, the industrialization of time came with a dark side — one Bertrand Russell so eloquently lamented in the 1920s when he asked: “What will be the good of the conquest of leisure and health, if no one remembers how to use them?” Johnson writes:

The natural rhythms of tasks and leisure had to be forcibly replaced with an abstract grid. When you spend your whole life inside that grid, it seems like second nature, but when you are experiencing it for the first time, as the laborers of industrial England did in the second half of the eighteenth century, it arrives as a shock to the system. Timepieces were not just tools to help you coordinate the day’s events, but something more ominous: the “deadly statistical clock,” in Dickens’s Hard Times, “which measured every second with a beat like a rap upon a coffin lid.”

[…]

To be a Romantic at the turn of the nineteenth century was in part to break from the growing tyranny of clock time: to sleep late, ramble aimlessly through the city, refuse to live by the “statistical clocks” that governed economic life… The time discipline of the pendulum clock took the informal flow of experience and nailed it to a mathematical grid. If time is a river, the pendulum clock turned it into a canal of evenly spaced locks, engineered for the rhythms of industry.

Johnson goes on to trace the hummingbird flutterings to the emergence of pocket watches, the democratization of time through the implementation of Standard Time, and the invention of the first quartz clock in 1928, which boasted the unprecedented accuracy of losing or gaining only one thousandth of a second per day. He observes the most notable feature of these leaps and bounds:

One of the strangest properties of the measurement of time is that it doesn’t belong neatly to a single scientific discipline. In fact, each leap forward in our ability to measure time has involved a handoff from one discipline to another. The shift from sundials to pendulum clocks relied on a shift from astronomy to dynamics, the physics of motion. The next revolution in time would depend on electromechanics. With each revolution, though, the general pattern remained the same: scientists discover some natural phenomenon that displays the propensity for keeping “equal time” that Galileo had observed in the altar lamps, and before long a wave of inventors and engineers begin using that new tempo to synchronize their devices.

But the most groundbreaking effect of the quartz clock — the most unpredictable manifestation of the hummingbird effect in the story of time — was that it gave rise to modern computing and the Information Age. Johnson writes:

Computer chips are masters of time discipline… Instead of thousands of operations per minute, the microprocessor is executing billions of calculations per second, while shuffling information in and out of other microchips on the circuit board. Those operations are all coordinated by a master clock, now almost without exception made of quartz… A modern computer is the assemblage of many different technologies and modes of knowledge: the symbolic logic of programming languages, the electrical engineering of the circuit board, the visual language of interface design. But without the microsecond accuracy of a quartz clock, modern computers would be useless.

Theodor Nelson's pioneering 1974 book 'Computer Lib | Dream Machines,' an exploration of the creative potential of computer networks, from '100 Ideas that Changed the Web' (Click image for more)

But as is often the case given the “thoroughly conscious ignorance” by which science progresses, new frontiers of knowledge only exposed what is yet to be reached. With the invention of the quartz clock also came the realization that the length of the day wasn’t as reliable as previously thought and the earth’s rotation wasn’t the most accurate tool for reaching Galileo’s measurement ideal of “equal time.” As Johnson puts it, “quartz let us ‘see’ that the seemingly equal times of a solar day weren’t nearly as equal as we had assumed” — the fact that a block of vibrating sand did a better job of keeping time than the sun and the earth, celebrated for centuries as the ultimate timekeepers, became the ultimate “deathblow to the pre-Copernican universe.”

What accurate timekeeping needed, ever since Galileo’s contemplation of the pendulum, was something that oscillated in the most consistent rhythm possible — and that’s what Niels Bohr and Werner Heisenberg’s discovery of the atom in the beginning of the twentieth century finally provided. With its rhythmically spinning electrons, the smallest chemical unit became the greatest and most consistent oscillator ever known. When the first atomic clocks were built in the 1950s, they introduced a groundbreaking standard of accuracy, measuring time down to the nanosecond, thousandfold better than the quartz clock’s microseconds.

Half a century later, this unprecedented precision is something we’ve come to take for granted — and yet it continues to underpin our lives with a layer of imperceptible magic. In one example, Johnson brings us full-circle to the relationship between timekeeping and map navigation where Galileo began:

Every time you glance down at your smartphone to check your location, you are unwittingly consulting a network of twenty-four atomic clocks housed in satellites in low-earth orbit above you. Those satellites are sending out the most elemental of signals, again and again, in perpetuity: the time is 11:48:25.084738 . . . the time is 11:48:25.084739. . . . When your phone tries to figure out its location, it pulls down at least three of these time stamps from satellites, each reporting a slightly different time thanks to the duration it takes the signal to travel from satellite to the GPS receiver in your hand. A satellite reporting a later time is closer than one reporting an earlier time. Since the satellites have perfectly predictable locations, the phone can calculate its exact position by triangulating among the three different time stamps. Like the naval navigators of the eighteenth century, GPS determines your location by comparing clocks. This is in fact one of the recurring stories of the history of the clock: each new advance in timekeeping enables a corresponding advance in our mastery of geography — from ships, to railroads, to air traffic, to GPS. It’s an idea that Einstein would have appreciated: measuring time turns out to be key to measuring space.

Therein lies the remarkable power and reach of the hummingbird effect, which Johnson condenses into an elegant concluding reflection:

Embedded in your ability to tell the time is the understanding of how electrons circulate within cesium atoms; the knowledge of how to send microwave signals from satellites and how to measure the exact speed with which they travel; the ability to position satellites in reliable orbits above the earth, and of course the actual rocket science needed to get them off the ground; the ability to trigger steady vibrations in a block of silicon dioxide — not to mention all the advances in computation and microelectronics and network science necessary to process and represent that information on your phone. You don’t need to know any of these things to tell the time now, but that’s the way progress works: the more we build up these vast repositories of scientific and technological understanding, the more we conceal them. Your mind is silently assisted by all that knowledge each time you check your phone to see what time it is, but the knowledge itself is hidden from view. That is a great convenience, of course, but it can obscure just how far we’ve come since Galileo’s altar-lamp daydreams in the Duomo of Pisa.

But perhaps the strangest thing about time is how each leap of innovation further polarized the scales on which it played out. As in the case of Galileo, who took six decades to master the minute, the same breakthroughs that gave atomic time its trailblazing accuracy also gave us radiation and radiometric dating, which was essential in debunking the biblical myth and proving that earth’s age was in the billions, not thousands, of years.

5,068-year-old bristlecone pine from Rachel Sussman's 'The Oldest Living Things in the World' (Click image for more)

Pointing to the Long Now Foundation’s quest to bury a clock that ticks once every 10,000 years beneath some of the oldest living pines in the world — an effort to extract us from the toxic grip of short-termism and, in the words of Long Now founder Kevin Kelly, nudge us to think about “generational-scale questions and projects” — Johnson ends with a wonderfully poetic reflection:

This is the strange paradox of time in the atomic age: we live in ever shorter increments, guided by clocks that tick invisibly with immaculate precision; we have short attention spans and have surrendered our natural rhythms to the abstract grid of clock time. And yet simultaneously, we have the capacity to imagine and record histories that are thousands or millions of years old, to trace chains of cause and effect that span dozens of generations. We can wonder what time it is and glance down at our phone and get an answer that is accurate to the split-second, but we can also appreciate that the answer was, in a sense, five hundred years in the making: from Galileo’s altar lamp to Niels Bohr’s cesium, from the chronometer to Sputnik. Compared to an ordinary human being from Galileo’s age, our time horizons have expanded in both directions: from the microsecond to the millennium.

In the remainder of How We Got to Now, a remarkable and perspective-shifting masterwork in its entirety, Johnson goes on to examine with equal dimension and rigor the workings of the hummingbird effect through the invention and evolution of such concepts as sound, light, glass, sanitation, and cooling.

For more on the mysteries of time, see these seven revelatory perspectives for a variety of fields, then revisit the curious psychology of why time slows down when you’re afraid, speeds up as you age, and gets warped while you’re on vacation.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.