“You can’t really, truly expect to explain design unless you explain intelligence.”
By Maria Popova
Sad days for the design world: We’ve lost Bill Moggridge (1943-2012) — visionary pioneer, designer of the world’s first laptop, director of the Cooper-Hewitt National Design Museum, and one infinitely kind man. No attempt to capture the full scope of his legacy and cultural imprint would be remotely adequate, but revisiting his inimitable insight into the essence, purpose, and sociocultural capacity of design is a poignant reminder of just what we’ve lost with his passing — and how much we’ve gained through his timeless wisdom.
In 2007, shortly after the publication of his now-iconic Designing Interactions, Moggridge gave an interview for Ambidextrous Magazine, in which he articulates with his signature blend of insight and irreverence some of the most critical issues that still keep design a culturally misunderstood discipline and design education a broken model.
[Academia is] all about explicit knowledge. And design, by definition — along with the other arts like poetry or writing — is mostly not so explicit. It’s mostly tacit knowledge. It has to do with people’s intuitions and harnessing the subconscious part of the mind rather than just the conscious. And the result is if you try and couch the respectability of a professor or some form of research grant in terms that are normal for science, then it looks very weak. And so you have to have a different attitude, really, in order to see the strength that it could offer or the value that it could offer. And that’s a big difficulty both in academia and in terms of foundations.
If you think about the structure of the mind, there just seems to be a small amount that is above the water—equivalent to an iceberg—which is the explicit part…And most academic subjects are designed to live in that explicit part that sticks out of the water. If you can find a way to harness, towards a productive goal, the rest of it, the subconscious [understanding], the tacit knowledge, the behavior — just doing it and the intuition — all those, then you can bring in the rest of the iceberg. And that is hugely valuable.
Every scientist is an intuitive person, and most ‘ahas’ come from intuition anyway. And we all know that we fall in love with things and that we’re interested in subjective qualitative values. It’s just whether you recognize it as having something that you can use in a respective environment or a respectable sense.
Ultimately, and perhaps precisely because it requires this kind of abstract knowledge and combinatorial creativity, Moggridge frames design as a form of intelligence:
I really don’t think you’re going to understand design and art until you understand intelligence [and how the brain works]. So you can dent it, you can sort of make things so there are interesting insights and will help people, and you can explain process, but you can’t really, truly expect to explain design unless you explain intelligence.
He leaves us with a seemingly simple but infinitely important reminder, wrapped in a hope for better design writing and education:
So few people seem to realize that everything’s designed. And until we get some good people telling the story, that’s probably going to continue to be the case. So I’d love it if there was a consciousness in the public mind that mathematics and reading and writing is not enough — you also need to learn how to do design. Because everything is designed, and the way our world exists around us depends on how well it’s designed.
Fortunately, we’ve had some reallygoodpeople telling design’s story. But what tragedy to lose one of the best of them — Bill Moggridge, you will be missed.
Literature and science converge in a playful riff on a riff on a riff.
By Maria Popova
Remember the first poem published in a scientific journal? The one that turned out not to be the first? Reader Marco F. Barozzi ups the dramatic ante by pointing out in an email that while J. Storey’s may have been the first scientific paper written entirely in verse, verses already appeared in a work of the English physicist and mathematician Lewis F. Richardson (1881-1953), who pioneered the application of physics and computational mathematics to weather forecasting. In 1920, he used a quatrain as an epigraph of his paper “The supply of energy from and to Atmospheric Eddies,” published in Issue 686, Volume 97 of the journal Proceedings of the Royal Society of London.
After his studies of air turbulence led him to develop the Richardson criterion, a measure of the ratio of buoyant to mechanical turbulence, he delivered his breakthrough in a clever rhyme playing on “Poetry, a Rhapsody,” a famous Jonathan Swift poem about fleas, and on the parody of Swift by Augustus De Morgan, A Budget of Paradoxes:
Big whorls have little whorls
That feed on their velocity,
And little whorls have lesser whorls
And so on to viscosity
The riff on Swift:
So, naturalists observe, a flea
Has smaller fleas that on him prey;
And these have smaller still to bite ’em;
And so proceed ad infinitum.
And the riff on De Morgan:
Great fleas have little fleas upon their backs to bite ’em,
And little fleas have lesser fleas, and so ad infinitum.
And the great fleas themselves, in turn, have greater fleas to go on;
While these again have greater still, and greater still, and so on.
Originality depends on new and striking combinations of ideas. It is obvious therefore that the more a man knows the greater scope he has for arriving at striking combinations. And not only the more he knows about his own subject but the more he knows beyond it of other subjects. It is a fact that has not yet been sufficiently stressed that those persons who have risen to eminence in arts, letters or sciences have frequently possessed considerable knowledge of subjects outside their own sphere of activity.
Harding goes on to give a number of examples: Pasteur was a bachelor of literature in addition to being a doctor of science; James Watt rested his mind from honing the steam engine with archeology and poetry; Emmanuel Kant read classics, mathematics, physics, astronomy, metaphysics, law, geography, and travel; Goethe was a collector of art and science ephemera, and took a close interest in the engineering of canals, harbors, and tunnels; George Eliot was obsessed with philology:
Success depends on adequate knowledge: that is, it depends on sufficient knowledge of the special subject, and a variety of extraneous knowledge to produce new and original combinations of ideas. Technical skill must be so far developed that it is never a hindrance to the flow of ideas. The thinker does not sit down and say to himself: ‘now I am going to think out the relations between so and so.’ The process is not so much an active as a passive one. In short the thinker dreams over his subject.
One particularly interesting notion Harding puts forth is that of “fringe-ideas” — ideas on the periphery of the thinker’s particular inquiry, but resonant in tone and thus able to enhance and flow into the creative process:
[M]any ideas outside the subject become associated with it by a kind of interest association and acquire a similar tone. Thus they tend to become available at the same time as the ideas directly connected with the subject itself. The variety of interests tends to increase the richness of these extra ideas — ‘fringe-ideas’ — associated with the subject and thus to increase the possibilities of new and original combinations of thought.
The old-fashioned idea that in-born genius is enough by itself without a solid foundation of knowledge, is the reason why [famous creators] set themselves against the use of this term and their pupils against the state. Without the rock of knowledge genius has no foundation to make it durable. In the words of Eugene Delacroix: ‘Natural gifts unsupported by culture may be said to resemble the honeysuckle, charming in its grace, but without odour, that I see hanging from the trees in the forest.’
There is much to be said in favour of laying a work aside to mature; for one thing it gives the judgment time to operate; the mind is able to return to the work from time to time with a fresh outlook; and check it from many different angles. It follows also that if new ideas are to be set aside to develop and newly finished works left to ‘mature,’ there must be several things on hand at the same time in various stages of development. The continuity of attention is purposely shorted and interrupted partly on account of the rest this gives.
Harding goes on to prescribe the following method for capturing and harnessing ideas:
(i) The ideas occurring when in the glow of inspiration are (a) briefly noted down and (b) checked.
(ii) (a) The subject is worked upon immediately, the thinker being wholly absorbed by it to the exclusion for the time being of everything else, or (b) The subject is set aside to develop and is then worked upon after an interval of time has elapsed, (c) the first draft of the completed work or half of it perhaps is put aside to ‘mature’ for a while; then it is again revised before publication.
(iii) Working at two or more subjects concurrently.
(iv) Working up the imagination to the state of vision and sometimes an audition.
On the whole it appears that morning or night hours are the most favourable to the flow of ideas. It has been shown that a difficulty unsolvable the day before is sometimes solved in the morning upon waking. In fact the value of morning hours when the mind is fresh has long been recognized as a time to be consecrated to important work.
Night-time when awake is perhaps the best time of all for the flow of ideas…. The spiritual aloneness that comes over the thinker when the world sleeps, carrying with it the sense of detachment so essential to a creative thinker may account partly for the fascination and spell of working by night. It is, however, a spell, to be resisted since it may lead to practices dangerous alike to bodily and mental health: Byron, sometimes writing on Hollands and water, Schiller on strong coffee, wine-chocolate, old Rhenish, or Champagne, the poet Crabbe at one time on weak brandy and water and snuff, and Balzac on endless cups of black coffee.
Harding also points to the importance of bodily posture and the habit of motion that many creators cultivated: Dickens and Hugo were avid walkers during ideation; Burns often composed while “holding the plough”; Twain paced madly while dictating; Goethe, Scott, and Burns composed on horseback; Mozart preferred the back of a carriage; Lord Kelvin worked on his mathematical studies while traveling by train. Harding offers:
It is possible that the rhythmical movement of a carriage or train, of a horse and to a much lesser degree of walking, may produce on sensitive minds a slightly hypnotic effect conducive to that state of mind most favourable to the birth of ideas.
The true novelist, poet, musician, or artist is really a discoverer. Ideas — the theme of a plot, a poem, a picture, a theme of music — come to him as a gift. The idea, ‘the seed-corn’ as Brahms called it, he allows to develop naturally. There may come a point where it branches in one or many directions; he is free at this point to follow one or other. And it is here and here only that the judgment or choice of the true artist may legitimately be exercised. In fact the artist is in much the same position as a gardener growing prize rose trees, who in order to produce beautiful roses lops off unwanted shoots and suckers.
With its countless anecdotes from some of mankind’s most remarkable creators and its synthesis of common ground, An Anatomy of Inspiration is, if not a blueprint to true creativity, at the very least an invaluable lens on the nooks and crannies of the creative process.
When Henrietta Lacks (1920-1951), an African-American mother of five who migrated from the tobacco farms of Virginia to poorest neighborhoods of Baltimore, died at the tragic age of 31 from cervical cancer, she didn’t realize she’d be the donor of cells that would create the HeLa immortal cell line — a line that didn’t die after a few cell divisions — making possible some of the most seminal discoveries in modern medicine. Though the tumor tissue was taken with neither her knowledge nor her consent, the HeLa cell was crucial in everything from the first polio vaccine to cancer and AIDS research. To date, scientists have grown more than 20 tons of HeLa cells.
Skloot weaves a fascinating and tender detective story about HeLa’s legacy through the discovery of Henrietta’s youngest daughter, Deborah, who didn’t know her mother but who always knew she wanted to be a scientist. As Skloot and Deborah, infinitely different yet united by the shared quest for answers, unravel one of the most absorbing mysteries of modern science, we also get a rich and sensitive tale about family, community, and the dark side of society’s capacity for exploiting its poorest and most vulnerable members. The book, one of the decade’s most excellent and ambitious science-and-so-much-more reads, is currently being made into an HBO movie by Oprah Winfrey and Alan Ball.
Good science is all about following the data as it shows up and letting yourself be proven wrong, and letting everything change while you’re working on it — and I think writing is the same way.” ~ Rebecca Skloot
In an interview with Skloot, David Dobbs offers a fascinating behind-the-scenes look at how the book came to life — a must-read for anyone interested in the intricacies of intelligent inquiry and storytelling, in science and in general.
THE BEGINNING OF INFINITY
Since time immemorial, mankind’s greatest questions — what is reality, what does it mean to be human, what is time, is there God — have endured as a pervasive frontier of intellectual inquiry through which we try to explain and make sense of the world, the pursuit of these elusive answers having germinated disciplines as diverse as philosophy and physics. But what place does explanation itself have in the universe and our understanding of it? That’s exactly what iconic physicist and quantum computation pioneer David Deutsch explores in The Beginning of Infinity: Explanations That Transform the World — an important and wildly illuminating new book on the nature and evolution of human knowledge. Fluidly switching between evolutionary biology, quantum physics, mathematics, philosophy, ancient history and more, Deutsch offers surprisingly — or, perhaps knowing his work, unsurprisingly — plausible answers to everything from why beauty exists to what is infinity.
Must progress come to an end — either in catastrophe or in some sort of completion — or is it unbounded? The answer is the latter. That unboundedness is the ‘infinity’ referred to in the title of this book. Explaining it, and the conditions under which progress can and cannot happen, entails a journey through virtually every fundamental field of science and philosophy. From each such field we learn that, although progress has no necessary end, it does have a necessary beginning: a cause, or an event with which it starts, or a necessary condition for it to take off and to thrive. Each of these beginnings is ‘the beginning of infinity’ as viewed from the perspective of that field. Many seem, superficially, to be unconnected. But they are all facets of a single attribute of reality, which I call the beginning of infinity.” ~David Deutsch
In 2009, I had the pleasure of seeing Deutsch speak at TEDGlobal, where he delivered what was unequivocally the event’s most mind-bending talk, presenting a new way to explain explanation itself — a teaser for the book as he was in the heat of writing it. Stay on your toes and try to keep up:
Empiricism is inadequate because scientific theories explain the seen in terms of the unseen and the unseen, you have to admit, doesn’t come to us through the senses.” ~ David Deutsch
Bear in mind, this is no light beach book, nor is it an easy read, but it’s an incredibly lucid one, the kind of book that stays with you for your entire lifetime, insights from it finding their way, consciously or unconsciously, into every intellectual conversation you’ll ever have.
In Radioactive: Marie & Pierre Curie: A Tale of Love and Fallout, artist Lauren Redniss tells the story of Marie Curie — one of the most extraordinary figures in the history of science, a pioneer in researching radioactivity, a field the very name for which she coined, and not only the first woman to win a Nobel Prize but also the first person to win two Nobel Prizes, and in two different sciences — through the two invisible but immensely powerful forces that guided her life: radioactivity and love. Granted, the book was also atop my omnibus of the year’s best art and design books — but that’s because it’s truly extraordinary — a remarkable feat of thoughtful design and creative vision. To honor Curie’s spirit and legacy, Redniss rendered her poetic artwork in cyanotype, an early-20th-century image printing process critical to the discovery of both X-rays and radioactivity itself — a cameraless photographic technique in which paper is coated with light-sensitive chemicals. Once exposed to the sun’s UV rays, this chemically-treated paper turns a deep shade of blue. The text in the book is a unique typeface Redniss designed using the title pages of 18th- and 19th-century manuscripts from the New York Public Library archive. She named it Eusapia LR, for the croquet-playing, sexually ravenous Italian Spiritualist medium whose séances the Curies used to attend. The book’s cover is printed in glow-in-the-dark ink.
Redniss tells a turbulent story — a passionate romance with Pierre Curie (honeymoon on bicycles!), the epic discovery of radium and polonium, Pierre’s sudden death in a freak accident in 1906, Marie’s affair with physicist Paul Langevin, her coveted second Noble Prize — under which lie poignant reflections on the implications of Curie’s work more than a century later as we face ethically polarized issues like nuclear energy, radiation therapy in medicine, nuclear weapons and more.
Full review, with more images and Redniss’s TEDxEast talk, here.
THE PHYSICS BOOK
Einstein famously noted that the most incomprehensible thing about the world is that it’s comprehensible. In The Physics Book: From the Big Bang to Quantum Resurrection, 250 Milestones in the History of Physics, acclaimed science author Clifford Pickover offers a sweeping, lavishly illustrated chronology of comprehension by way of physics, from the Big Bang (13.7 billion BC) to Quantum Resurrection (> 100 trillion), through such watershed moments as Newton’s formulation of the laws of motion and gravity (1687), the invention of fiber optics (1841), Einstein’s general theory of relativity (1915), the first speculation about parallel universes (1956), the discovery of buckyballs (1985), Stephen Hawking’s Star Trek cameo (1993), and the building of the Large Hadron Collider (2009).
As the island of knowledge grows, the surface that makes contact with mystery expands. When major theories are overturned, what we thought was certain knowledge gives way, and knowledge touches upon mystery differently. This newly uncovered mystery may be humbling and unsettling, but it is the cost of truth. Creative scientists, philosophers, and poets thrive at this shoreline.” ~ W. Mark Richardson, ‘A Skeptic’s Sense of Wonder,’ Science
Pickover takes a wide-angle view of what physics actually is, encompassing everything from relativity to quantum mechanics to dark matter and beyond, in a spirit that honors the American Physical Society’s founding mission statement of 1899, which holds physics as “the most basic and fundamental science.” As much as it is about the great ideas of physics, the book is also about the great minds behind them, including Brain Pickings darlings Marie Curie, Albert Einstein, Richard Feynman, Stephen Hawking, and Erwin Schrödinger.
From the magnetic monopole to quasicrystals to dark matter, The Physics Book is an invaluable treasure trove of curated knowledge in an age when, as Andrew Zolli put it at the opening of PopTech 2011, “the scale of our knowledge is expanding faster than most of our ability to comprehend.” For once, it’s rather nice to make some of humanity’s greatest intellectual achievements feel contained and digestible.
Originally reviewed, with plenty more images, last month.
I HAVE LANDED
For 27 years, iconic evolutionary biologist and science historian Stephen Jay Gould contributed illuminating and absorbing essays on everything from Aristotle to zoology for the magazine Natural History, many collected in a series of anthologies, offering some of the most articulate science writing of our time and influencing public opinion on science in magnitude few other writers have achieved. This year marked the bittersweet reprint of I Have Landed: The End of a Beginning in Natural History — the tenth and final of these fantastic anthologies, featuring 31 of Gould’s essays and commemorating the centennial of his family’s arrival at Ellis Island. (The title comes from his grandfather’s diary entry on that day.) It was originally published in 2002, mere weeks after Gould passed away from cancer.
From a fascinating essay on Vladimir Nabokov’s lepidoptery poetically titled “No Science Without Fancy, No Art Without Facts” to a meditation on Freud’s evolutionary fantasy to a poignant scientific reflection on 9/11, the essays blend a head-spinning spectrum of serious scientific inquiry with the storytelling of fine fiction.
In fact, a big part of what makes Gould’s thinking so compelling and his writing so alluring is the eloquence with which he blends popular interest with deep scientific insight. (The very notion of a scientific essay faces a great deal of resistance among many scientists, who find the essay format to be inappropriate for science.) Of the balance, Gould writes:
I have come to believe, as the primary definition of these ‘popular’ essays, that the conceptual depth of technical and general writing should not differ, lest we disrespect the interest and intelligence of millions of potential readers who lack advanced technical training in science, but who remain just as fascinated as any professional, as just as well aware of the importance of science to our human and earthly existence.”
Gould closes his final essay for Natural History with this moving tribute to his grandfather, all the more profound in light of the author’s own passing shortly thereafter:
Dear Papa Joe, I have been faithful to your dream of persistence and attentive to a hope that the increments of each worthy generation may buttress the continuity of evolution. You could write those wondrous words right at the beginning of your journey, amidst all the joy and terror of inception. I dared not repeat them until I could fulfill my own childhood dream — something that once seemed so mysteriously beyond any hope of realization to an insecure little boy in a garden apartment in Queens — to
become a scientist and to make, by my own effort, even the tiniest addition to human knowledge of evolution and the history of life. But now, with my 300, so fortuitously coincident with the world’s new 1,000 and your own 100, perhaps I have finally won the right to restate your noble words and to tell you that their inspiration still lights my journey: I have landed. But I also can’t help wondering what comes next!”
With beautiful illustrations by graphic artist Dave McKean, Dawkins’ volume is as accessible as it is illuminating, covering a remarkable spectrum of subjects and natural phenomena — from who the very first person was to how earthquakes work to what dark matter is — in a way that infuses reality with the kind of fascination and whimsy we’re used to finding in myth and folklore. Each chapter begins with a famous myth from one of the world’s religions or folklore traditions, which Dawkins proceeds to myth-bust by examining the actual scientific processes and phenomena that these stories try to explain.
Here’s an introduction from Dawkins himself:
BBC has a great short segment, in which Dawkins explores the relationship between comfort and truth, and explains why evolution is the most magical, spellbinding story of all, more poetic than any fable or fairy tale:
When you think about it, here we are, we started off on this planet — this fragment of dust spinning around the sun — and in 4 billion years we gradually changed form bacteria into us. That is a spellbinding story.” ~ Richard Dawkins
The book comes with a companion immersive iPad app.
Field Notes on Science and Nature, one of five fascinating peeks inside the notebooks of great creators, offers an unprecedented look at the inner workings of scientific inquiry and observation. It’s as much a scientific travelogue as it is a celebration of traditional methodologies for making sense of our natural environment, its beautiful reproductions of original journal pages taking us from Baja, California, with eminent ornithologist Kenn Kaufman to the Serengeti with renowned mammalogist George Schaller.
Michael Canfield, who edited the book and is himself a biologist at Harvard, invites us to “peer over the shoulders of outstanding field scientists and naturalists” through their brilliant annotations and illustrations.
The twelve essays in Field Notes were written by professional naturalists from such diverse disciplines as anthropology, botany, ecology, entomology, and paleontology, and their enthusiasm and experience are contagious. For the amateur naturalists among us, the compilation also contains essays on “Note-Taking for Pencilophobes” and basic instructions on color theory and sketching.
E.O. Wilson articulates the book’s voyeuristic magic in its introduction:
If there is a heaven, and I am allowed entrance, I will ask for no more than an endless living world to walk through and explore. I will carry with me an inexhaustible supply of notebooks, from which I can send back reports to the more sedentary spirits (mostly molecular and cell biologists). Along the way I would expect to meet kindred spirits among whom would be the authors of the essays in this book.”
From Feynman’s childhood in Long Island to his work on the Manhattan Project to the infamous Challenger disaster, by way of quantum electrodynamics and bongo drums, the graphic narrative unfolds with equal parts humor and respect as it tells the story of one of the founding fathers of popular physics.
Colorful, vivid, and obsessive, the pages of Feynman exude the famous personality of the man himself, full of immense brilliance, genuine excitement for science, and a healthy dose of snark.
Originally featured, with more images, in October.
This year, Edge.org editor John Brockman launched a new series of anthologies curating 15 years’ worth of the most provocative thinking on major facets of science, culture, and intellectual life. First came The Mind, followed by Culture: Leading Scientists Explore Societies, Art, Power, and Technology — a treasure chest of insight true to the promise of its title, featuring essays and interviews by and with (alas, all-male) icons such as Brian Eno, George Dyson and Douglas Rushkoff, as well as Brain Pickings favorites like Denis Dutton, Stewart Brand, Clay Shirky and Dan Dennett. From the origin and social purpose of art to how technology shapes civilization to the Internet as a force of democracy and despotism, the 17 pieces exude the kind of intellectual inquiry and cultural curiosity that give progress its wings.
Here’s a modest sampling of the lavish cerebral feast you’ll find between the book’s covers.
In his 1997 meditation “A Big Theory of Culture”, music icon and deep-thinker Brian Eno explores what constitutes cultural value and how it comes about:
Nearly all of art history is about trying to identify the source of value in cultural objects. Color theories and dimension theories, golden means, all those sort of ideas, assume that some objects are intrinsically more beautify and meaningful than others. New cultural thinking isn’t like that. It says that we confer value on things. We create the value in things. It’s the act of conferring that makes things valuable. Now this is very important, because so many, in fact all fundamentalist ideas, rest on the assumption that some things have intrinsic value and resonance and meaning. All pragmatists work from another assumption: No, it’s us. It’s us who make those meanings.”
[It] is not some kind of ironclad doctrine that it is supposed to replace a heavy post-structuralism with something just as oppressive. What surprises me about the resistance to the application of Darwin to psychology is the vociferous way in which people want to dismiss it, not even to consider it.”
In “Social Networks Are Like the Eye” (2008), Harvard physician and sociologist Nicholas Christakis examines why networks form and how they operate:
The amazing thing about social networks, unlike other networks that are almost as interesting — networks of neurons or genes or stars or computers or all kinds of other things one can imagine — is that the nodes of a social network — the entities, the components — are themselves sentient, acting individuals who can respond to the network and actually form it themselves.”
In “Turing’s Cathedral” (2005), science historian George Dyson recalls his visit to the Google headquarters in the context of H. G. Wells’s 1938 prophecy:
I felt I was entering a 14th-century cathedral — not in the 14th century but in the 12th century, while it was being built […] The whole human memory can be, and probably in a short time will be, made accessible to every individual […] Wells foresaw not only the distributed intelligence of the World Wide Web, but the inevitability that this intelligence would coalesce, and that power, as well as knowledge, would fall under its domain.”
Thoughtfully curated to stimulate your keenest critical thinking — like, for instance, the juxtaposition of Jaron Lanier’s digital dystopianism and Clay Shirky’s optimistic retort — Culture expands both the scope of science and your comfort zone of intellectual inquiry.
THE MAN OF NUMBERS
Imagine a day without numbers — how would you know when to wake up, how to call your mother, how the stock market is doing, or even how old you are? We live our lives by numbers. So fundamental are they to our understanding of the world that we’ve grown to take them for granted. And yet it wasn’t always so. Until the 13th century, even simple arithmetic was accessible almost exclusively to European scholars. Merchants kept track of quantifiables using Roman numerals, performing calculations either by an elaborate yet widespread fingers procedure or with a clumsy mechanical abacus. But in 1202, a young Italian man named Leonardo da Pisa — known today as Fibonacci — changed everything when he wrote Liber Abbaci, Latin for Book of Calculation, the first arithmetic textbook of the West.
Keith Devlin tells his incredible and important story in The Man of Numbers: Fibonacci’s Arithmetic Revolution, tracing how Fibonacci revolutionized everything from education to economics by making arithmetic available to the masses. If you think the personal computing revolution of the 1980s was a milestone of our civilization, consider the personal computation revolution. And yet, de Pisa’s cultural contribution is hardly common knowledge.
The change in society brought about by the teaching of modern arithmetic was so pervasive and all-powerful that within a few generations people simply took it for granted. There was no longer any recognition of the magnitude of the revolution that took the subject from an obscure object of scholarly interest to an everyday mental tool. Compared with Copernicus’s conclusions about the position of Earth in the solar system and Galileo’s discovery of the pendulum as a basis for telling time, Leonardo’s showing people how to multiply 193 by 27 simply lacks drama.” ~ Keith Devlin
Though “about” mathematics, Fibonacci’s story is really about a great number of remarkably timely topics: gamification for good (Liber abbaci brimmed with puzzles and riddles like the rabbit problem to alleviate the tedium of calculation and engage readers with learning); modern finance (Fibonacci was the first to develop an early form of present-value analysis, a method for calculating the time value of money perfected by iconic economist Irving Fisher in the 1930s); publishing entrepreneurship (the first edition of Liber Abbaci was too dense for the average person to grasp, so da Pisa released — bear in mind, before the invention of the printing press — a simplified version accessible to the ordinary traders of Pisa, which allowed the text to spread around the world); abstract symbolism (because numbers, as objective as we’ve come to perceive them as, are actually mere commonly agreed upon abstractions); and even remix culture (Liber Abbaci was assumed to be the initial source for a great deal of arithmetic bestsellers released after the invention of the printing press.)
Above all, however, Fibonacci’s feat was one of storytelling — much like TED, he took existing ideas that were far above the average person’s competence and grasp, and used his remarkable expository skills to make them accessible and attractive to the common man, allowing these ideas to spread far beyond the small and self-selected circles of the scholarly elite.
A book about Leonardo must focus on his great contribution and his intellectual legacy. Having recognized that numbers, and in particular powerful and efficient ways to compute with them, could change the world, he set about making that happen at a time when Europe was poised for major advances in science, technology, and commercial practice. Through Liber Abbaci he showed that an abstract symbolism and a collection of seemingly obscure procedures for manipulating those symbols had huge practical applications.” ~ Keith Devlin
For an added layer of fascinating, there’s also a complementary ebook titled Leonardo and Steve, drawing a curious parallel between Fibonacci and Steve Jobs.
Originally featured, with a Kindle preview, in July.
What consumes the best and brightest minds working in science today? That’s exactly what literary agent Max Brockman explores in Future Science: Essays from the Cutting Edge — a fantastic anthology of short pieces by 19 first-rate researchers spanning everything from astronomy to virology to computer science, and a wealth in between. The provocative yet digestible essays are intended for the curious layperson, which Brockman reminds us in the introduction doesn’t come without risk: “If you’re an academic who writes about your work for a general audience, you’re thought by some of your colleagues to be wasting your time and perhaps endangering your academic career. For younger scientists (i.e., those without tenure), this is almost universally true.”
Given our optimism for the future and soft spot for intellectual anthologies, we’re certainly glad the contributors to Future Science took the chance. The result is a fascinating tour of academy’s advanced guard on, among other topics, why stress causes some people to crumble even as it spurs others on, what sense computer science can make of social media’s vast digital data, and how infinity has entered the realm of testable science. The breadth of subjects and their authors’ ability to make them accessible is thrilling — it’s like TED in book form.
For much of human history, we have been explorers of other continents — examiners of rocks and regions ripe for habitation, the culmination being the Heroic Age of Antarctic exploration and the capstone being our flags and footprints on the surface of the Moon. But in the decades and centuries to come, exploration — both human and robotic — will increasingly focus on the ocean depths, of both our own ocean and the subsurface oceans believed to exist on at least five moons of the outer Solar System: Jupiter’s Europa, Ganymede, and Callisto and Saturn’s Titan and Enceladus. The total volume of liquid water on those worlds is estimated to be more than a hundred times the volume of liquid water on Earth.” ~ Kevin P. Hand, “On the Coming Age of Ocean Exploration”
If humans are to succeed as a species, our collective shame over destroying other life-forms should grow in proportion to our understanding of their various ecological roles. Maybe the same attention to one another that promoted our own evolutionary success will keep us from failing the other species in life’s fabric and, in the end, ourselves.” ~ Jennifer Jacquet, “Is Shame Necessary”
This afternoon I received in the post a slim FedEx envelope containing four small vials of DNA. The DNA had been synthesized according to my instructions in under three weeks, at a cost of 39 U.S. cents per base pair (the rungs adenine-thymine or guanine-cytosine in the DNA ladder). The 10 micrograms I ordered are dried, flaky, and barely visible to the naked eye, yet once I have restored them in water and made an RNA copy of this template, they will encode a virus I have designed.” ~ William McEwan, “Molecular Cut and Paste: The New Generation of Biological Tools”
As you might have guessed, Brockman is the son of John Brockman, who masterminded Culture above.