Brain Pickings

Posts Tagged ‘science’

10 APRIL, 2014

The Science of Smell: How the Most Direct of Our Senses Works

By:

Why the 23,040 breaths we take each day are the most powerful yet perplexing route to our emotional memory.

“Get a life in which you notice the smell of salt water pushing itself on a breeze over the dunes,” Anna Quindlen advised in her indispensable Short Guide to a Happy Life. Susan Sontag listed “linen” and “the smell of newly mown grass” among her favorite things. “A man may have lived all of his life in the gray,” John Steinbeck wrote in his beautiful meditation on the meaning of life, “and then — the glory — so that a cricket song sweetens his ears, the smell of the earth rises chanting to his nose.” Why is it that smell lends itself to such poetic metaphors, sings to us so sweetly, captures us so powerfully?

That’s precisely what science historian Diane Ackerman explores in A Natural History of the Senses (public library), her 1990 prequel to the equally fantastic A Natural History of Love. Ackerman, who also happens to be a spectacular poet and the author of the gorgeous cosmic verses that Carl Sagan mailed to Timothy Leary in prison, paints the backdrop of this perplexing and unique sensory experience:

Our sense of smell can be extraordinarily precise, yet it’s almost impossible to describe how something smells to someone who hasn’t smelled it… We see only where there is light enough, taste only when we put things into our mouths, touch only when we make contact with someone or something, hear only sounds that are loud enough to hear. But we smell always and with every breath. Cover your eyes and you will stop seeing, cover your ears and you will stop hearing, but if you cover your nose and stop smelling, you will die.

Illustration by Wendy MacNaughton from 'The Essential Scratch and Sniff Guide to Becoming a Wine Expert.' Click image for more.

In fact, every breath we take in order to live is saturated with an extraordinary amount of olfactory information — a fact largely a matter of scale:

Each day, we breathe about 23,040 times and move around 438 cubic feet of air. It takes us about five seconds to breathe — two seconds to inhale and three seconds to exhale — and, in that time, molecules of odor flood through our systems. Inhaling and exhaling, we smell odors. Smells coat us, swirl around us, enter our bodies, emanate from us. We live in a constant wash of them. Still, when we try to describe a smell, words fail us like the fabrications they are…

The charm of language is that, though it is human-made, it can on rare occasions capture emotions and sensations that aren’t. But the physiological links between the smell and language centers of the brain are pitifully weak. Not so the links between the smell and the memory centers, a route that carries us nimbly across time and distance.

Indeed, that route is a greater shortcut to our cognition and psychoemotional circuitry than any of our other senses can offer. Ackerman outlines the singular qualities of our smell-sensation that set it apart from all other bodily functions:

Smell is the most direct of all our senses. When I hold a violet to my nose and inhale, odor molecules float back into the nasal cavity behind the bridge of the nose, where they are absorbed by the mucosa containing receptor cells bearing microscopic hairs called cilia. Five million of these cells fire impulses to the brain’s olfactory bulb or smell center. Such cells are unique to the nose. If you destroy a neuron in the brain, it’s finished forever; it won’t regrow. If you damage neurons in your eyes or ears, both organs will be irreparably damaged. But the neurons in the nose are replaced about every thirty days and, unlike any other neurons in the body, they stick right out and wave in the air current like anemones on a coral reef.

Illustration by Tomi Ungerer from 'The Cat-Hater's Handbook.' Click image for more.

That’s also what makes perfumes so powerful — if you’ve ever walked into a crowded room and instantly experienced a pang of emotion as you thought you smelled your ex, or your mother, or your third-grade teacher, you’ve had a first-hand testimony to the potency of smell as a trigger of emotional memory. Ackerman explains:

A smell can be overwhelmingly nostalgic because it triggers powerful images and emotions before we have time to edit them… When we give perfume to someone, we give them liquid memory. Kipling was right: “Smells are surer than sights and sounds to make your heart-strings crack.”

What’s perhaps most extraordinary is that scent lodges itself largely in the long-term memory system of the brain. And yet, we remain inept at mapping those links and associative chains when it comes to describing smells and their emotional echoes. To shed light on how perfumery plays into this paradox, Ackerman offers a taxonomy of the basic types of natural smells and how they became synthetically replicated, unleashing an intimate dance of art, science, and commerce:

All smells fall into a few basic categories, almost like primary colors: minty (peppermint), floral (roses), ethereal (pears), musky (musk), resinous (camphor), foul (rotten eggs), and acrid (vinegar). This is why perfume manufacturers have had such success in concocting floral bouquets or just the right threshold of muskiness or fruitiness. Natural substances are no longer required; perfumes can be made on the molecular level in laboratories. One of the first perfumes based on a completely synthetic smell (an aldehyde) was Chanel No. 5, which was created in 1922 and has remained a classic of sensual femininity. It has led to classic comments, too. When Marilyn Monroe was asked by a reporter what she wore to bed, she answered coyly, “Chanel No. 5.” Its top note — the one you smell first — is the aldehyde, then your nose detects the middle note of jasmine, rose, lily of the valley, orris, and ylang-ylang, and finally the base note, which carries the perfume and makes it linger: vetiver, sandalwood, cedar, vanilla, amber, civet, and musk. Base notes are almost always of animal origin, ancient emissaries of smell that transport us across woodlands and savannas.

And so we get to the actual science of smell — what actually makes us have an olfactory experience, and why we often confuse those with taste:

We need only eight molecules of a substance to trigger an impulse in a nerve ending, but forty nerve endings must be aroused before we smell something. Not everything has a smell: only substances volatile enough to spray microscopic particles into the air. Many things we encounter each day — including stone, glass, steel, and ivory — don’t evaporate when they stand at room temperature, so we don’t smell them. If you heat cabbage, it becomes more volatile (some of its particles evaporate into the air) and it suddenly smells stronger. Weightlessness makes astronauts lose taste and smell in space. In the absence of gravity, molecules cannot be volatile, so few of them get into our noses deeply enough to register as odors. This is a problem for nutritionists designing space food. Much of the taste of food depends on its smell; some chemists have gone so far as to claim that wine is simply a tasteless liquid that is deeply fragrant. Drink wine with a head cold, and you’ll taste water, they say. Before something can be tasted, it has to be dissolved in liquid (for example hard candy has to melt in saliva); and before something can be smelled, it has to be airborne. We taste only four flavors: sweet, sour, salt, and bitter. That means that everything else we call “flavor” is really “odor.” And many of the foods we think we can smell we can only taste. Sugar isn’t volatile, so we don’t smell it, even though we taste it intensely. If we have a mouthful of something delicious, which we want to savor and contemplate, we exhale; this drives the air in our mouths across our olfactory receptors, so we can smell it better.

Illustration by Wendy MacNaughton from 'The Essential Scratch and Sniff Guide to Becoming a Wine Expert.' Click image for more.

The rest of A Natural History of the Senses is just as fascinating a read, diving deeper into the mysteries and miracles of smell and our other sensory faculties. Complement it with Ackerman’s A Natural History of Love and her impossibly wonderful love letter to the Solar System, The Planets: A Cosmic Pastoral.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

08 APRIL, 2014

The Science of How Memory Works

By:

What the four “slave” systems of the mind have to do with riding a bicycle.

“Whatever becomes of [old memories], in the long intervals of consciousness?” Henry James wistfully pondered upon turning fifty. “They are like the lines of a letter written in sympathetic ink; hold the letter to the fire for a while and the grateful warmth brings out the invisible words.” James was not alone in seeking to understand the seemingly mysterious workings of human memory — something all the more urgently fascinating in our age of information overload, where we’re evolving a new kind of “transactive memory.” But like other scientific mysteries of how the brain works — including what actually happens while we sleep and why some people are left-handed — memory continues to give scientists more questions than answers.

In The Guardian of All Things: The Epic Story of Human Memory (public library) technology writer Michael S. Malone takes a 10,000-year journey into humanity’s understanding of our great cognitive record-keeper, exploring both its power and its ongoing perplexity.

Illustration from 'Neurocomic,' a graphic novel about how the brain works. Click image for more.

One of the most astounding facts Malone points out is that memory — that is, the creation of memories — is the result of a biochemical reaction that takes place inside neurons, one particularly common among neurons responsible for our senses. Scientists have recently discovered that our short-term memory — also known as “working memory,” the kind responsible for the “chunking” mechanism that powers our pattern-recognition and creativity — is localized to a few specific areas of the brain. The left hemisphere, for instance, is mostly in charge of verbal and object-oriented tasks. Even so, however, scientists remain mystified by the specific distribution, retrieval, and management of memory. Malone writes:

One popular theory holds that short-term memory consists of four “slave” systems. The first is phonological, for sound and language that (when its contents begin to fade) buys extra time through a second slave system. This second operation is a continuous rehearsal system — as you repeat a phone number you’ve just heard as you run to the other room for your phone. The third system is a visuo-spatial sketch pad that, as the name suggests, stores visual information and mental maps. Finally, the fourth (and most recently discovered) slave is an episodic buffer that gathers all of the diverse information in from the other slaves, and maybe other information from elsewhere, and integrates them together into what might be described as a multimedia memory.

It’s worth noting that memory and creativity have a great deal in common — the combinatorial process of memory-making that Malone describes is remarkably similar to how creativity works: we gather ideas and information just by being alive and awake to the world, record some of those impressions in our mental sketch pad, then integrate the various bits into new combinations that we call our “own” ideas, a kind of “multimedia” assemblage of existing bits.

Malone goes on to explore the inner workings of long-term memory — a substantially different beast, designed to keep our permanent mental record:

Chemically, we have a pretty good idea how memories are encoded and retained in brain neurons. As with short-term memory, the storage of information is made possible by the synthesis of certain proteins in the cell. What differentiates long-term memory in neurons is that frequent repetition of signals causes magnesium to be released — which opens the door for the attachment of calcium, which in turn makes the record stable and permanent. But as we all know from experience, memory can still fade over time. For that, the brain has a chemical process called long-term potentiation that regularly enhances the strength of the connections (synapses) between the neurons and creates an enzyme protein that also strengthens the signal — in other words, the memory — inside the neuron.

From the functional, Malone moves on to the structural organization of memory, where another dichotomy emerges:

Architecturally, the organization of memory in the brain is a lot more slippery to get one’s hands around (so to speak); different perspectives all seem to deliver useful insights. For example, one popular way to look at brain memory is to see it as taking two forms: explicit and implicit. Explicit, or “declarative,” memory is all the information in our brains that we can consciously bring to the surface. Curiously, despite its huge importance in making us human, we don’t really know where this memory is located. Scientists have, however, divided explicit memory into two forms: episodic, or memories that occurred at a specific point in time; and semantic, or understandings (via science, technology, experience, and so on) of how the world works.

Implicit, or “procedural” memory, on the other hand, stores skills and memories of how to physically function in the natural world. Holding a fork, driving a car, getting dressed — and, most famously, riding a bicycle — are all nuanced activities that modern humans do without really giving them much thought; and they are skills, in all their complexity, that we can call up and perform decades after last using them.

One of the most confounding pieces of the cognitive puzzle, however, is a form of memory known as emotional memory — a specialized system for cataloging our memories based on the emotions they evoke. It’s unclear whether it belongs to the explicit or implicit domain, or to both, and scientists are still seeking to understand whether it serves as a special “search function” for the brain. (What we do now know, however, is that sharpening “emotional recall” might be the secret to better memory.)

From all this perplexity emerges Malone’s bigger point, a somewhat assuring testament to the idea that science, at its best, is always driven by “thoroughly conscious ignorance”:

What we do know is that — a quarter-million years after mankind inherited this remarkable organ called the brain — even with all of the tools available to modern science, human memory remains a stunning enigma.

The Guardian of All Things is a fascinating read in its entirety. Complement it with Joshua Foer’s quest to hack memory to superhuman levels and Henry James on aging and memory.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

07 APRIL, 2014

Isaac Asimov on the Thrill of Lifelong Learning, Science vs. Religion, and the Role of Science Fiction in Advancing Society

By:

“It’s insulting to imply that only a system of rewards and punishments can keep you a decent human being.”

Isaac Asimov was an extraordinary mind and spirit — the author of more than 400 science and science fiction books and a tireless advocate of space exploration, he also took great joy in the humanities (and once annotated Lord Byron’s epic poem “Don Juan”), championed humanism over religion, and celebrated the human spirit itself (he even wrote young Carl Sagan fan mail). Like many of the best science fiction writers, he was as exceptional at predicting the future as he was at illuminating some of the most timeless predicaments of the human condition. In a 1988 interview with Bill Moyers, found in Bill Moyers: A World of Ideas (public library) — the same remarkable tome that gave us philosopher Martha Nussbaum on how to live with our human fragility — Asimov explores several subjects that still stir enormous cultural concern and friction. With his characteristic eloquence and sensitivity to the various dimensions of these issues, he presages computer-powered lifelong learning and online education decades before it existed, weighs the question of how authors will make a living in a world of free information, bemoans the extant attempts of religious fundamentalism to drown out science and rational thought, and considers the role of science fiction as a beacon of the future.

The conversation begins with a discussion of Asimov’s passionate belief that when given the right tools, we can accomplish far more than what we can with the typical offerings of formal education:

MOYERS: Do you think we can educate ourselves, that any one of us, at any time, can be educated in any subject that strikes our fancy?

ASIMOV: The key words here are “that strikes our fancy.” There are some things that simply don’t strike my fancy, and I doubt that I can force myself to be educated in them. On the other hand, when there’s a subject I’m ferociously interested in, then it is easy for me to learn about it. I take it in gladly and cheerfully…

[What’s exciting is] the actual process of broadening yourself, of knowing there’s now a little extra facet of the universe you know about and can think about and can understand. It seems to me that when it’s time to die, there would be a certain pleasure in thinking that you had utilized your life well, learned as much as you could, gathered in as much as possible of the universe, and enjoyed it. There’s only this one universe and only this one lifetime to try to grasp it. And while it is inconceivable that anyone can grasp more than a tiny portion of it, at least you can do that much. What a tragedy just to pass through and get nothing out of it.

MOYERS: When I learn something new — and it happens every day — I feel a little more at home in this universe, a little more comfortable in the nest. I’m afraid that by the time I begin to feel really at home, it’ll all be over.

ASIMOV: I used to worry about that. I said, “I’m gradually managing to cram more and more things into my mind. I’ve got this beautiful mind, and it’s going to die, and it’ll all be gone.” And then I thought, “No, not in my case. Every idea I’ve ever had I’ve written down, and it’s all there on paper. I won’t be gone. It’ll be there.

Page from 'Charley Harper: An Illustrated Life'

Asimov then considers how computers would usher in this profound change in learning and paints the outline of a concept that Clay Shirky would detail and term “cognitive surplus” two decades later:

MOYERS: Is it possible that this passion for learning can be spread to ordinary folks out there? Can we have a revolution in learning?

ASIMOV: Yes, I think not only that we can but that we must. As computers take over more and more of the work that human beings shouldn’t be doing in the first place — because it doesn’t utilize their brains, it stifles and bores them to death — there’s going to be nothing left for human beings to do but the more creative types of endeavor. The only way we can indulge in the more creative types of endeavor is to have brains that aim at that from the start.

You can’t take a human being and put him to work at a job that underuses the brain and keep him working at it for decades and decades, and then say, “Well, that job isn’t there, go do something more creative.” You have beaten the creativity out of him. But if from the start children are educated into appreciating their own creativity, then probably almost all of us can be creative. In the olden days, very few people could read and write. Literacy was a very novel sort of thing, and it was felt that most people just didn’t have it in them. But with mass education, it turned out that most people could be taught to read and write. In the same way, once we have computer outlets in every home, each of them hooked up to enormous libraries, where you can ask any question and be given answers, you can look up something you’re interested in knowing, however silly it might seem to someone else.

Asimov goes on to point out the flawed industrial model of education — something Sir Ken Robinson would lament articulately two decades later — and tells Moyers:

Today, what people call learning is forced on you. Everyone is forced to learn the same thing on the same day at the same speed in class. But everyone is different. For some, class goes too fast, for some too slow, for some in the wring direction. But give everyone a chance, in addition to school, to follow up their own bent from the start, to find out about whatever they’re interested in by looking it up in their own homes, at their own speed, in their own time, and everyone will enjoy learning.

Later, in agreeing with Moyers that this revolution in learning isn’t merely for the young, Asimov adds:

That’s another trouble with education as we now have it. People think of education as something that they can finish. And what’s more, when they finish, it’s a rite of passage. You’re finished with school. You’re no more a child, and therefore anything that reminds you of school — reading books, having ideas, asking questions — that’s kid’s stuff. Now your’e an adult, you don’t do that sort of thing anymore…

Every kid knows the only reason he’s in school is because he’s a kid and little and weak, and if he manages to get out early, if he drops out, why he’s just a premature man.

Embroidered map of the infant Internet in 1983 by Debbie Millman

Speaking at a time when the Internet as we know it today was still an infant, and two decades before the golden age of online education, Asimov offers a remarkably prescient vision for how computer-powered public access to information would spark the very movement of lifelong learning that we’ve witnessed in the past decade:

You have everybody looking forward to no longer learning, and you make them ashamed afterward of going back to learning. If you have a system of education using computers, then anyone, any age, can learn by himself, can continue to be interested. If you enjoy learning, there’s no reason why you should stop at a given age. People don’t stop things they enjoy doing just because they reach a certain age. They don’t stop playing tennis just because they’ve turned forty. They don’t stop with sex just because they’ve turned forty. They keep it up as long as they can if they enjoy it, and learning will be the same thing. The trouble with learning is that most people don’t enjoy it because of the circumstances. Make it possible for them to enjoy learning, and they’ll keep it up.

When Moyers asks him to describe what such a teaching machine would look like — again, in 1988, when personal computers had only just begun to appear in homes — Asimov envisions a kind of Siri-like artificial intelligence, combined with the functionality of a discovery engine:

I suppose that one essential thing would be a screen on which you could display things… And you’ll have to have a keyboard on which you ask your questions, although ideally I could like to see one that could be activated by voice. You could actually talk to it, and perhaps it could talk to you too, and say, “I have something here that may interest you. Would you like to have me print it out for you?” And you’d say, “Well, what is it exactly?” And it would tell you, and you might say, “Oh all right, I’ll take a look at it.”

But one of his most prescient remarks actually has to do not with the mechanics of freely available information but with the ethics and economics of it. Long before our present conundrum of how to make online publishing both in the public interest and financially sustainable for publishers, Asimov shares with Moyers the all too familiar question he has been asking himself — “How do you arrange to pay the author for the use of the material?” — and addresses it with equal parts realism and idealism:

After all, if a person writes something, and this then becomes available to everybody, you deprive him of the economic reason for writing. A person like myself, if he was assured of a livelihood, might write anyway, just because he enjoyed it, but most people would want to do it in return for something. I imagine how they must have felt when free libraries were first instituted. “What? My book in a free library? Anyone can come in and read it for free?” Then you realize that there are some books that wouldn’t be sold at all if you didn’t have libraries.

(A century earlier, Schopenhauer had issued a much sterner admonition against the cultural malady of writing solely for material rewards.)

Painting of hell by William Blake from John Milton's 'Paradise Lost' (click image for more)

Asimov then moves on to the subject of science vs. religion — something he would come to address with marvelous eloquence in his memoir — and shares his concern about how mysticism and fundamentalism undercut society:

I’d like to think that people who are given a chance to learn facts and broaden their knowledge of the universe wouldn’t seek so avidly after mysticism.

[…]

It isn’t right to sell a person phony stock, and take money for it, and this is what mystics are doing. They’re selling people phony knowledge and taking money for it. Even if people feel good about it, I can well imagine that a person who really believes in astrology is going to have a feeling of security because he knows that this is a bad day, so he’ll stay at home, just as a guy who’s got phony stock may look at it and feel rich. But he still has phony stock, and the person who buys mysticism still has phony knowledge.

He offers a counterpoint and considers what real knowledge is, adding to history’s best definitions of science:

Science doesn’t purvey absolute truth. Science is a mechanism, a way of trying to improve your knowledge of nature. It’s a system for testing your thoughts against the universe and seeing whether they match. This works not just for the ordinary aspects of science, but for all of life.

Asimov goes on to bemoan the cultural complacency that has led to the decline of science in mainstream culture — a decline we feel even today more sharply than ever when, say, a creationist politician tries to stop a little girl’s campaign for a state fossil because such an effort would “endorse” evolution. Noting that “we are living in a business society” where fewer and fewer students take math and science, Asimov laments how we’ve lost sight of the fact that science is driven by not-knowing rather than certitude:

MOYERS: You wrote a few years ago that the decline in America’s world power is in part brought about by our diminishing status as a world science leader. Why have we neglected science?

ASIMOV: Partly because of success. The most damaging statement that the United States has ever been subjected to is the phrase “Yankee know-how.” You get the feeling somehow that Americans — just by the fact that they’re American — are somehow smarter and more ingenious than other people, which really is not so. Actually, the phrase was first used in connection with the atomic bomb, which was invented and brought to fruition by a bunch of European refugees. That’s “Yankee know-how.”

MOYERS: There’s long been a bias in this country against science. When Benjamin Franklin was experimenting with the lightning rod, a lot of good folk said, “You don’t need a lightning rod. If you want to prevent lightning from striking, you just have to pray about it.”

ASIMOV: The bias against science is part of being a pioneer society. You somehow feel the city life is decadent. American history is full of fables of the noble virtuous farmer and the vicious city slicker. The city slicker is an automatic villain. Unfortunately, such stereotypes can do damage. A noble ignoramus is not necessarily what the country needs.

(What might Asimov, who in 1980 voiced fears that the fundamentalists coming into power with President Reagan would turn the country even more against science by demanding that biblical creationism be given an equal footing with evolution in the classroom, if he knew that a contemporary television station can edit out Neil deGrasse Tyson’s mention of evolution?)

'The Expulsion of Adam and Eve from the Garden of Eden' by William Blake from John Milton's 'Paradise Lost' (click image for more)

But when Moyers asks the writer whether he considers himself an enemy of religion, Asimov answers in the negative and offers this beautifully thoughtful elaboration on the difference between the blind faith of religion and the critical thinking at the heart of science:

My objection to fundamentalism is not that they are fundamentalists but that essentially they want me to be a fundamentalist, too. Now, they may say that I believe evolution is true and I want everyone to believe that evolution is true. But I don’t want everyone to believe that evolution is true, I want them to study what we say about evolution and to decide for themselves. Fundamentalists say they want to treat creationism on an equal basis. But they can’t. It’s not a science. You can teach creationism in churches and in courses on religion. They would be horrified if I were to suggest that in churches they should teach secular humanism as nan alternate way of looking at the universe or evolution as an alternate way of considering how life may have started. In the church they teach only what they believe, and rightly so, I suppose. But on the other hand, in schools, in science courses, we’ve got to teach what scientists think is the way the universe works.

He extols the “thoroughly conscious ignorance” at the heart of science as a much safer foundation of reality than dogma:

That is really the glory of science — that science is tentative, that it is not certain, that it is subject to change. What is really disgraceful is to have a set of beliefs that you think is absolute and has been so from the start and can’t change, where you simply won’t listen to evidence. You say, “If the evidence agrees with me, it’s not necessary, and if it doesn’t agree with me, it’s false.” This is the legendary remark of Omar when they captured Alexandria and asked him what to do with the library. He said, “If the books agree with the Koran, they are not necessary and may be burned. If they disagree with the Koran, they are pernicious and must be burned.” Well, there are still these Omar-like thinkers who think all of knowledge will fit into one book called the Bible, and who refuse to allow it is possible ever to conceive of an error there. To my way of thinking, that is much more dangerous than a system of knowledge that is tentative and uncertain.

Riffing off the famous and rather ominous Dostoevsky line that “if God is dead, everything is permitted,” Asimov revisits the notion of intrinsic vs. extrinsic rewards — similarly to his earlier remark that good writing is motivated by intrinsic motives rather than external incentives, he argues that good-personhood can’t be steered by dogma but by one’s own conscience:

It’s insulting to imply that only a system of rewards and punishments can keep you a decent human being. Isn’t it conceivable a person wants to be a decent human being because that way he feels better?

I don’t believe that I’m ever going to heaven or hell. I think that when I die, there will be nothingness. That’s what I firmly believe. That’s not to mean that I have the impulse to go out and rob and steal and rape and everything else because I don’t fear punishment. For one thing, I fear worldly punishment. And for a second thing, I fear the punishment of my own conscience. I have a conscience. It doesn’t depend on religion. And I think that’s so with other people, too.

'The Rout of the Rebel Angels' by William Blake from John Milton's 'Paradise Lost' (click image for more)

He goes on to extend this conscience-driven behavior to the domain of science, which he argues is strongly motivated by morality and a generosity of spirit uncommon in most other disciplines, where ego consumes goodwill. (Mark Twain memorably argued that no domain was more susceptible to human egotism than religion.) Asimov offers a heartening example:

I think it’s amazing how many saints there have been among scientists. I’ll give you an example. In 1900, De Vries studied mutations. He found a patch of evening primrose of different types, and he studied how they inherited their characteristics. He worked out the laws of genetics. Two other guys worked out the laws of genetics at the same time, a guy called Karl Correns, who was a German, and Erich Tschermak von Seysenegg, who was an Austrian. All three worked out the laws of genetics in 1900, and having done so, all three looked through the literature, just to see what has been done before. All three discovered that in the 1860s Gregor Mendel had worked out the laws of genetics, and people hadn’t paid any attention then. All three reported their findings as confirmation of what Mendel had found. Not one of the three attempted to say that it was original with him. And you know what it meant. It meant that two of them, Correns and Tschermak von Seyenegg, lived in obscurity. De Vries is known only because he was also the first to work out the theory of mutations. But as far as discovering genetics is concerned, Mendel gets all the credit. They knew at the time that this would happen. That’s the sort of thing you just don’t find outside of science.

Moyers, in his typical perceptive fashion, then asks Asimov why, given how much the truth of science excites him, he is best-known for writing science fiction, and Asimov responds with equal insight and outlines the difference, both cultural and creative, between fiction in general and science fiction:

In serious fiction, fiction where the writer feels he’s accomplishing something besides simply amusing people — although there’s nothing wrong with simply amusing people — the writer is holding up a mirror to the human species, making it possible for you to understand people better because you’ve read the novel or story, and maybe making it possible for you to understand yourself better. That’s an important thing.

Now science fiction uses a different method. It works up an artificial society, one which doesn’t exist, or one that may possibly exist in the future, but not necessarily. And it portrays events against the background of this society in the hope that you will be able to see yourself in relation to the present society… That’s why I write science fiction — because it’s a way of writing fiction in a style that enables me to make points I can’t make otherwise.

Painting by Rowena Morrill

But perhaps the greatest benefit of science fiction, Moyers intimates and Asimov agrees, is its capacity to warm people up to changes that are inevitable but that seem inconceivable at the present time — after all, science fiction writers do have a remarkable record of getting the future right. Asimov continues:

Society is always changing, but the rate of change has been accelerating all through history for a variety of reasons. One, the change is cumulative. The very changes you make now make it easier to make further changes. Until the Industrial Revolution came along, people weren’t aware of change or a future. They assumed the future would be exactly like it had always been, just with different people… It was only with the coming of the Industrial Revolution that the rate of change became fast enough to be visible in a single lifetime. People were suddenly aware that not only were things changing, but that they would continue to change after they died. That was when science fiction came into being as opposed to fantasy and adventure tales. Because people knew that they would die before they could see the changes that would happen in the next century, they thought it would be nice to imagine what they might be.

As time goes on and the rate of change still continues to accelerate, it becomes more and more important to adjust what you do today to the fact of change in the future. It’s ridiculous to make your plans now on the assumption that things will continue as they are now. You have to assume that if something you’re doing is going to reach fruition in ten years, that in those ten years changes will take place, and perhaps what you’re doing will have no meaning then… Science fiction is important because it fights the natural notion that there’s something permanent about things the way they are right now.

Painting by William Blake from Dante's 'Divine Comedy' (click image for more)

Given that accepting impermanence doesn’t come easily to us, that stubborn resistance to progress and the inevitability of change is perhaps also what Asimov sees in the religious fundamentalism he condemns — dogma, after all, is based on the premise that truth is absolute and permanent, never mind that the cultural context is always changing. Though he doesn’t draw the link directly, in another part of the interview he revisits the problem with fundamentalism with words that illuminate the stark contrast between the cultural role of religion and that of science fiction:

Fundamentalists take a statement that made sense at the time it was made, and because they refuse to consider that the statement may not be an absolute, eternal truth, they continue following it under conditions where to do so is deadly.

Indeed, Asimov ends the conversation on a related note as he considers what it would take to transcend the intolerance that such fundamentalism breeds:

MOYERS: You’ve lived through much of this century. Have you ever known human beings to think with the perspective you’re calling on them to think with now?

ASIMOV: It’s perhaps not important that every human being think so. But how about the leaders and opinion-makers thinking so? Ordinary people might follow them. It would help if we didn’t have leaders who were thinking in exactly the opposite way, if we didn’t have people who were shouting hatred and suspicion of foreigners, if we didn’t have people who were shouting that it’s more important to be unfriendly than to be friendly, if we didn’t have people shouting that the people inside the country who don’t look exactly the way the rest of us look have something wrong with them. It’s almost not necessary for us to do good; it’s only necessary for us to stop doing evil, for goodness’ sake.

Bill Moyers: A World of Ideas is a remarkable tome in its entirety. Complement this particular sample-taste with Asimov on religion vs. humanism, Buckminster Fuller’s vision for the future of education, and Carl Sagan on science and spirituality.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.