Brain Pickings

Posts Tagged ‘psychology’

07 APRIL, 2014

Isaac Asimov on the Thrill of Lifelong Learning, Science vs. Religion, and the Role of Science Fiction in Advancing Society

By:

“It’s insulting to imply that only a system of rewards and punishments can keep you a decent human being.”

Isaac Asimov was an extraordinary mind and spirit — the author of more than 400 science and science fiction books and a tireless advocate of space exploration, he also took great joy in the humanities (and once annotated Lord Byron’s epic poem “Don Juan”), championed humanism over religion, and celebrated the human spirit itself (he even wrote young Carl Sagan fan mail). Like many of the best science fiction writers, he was as exceptional at predicting the future as he was at illuminating some of the most timeless predicaments of the human condition. In a 1988 interview with Bill Moyers, found in Bill Moyers: A World of Ideas (public library) — the same remarkable tome that gave us philosopher Martha Nussbaum on how to live with our human fragility — Asimov explores several subjects that still stir enormous cultural concern and friction. With his characteristic eloquence and sensitivity to the various dimensions of these issues, he presages computer-powered lifelong learning and online education decades before it existed, weighs the question of how authors will make a living in a world of free information, bemoans the extant attempts of religious fundamentalism to drown out science and rational thought, and considers the role of science fiction as a beacon of the future.

The conversation begins with a discussion of Asimov’s passionate belief that when given the right tools, we can accomplish far more than what we can with the typical offerings of formal education:

MOYERS: Do you think we can educate ourselves, that any one of us, at any time, can be educated in any subject that strikes our fancy?

ASIMOV: The key words here are “that strikes our fancy.” There are some things that simply don’t strike my fancy, and I doubt that I can force myself to be educated in them. On the other hand, when there’s a subject I’m ferociously interested in, then it is easy for me to learn about it. I take it in gladly and cheerfully…

[What's exciting is] the actual process of broadening yourself, of knowing there’s now a little extra facet of the universe you know about and can think about and can understand. It seems to me that when it’s time to die, there would be a certain pleasure in thinking that you had utilized your life well, learned as much as you could, gathered in as much as possible of the universe, and enjoyed it. There’s only this one universe and only this one lifetime to try to grasp it. And while it is inconceivable that anyone can grasp more than a tiny portion of it, at least you can do that much. What a tragedy just to pass through and get nothing out of it.

MOYERS: When I learn something new — and it happens every day — I feel a little more at home in this universe, a little more comfortable in the nest. I’m afraid that by the time I begin to feel really at home, it’ll all be over.

ASIMOV: I used to worry about that. I said, “I’m gradually managing to cram more and more things into my mind. I’ve got this beautiful mind, and it’s going to die, and it’ll all be gone.” And then I thought, “No, not in my case. Every idea I’ve ever had I’ve written down, and it’s all there on paper. I won’t be gone. It’ll be there.

Page from 'Charley Harper: An Illustrated Life'

Asimov then considers how computers would usher in this profound change in learning and paints the outline of a concept that Clay Shirky would detail and term “cognitive surplus” two decades later:

MOYERS: Is it possible that this passion for learning can be spread to ordinary folks out there? Can we have a revolution in learning?

ASIMOV: Yes, I think not only that we can but that we must. As computers take over more and more of the work that human beings shouldn’t be doing in the first place — because it doesn’t utilize their brains, it stifles and bores them to death — there’s going to be nothing left for human beings to do but the more creative types of endeavor. The only way we can indulge in the more creative types of endeavor is to have brains that aim at that from the start.

You can’t take a human being and put him to work at a job that underuses the brain and keep him working at it for decades and decades, and then say, “Well, that job isn’t there, go do something more creative.” You have beaten the creativity out of him. But if from the start children are educated into appreciating their own creativity, then probably almost all of us can be creative. In the olden days, very few people could read and write. Literacy was a very novel sort of thing, and it was felt that most people just didn’t have it in them. But with mass education, it turned out that most people could be taught to read and write. In the same way, once we have computer outlets in every home, each of them hooked up to enormous libraries, where you can ask any question and be given answers, you can look up something you’re interested in knowing, however silly it might seem to someone else.

Asimov goes on to point out the flawed industrial model of education — something Sir Ken Robinson would lament articulately two decades later — and tells Moyers:

Today, what people call learning is forced on you. Everyone is forced to learn the same thing on the same day at the same speed in class. But everyone is different. For some, class goes too fast, for some too slow, for some in the wring direction. But give everyone a chance, in addition to school, to follow up their own bent from the start, to find out about whatever they’re interested in by looking it up in their own homes, at their own speed, in their own time, and everyone will enjoy learning.

Later, in agreeing with Moyers that this revolution in learning isn’t merely for the young, Asimov adds:

That’s another trouble with education as we now have it. People think of education as something that they can finish. And what’s more, when they finish, it’s a rite of passage. You’re finished with school. You’re no more a child, and therefore anything that reminds you of school — reading books, having ideas, asking questions — that’s kid’s stuff. Now your’e an adult, you don’t do that sort of thing anymore…

Every kid knows the only reason he’s in school is because he’s a kid and little and weak, and if he manages to get out early, if he drops out, why he’s just a premature man.

Embroidered map of the infant Internet in 1983 by Debbie Millman

Speaking at a time when the Internet as we know it today was still an infant, and two decades before the golden age of online education, Asimov offers a remarkably prescient vision for how computer-powered public access to information would spark the very movement of lifelong learning that we’ve witnessed in the past decade:

You have everybody looking forward to no longer learning, and you make them ashamed afterward of going back to learning. If you have a system of education using computers, then anyone, any age, can learn by himself, can continue to be interested. If you enjoy learning, there’s no reason why you should stop at a given age. People don’t stop things they enjoy doing just because they reach a certain age. They don’t stop playing tennis just because they’ve turned forty. They don’t stop with sex just because they’ve turned forty. They keep it up as long as they can if they enjoy it, and learning will be the same thing. The trouble with learning is that most people don’t enjoy it because of the circumstances. Make it possible for them to enjoy learning, and they’ll keep it up.

When Moyers asks him to describe what such a teaching machine would look like — again, in 1988, when personal computers had only just begun to appear in homes — Asimov envisions a kind of Siri-like artificial intelligence, combined with the functionality of a discovery engine:

I suppose that one essential thing would be a screen on which you could display things… And you’ll have to have a keyboard on which you ask your questions, although ideally I could like to see one that could be activated by voice. You could actually talk to it, and perhaps it could talk to you too, and say, “I have something here that may interest you. Would you like to have me print it out for you?” And you’d say, “Well, what is it exactly?” And it would tell you, and you might say, “Oh all right, I’ll take a look at it.”

But one of his most prescient remarks actually has to do not with the mechanics of freely available information but with the ethics and economics of it. Long before our present conundrum of how to make online publishing both in the public interest and financially sustainable for publishers, Asimov shares with Moyers the all too familiar question he has been asking himself — “How do you arrange to pay the author for the use of the material?” — and addresses it with equal parts realism and idealism:

After all, if a person writes something, and this then becomes available to everybody, you deprive him of the economic reason for writing. A person like myself, if he was assured of a livelihood, might write anyway, just because he enjoyed it, but most people would want to do it in return for something. I imagine how they must have felt when free libraries were first instituted. “What? My book in a free library? Anyone can come in and read it for free?” Then you realize that there are some books that wouldn’t be sold at all if you didn’t have libraries.

(A century earlier, Schopenhauer had issued a much sterner admonition against the cultural malady of writing solely for material rewards.)

Painting of hell by William Blake from John Milton's 'Paradise Lost' (click image for more)

Asimov then moves on to the subject of science vs. religion — something he would come to address with marvelous eloquence in his memoir — and shares his concern about how mysticism and fundamentalism undercut society:

I’d like to think that people who are given a chance to learn facts and broaden their knowledge of the universe wouldn’t seek so avidly after mysticism.

[…]

It isn’t right to sell a person phony stock, and take money for it, and this is what mystics are doing. They’re selling people phony knowledge and taking money for it. Even if people feel good about it, I can well imagine that a person who really believes in astrology is going to have a feeling of security because he knows that this is a bad day, so he’ll stay at home, just as a guy who’s got phony stock may look at it and feel rich. But he still has phony stock, and the person who buys mysticism still has phony knowledge.

He offers a counterpoint and considers what real knowledge is, adding to history’s best definitions of science:

Science doesn’t purvey absolute truth. Science is a mechanism, a way of trying to improve your knowledge of nature. It’s a system for testing your thoughts against the universe and seeing whether they match. This works not just for the ordinary aspects of science, but for all of life.

Asimov goes on to bemoan the cultural complacency that has led to the decline of science in mainstream culture — a decline we feel even today more sharply than ever when, say, a creationist politician tries to stop a little girl’s campaign for a state fossil because such an effort would “endorse” evolution. Noting that “we are living in a business society” where fewer and fewer students take math and science, Asimov laments how we’ve lost sight of the fact that science is driven by not-knowing rather than certitude:

MOYERS: You wrote a few years ago that the decline in America’s world power is in part brought about by our diminishing status as a world science leader. Why have we neglected science?

ASIMOV: Partly because of success. The most damaging statement that the United States has ever been subjected to is the phrase “Yankee know-how.” You get the feeling somehow that Americans — just by the fact that they’re American — are somehow smarter and more ingenious than other people, which really is not so. Actually, the phrase was first used in connection with the atomic bomb, which was invented and brought to fruition by a bunch of European refugees. That’s “Yankee know-how.”

MOYERS: There’s long been a bias in this country against science. When Benjamin Franklin was experimenting with the lightning rod, a lot of good folk said, “You don’t need a lightning rod. If you want to prevent lightning from striking, you just have to pray about it.”

ASIMOV: The bias against science is part of being a pioneer society. You somehow feel the city life is decadent. American history is full of fables of the noble virtuous farmer and the vicious city slicker. The city slicker is an automatic villain. Unfortunately, such stereotypes can do damage. A noble ignoramus is not necessarily what the country needs.

(What might Asimov, who in 1980 voiced fears that the fundamentalists coming into power with President Reagan would turn the country even more against science by demanding that biblical creationism be given an equal footing with evolution in the classroom, if he knew that a contemporary television station can edit out Neil deGrasse Tyson’s mention of evolution?)

'The Expulsion of Adam and Eve from the Garden of Eden' by William Blake from John Milton's 'Paradise Lost' (click image for more)

But when Moyers asks the writer whether he considers himself an enemy of religion, Asimov answers in the negative and offers this beautifully thoughtful elaboration on the difference between the blind faith of religion and the critical thinking at the heart of science:

My objection to fundamentalism is not that they are fundamentalists but that essentially they want me to be a fundamentalist, too. Now, they may say that I believe evolution is true and I want everyone to believe that evolution is true. But I don’t want everyone to believe that evolution is true, I want them to study what we say about evolution and to decide for themselves. Fundamentalists say they want to treat creationism on an equal basis. But they can’t. It’s not a science. You can teach creationism in churches and in courses on religion. They would be horrified if I were to suggest that in churches they should teach secular humanism as nan alternate way of looking at the universe or evolution as an alternate way of considering how life may have started. In the church they teach only what they believe, and rightly so, I suppose. But on the other hand, in schools, in science courses, we’ve got to teach what scientists think is the way the universe works.

He extols the “thoroughly conscious ignorance” at the heart of science as a much safer foundation of reality than dogma:

That is really the glory of science — that science is tentative, that it is not certain, that it is subject to change. What is really disgraceful is to have a set of beliefs that you think is absolute and has been so from the start and can’t change, where you simply won’t listen to evidence. You say, “If the evidence agrees with me, it’s not necessary, and if it doesn’t agree with me, it’s false.” This is the legendary remark of Omar when they captured Alexandria and asked him what to do with the library. He said, “If the books agree with the Koran, they are not necessary and may be burned. If they disagree with the Koran, they are pernicious and must be burned.” Well, there are still these Omar-like thinkers who think all of knowledge will fit into one book called the Bible, and who refuse to allow it is possible ever to conceive of an error there. To my way of thinking, that is much more dangerous than a system of knowledge that is tentative and uncertain.

Riffing off the famous and rather ominous Dostoevsky line that “if God is dead, everything is permitted,” Asimov revisits the notion of intrinsic vs. extrinsic rewards — similarly to his earlier remark that good writing is motivated by intrinsic motives rather than external incentives, he argues that good-personhood can’t be steered by dogma but by one’s own conscience:

It’s insulting to imply that only a system of rewards and punishments can keep you a decent human being. Isn’t it conceivable a person wants to be a decent human being because that way he feels better?

I don’t believe that I’m ever going to heaven or hell. I think that when I die, there will be nothingness. That’s what I firmly believe. That’s not to mean that I have the impulse to go out and rob and steal and rape and everything else because I don’t fear punishment. For one thing, I fear worldly punishment. And for a second thing, I fear the punishment of my own conscience. I have a conscience. It doesn’t depend on religion. And I think that’s so with other people, too.

'The Rout of the Rebel Angels' by William Blake from John Milton's 'Paradise Lost' (click image for more)

He goes on to extend this conscience-driven behavior to the domain of science, which he argues is strongly motivated by morality and a generosity of spirit uncommon in most other disciplines, where ego consumes goodwill. (Mark Twain memorably argued that no domain was more susceptible to human egotism than religion.) Asimov offers a heartening example:

I think it’s amazing how many saints there have been among scientists. I’ll give you an example. In 1900, De Vries studied mutations. He found a patch of evening primrose of different types, and he studied how they inherited their characteristics. He worked out the laws of genetics. Two other guys worked out the laws of genetics at the same time, a guy called Karl Correns, who was a German, and Erich Tschermak von Seysenegg, who was an Austrian. All three worked out the laws of genetics in 1900, and having done so, all three looked through the literature, just to see what has been done before. All three discovered that in the 1860s Gregor Mendel had worked out the laws of genetics, and people hadn’t paid any attention then. All three reported their findings as confirmation of what Mendel had found. Not one of the three attempted to say that it was original with him. And you know what it meant. It meant that two of them, Correns and Tschermak von Seyenegg, lived in obscurity. De Vries is known only because he was also the first to work out the theory of mutations. But as far as discovering genetics is concerned, Mendel gets all the credit. They knew at the time that this would happen. That’s the sort of thing you just don’t find outside of science.

Moyers, in his typical perceptive fashion, then asks Asimov why, given how much the truth of science excites him, he is best-known for writing science fiction, and Asimov responds with equal insight and outlines the difference, both cultural and creative, between fiction in general and science fiction:

In serious fiction, fiction where the writer feels he’s accomplishing something besides simply amusing people — although there’s nothing wrong with simply amusing people — the writer is holding up a mirror to the human species, making it possible for you to understand people better because you’ve read the novel or story, and maybe making it possible for you to understand yourself better. That’s an important thing.

Now science fiction uses a different method. It works up an artificial society, one which doesn’t exist, or one that may possibly exist in the future, but not necessarily. And it portrays events against the background of this society in the hope that you will be able to see yourself in relation to the present society… That’s why I write science fiction — because it’s a way of writing fiction in a style that enables me to make points I can’t make otherwise.

Painting by Rowena Morrill

But perhaps the greatest benefit of science fiction, Moyers intimates and Asimov agrees, is its capacity to warm people up to changes that are inevitable but that seem inconceivable at the present time — after all, science fiction writers do have a remarkable record of getting the future right. Asimov continues:

Society is always changing, but the rate of change has been accelerating all through history for a variety of reasons. One, the change is cumulative. The very changes you make now make it easier to make further changes. Until the Industrial Revolution came along, people weren’t aware of change or a future. They assumed the future would be exactly like it had always been, just with different people… It was only with the coming of the Industrial Revolution that the rate of change became fast enough to be visible in a single lifetime. People were suddenly aware that not only were things changing, but that they would continue to change after they died. That was when science fiction came into being as opposed to fantasy and adventure tales. Because people knew that they would die before they could see the changes that would happen in the next century, they thought it would be nice to imagine what they might be.

As time goes on and the rate of change still continues to accelerate, it becomes more and more important to adjust what you do today to the fact of change in the future. It’s ridiculous to make your plans now on the assumption that things will continue as they are now. You have to assume that if something you’re doing is going to reach fruition in ten years, that in those ten years changes will take place, and perhaps what you’re doing will have no meaning then… Science fiction is important because it fights the natural notion that there’s something permanent about things the way they are right now.

Painting by William Blake from Dante's 'Divine Comedy' (click image for more)

Given that accepting impermanence doesn’t come easily to us, that stubborn resistance to progress and the inevitability of change is perhaps also what Asimov sees in the religious fundamentalism he condemns — dogma, after all, is based on the premise that truth is absolute and permanent, never mind that the cultural context is always changing. Though he doesn’t draw the link directly, in another part of the interview he revisits the problem with fundamentalism with words that illuminate the stark contrast between the cultural role of religion and that of science fiction:

Fundamentalists take a statement that made sense at the time it was made, and because they refuse to consider that the statement may not be an absolute, eternal truth, they continue following it under conditions where to do so is deadly.

Indeed, Asimov ends the conversation on a related note as he considers what it would take to transcend the intolerance that such fundamentalism breeds:

MOYERS: You’ve lived through much of this century. Have you ever known human beings to think with the perspective you’re calling on them to think with now?

ASIMOV: It’s perhaps not important that every human being think so. But how about the leaders and opinion-makers thinking so? Ordinary people might follow them. It would help if we didn’t have leaders who were thinking in exactly the opposite way, if we didn’t have people who were shouting hatred and suspicion of foreigners, if we didn’t have people who were shouting that it’s more important to be unfriendly than to be friendly, if we didn’t have people shouting that the people inside the country who don’t look exactly the way the rest of us look have something wrong with them. It’s almost not necessary for us to do good; it’s only necessary for us to stop doing evil, for goodness’ sake.

Bill Moyers: A World of Ideas is a remarkable tome in its entirety. Complement this particular sample-taste with Asimov on religion vs. humanism, Buckminster Fuller’s vision for the future of education, and Carl Sagan on science and spirituality.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount:





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

01 APRIL, 2014

Why Look at Animals: John Berger on What Our Relationship with Our Fellow Beings Reveals About Us

By:

“[Animals] are the objects of our ever-extending knowledge.”

“Erasing the awe-inspiring variety of sentient life impoverishes all our lives,” Joanna Bourke wrote in her fascinating chronicle of our understanding of what it means to be human, an awareness inextricably entwined with a parallel understanding of the animal experience of our fellow sentient beings. Jon Mooallem put it even more poignantly in his beautiful and bittersweet meditation on the fate of wildlife today: “Maybe you have to believe in the value of everything to believe in the value of anything.” And yet how readily we, as a civilization and as individuals, stop believing in the value of that awe-inspiring variety of sentient life.

Hardly anyone has addressed this disquieting cultural tendency with more dimension than John Berger, best-known for his brilliant 1972 critique of consumer culture, Ways of Seeing. In his essay “Why Look at Animals?,” part of the altogether fantastic 1980 anthology About Looking (public library), Berger examines the evolution of our relationship with animals and how they went from muses for the very first human art, as cave men and women adorned their stone walls with drawings of animals painted with animal blood, to spiritual deities to captive entertainment.

Chauvet cave drawings from '100 Diagrams That Changed the World.' Click image for more.

He opens with a poetic reminder of how it all began:

To suppose that animals first entered the human imagination as meat or leather or horn is to project a 19th century attitude backwards across the millennia. Animals first entered the imagination as messengers and promises. For example, the domestication of cattle did not begin as a simple prospect of milk and meat. Cattle had magical functions, sometimes oracular, sometimes sacrificial. And the choice of a given species as magical, tameable and alimentary was originally determined by the habits, proximity and “invitation” of the animal in question.

But there was also something else that drew us closer to our fellow beings as they went from our bonfires to our backyards to our beds — some other kind of singular comfort they offered. As any devoted pet-parent (to use a term rather telling in itself) can attest, a big part of what makes those bonds so intimate is the unconditional affection pets provide, a lack of conditions largely premised on their inability to speak, to talk back, in our human language, coupled with their capacity to speak directly to the soul. Berger writes:

With their parallel lives, animals offer man a companionship which is different from any offered by human exchange. Different because it is a companionship offered to the loneliness of man as a species. Such an unspeaking companionship was felt to be so equal that often one finds the conviction that it was man who lacked the capacity to speak with animals — hence the stories and legends of exceptional beings, like Orpheus, who could talk with animals in their own language.

Illustration by Isabel Arsenault from 'Jane, the Fox and Me.' Click image for more.

Berger adds:

What were the secrets of the animal’s likeness with, and unlikeness from man? The secrets whose existence man recognized as soon as he intercepted an animal’s look.

In one sense the whole of anthropology, concerned with the passage from nature to culture, is an answer to that question.

Leo from Salvador Dalí's zodiac series, 1967. Click image for more.

The spiritual quality of that animal gaze, Berger reminds us, stretches much further back than the age of domestication — animals comprise eight of the twelve ancient signs of the zodiac, and the Greeks signified each of the twelve clock-hours of the day with an animal. But that representational capacity was also precisely what separated us from other animals:

What distinguished man from animals was the human capacity for symbolic thought, the capacity which was inseparable from the development of language in which words were not mere signals, but signifiers of something other than themselves. Yet the first symbols were animals. What distinguished men from animals was born of their relationship with them.

Berger cites Aristotle’s History of Animals, considered the first scientific work on the subject, in which the legendary philosopher anthropomorphizes animals by suggesting that they carry traces of our “human qualities and attitudes,” such as “fierceness, mildness or cross-temper, courage or timidity, fear or confidence, high spirits or low cunning, and, with regard to intelligence, something akin to sagacity.” That anthropomorphism, rooted in our systematic use of the animal as a metaphor, continued up until the 19th century. Berger laments:

In the last two centuries, animals have gradually disappeared. Today we live without them. And in this new solitude, anthropomorphism makes us doubly uneasy.

Art by Maira Kalman from 'The Big New Yorker Book of Dogs.' Click image for more.

Berger goes on to trace how animals went from caves to carts to cages. The Industrial Revolution gave us the internal combustion engine, which displaced draught animals from both streets and factories. But while this was undoubtedly an upgrade for both animal rights and human productivity, removing animals from our view was detrimental to our sense of shared everyday reality. Meanwhile, as urbanization and industrialization spread, the extinction of wildlife continued removing animals from that reality — more than that, it forcibly denied them the chance to share it with us and instead confined them to the artificial reality of the zoo. Berger draws an unsettling parallel:

This reduction of the animal, which has a theoretical as well as economic history, is part of the same process as that by which men have been reduced to isolated productive and consuming units. Indeed, during this period an approach to animals often prefigured an approach to man.

Alongside this cultural change emerged another significant shift — the rise of pets, which Virginia Woolf’s nephew argued were an extension of human fashion and vanity. Noting that at the time of his writing the United States was home to an estimated 40 million dogs, 40 million cats, 15 million cage birds and 10 million other pets, Berger contextualizes our compulsion for domestic animal companionship:

The practice of keeping animals regardless of their usefulness, the keeping, exactly, of pets (in the 16th century the word usually referred to a lamb raised by hand) is a modern innovation, and, on the social scale on which it exists today, is unique. It is part of that universal but personal withdrawal into the private small family unit, decorated or furnished with mementoes from the outside world, which is such a distinguishing feature of consumer societies.

[…]

Equally important is the way the average owner regards his pet. (Children are, briefly, somewhat different.) The pet completes him, offering responses to aspects of his character which would otherwise remain unconfirmed. He can be to his pet what he is not to anybody or anything else. Furthermore, the pet can be conditioned to react as though it, too, recognizes this. The pet offers its owner a mirror to a part that is otherwise never reflected. But, since in this relationship the autonomy of both parties has been lost (the owner has become the-special-man-he-is-only-to-his-pet, and the animal has become dependent on its owner for every physical need), the parallelism of their separate lives has been destroyed.

Cartoon by George Booth from 'The Big New Yorker Book of Cats.' Click image for more.

But beneath this spiritual role of pets in completing the human self lies a darker dynamic, one in which the notion of caretaking becomes an imbalance of power. Berger writes:

In the accompanying ideology, animals are always the observed. The fact that they can observe us has lost all significance. They are the objects of our ever-extending knowledge. What we know about them is an index of our power, and thus an index of what separates us from them. The more we know, the further away they are.

That dynamic was even more pronounced in the public zoo — a 19th-century innovation that came into existence as animals began to disappear from our daily lives. Emerging as an emblem of colonial power, where the capturing of animals became a trophy in the conquest of exotic lands, the zoo changed not only our relationship with animals, but also our very language. Berger cites the London Zoo Guide:

About 1867, a music hall artist called the Great Vance sang a song called Walking in the zoo is the OK thing to do, and the word ‘zoo’ came into everyday use. London Zoo also brought the word ‘Jumbo’ into the English language. Jumbo was an African elephant of mammoth size, who lived at the zoo between 1865 and 1882. Queen Victoria took an interest in him and eventually he ended his days as the star of the famous Barnum circus which travelled through America — his name living on to describe things of giant proportions.

Berger poignantly observes:

The zoo to which people go to meet animals, to observe them, to see them, is, in fact, a monument to the impossibility of such encounters. Modern zoos are an epitaph to a relationship which was as old as man.

Illustration from 'The Animal Fair' by Alice and Martin Provensen. Click image for more.

But perhaps the most bittersweet reflection on the changing role of animals in our lives comes from the domain of children — the same observation that sparked Jon Mooallem’s ode to wildlife as he watched his little girl play with stuffed animals the real versions of which would be extinct by the time she grew up. Berger writes:

Children in the industrialized world are surrounded by animal imagery: toys, cartoons, pictures, decorations of every sort. No other source of imagery can begin to compete with that of animals. The apparently spontaneous interest that children have in animals might lead one to suppose that this has always been the case. Certainly some of the earliest toys (when toys were unknown to the vast majority of the population) were animal. Equally, children’s games, all over the world, include real or pretended animals. Yet it was not until the 19th century that reproductions of animals became a regular part of the decor of middle class childhoods — and then, in this century, with the advent of vast display and selling systems like Disney’s — of all childhoods.

(Perhaps MoMA curator Juliet Kinchin put it best in her design history of childhood, where she observed that “children help us to mediate between the ideal and the real” — and nowhere is the disconnect between the two more dramatic than in children’s animal toys.)

Returning to the zoo, where animals have become isolated from each other and deprived of interaction between species, where they have come to rely helplessly on their keepers for survival, Berger draws yet another chilling parallel between the animal experience and human culture:

All sites of enforced marginalization — ghettos, shanty towns, prisons, madhouses, concentration camps — have something in common with zoos. But it is both too easy and too evasive to use the zoo as a symbol. The zoo is a demonstration of the relations between man and animals; nothing else. The marginalization of animals is today being followed by the marginalization and disposal of the only class who, throughout history, has remained familiar with animals and maintained the wisdom which accompanies that familiarity: the middle and small peasant. The basis of this wisdom is an acceptance of the dualism at the very origin of the relation between man and animal. The rejection of this dualism is probably an important factor in opening the way to modern totalitarianism.

Photograph by Sharon Montrose from her series 'Menagerie.' Click image for more.

The zoo, then, fulfills Joanna Bourke’s admonition and by marginalizing and impoverishing the lives of animals, it does the same to our own. Berger concludes:

The zoo cannot but disappoint. The public purpose of zoos is to offer visitors the opportunity of looking at animals. Yet nowhere in a zoo can a stranger encounter the look of an animal. At the most, the animal’s gaze flickers and passes on. They look sideways. They look blindly beyond. They scan mechanically. They have been immunized to encounter, because nothing can any more occupy a central place in their attention. Therein lies the ultimate consequence of their marginalization… This historic loss, to which zoos are a monument, is now irredeemable for the culture of capitalism.

About Looking is well worth a read in its entirety. Complement it with Wild Ones: A Sometimes Dismaying, Weirdly Reassuring Story About Looking at People Looking at Animals in America, one of the best science books of 2013, then revisit these favorite books about animals.

Thanks, Raghava

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.

31 MARCH, 2014

The Evolutionary Mystery of Left-Handedness and What It Reveals About How the Brain Works

By:

From Medieval sword-fighters to Broca’s brains, or why the hand may hold the key to the link between creativity and mental illness.

“Sahara is too little price / to pay for thy Right hand,” Emily Dickinson wrote in a poem. “The right hand = the hand that is aggressive, the hand that masturbates,” Susan Sontag pondered in her diary in 1964. “Therefore, to prefer the left hand! … To romanticize it, to sentimentalize it!” The human hand has long carried cultural baggage, and yet we still struggle to unclutch from it the myths and reveal the realities.

The question of why some humans are left-handed — including such notable specimens as Plato, Charles Darwin, Carl Sagan, Debbie Millman, Stephen Jay Gould, Noam Chomsky, and Albert Einstein* — has perplexed scientists for centuries. For Southpaws themselves — the affectionate term for lefties — this biological peculiarity has been everything from a source of stigma to a point of pride. But at the heart of it remains an evolutionary mystery — one that Wired contributing editor David Wolman, himself (but of course) a lefty, sets out to investigate in A Left-Hand Turn Around the World: Chasing the Mystery and Meaning of All Things Southpaw (public library).

Wolman, who spoke about the oddity of handedness in a fantastic recent Radiolab episode about conflict, traces 200 years of biological and psychological perplexity as he scours the world for answers, from a Parisian science museum that houses pioneering French surgeon and scholar Paul Broca’s bottled brains to a castle in Scotland that hides clues to the heritability of left-handedness to the neuroscience labs of Berkeley to the golf courses of Japan.

To be sure, a Southpaw wasn’t always a mere scientific curiosity, let alone the “lefty superiority complex” which Wolman both notes and embodies — for centuries, it was the subject of superstition, which bestowed upon its owner a serious social curse. Wolman writes:

In the Western world, left-handedness has long been associated with the worst of the worst: sin, devil worship, Satan himself, and just an all-around bad position with God. Catholic schoolteachers used to tell students that left-handedness was “the mark of the Beast,” the Scots say a person with terrible luck must have been baptized by a left-handed priest, and orthodox Jews wrap their left arms in the leather strap of tefillin as if to say, in the words of Rabbi Lawrence Kushner: “Here I am, standing with my dangerous side bridled, ready to pray ” The Bible is full of references to hands, and usually they are about God doing something benevolent and holy with his right hand. I’ll spare you the run-through and stick to a token example, like this one from Psalms 118: “The right hand of the Lord is exalted. The right hand of the Lord doeth valiantly.”

Carl Sagan, a lefty

While Carl Sagan once hypothesized that the cultural link between left-handedness and badness arose due to the left hand’s use for hygiene purposes in nonindustrialized countries, Wolman points out that the association has much deeper roots, including the very etymology of the word “left”:

The Anglo-Saxon lyft means weak or broken, and even modern dictionaries include such meanings for left as “defective,” “crippled,” “awkward,” “clumsy,” “inept,” and “maladroit,” the latter one borrowed from French, translated literally as “bad right.” Most definitions of left reduce to an image of doubtful sincerity and clumsiness, and the Latin word for left, sinister, is a well-known beauty. From this version springs my favorite term for left-handedness, “the bend sinister,” which Vladimir Nabokov used for the title of a book that has nothing to do with handedness.

Even today, our understanding of handedness is muddled by misconceptions. While it’s currently estimated that 10-12% of the human population is left-handed, the very definition of handedness is cause for confusion:

Most people presume the hand used for writing is the litmus test for determining whether someone is lefty or righty, and for anyone content to live with a pedestrian level of knowledge on the subject, this narrow reading will serve well enough. [And yet] everyday tasks, like throwing and eating, also influence the popular understanding of hand dominance, sometimes nearly as strongly as writing. These different behaviors lead immediately to a quintessential problem of handedness inquiries: how to define handedness itself. The definition of lefty or righty varies, sometimes to a frustrating degree, and that variation has troubled researchers who want to get a better handle on why it is that humans have hand preference and performance discrepancies in the first place, where these discrepancies come from, and why as a population we usually favor the right hand.

Also to be accounted for is the fact that many people are born with a natural inclination to write with the left hand but are schooled out of it for various reasons. (I myself, an adult righty, am among them. The most visceral evidence is found on my left thumb, whose pad an elongated scar splits vertically. It was inflicted while carving a watermelon jack-o-lantern at age six and accidentally flipping the knife the wrong way, pressing onto its edge rather than its blunt side. The fact that I held the knife in my left hand is cited to this day as indication of my natural left-handedness. The fact that I held the knife at all is cited as indication of questionable parenting.)

The most commonly used test for handedness is this imperfect inventory created in the early 1970s, which generates what researchers call a laterality quotient:

Still, even if we were able to codify handedness, the question of why the dichotomy exists in the first place remains open. But before diving into the various theories, Wolman offers an essential primer on how the brain works:

It’s a well-known aspect of the brain-body relationship that control of movement is crisscrossed. That is, the act of swatting at a buzzing mosquito with the right hand is controlled by the left side of the brain, or more specifically by a certain area of the left hemisphere known as the left motor cortex, which sends the necessary signals to muscles in the right arm. The reverse is true for actions carried out with the left hand, and all of this is irrespective of handedness. The hemisphere on the same side as a movement isn’t entirely silent … but for the most part motor control comes from the opposite hemisphere. This contralateral control, first described by Hippocrates himself, isn’t limited to hands. It applies to arms, legs, eyes, ears, and indeed almost all motor faculties, which is why people who’ve had a stroke or tumor on one side of the brain often experience partial or total paralysis on the opposite side of the body.

[…]

The left and right sides of the brain are physically quite distinct. The brain is made up of two mostly separate halves, each composed of billions and billions of neural connections. Yet despite popular notions to the contrary, left-handed people do not think in the right hemisphere of the brain, nor do right-handers think in the left hemisphere. The motor cortex, that part of each hemisphere cross-wired to control the other side of the body, is only one relatively minor aspect of this dizzyingly complex organ, and it says nothing or nearly nothing about a person’s thoughts or personality.

So what can the brain reveal about handedness? One of the first scientists to ponder the mystery of left-handedness was pioneering French surgeon Paul Broca, whom Wolman calls “the closest thing the religion of Southpaw has to a prophet.” In 1861, just two years after Darwin had discovered the principles of evolution, Broca encountered two patients who stumped him profoundly. One was an epileptic man named Leborgne but known as “Tan,” nicknamed after the only syllable he was capable of uttering. Leborgne was able to understand spoken language but couldn’t articulate his thoughts in speech — something that perplexed Broca enormously, doubly so given that one of Leborgne’s first symptoms was a weakening of function in the right side of his body, which progressed to more loss of motor control and eventually the loss of sight and some of his mental faculties.

When Leborgne died at the age of 51, Broca decided to crack the mystery — literally. He dissected Leborgne’s brain and found massive lesion in the left frontal cortex, likely due to a tumor. Broca concluded that this must somehow be related to Leborgne’s symptoms, of which the loss of speech was the most dramatic. But a direct link would take longer to recover.

Then came Lelong, an elderly patient who had ended up in Broca’s care after a fall, only able to utter a few words. When Lelong died two weeks later, Broca discovered a similarly dramatic lesion in the left side of his brain, which is preserved to this day in Paris.

But the most perplexing of Broca’s patients were the few who had either had damage to the left hemisphere but no difficulties with speech or who had lost their ability to speak but only had damage in the right hemisphere. This led Broca to conclude that for a minority of people, the speech centers were located in the right rather than left hemispheres, which he at first thought might be tied to handedness, but later surmised that both phenomena were anomalous exceptions and weren’t correlated. Still, his work was the first significant spark for the study of handedness and stirred quite the flurry within the scientific community. Wolman points out the clash between science and popular culture that ensued:

Because some people are an exception to the language-to-the-left rule, and because a similarly small proportion of people are left-handed, everyone and his cousin in the medical establishment figured the two must go hand in hand; lefties should have language lateralized to the right. What’s interesting about this conclusion is that few people in nineteenth-century Europe would have admitted to being left-handed. Detecting someone’s left-handedness would have been difficult, with eating, writing, and other major tasks all usually carried out with the right hand. What’s also interesting about this conclusion is that it’s wrong. Nearly 99 percent of right-handers have language located in the left hemisphere, and about 70 percent of lefties do. A different proportion, yes, but hardly the opposite; most lefty brains are like righty brains, at least as far as speech function is concerned. The rest either have language in the right hemisphere, or have it distributed more evenly between the two sides of the brain.

Paul Broca

Though Broca himself had gently dismissed the link between handedness and right-hemispheric speech dominance, he hadn’t gone out of his way to assert the dissociation, so the myth that lefties had their speech located in the right hemisphere persisted for nearly half a century. It wasn’t until WWI, when physicians began to notice that injured veterans who were lefties didn’t necessarily have right-hemispheric language localization, that the myth began to erode and the quest for new theories gained momentum.

Wolman points to several of the notable theories that followed, each in turn disproven by science but offering a valuable piece of the still-unfinished puzzle. One of the earliest proposed that handedness in humans was originally evenly distributed, but hand-to-hand battle in the ancient world killed off the lefties because they held the sword with their left hand and the shield in their right, thus leaving the heart much less protected than for righties, who held the shield on the left. As the lefties perished on the battlefield, so did their genes.

The theory was disproven for a number of reasons, including the fact that there is evidence of left-handedness long before the invention of the sword and shield as well as the biological reality that the heredity of handedness isn’t so straightforward.

A later theory proposed pretty much the opposite — that left-handedness gave warriors a competitive advantage “for much the same reason left-handed tennis players, boxers, or fencers have an advantage.” Wolman cites the example of a left-handed 16th-century Scottish warrior from the famous Kerr clan:

The lefty advantage exists because, in a world with far fewer lefties than righties, right-handed, or even left-handed, opponents have comparatively little practice facing off against left-handers. Lefty forehands are hit from that less familiar side, their stronger punches originate from that less comfortable side, and their opposing stance differs from what people are accustomed to, namely right-handed opponents. For Andrew Kerr, the matter was of far greater import than for, say, John McEnroe serving up another ace to the ad-court at Wimbledon; for Kerr, success with the sword was a matter of life and death.

[…]

French scientists recently hypothesized that fighting advantages for left-handers in prehistoric human societies ensured reproductive success. To test the idea, they looked at the homicide rates in various societies, wondering if the Southpaw advantage might be magnified within historically more violent populations. From the results, it looks like it was, with proportions of left-handedness ranging from 3.4 percent in an especially pacifist African community in Burkina Faso to 22.6 percent in a notoriously violent culture in South America. Southpaws aren’t more violent, of course, but may have had a survival advantage in societies that were.

Still, these hypotheses were never proven with any degree of verifiability. Fast-forward to the late 1970s and early 1980s, when Harvard neurologist Norman Geschwind proposed a theory that endured for many years: He suggested that high levels of testosterone in the womb led to a minor mutation in the brain, which caused its organization to shift away from right-handedness. One support point cited for his theory is the fact that there are slightly more male lefties than female.

An earlier womb theory had posited that the fetus’s orientation inside fosters its sense of stability, so handedness is based on whether fetus uses the left or right hand for balance — the free hand, in most cases the right, can then move and explore, eventually becoming the dominant one.

Once again, neither theory found sufficient support. And so we return to the brain and genetics. Wolman cites the work of English researcher Marian Annett, who discovered the key to the messy heredity patterns of handedness:

In 1972, Annett … published a paper titled, “The Distribution of Manual Asymmetry,” which, although of little notice at the time, would later serve as the foundation for one of the most widely accepted explanations of human handedness. She called it the Right Shift Theory, and she later expanded it in a 1985 volume of the same name. Annett argues that whereas human handedness is comparable to the left- or right-side preferences exhibited by other creatures with hands, paws, feet, or what have you, the approximate 90 percent predominance of right-handedness in the human population sets us apart. All other animals have a 50-50 split between righties and lefties. According to Annett’s model, handedness in nature rests on a continuum, ranging from strong left, through mixed, and then to strong right-handedness. But for humanity the distribution of preference and performance is dramatically shifted to the right. Human bias to the right, Annett explains, was triggered by a shift to the left hemisphere of the brain for certain cognitive functions, most likely speech. . . . That momentous shift was caused by a gene.

Of course, that question had perplexed generations of scientists since Darwin — who, by the way, was a victim of the confounding heredity of handedness: his wife and father-in-law were lefties, but only two of Darwin and Emma’s ten children were. But Annett’s Right Shift Theory was the first systematic explanation for the genetics of handedness. Still, Wolman observes the complexities of genetics:

In many ways, the genes-versus-environment dichotomy is a misleading one because so often the two work hand in hand. Say, for instance, a gene or genes instructs for a certain amount of testosterone in the womb. If the level of that hormone varies and somehow influences the development of the fetus, should traits affected by the level of testosterone be dubbed genetic or environmental? One could argue that the biochemical conditions in the womb-the fetus’s surroundings-qualify as environmental factors, but those conditions are shaped by genetic instructions. Yet within the DNA of every cell of that newborn baby, there will be no information specific to the child’s conditions in the womb. Can we call that a genetic trait? Luckily, Annett’s theory supposes a less ambivalent role for a gene, or possibly a few genes. Inside the nuclei of nearly every cell in the body are left-twisting bundles of DNA that either do or do not contain what Annett has dubbed the “Right Shift factor.”

What this series of hypotheses reveals, more than anything, is just how little we know about the inner workings of the body — despite having sequenced the human genome, here we are still struggling to explain as seemingly simple a characteristic as handedness. And yet, Wolman argues, Annett’s research was groundbreaking and immensely valuable for three reasons: “the idea of a handedness continuum; the chance-based gene predicting not individual handedness but shifted population distribution; and finally the suggestion that people are not left-handed or right-handed, but right-handed or non-right-handed.”

Vintage artwork from 'A Visual History of Magic.' Click images for details.

Perhaps the most interesting theory, however, is a rather fringe proposition that ties handedness to “magical ideation” — one’s tendency to believe in metaphysical phenomena beyond that aren’t scientifically verifiable, from supernatural forces to extrasensory perception to reincarnation and other concepts that wouldn’t hold up to Carl Sagan’s Baloney Detection Kit. Wolman cites New Zealand scholar Michael Corballis, who has written about the potential link between more brain symmetry — something found in lefties — and magical ideation:

Hemispheric asymmetry itself may lead to more decisive and controlled action, and perhaps a better ability to organize hierarchical processes, as in language, manufacture, and theory of mind. Those individuals who lack cerebral asymmetry [a.k.a. increased symmetry] may be more susceptible to superstition and magical thinking, but more creative and perhaps more spatially aware.

On the magical ideation scale — the measure of belief in such phenomena — lefties tend to score higher than righties. And yet, Wolman points out, “anecdotal evidence that lefties are highly represented in low-bullshit-tolerating professions such as journalism and science doesn’t exactly support this notion,” suggesting instead that the magical ideation hypothesis is best “recalibrated as a degree-not-direction descriptor.”

What makes this theory intriguing, however, isn’t its verifiability or lack thereof but what it reveals about our culture’s beliefs about creativity and mental illness, or cognitive abnormality. Wolman writes:

The magical ideation line of thinking loops back to creativity when we consider findings indicating an increased proportion of left-handers who suffer from such disorders as schizophrenia. With due acknowledgment once again to Corballis for synthesis of this idea, it’s plausible that schizophrenia and magical ideation sprout from similar neurological roots. Research demonstrating connections between mixed-handedness and either of these two conditions advances that plausibility.

[…]

Consider for a moment that there’s a thin, perhaps blurred line between genius and mental illness. What if some types of genius stem from the same aspect of the brain — or influence on the brain — as, say, magical ideation and schizophrenia, and that subtle variation in the arrangement of certain brain circuits determines the difference between the next da Vinci, the next graphology believer, the next Hendrix-like guitar god, or the next schizophrenic?

Albert Einstein's brain (Photograph: NMHM, Silver Spring, MD via Nature)

Wolman points to Kim Peek, the autistic “megasavant” on whom the film Rain Man is based, and perhaps most notably Albert Einstein, celebrated as “the quintessential modern genius”:

Examination of his brain after death showed unusual anatomical symmetry that … can mean above-normal interhemispheric connections. Then there’s the fact that Einstein’s genius is often linked with an imagination supercharged with imagery, a highly right hemisphere-dependent function. It was that kind of imagination that ignited questions leading eventually to the Theory of Relativity: what does a person on a moving train see compared with what a person standing still sees, and how would the body age if traveling near the speed of light in a spaceship compared to the aging process observed on Earth? Is it such a stretch to speculate that Einstein landed on the fortunate end of the same brain organization spectrum upon which other, less lucky individuals land in the mental illness category? And what if handedness too is influenced by this organizational crapshoot?

Illustration by Vladimir Radunsky for 'On a Beam of Light: A Story of Albert Einstein.' Click image for details.

The investigation is ever-ongoing, but Wolman offers a wealth of other pause-giving findings and theories in the rest of A Left-Hand Turn Around the World.

* Einstein’s handedness is somewhat a matter of debate. While he is often cited among history’s famous lefties, laterality scholars have surmised that he was mixed-handed — which is not to be confused with ambidextrous: mixed-handed people use the right hand for some things and the left for others, whereas the ambidextrous can use both hands equally well for most tasks.

Donating = Loving

Bringing you (ad-free) Brain Pickings takes hundreds of hours each month. If you find any joy and stimulation here, please consider becoming a Supporting Member with a recurring monthly donation of your choosing, between a cup of tea and a good dinner.





You can also become a one-time patron with a single donation in any amount.





Brain Pickings has a free weekly newsletter. It comes out on Sundays and offers the week’s best articles. Here’s what to expect. Like? Sign up.