Human origins and human futures
Looking back at Darwin and ahead at genetic engineering.
Yet the great man himself might be astounded by what his successors have wrought. Today we are hurtling, willy-nilly, into a bio-medical Brave New World that changes almost daily. Research is directed by Ph.D.s in university or corporate labs, routinized, narrowly focused, and highly funded. The wonders that result, from hip replacements to gene therapy, offer so much promise that it’s easy to overlook their ethical implications.
In Darwin’s era, just 150 years ago, science was a vastly different, almost amateurish enterprise. It was largely the province of solitary gentlemen-scholars with little formal training and lots of leisure time. Charles Darwin (1809-1882), a scion of country gentry who attended university to prepare for the clergy, was one such man. He was a “natural historian” whose early dabbling in collecting beetles and studying rock formations eventually led, almost against his will, to the theories that made him famous in some quarters and notorious elsewhere. Blessed with a placid temperament and an amiable disposition, he was a most reluctant revolutionary.
Janet Browne’s two-volume life, Charles Darwin: Voyaging and Charles Darwin: The Power of Place, is a masterly work, one of the most formidable biographies of our time, at once intellectually sophisticated and engagingly readable. Here is Darwin as protean man—the most curious of human beings, the avid collector, the meticulous analyzer of data, the grand synthesizer, the religious agnostic, the self-absorbed husband, the doting father, the chronic invalid.
Browne’s most innovative achievement is to frame her subject within the parameters of the English elite. “Behind Darwin,” she writes, “lay the vast unacknowledged support system of the Victorian gentry, and beyond that the farflung network of imperial, colonial Britain. He was born a wealthy child in a socially secure, well-connected family. His father was a rich physician and his mother a daughter of Josiah Wedgwood, the potter. . . . He possessed many advantages in life, including an education at the best institutions Britain had to offer. The friends he made at Cambridge University proved influential figures over the years.”
Securing Darwin the position of naturalist on the Beagle, for instance, was the work of his Cambridge mentor, botanist John Henslow. Captain Robert FitzRoy wanted someone of gentlemanly breeding as travel companion, perhaps an understandable preference in light of the five years (1831-1836) they would share aboard a 90-foot craft with a crew of seventy-two mostly illiterate souls.
The 22-year-old Darwin was required to pay his own fare, the equivalent of two years at Cambridge, a burden taken on by his obliging father. Thus when the naturalist scoured Patagonia or the Galapagos for specimens, he was hardly a lone explorer but well-outfitted with horses, guides, and baggage handlers.
In 1859, after years of mustering evidence from the fossil record, embryology, comparative anatomy, and animal behavior, plus painstaking research of his own including eight years on barnacles alone Darwin published On the Origin of Species. What made his argument for evolution through natural selection persuasive, according to Browne, was an appealingly humane writing style and “a mountain of scrupulously considered data” made coherent by a breadth of vision.
Once again, Darwin’s social network proved indispensable. Prominent scientists like Charles Lyell in Britain and Asa Gray in America put aside any quibbles to lead the defense. Thomas Henry Huxley became his most ardent champion and shrewdest popularizer. “Without [Huxley] acting as a pugnacious bulldog to his stay-at-home Labrador,” Browne writes, “Darwin would have been lost in the ensuing conflict.”
Thus was framed the defining question of the Victorian era: Who has the authority to explain the origin of living things, theologians or scientists? The predictable corollary, sparked in 1871 by Darwin’s The Descent of Man: Were humans descended from apes, as Darwin came to believe, or fashioned by God?
Darwin’s wife, Emma, was a devout Christian. But under the pressure of his research, all the pillars of Darwin’s faith toppled one by one: belief in miracles, divine revelation, biblical authority, and, finally, the necessity of a deity. As self-description, he adopted Huxley’s term of “agnostic”; whether God existed he considered “beyond the scope of man’s intellect.” Still, he was well aware that his rationalist universe worked according to its own laws. It had no need for a Divine Watchmaker who designed and regulated all of creation.
Unitarianism, Darwin’s grandfather Erasmus Darwin liked to say, was “a feather-bed to catch a falling Christian.” (Darwin’s mother, Susanna Wedgwood, had been a Unitarian, and he was tutored for a year by her minister; but the adult Darwin was always nominally affiliated with the Church of England, in deference to his wife.) Aptly and wittily put for his time, the definition is too narrow for today’s Unitarian Universalists, who espouse many varieties of humanism and theism.
Whatever Unitarian Universalists may think of Darwin’s agnosticism, they cannot escape the field his ideas revolutionized. Biological science affects us every day, most directly in our food, medical care, and environment. Increasingly, as reflected in genetics, it affects our very sense of who we are.
Every cell of every human being, geneticists inform us, has some 35,000 genes. These genes, made up of DNA, or deoxyribonucleic acid, amount to blueprints that tell the body how to perform. When normal, they allow the body to operate in a healthy manner. If they are structurally imperfect, they may cause disease or malfunction.
The Human Genome Project was the huge collective research project to identify all of those genes. The same kind of work has been done on other animals, including the great apes. The results are astonishing—about 98 percent similarity in genetic structure. But the real question is, how much does the overlap matter?
The title of Jonathan Marks’ witty and lucid study, What It Means To Be 98% Chimpanzee: Apes, People, and Their Genes, gets right to the heart of things. Humans, he notes, also share a high percentage of genes with many animals. What’s more, 35 percent of our genes are the same as those of the lowly daffodil. Does this make us 35 percent daffodil? Hardly. In the genes that do differentiate us from apes, he concludes, lie all the difference.
“Ultimately,” writes Marks, an evolutionary anthropologist, “the fallacy is not a genetic but a cultural one—our reduction of the important things in life to genetics.” He aims his biggest guns at reductionist theorists who interpret the genetic evidence to mean that we are nothing but smarter apes. Sociobiologists, for example, study the biological roots of human behavior, he writes, “whether or not they exist.” What they cannot prove—say, the inborn aggression of males—they merely assume.
Marks believes that sociobiologists, like Rousseau and other philosophers before them, are pursuing an impossible goal: the discovery of an elusive something called “human nature.” The great apes are contemporary analogs to the “noble savages” the Frenchman perceived in Native Americans. Trouble is, Marks writes, “there is no human nature outside culture. . . . If the human is like a cake, culture is like the eggs, not like the icing—it is an inseparable part, not a superficial glaze.”
However great his genius, Darwin’s understanding of how species thrive or become extinct through natural selection pales before that of today’s undergraduate biology students. Though he theorized that inherited traits were passed on by minute, unseen entities he called “gemmules” (sometimes “granules” or “germs”), the word “gene” was not in his vocabulary. Genetics was in its infancy, its mechanisms a mystery.
Bill McKibben’s latest venture into social commentary, Enough, explores the ethical fallout of a phenomenon Darwin would have found inconceivable: “unnatural selection,” that is, genetic engineering or artificially changing the structure of genes. No scientist, McKibben, author of The End of Nature and The Age of Missing Information, is a visionary thinker who challenges our preconceptions of what is obvious, impossible, or foolish.
Because of the unprecedented pace of scientific change, McKibben believes, the question of who we are has never been more up for grabs. Make no mistake. He does not oppose “somatic gene therapy,” the introduction of a therapeutic gene into a patient’s cells to treat an illness. His objection is to “germline” genetic engineering, the modification of genes in the fertilized embryo to produce desired (and inheritable) characteristics, from eye color to height to intelligence. Germline engineering is prohibited in the United States under current research guidelines, but McKibben suggests that we are on the brink of what was once taboo.
Consider the designer-gene race we may someday face. Wouldn’t conscientious (and wealthy) parents gladly improve their unborn children’s intelligence or muscle mass if the Joneses next door had already upgraded theirs in embryo? Sibling rivalry will never be the same: “What if you had a second child five years after the first, and by that time upgrades were undeniably improved: How would you feel about the first kid? How would he feel about his new brother, the latest model?”
On the societal level, the implications are far worse—a caste system, divided between what geneticist Lee Silver has dubbed “GenRich” and “Naturals.”
While it’s easy to outline possible dangers, it’s harder to show how the juggernaut can be stopped, especially as it delivers welcome medical breakthroughs. The day-to-day improvements—a treatment for sickle cell anemia, for instance—are enticing. But remember, that’s somatic gene therapy. What should we make of germline engineering promising us longer life? If scientists can extend the lifespan of nematodes sevenfold, what might they one day offer human beings?
“Techno-zealots,” McKibben’s term for his opponents, argue that it’s pointless to oppose germline engineering. If something is technologically feasible, it will be done. Trying to ban it is futile, a mere finger in the dike. Market pressures are too strong, scientific rivalries too intense, the intellectual urge to explore the unknown irresistible. And in modern consumer society, the author notes, there is no more seductive word than “choice.” Why be old-fashioned? Choose the sort of child you really want.
James Watson himself, Nobel Prize-winning co-discoverer of DNA’s double helix, exhorts us to plunge boldly ahead. To which McKibben replies, beware the aura of pontificating scientists. Experts in genetics aren’t necessarily authorities in ethics.
So, can the juggernaut be stopped? The author insists it can. We have the power to say no. By way of example, he cites several societies that chose to turn down available technology because it clashed with more crucial values. In the fifteenth century, China scrapped its formidable navy, which had ventured as far as the East African coast; a century later, Japan gave up firearms. In our time, McKibben finds inspiration in the fact that no nation has resorted to nuclear weapons since their initial use in 1945—and in two remarkable innovations in saying no: non-violent civil disobedience and preservation of wilderness as ecological necessity.
Technological efforts to perfect humanity run counter to the ethic of restraint counseled by philosophy and religion. To try to engineer human perfection is the ultimate in hubris. If daring to say no is quixotic, the consequences of doing nothing are worse.
“That inefficiency, that tension, that tug in different directions,” McKibben writes, “is what we call consciousness. It explains novel-writing and rock-climbing and church-going, and it explains both the difficulties and the glories of family and community and love. Machines don’t have that tension; the other animals move along that edge with inborn grace. Consciousness doesn’t make us better than robots or rhinoceri. It just makes us different. It just makes us human.”