Sex At Dawn: The Prehistoric Origins of Modern Sexuality by Christopher Ryan and Cacilda Jetha - HTML preview

PLEASE NOTE: This is an HTML preview only and some elements such as links or page numbers may be incorrect.
Download the book in PDF, ePub, Kindle for a complete version.

Is 80 the New 30?

Cartoon in The New Yorker: Two cavemen are shown chatting, one of whom is saying, “Something’s just not right—our air is clean, our water is pure, we all get plenty of exercise, everything we eat is organic and free-range, and yet nobody lives past thirty.”

Statistical distortions due to infanticide are not the only source of confusion concerning prehistoric longevity. As you might imagine, it’s not so easy to determine the age at death of a skeleton that’s been in the ground for thousands of years. For various technical reasons, archaeologists often underestimate the age at death. For example, archaeologists estimated the ages at death of skeletons taken from mission cemeteries in California. After the estimates had been made, written records of the actual ages at death were discovered. While the archaeologists had estimated that only about 5 percent had lived to age forty-five or beyond, the documents proved that seven times that many (37 percent) of the people buried in these cemeteries were over forty-five years of age when they died.10 If estimates can be so far off on skeletons just a few hundred years old, imagine the inaccuracies with remains that are tens of thousands of years old.

One of the most reliable techniques archaeologists use to estimate age at death is dental eruption. They look at how far the molars have grown out of the jawbone, which indicates roughly how old a young adult was at death. But our “wisdom teeth” stop “erupting” in our early to mid-thirties, which means that archaeologists note the age at death of skeletons beyond this point as “35+.” This doesn’t mean that thirty-five was the age of death, but that the person was thirty-five or older. He or she may have been anywhere from thirty-five to one hundred years old. Nobody knows.

Somewhere along the line this notation system was mistranslated in the popular press, leaving the impression that our ancient ancestors rarely made it past thirty-five. Big mistake. A wide range of data sources (including, even, the Old Testament) point to a typical human life span of anywhere from seventy (“three score and ten”) to over ninety years.

In one study, scientists calibrated brain and body-weight ratios across different primates, arriving at an estimate of sixty-six to seventy-eight years for Homo sapiens.11 These numbers bear up under observation of modern-day foragers. Among the !Kung San, Hadza, and Aché (societies in Africa and South America), a female who lived to forty-five could be expected to survive another 20, 21.3, and 22.1 years, respectively.12 Among the !Kung San, most people who reached sixty could reasonably expect to live another ten years or so—active years of mobility and social contribution. Anthropologist Richard Lee reported that one in ten of the !Kung he encountered in his time in Botswana were over sixty years of age.13

As mentioned in previous chapters, it’s clear that overall human health (including longevity) took a severe hit from agriculture. The typical human diet went from extreme variety and nutritional richness to just a few types of grain, possibly supplemented by occasional meat and dairy. The Aché diet, for example, includes 78 different species of mammal, 21 species of reptiles and amphibians, more than 150 species of birds, and 14 species of fish, as well as a wide range of plants.14

In addition to the reduced nutritional value of the agricultural diet, the diseases deadliest to our species began their dreadful rampage when human populations turned to agriculture. Conditions were perfect: high-density population centers stewing in their own filth, domesticated animals in close proximity (adding their excrement, viruses, and parasites to the mix), and extended trade routes facilitating the movement of contagious pathogens from populations with immunity to vulnerable communities.15

When James Larrick and his colleagues studied the still relatively isolated Waorani Indians of Ecuador, they found no evidence of hypertension, heart disease, or cancer. No anemia or common cold. No internal parasites. No sign of previous exposure to polio, pneumonia, smallpox, chicken pox, typhus, typhoid, syphilis, tuberculosis, malaria, or serum hepatitis.16

This is not as surprising as it may seem, given that almost all these diseases either originated in domesticated animals or depend upon high-density population for easy transmission. The deadliest infectious diseases and parasites that have plagued our species could not have spread until after the transition to agriculture.

Table 3: deadly diseases from domesticated animals17

HUMAN DISEASE ANIMAL SOURCE Measles Cattle (rinderpest) Tuberculosis Cattle Smallpox Cattle (cowpox) Influenza Pigs and birds Pertussis Pigs and dogs Falciparum malaria Birds

The dramatic increases in world population that paralleled agricultural development don’t indicate increased health, but increased fertility: more people living to reproduce, but lower quality of life for those who do. Even Edgerton, who repeatedly tells the longevity lie (Foragers’ “lives are short—life expectancy at birth ranges between 20 and 40 years …”), has to agree that, somehow, foragers managed to be healthier than agriculturalists: “Agriculturalists throughout the world were always less healthy than hunters and gatherers.” The urban populations of Europe, he writes, “did not match the longevity of hunter-gatherers until the mid-nineteenth or even twentieth century.”18

That’s in Europe. People living in Africa, most of Asia, and Latin America have still not regained the longevity typical of their ancestors and, thanks to chronic world poverty, global warming, and AIDS, it’s unlikely they will for the foreseeable future.

Once pathogens mutate into human populations from domesticated animals, they quickly migrate from one community to another. For these agents of disease, the initiation of global trade was a boon. Bubonic plague took the Silk Route to Europe. Smallpox and measles stowed away on ships headed for the New World, while syphilis appears to have hitched a ride back across the Atlantic, probably on Columbus’s first return voyage. Today, the Western world flutters into annual panics over avian flu scares emanating from the Far East. Ebola, SARS, flesh-eating bacteria, the H1N1 virus (swine flu), and innumerable pathogens yet to be named keep us all compulsively washing our hands.

While there were no doubt occasional outbreaks of infectious diseases in prehistory, it’s unlikely they spread far, even with high levels of sexual promiscuity. It would have been nearly impossible for pathogens to take hold in widely dispersed groups of foragers with infrequent contact between groups. The conditions necessary for devastating epidemics or pandemics just didn’t exist until the agricultural revolution. The claim that modern medicine and sanitation save us from infectious diseases that ravaged pre-agricultural people (something we hear often) is like arguing that seat belts and air bags protect us from car crashes that were fatal to our prehistoric ancestors.