Support the news
From "Why the West Rules — For Now: The Patterns of History, and What They Reveal About the Future," by Ian Morris.
Chapter 1: Before East and West
What Is the West?
“When a man is tired of London,” said Samuel Johnson, “he is tired of life; for there is in London all that life can afford.” It was 1777, and every current of thought, every bright new invention, was energizing Dr. Johnson’s hometown. London had cathedrals and palaces, parks and rivers, mansions and slums. Above all, it had things to buy—things beyond the wildest imaginings of previous generations. Fine ladies and gentlemen could alight from carriages outside the new arcades of Oxford Street, there to seek out novelties like the umbrella, an invention of the 1760s that the British soon judged indispensable; or the handbag, or toothpaste, both of them products of the same decade. And it was not just the rich who indulged in this new culture of consumption. To the horror of conservatives, tradesmen were spending hours in coffee shops, the poor were calling tea a “necessary,” and farmers’ wives were buying pianos.
The British were beginning to feel they were not like other people. In 1776 the Scottish sage Adam Smith had called them “a nation of shopkeepers” in his Inquiry into the Nature and Causes of the Wealth of Nations, but he had meant it as a compliment; Britons’ regard for their own well-being, Smith insisted, was making everyone richer. Just think, he said, of the contrast between Britain and China. China had been “long one of the richest, that is, one of the most fertile, best cultivated, most industrious, and most populous, countries of the world,” but had already “acquired that full complement of riches which the measure of its laws and institutions permits it to acquire.” The Chinese, in short, were stuck. “The competition of the labourers and the interest of the masters,” Smith predicted, “would soon reduce them to the lowest rate which is consistent with common humanity,” with the consequence that “the poverty of the lower ranks of people in China far surpasses that of the most beggarly nations in Europe…Any carrion, the carcase of a dead dog or cat, for example, though half putrid and stinking, is as welcome to them as the most wholesome food to the people of other countries.”
Johnson and Smith had a point. Although the industrial revolution had barely begun in the 1770s, average incomes were already higher and more evenly distributed in England than in China. Long-term lock-in theories of Western rule often start from this fact: the West’s lead, they argue, was a cause rather than a consequence of the industrial revolution, and we need to look back further in time—perhaps much further—to explain it.
Or do we? The historian Kenneth Pomeranz, whose book The Great Divergence I mentioned in the introduction, insists that Adam Smith and all the cheerleaders for the West who followed him were actually comparing the wrong things. China is as big and as varied, Pomeranz points out, as the whole continent of Europe. We should not be too surprised, then, that if we single out England, which was Europe’s most developed region in Smith’s day, and compare it with the average level of development in the whole of China, England scores higher. By the same token, if we turned things around and compared the Yangzi Delta (the most developed part of China in the 1770s) with the average level of development across the whole of Europe, the Yangzi Delta would score higher. Pomeranz argues that eighteenth-century England and the Yangzi Delta had more in common with each other (incipient industrialism, booming markets, complex divisions of labor) than England did with underdeveloped parts of Europe or the Yangzi Delta did with underdeveloped parts of China—all of which leads him to conclude that long-term theorists get things back-to-front because their thinking has been sloppy. If England and the Yangzi Delta were so similar in the eighteenth century, Pomeranz observes, the explanation for Western rule must lie after this date, not before it.
One implication is clear: if we want to know why the West rules, we first need to know what “the West” is. As soon as we ask that question, though, things get messy. Most of us have a gut feeling about what constitutes “the West.” Some people equate it with democracy and freedom; others with Christianity; others still with secular rationalism. In fact, the historian Norman Davies has found no fewer than twelve ways that academics define the West, united only by what he calls their “elastic geography.” Each definition gives the West a different shape, creating exactly the kind of confusion that Pomeranz complains about. The West, says Davies, “can be defined by its advocates in almost any way that they think fit,” meaning that when we get right down to it, “Western civilization is essentially an amalgam of intellectual constructs which were designed to further the interests of their authors.”
If Davies is right, asking why the West rules means nothing more than arbitrarily picking some value to define the West, claiming that a particular set of countries exemplifies this value, then comparing that set with an equally arbitrary set of “non-Western” countries to reach whatever self-serving conclusions we like. Anyone who disagrees with our conclusions can simply choose a different value to exemplify Westernness, a different set of countries exemplifying it, and a different comparison set, coming—naturally—to a different but equally self-serving conclusion.
This would be pointless, so I want to take a different approach. Instead of starting at the end of the process, making assumptions about what count as Western values and then looking back through time to find their roots, I will start at the beginning. I will move forward through time from the beginning until we reach a point at which we can see distinctive ways of life emerging in different parts of the world. I will then call the westernmost of these distinctive regions “the West” and the easternmost “the East,” treating West and East for what they are—geographical labels, not value judgments.
Saying we must start at the beginning is one thing; finding it is another altogether. As we will see, there are several points in the distant past at which scholars have been tempted to define East and West in terms of biology, rejecting the argument I made in the introduction that folks (in large groups) are all much the same and instead seeing the people in one part of the world as genetically superior to everyone else. There are also points when it would be all too easy to conclude that one region has, since time immemorial, been culturally superior to all others. We must look into these ideas carefully, because if we make a misstep here at the start we will also get everything about the shape of the past, and therefore about the shape of the future, too, wrong.
In the Beginning
Every culture has had its own story about how things started, but in the last few years astrophysicists have given us some new, scientific versions. Most experts now think time and space began over 13 billion years ago, although they do not agree on just how that happened. The dominant “inflationary” theory holds that the universe initially expanded faster than the speed of light from an infinitely dense and infinitely small point, while a rival “cyclical” theory argues that it blew up when a previous universe collapsed. Both schools agree that our universe is still expanding, but while inflationists say it will continue to grow, the stars will go out, and eventually infinite darkness and coldness will descend, cyclists claim it will shrink back on itself, explode again, and start another new universe.
It is hard to make much sense of these theories unless you have had years of advanced mathematical training, but fortunately our question does not require us to begin quite so early. There could be neither East nor West when there were no directions at all and when the laws of nature did not exist. Nor could East and West be useful concepts before our sun and planet took shape 4.5 billion years ago. Perhaps we can speak of East and West once the earth’s crust formed, or at least once the continents reached something like their current positions, by which point we are already into the last few million years. Really, though, all these discussions are beside the point: East and West cannot mean anything for the question in this book until we add another ingredient to the mix—humans.
Paleoanthropologists, who study early humans, like controversy even more than historians do. Their field is young and fast moving, and new discoveries constantly turn established truths on their heads. If you get two paleoanthropologists into a room they are likely to come out with three theories of human evolution, and by the time the door shuts behind them, all will be out of date.
The boundary between humans and prehumans is necessarily fuzzy. Some paleoanthropologists think that as soon as we see apes that could walk upright we should start speaking of humans. Judging from the fossilized remains of hip and toe bones, some East African apes began doing this 6 or 7 million years ago. Most experts, though, think this sets the bar too low, and standard biological classifications in fact define the genus Homo (“mankind” in Latin) by bundling together an increase in brain size from 400–500 cubic centimeters to roughly 630 (our own brains are typically about twice as big) with the first evidence for upright apes smashing stones together to create crude tools. Both processes began among bipedal East African apes around 2.5 million years ago. Louis and Mary Leakey, the famous excavators of Olduvai Gorge in Tanzania (Figure 1.1), named these relatively big-brained, tool-using creatures Homo habilis, Latin for “Handy Man.” (Until recently, paleoanthropologists, like most people, thought nothing of applying the word “man” to individuals of both sexes; that has changed, but by convention scientists still use single-sex names like Handy Man.)
East and West meant little when Homo habilis walked the earth—first, because these creatures lived entirely within the forests of East Africa, and no regional variations had yet developed, and second, because the expression “walked the earth” is actually overly generous. Handy Men had toes and ankles like ours, and certainly did walk, but their long arms suggest that they also spent a lot of time in trees. These were fancy apes, but not much more. The marks their stone tools left on animal bones show that Homo habilis ate meat as well as plants, but it looks like they were still quite low on the food chain. Some paleoanthropologists defend a man-the-hunter theory, seeing Homo habilis as smart and brave enough to kill game armed with nothing more than sticks and broken stones, but others (rather more convincingly) see in Homo habilis man-the-scavenger, following the real killers (like lions) around, eating the bits they didn’t want. Microscopic studies show that marks from Handy Man’s tools did at least get onto animal bones before those from hyenas’ teeth.
For 25,000 generations Handy Men scampered and swung through the trees in this little corner of the world, chipping stone tools, grooming each other, and mating. Then, somewhere around 1.8 million years ago, they disappeared. So far as we can tell this happened rather suddenly, although one of the problems in studying human evolution is the difficulty of dating finds precisely. Much of the time we depend on the fact that the layers of rock containing the fossil bones or tools may also contain unstable radioactive isotopes whose rate of decay is known, so that measuring the ratios between the isotopes gives dates for the finds. These dates, however, can have margins of error tens of thousands of years wide, so when we say the world of Homo habilis ended suddenly, “suddenly” may mean a few lifetimes or a few thousand lifetimes.
When Charles Darwin was thinking about natural selection in the 1840s and 1850s he assumed that it worked through the slow accretion of tiny changes, but in the 1970s the biologist Stephen Jay Gould suggested instead that for long periods nothing much happens, then some event triggers a cascade of changes. Evolutionists nowadays divide over whether gradual change (evolution by creeps, as its critics call it) or Gould’s “punctuated equilibrium” (evolution by jerks) is better as a general model, but the latter certainly seems to make most sense of Homo habilis’s disappearance. About 1.8 million years ago East Africa’s climate was getting drier and open savannas were replacing the forests where Homo habilis lived; and at just that point, new kinds of ape-men took Handy Man’s place.
I want to hold off putting a name on these new ape-men, and for now will just point out that they had bigger brains than Homo habilis, typically about 800 cc. They lacked the long, chimplike arms of Homo habilis, probably meaning that they spent nearly all their time on the ground. They were also taller. A million-and-a-half-year-old skeleton from Nariokotome in Kenya, known as the Turkana Boy, belongs to a five-foot-tall child who would have reached six feet had he survived to adulthood. As well as being longer, his bones were less robust than those of Homo habilis, suggesting that he and his contemporaries relied more on their wits and tools than on brute strength.
Most of us think that being smart is self-evidently good. Why, then, if Homo habilis had the potential to mutate in this direction, did they putter along for half a million years before “suddenly” morphing into taller, bigger-brained creatures? The most likely explanation lies in the fact that there is no such thing as a free lunch. A big brain is expensive to run. Our own brains typically make up 2 percent of our body weight but use up 20 percent of the energy we consume. Big brains create other problems too: it takes a big skull to hold a big brain—so big, in fact, that modern women have trouble pushing babies with such big heads down their birth canals. Women deal with this by in effect giving birth prematurely. If our babies stayed in the womb until they were almost self-sufficient (like other mammals), their heads would be too big for them to get out.
Yet risky childbirth, years of nurturing, and huge brains that burn up one fifth of our food intake are all fine with us—finer, anyway, than using the same amounts of energy to grow claws, more muscles, or big teeth. Intelligence is much more of a plus than any of these alternatives. It is less obvious, though, why a genetic mutation producing bigger brains gave ape-men enough advantages to make the extra energy costs worthwhile a couple of million years ago. If being smarter had not been beneficial enough to pay the costs of supporting these gray cells, brainy apes would have been less successful than their dumber relatives, and their smart genes would have quickly disappeared from the population.
Perhaps we should blame it on the weather. When the rains failed and the trees the ape-men lived in started dying, brainier and perhaps more sociable mutants might well have gained an edge over their more apelike relatives. Instead of retreating ahead of the grasslands, the clever apes found ways to survive on them, and in the twinkling of an eye (on the timescale of evolution) a handful of mutants spread their genes through the whole pool and completely replaced the slower-witted, undersized, forest-loving Homo habilis.
The Beginnings of East and West?
Whether because their home ranges got crowded, because bands squabbled, or just because they were curious, the new ape-men were the first such creatures to leave East Africa. Their bones have been found everywhere from the southern tip of the continent to the Pacific shores of Asia. We should not imagine great waves of migrants like something out of a cowboy movie, though; the ape-men were surely barely conscious of what they were doing, and crossing these vast distances required even vaster stretches of time. From Olduvai Gorge to Cape Town in South Africa is a long way—two thousand miles—but to cover this ground in a hundred thousand years (the length of time it apparently took) ape-men only needed, on average, to expand their foraging range by 35 yards each year. Drifting northward at the same rate would take them to the threshold of Asia, and in 2002 excavators at Dmanisi in the Republic of Georgia found a 1.7-million-year-old skull that combines features of Homo habilis and the newer ape-men. Stone tools from China and fossil bones from Java (then still joined to the Asian mainland) may be almost as old, implying that after leaving Africa the ape-men picked up speed, averaging a cracking pace of 140 yards per year.
We can only realistically expect to distinguish Eastern and Western ways of life after ape-men left East Africa, spreading through the warm, subtropical latitudes as far as China; and an East-West distinction may be just what we do find. By 1.6 million years ago, there are obvious Eastern and Western patterns in the archaeological record. The question, though, is whether these contrasts are important enough that we should imagine distinct ways of life lying behind them.
Archaeologists have known about these East-West differences since the 1940s, when the Harvard archaeologist Hallam Movius noticed that the bones of the new, brainy ape-men were often found in association with new kinds of flaked stone tools. Archaeologists called the most distinctive of these tools “Acheulean hand axes” (“ax” because they look like axheads, even though they were clearly used for cutting, poking, and pounding as well as chopping; “hand” because they were handheld, rather than being attached to sticks; and Acheulean after the small French town of St. Acheul, where they were first found in large numbers). Calling these tools works of art might be excessive, but their simple symmetry is often much more beautiful than Handy Men’s cruder flakes and chopping tools. Movius noticed that while Acheulean hand axes were common in Africa, Europe, and southwest Asia, none had been found in East or Southeast Asia. Instead, Eastern sites produced rougher tools much like the pre-Acheulean finds associated with Homo habilis in Africa.
If the so-called Movius Line (Figure 1.2) really does mark the beginning of separate Eastern and Western ways of life, it could also provide an astonishingly long-term lock-in theory—one holding that almost as soon as ape-men moved out of Africa, they divided between Western/technologically advanced/Acheulean hand ax cultures in Africa and southwest Asia and Eastern/technologically less advanced/flake-and-chopper cultures in East Asia. No wonder the West rules today, we might conclude: it has led the world technologically for a million and a half years.
Identifying the Movius Line, though, is easier than explaining it. The earliest Acheulean hand axes, found in Africa, are about 1.6 million years old, but there were already ape-men at Dmanisi in Georgia a hundred thousand years before that. The first ape-men clearly left Africa before the Acheulean hand ax became a normal part of their toolkit, carrying pre-Acheulean technologies across Asia while the Western/African region went on to develop Acheulean tools.
A quick glance at Figure 1.2, though, shows that the Movius Line does not divide Africa from Asia; it actually runs through northern India. This is an important detail. The first migrants left Africa before Acheulean hand axes were invented, so there must have been subsequent waves of migration out of Africa, bringing hand axes to southwest Asia and India. So we need to ask a new question: Why did these later waves of ape-men not take Acheulean technology even farther east?
The most likely answer is that rather than marking the boundary between a technologically advanced West and a less-advanced East, the Movius Line merely separates Western regions where access to the sort of stones needed for hand axes is easy from Eastern areas where such stones are rare and where good alternatives—such as bamboo, which is tough but does not survive for us to excavate—are easily available. According to this interpretation, as hand-ax users drifted across the Movius Line they gradually gave Acheulean tools up because they could not replace broken ones. They carried on producing choppers and flakes, for which any old pebble would do, but perhaps started using bamboo for tasks previously done with stone hand axes.
Some archaeologists think finds from the Bose Basin in south China support this thinking. About 800,000 years ago a huge meteor crashed here. It was a disaster on an epic scale, and intense fires burned millions of acres of forest. Before the impact, ape-men in the Bose Basin had used choppers, flakes, and (presumably) bamboo, like other East Asians; but when they returned after the fires they started making hand axes rather like the Acheulean ones—perhaps, the theory runs, because the fires had burned off all the bamboo, in the process exposing usable cobbles. After a few centuries, as the vegetation grew back, the locals gave up hand axes and went back to bamboo.
If this speculation is right, East Asian ape-men were perfectly capable of making hand axes when conditions favored these tools, but normally did not bother because alternatives were more easily available. Stone hand axes and bamboo tools were just two different tools for doing the same jobs, and ape-men all lived in much the same ways, whether they found themselves in Morocco or Malaya.
That makes reasonable sense, but, this being prehistoric archaeology, there are other ways of looking at the Movius Line too. So far I have avoided giving a name to the ape-men who used Acheulean hand axes, but at this point the name we give them starts to matter.
Since the 1960s most paleoanthropologists have called the new species that evolved in Africa about 1.8 million years ago Homo erectus (“Upright Man”) and have assumed that these creatures wandered through the subtropical latitudes to the shores of the Pacific Ocean. In the 1980s, however, some experts began focusing on subtle differences between Homo erectus skulls found in Africa and those found in East Asia. They suspected that they were in fact looking at two different species of ape-men. They coined a new name, Homo ergaster (“Working Man”), for those who evolved in Africa 1.8 million years ago and then spread all the way to China. Only when Homo ergaster reached East Asia, they suggested, did Homo erectus evolve from them. Homo erectus was therefore a purely East Asian species, distinct from the Homo ergaster who filled Africa, southwest Asia, and India.
If this theory is correct, the Movius Line was not just a trivial difference in tool types: it was a genetic watershed that split early ape-men in two. In fact, it raises the possibility of what we might call the mother of all long-term lock-in theories: that East and West are different because Easterners and Westerners are—and have been for more than a million years—different kinds of human beings.
The First Easterners: Peking Man
This technical debate over classifying prehistoric skeletons has potentially alarming implications. Racists are often eager to pounce on such details to justify prejudice, violence, and even genocide. You might feel that taking the time to talk about a theory of this kind merely dignifies bigotry; perhaps we should just ignore it. But that, I think, would be a mistake. Pronouncing racist theories contemptible is not enough. If we really want to reject them, and to conclude that people (in large groups) really are all much the same, it must be because racist theories are wrong, not just because most of us today do not like them.
Basically, we do not know whether there was just one kind of ape-man on earth around 1.5 million years ago—meaning that ape-men (in large groups) were all much the same from Africa to Indonesia—or whether there was one distinct species of Homo ergaster west of the Movius Line and another of Homo erectus east of it. Only further research will clear that question up. But we do know, without a shadow of doubt, that within the last million years distinct species of ape-men did evolve in East and West.
Geography probably had a lot to do with this. The ape-men that drifted out of Africa around 1.7 million years ago were well adapted to subtropical climes, but as they wandered northward, deeper into Europe and Asia, they had to face longer and harsher winters. Living in the open air, like their African ancestors, became increasingly impractical as they advanced toward a line roughly 40 degrees north of the equator (running from the top of Portugal to Beijing; see Figure 1.1). So far as we can tell, building huts and making clothes were beyond their mental capacities, but they could figure out one response: take shelter in caves. Thus were born the cavemen we all heard about as children.
Cave-dwelling was a mixed blessing for the ape-men, who regularly had to share space with bears and lion-sized hyenas whose teeth could crunch up bones. It was a godsend for archaeologists, though, because caves preserve prehistoric deposits well, allowing us to trace how the evolution of ape-men began diverging in the Eastern and Western parts of the Old World as different adaptations to the colder climates took hold.
For understanding Eastern ape-men, the most important site is Zhoukoudian near Beijing, right on the 40-degree line, occupied on-and-off from about 670,000 through 410,000 years ago. The story of its excavation is an epic in its own right, and forms the backdrop to part of Amy Tan’s excellent novel The Bonesetter’s Daughter. While European, American, and Chinese archaeologists were digging here between 1921 and 1937, the hills around the site became the front line in a brutal civil war among Nationalists, Communists, and assorted homegrown warlords. The excavators often worked to the sound of gunfire and had to dodge bandits and checkpoints to take their finds back to Beijing. The project finally collapsed when Japan invaded China, Zhoukoudian became a Communist base, and Japanese troops tortured and murdered three members of the team.
Matters then went from bad to worse. In November 1941, when war between Japan and the United States looked certain, a decision was taken to ship the finds to New York for safekeeping. Technicians packed them into two large crates to await collection in a car from the American embassy in Beijing. No one knows for sure if the car ever came, or where, if it did come, it took the crates. One story has it that Japanese soldiers intercepted the U.S. Marines escorting the finds at the very moment bombs started falling on Pearl Harbor, arrested them, and abandoned the priceless finds. Life was cheap in those dark days, and no one paid much attention to a few boxes of rocks and bones.
But all was not lost. The Zhoukoudian team had published their finds meticulously and had sent plaster casts of the bones to New York—an early example of the importance of backing up data. These show that by 600,000 years ago Peking Man (as the excavators dubbed the Zhoukoudian ape-men) had diverged from tall, lanky Africans like the Turkana Boy toward a stockier form, better suited to cold. Peking Men were typically around five feet three inches tall and less hairy than modern apes, though if you ran into one on Main Street it would certainly be disconcerting. They had short, wide faces, with low, flat foreheads, a heavy single eyebrow, and a big jaw with almost no chin.
Conversation with Peking Man would be a challenge. So far as we can tell, the basal ganglia (the parts of the brain that allow modern humans to combine a small number of mouth movements into an infinite number of utterances) of Homo erectus were poorly developed. The well-preserved skeleton of the Turkana Boy also has a neural canal (holding the spinal cord) only three quarters as wide as a modern human’s, suggesting that he could not control his breathing precisely enough to talk anything like we do.
That said, other finds suggest—indirectly—that ape-men in the Eastern Old World could communicate, after a fashion. In 1994 archaeologists on the little island of Flores near Java excavated what appeared to be 800,000-year-old stone tools. Eight hundred thousand years ago Flores was definitely an island, separated from the mainland by twelve miles of ocean; all of which seemed to mean that Homo erectus must have been able to communicate well enough to make boats, sail over the horizon, and colonize Flores. Other archaeologists, however, dismayed at the idea of boat-building Homo erectus, countered that perhaps these “tools” were not tools at all; maybe they were simply rocks bashed into misleading shapes by natural processes.
The argument could easily have deadlocked, as archaeological debates so often do, but in 2003 Flores yielded up even more astonishing discoveries. A deep sounding exposed eight skeletons, all dating around 16,000 BCE, all belonging to adults, and all under four feet tall. The first of Peter Jackson’s films of The Lord of the Rings had just come out, and journalists immediately labeled these prehistoric little people “hobbits,” after J.R.R. Tolkien’s furry-footed halflings. When animal populations are isolated on islands where there are no predators they quite often evolve into dwarf forms, and this is presumably how the “hobbits” came to be so small. To have shrunk to hobbit size by 16,000 BCE, though, ape-men must have colonized Flores many thousands of generations earlier—perhaps even as long as 800,000 years ago, as the stone tools found in 1994 suggest. The implication, once again, is that Homo erectus could communicate well enough to cross the sea.
The ape-men at Zhoukoudian, then, could probably make themselves understood much better than chimpanzees or gorillas, and the deposits from the cave suggest that they could also make fire at will. On at least one occasion Peking Men roasted a wild horse’s head. Cuts on the skull show they were after its tongue and brain, both rich in fats. They may have been fond of one another’s brains too: in the 1930s the excavators inferred cannibalism and even headhunting from bone-breakage patterns. A 1980s study of the plaster casts showed that most of the marks on the skulls were actually caused by the teeth of prehistoric giant hyenas rather than other Peking Men, but one skull—an additional fragment of which was excavated in 1966—definitely shows stone tool marks.
If instead of bumping into a Peking Man on a modern Main Street you could take a time machine back to Zhoukoudian half a million years ago, you would have a disorienting and alarming experience. You would see the cavemen communicating, perhaps with grunts and gestures, but you would not be able to talk to them. Nor could you get through to them by drawing pictures; there is no good evidence that art made any more sense to Homo erectus than it does to chimpanzees. The Peking Men that evolved in the Eastern Old World were very different from us.
The First Westerners: Neanderthals
But were Peking Men also different from the ape-men that were evolving in the Western Old World? The oldest finds from Europe, made in 1994 in a chain of caves at Atapuerca in Spain, date back about 800,000 years (roughly to the time that Homo erectus may have taken to boats and colonized Flores). In some ways, the Atapuerca finds were rather like those from Zhoukoudian: many of the bones were crisscrossed with cut marks from stone tools exactly like those that butchery would produce.
The hints of cannibalism grabbed headlines, but paleoanthropologists were even more excited by the ways in which Atapuerca differed from Zhoukoudian. The Atapuerca skulls had bigger brain cavities than those of Homo erectus and rather modern-looking noses and cheekbones. The paleoanthropologists concluded that a new species was emerging, which they called Homo antecessor (“Ancestral Man”).
Homo antecessor helped make sense of a string of finds going back to 1907, when workmen had turned up a strange jawbone in a sandpit in Germany. This species, named Heidelberg Man after a nearby university town, looked much like Homo erectus but had heads more like ours, with high, rounded skulls and brains of about 1,000 cc—much bigger than the 800 cc average for Homo erectus. It looks as if the pace of evolutionary change accelerated all across the Old World after 800,000 years ago as ape-men entering the cold north encountered wildly different climates where random genetic mutations could flourish.
Here at last we have some incontrovertible facts. By 600,000 years ago, when Heidelberg Man came onto the scene and Peking Man ruled the roost at Zhoukoudian, there were definitely different species of Homo in the Eastern and Western parts of the Old World: in the East the small-brained Homo erectus and in the West the larger-brained Homo antecessor and Heidelberg Man.
When it comes to brains, size is not everything. Anatole France won the Nobel Prize for literature in 1921 with a brain no bigger than Heidelberg Man’s. Yet Heidelberg Man does seem to have been a lot smarter than earlier ape-men or contemporary Peking Man. Before Heidelberg Man showed up, stone tools had barely changed for a million years, but by 500,000 BCE Heidelberg Man was making thinner and therefore lighter versions, striking more delicate flakes using soft (probably wood) hammers as well as just banging rocks together. This suggests better hand-eye coordination. Heidelberg Men and Women also made more specialized tools and began preparing specially shaped stone cores from which they could strike further tools at will, which must mean that they were just a lot better than Homo erectus at thinking about what they wanted from the world and how to get it. The very fact that Heidelberg Man could survive at Heidelberg, well north of the 40-degree line, is itself evidence of a smarter ape-man.
Zhoukoudian’s occupants changed little between 670,000 and 410,000 years ago, but Western ape-men continued evolving across this period. If you crawl several hundred yards into the dank Spanish caves at Atapuerca, mostly on your belly and sometimes using ropes, you come to a forty-foot drop into the aptly named Pit of Bones—the densest concentration of ape-man remains ever found. More than four thousand fragments have been recovered here since the 1990s, dated between 564,000 and 600,000 years ago. Most belong to teenagers or young adults. What they were doing so far beneath the earth remains a mystery, but like the older Atapuerca deposit, the Pit of Bones has remarkably diverse human remains. The Spanish excavators classify most of them as Heidelberg Man, but many foreign scholars think they look more like yet another species—the Neanderthals.
These most famous of cavemen were first recognized in 1856, when quarry workers in the Neander Valley (Tal or Thal in German) showed a local schoolteacher a skullcap and fifteen bones they had found (excavations in the 1990s recovered a further sixty-two fragments from the workers’ waste dump). The teacher showed them to an anatomist, who, with impressive understatement, pronounced them “pre-Germanic.”
The Atapuerca finds suggest that Neanderthals emerged gradually across a quarter of a million years. Rather than climate change or expansion into new areas providing conditions for a few mutants to out-breed and replace Heidelberg Man, this may have been a case of genetic drift, with many different kinds of ape-men developing alongside one another. “Classic” Neanderthals appeared by 200,000 years ago and within another hundred thousand years spread over much of Europe and east into Siberia, though so far as we know they did not reach China or Indonesia.
Just how much did Neanderthals differ from Peking Men? They were typically about the same height as Eastern ape-men and were even more primitive-looking, with sloping foreheads and weak chins. They had big front teeth, often worn down from use as tools, set in forward-thrust faces with large noses, the latter perhaps an adaptation to the cold air of Ice Age Europe. Neanderthals were more heavily built than Peking Men, with broader hips and shoulders. They were as strong as wrestlers, had the endurance of marathon runners, and seem to have been ferocious fighters.
Despite having much heavier bones than most ape-men, Neanderthals got injured a lot; the closest modern parallel to their bone-breakage patterns, in fact, comes from professional rodeo riders. Since there were no bucking broncos to fall off a hundred thousand years ago (modern horses would not evolve until 4000 BCE), paleoanthropologists are confident that Neanderthals got hurt fighting—with one another and with wild animals. They were dedicated hunters; analysis of nitrogen isotopes from their bones shows that they were massively carnivorous, getting an amazing proportion of their protein from meat. Archaeologists had long suspected that Neanderthals got some of their meat by eating one another, just like Peking Man, and in the 1990s finds in France proved this beyond a doubt. The bones of half a dozen Neanderthals were found mixed with those of five red deer. The ape-men and deer had been treated exactly the same way: first they were cut into pieces with stone tools, then the flesh was sliced off their bones, and finally their skulls and long bones were smashed to get at their brains and marrow.
The details I have emphasized so far make Neanderthals sound not so different from Peking Men, but there is more to the story than this. For one thing, Neanderthals had big brains—even bigger brains than ours, in fact, averaging around 1,520 cc to our 1,350 cc. They also had wider neural canals than the Turkana Boy, and these thick spinal cords gave them more manual dexterity. Their stone tools were better made and more varied than Peking Men’s, with specialized scrapers, blades, and points. Traces of tar on a stone point found embedded in a wild ass’s neck in Syria suggest that it had been a spearhead attached to a stick. Wear patterns on tools suggest that Neanderthals used them mostly for cutting wood, which rarely survives, but at the waterlogged German site of Schöningen four beautifully carved seven-foot-long spears turned up near heaps of wild horse bones. The spears were weighted for thrusting, not throwing; for all their smartness, Neanderthals may not have been coordinated enough to use missile weapons.
The need to get up close to scary animals may account for Neanderthals’ rodeo-rider injuries, but some finds, especially from Shanidar Cave in Iraq, hint at entirely different qualities. One skeleton showed that a man had survived with a withered arm and deformed legs for years, despite losing his right forearm and left eye (in her bestselling novel The Clan of the Cave Bear, Jean Auel based her character Creb—the disabled spiritual leader of a Neanderthal band living in Crimea—on this skeleton). Another man at Shanidar had crippling arthritis in his right ankle, but also managed to get by, at least until a stab wound killed him. Having bigger brains doubtless helped the weak and injured to help themselves; Neanderthals could definitely make fire at will and could probably turn animal skins into clothes. All the same, it is hard to see how the Shanidar men could have coped without help from able-bodied friends or family. Even the most austere scientists agree that Neanderthals—by contrast with all earlier kinds of Homo and their contemporaries at Zhoukoudian—showed something we can only call “humanity.”
Some paleoanthropologists even think that Neanderthals’ big brains and wide neural canals allowed them to talk more or less like us. Like modern humans they had hyoid bones, which anchor the tongue and let the larynx make the complex movements needed for speech. Other scholars disagree, though, noting that Neanderthal brains, while big, were longer and flatter than ours, and that the speech areas were probably less developed. They also point out that although the relevant areas survive on the bases of only three skulls, it looks as if Neanderthals’ larynxes were very high in their necks, meaning that despite their hyoid bones they could vocalize only a narrow range of sounds. Maybe they could just grunt single syllables (what we might call the “me Tarzan, you Jane” model), or maybe they could express important concepts—“come here,” “let’s go hunting,” “let’s make stone tools/dinner/love”—by combining gestures and sounds (the Clan of the Cave Bear model, where Neanderthals have an elaborate sign language).
In 2001 it began to look like genetics might settle things. Scientists found that one British family that for three generations had shared a speech disorder called verbal dyspraxia also shared a mutation on a gene called FOXP2. This gene, it turned out, codes for a protein influencing how the brain processes speech and language. This does not mean that FOXP2 is “the language gene”: speech is a bewilderingly complex process involving countless genes working together in ways we cannot yet fathom. FOXP2 came to geneticists’ attention because sometimes it just needs one thing to go wrong for a whole system to crash. A mouse chews through a two-cent wire and my twenty-thousand-dollar car won’t start; FOXP2 malfunctions and the brain’s elaborate speech networks seize up. All the same, some archaeologists suggested, maybe random mutations producing FOXP2 and related genes gave modern humans linguistic skills that earlier species, including Neanderthals, lacked.
But then the plot thickened. As everyone now knows, deoxyribonucleic acid—DNA—is the basic building block of life, and in 2000 geneticists sequenced the modern human genome. What is less well known is that back in 1997, in a scene reminiscent of Jurassic Park, scientists in Leipzig, Germany, extracted ancient DNA from the arm of the original Neanderthal skeleton found in the Neander Valley in 1856. This was an extraordinary feat, since DNA begins breaking down immediately upon death, and only tiny fragments survive in such ancient material. The Leipzig team is not about to clone cavemen and open a Neanderthal Park, so far as I know, but in 2007 the process of sequencing a draft of the Neanderthal genome (which was completed in 2009) produced a remarkable discovery—that Neanderthals also had the FOXP2 gene.
Maybe this means that Neanderthals were as chatty as us; or maybe that FOXP2 was not the key to speech. One day we will surely know, but for now all we can do is observe the consequences of Neanderthals’ interactions. They lived in bigger groups than earlier types of ape-men, hunted more effectively, occupied territories for longer periods, and cared about one another in ways earlier ape-men could not.
They also deliberately buried some of their dead, and perhaps even performed rituals over them—the earliest signs of that most human quality of all, a spiritual life, if we are interpreting the evidence correctly. At Shanidar, for instance, several bodies had definitely been buried, and the soil in one grave contained high concentrations of pollen, which might mean that some Neanderthals laid a loved one’s body on a bed of spring flowers. (Rather less romantically, some archaeologists point out that the grave was honeycombed with rat burrows, and that rats often carry flowers into their lairs.)
In a second case, at Monte Circeo near Rome, construction workers in 1939 exposed a cave that had been sealed by a rockfall fifty thousand years ago. They told archaeologists that a Neanderthal skull sat on the floor in the middle of a circle of rocks, but because the workers moved the skull before experts saw it, many archaeologists harbor doubts.
Finally, there is Teshik-Tash in Uzbekistan. Here Hallam Movius (he of Movius Line fame) found the skeleton of a boy encircled, he said, by five or six pairs of wild goat horns. However, the deposits at Teshik-Tash are full of goat horns, and Movius never published plans or photographs of the finds to convince skeptics that these particular ones were in a meaningful pattern.
We need clearer evidence to lay this question to rest. Personally, I suspect that there is no smoke without fire, and that Neanderthals did have some kind of spiritual life. Perhaps they even had medicine women and shamans like Iza and Creb in The Clan of the Cave Bear. Whether that is right or not, though, if the time machine I invoked earlier could transport you to Shanidar as well as to Zhoukoudian, you would see real behavioral differences between Eastern Peking Man and Western Neanderthals. You would also be hard-pressed to avoid concluding that the West was more developed than the East. This may already have been true 1.6 million years ago, when the Movius Line took shape, but it was definitely true a hundred thousand years ago. Again the specter of a racist long-term lock-in theory rears its head: Does the West rule today because modern Europeans are the heirs of genetically superior Neanderthal stock, while Asians descend from the more primitive Homo erectus?
Historians like giving long, complicated answers to simple questions, but this time things really do seem to be straightforward. Europeans do not descend from superior Neanderthals, and Asians do not descend from inferior Homo erectus. Starting around seventy thousand years ago, a new species of Homo—us—drifted out of Africa and completely replaced all other forms. Our kind, Homo sapiens (“Wise Man”), did interbreed with Neanderthals in the process. Modern Eurasians share 1 to 4 percent of their genes with the Neanderthals, but everywhere from France to China it is the same 1 to 4 percent. The spread of modern humans wiped the slate clean. Evolution of course continues, and local variations in skin color, face shape, height, lactose tolerance, and countless other things have appeared in the two thousand generations since we began spreading across the globe. But when we get right down to it, these are trivial. Wherever you go, whatever you do, people (in large groups) are all much the same.
The evolution of our species and its conquest of the planet established the biological unity of mankind and thereby the baseline for any explanation of why the West rules. Humanity’s biological unity rules out race-based theories. Yet despite the overwhelming importance of these processes, much about the origins of modern humans remains obscure. By the 1980s archaeologists knew that skeletons more or less like ours first appeared around 150,000 years ago on sites in eastern and southern Africa. The new species had flatter faces, more retracted under their foreheads, than earlier ape-men. They used their teeth less as tools, had longer and less muscular limbs, and had wider neural canals and larynxes positioned better for speaking. Their brain cavities were a little smaller than Neanderthals’ but their skullcaps were higher and more domed, leaving room for bigger speech and language centers and stacked layers of neurons that could perform massive numbers of calculations in parallel.
The skeletons suggested that the earliest Homo sapiens could walk the walk just like us, but—oddly—the archaeology suggested that for a hundred thousand years they stubbornly refused to talk the talk. Homo sapiens tools and behavior looked much like those of earlier ape-men, and—again like other ape-men, but utterly unlike us—early Homo sapiens seemed to have had just one way of doing things. Regardless of where archaeologists dug in Africa, they kept coming up with the same, not particularly exciting, kinds of finds. Unless, that is, they excavated Homo sapiens sites less than fifty thousand years old. On these younger sites Homo sapiens started doing all kinds of interesting things, and doing them in lots of different ways. For instance, archaeologists identify no fewer than six distinct styles of stone tools in use in Egypt’s Nile Valley between 50,000 and 25,000 BCE, whereas before then a single fashion prevailed from South Africa to the shores of the Mediterranean.
Humans had invented style. Chipping stone tools this way, rather than that way, now marked a group off as different from their neighbors; chipping them a third way marked a new generation as different from their elders. Change remained glacial by the standards we are used to, when pulling out a four-year-old cell phone that can’t make movies, locate me on a map, or check e-mail makes me look like a fossil, but it was meteoric compared to all that had gone before.
As any teenager coming home with hair dyed green or a new piercing will tell you, the best way to express yourself is to decorate yourself, but until fifty thousand years ago, it seemed that almost no one had felt this way. Then, apparently, almost everyone did. At site after site across Africa after 50,000 BCE archaeologists find ornaments of bone, animal tooth, and ivory; and these are just the activities that leave remains for us to excavate. Most likely all those other forms of personal adornment we know so well—hairstyles, makeup, tattoos, clothes—appeared around the same time. A rather unpleasant genetic study has suggested that human body lice, which drink our blood and live in our clothes, evolved around fifty thousand years ago as a little bonus for the first fashionistas.
“What a piece of work is a man!” gasps Hamlet when his friends Rosencrantz and Guildenstern come to spy on him. “How noble in reason! how infinite in faculty! in form and moving how express and admirable! in action how like an angel! in apprehension how like a god!” And in all these ways, how unlike an ape-man. By 50,000 BCE modern humans were thinking and acting on a whole different plane from their ancestors. Something extraordinary seemed to have happened—something so profound, so magical, that in the 1990s it moved normally sober scientists to flights of rhetoric. Some spoke of a Great Leap Forward; others of the Dawn of Human Culture or even the Big Bang of Human Consciousness.
But for all their drama, these Great Leap Forward theories were always a little unsatisfactory. They required us to imagine not one but two transformations, the first (around 150,000 years ago) producing modern human bodies but not modern human behavior, and the second (around 50,000 years ago) producing modern human behavior but leaving our bodies unchanged. The most popular explanation was that the second transformation—the Great Leap—began with purely neurological changes that rewired the brain to make modern kinds of speech possible, which in turn drove a revolution in behavior; but just what this rewiring consisted of (and why there were no related changes to skulls) remained a mystery.
If there is anywhere that evolutionary science has left room for supernatural intervention, some superior power breathing a spark of divinity into the dull clay of ape-men, surely it is here. When I was (a lot) younger I particularly liked the story that opens Arthur C. Clarke’s science-fiction novel 2001: A Space Odyssey (and Stanley Kubrick’s memorable, if hard to follow, movie version). Mysterious crystal monoliths drop from outer space to Earth, come to upgrade our planet’s ape-men before they starve into extinction. Night after night Moon-Watcher, the alpha ape-man in one band of earthlings, feels what Clarke calls “inquisitive tendrils creeping down the unused byways of his brain” as a monolith sends him visions and teaches him to throw rocks. “The very atoms of his simple brain were being twisted into new patterns,” says Clarke. And then the monolith’s mission is done: Moon-Watcher picks up a discarded bone and brains a piglet with it. Depressingly, Clarke’s vision of the Big Bang of Human Consciousness consists entirely of killing things, culminating in Moon-Watcher murdering One-Ear, the top ape-man in a rival band. Next thing the reader knows, we are in the space age.
Clarke set his 2001 moment 3 million years ago, presumably to account for the invention of tools by Homo habilis, but I always felt that the place where a good monolith would really do some work was when fully modern humans appeared. By the time I started studying archaeology in college I had learned not to say things like that, but I couldn’t shake the feeling that the professionals’ explanations were less compelling than Clarke’s.
The big problem archaeologists had in those far-off days when I was an undergraduate was that they simply had not excavated very many sites dating between 200,000 and 50,000 years ago. As new finds accumulated across the 1990s, though, it began to become clear that we did not need monoliths after all; in fact, the Great Leap Forward itself began to dissolve into a series of Baby Steps Forward, spread across tens of thousands of years.
We now know of several pre-50,000-BCE sites with signs of surprisingly modern-looking behavior. Take, for instance, Pinnacle Point, a cave excavated in 2007 on the South African coast. Homo sapiens moved in here about 160,000 years ago. This is interesting in itself: earlier ape-men generally ignored coastal sites, probably because they could not work out how to find much food there. Yet Homo sapiens not only headed for the beach—distinctly modern behavior—but when they got there they were smart enough to gather, open, and cook shellfish. They also chipped stones into the small, light points that archaeologists call bladelets, perfect as tips for javelins or arrows—something that neither Peking Man nor Europe’s Neanderthals ever did.
On a handful of other African sites people engaged in different but equally modern-looking activity. About a hundred thousand years ago at Mumbwa Cave in Zambia people lined a group of hearths with stone slabs to make a cozy nook where it is easy to imagine them sitting around telling stories, and at dozens of sites around Africa’s coasts, from its southern tip to Morocco and Algeria in the north (and even just outside Africa, in Israel), people were sitting down and patiently cutting and grinding ostrich eggshells into beads, some of them just a quarter of an inch across. By ninety thousand years ago people at Katanda in the Congo had turned into proper fishermen, carving harpoons out of bone. The most interesting site of all, though, is Blombos Cave on Africa’s southern coast, where in addition to shell beads, excavators found a 77,000-year-old stick of ocher (a type of iron ore). Ocher can be used for sticking things together, waterproofing sails, and all kinds of other tasks; but in recent times it has been particularly popular for drawing, producing satisfyingly bold red lines on tree bark, cave walls, and people’s bodies. Fifty-seven pieces turned up at Pinnacle Point, and by 100,000 BCE it shows up on most African sites, which probably means that early humans liked drawing. The truly remarkable thing about the Blombos ocher stick, though, is that someone had scratched a geometric pattern on it, making it the world’s oldest indisputable work of art—and one made for producing more works of art.
At each of these sites we find traces of one or two kinds of modern behavior, but never of the whole suite of activities that becomes familiar after 50,000 BCE. Nor is there much sign yet that the modern-looking activities were cumulative, building up gradually until they took over. But archaeologists are already beginning to feel their way toward an explanation for the apparent baby steps toward fully modern humanity, driven largely by climate change.
Geologists realized back in the 1830s that the miles-long, curving lines of rubble found in parts of Europe and North America must have been created by ice sheets pushing debris before them (not, as had previously been thought, by the biblical flood). The concept of an “ice age” was born, although another fifty years passed before scientists understood exactly why ice ages happen.
Earth’s orbit around the sun is not perfectly round, because the gravity of other planets also pulls on us. Over the course of a hundred thousand years our orbit goes from being almost circular (as it is now) to being much more elliptical, then back again. Earth’s tilt on its axis also shifts, on a 22,000-year rhythm, as does the way the planet wobbles around this axis, this time on a 41,000-year scale. Scientists call these Milankovich cycles, after a Serbian mathematician who worked them out, longhand, while interned during World War I (this was a very gentlemanly internment, leaving Milankovich free to spend all day in the library of the Hungarian Academy of Sciences). The patterns combine and recombine in bewilderingly complex ways, but on a roughly hundred-thousand-year schedule they take us from receiving slightly more solar radiation than the average, distributed slightly unevenly across the year, to receiving slightly less sunlight, distributed slightly more evenly.
None of this would matter much except for the way Milankovich cycles interact with two geological trends. First, over the last 50 million years continental drift has pushed most land north of the equator, and having one hemisphere mostly land and the other mostly water amplifies the effects of seasonal variations in solar radiation. Second, volcanic activity has declined across the same period. There is (for the time being) less carbon dioxide in our atmosphere than there was in the age of the dinosaurs, and because of this the planet has—over the very long run and until very recently—steadily cooled.
Through most of Earth’s history the winters were cold enough that it snowed at the poles and this snow froze, but normally the sun melted this ice every summer. By 14 million years ago, however, declining volcanic activity had cooled Earth so much that at the South Pole, where there is a large landmass, the summer sun no longer melted the ice. At the North Pole, where there is no landmass, ice melts more easily, but by 2.75 million years ago temperatures had dropped enough for ice to survive year-round there, too. This had huge consequences, because now whenever Milankovich cycles gave Earth less solar radiation, distributed more evenly across the year, the North Pole ice cap would expand onto northern Europe, Asia, and America, locking up more water, making the earth drier and the sea level lower, reflecting back more solar radiation, and reducing temperatures further still. Earth then spiraled down into an ice age—until the planet wobbled, tilted, and rotated its way back to a warmer place, and the ice retreated.
Depending on how you count, there have been between forty and fifty ice ages, and the two that spanned the period from 190,000 through 90,000 BCE—crucial millennia in human evolution—were particularly harsh. Lake Malawi, for instance, contained just one-twentieth as much water in 135,000 BCE as it does today. The tougher environment must have changed the rules for staying alive, which may explain why mutations favoring braininess began flourishing. It may also explain why we have found so few sites from this period; most protohumans probably died out. Some archaeologists and geneticists in fact estimate that around 100,000 BCE there were barely twenty thousand Homo sapiens left alive.
If this new theory is correct, the population crisis would have done several things at once. On the one hand, by shrinking the gene pool it would have made it easier for mutations to flourish; but on the other, if Homo sapiens bands became smaller they would die out more easily, taking any advantageous mutations with them. If (as seems likely from the tiny number of sites known from this period) there were also fewer bands, groups would meet less often and have less chance to pool their genes and knowledge. We should probably imagine that for a hundred thousand years tiny bands of protohumans eked out livings in Africa in unfriendly and unpredictable environments. They did not meet, interbreed, or exchange goods and information very often. Genetic mutations flourished in these isolated pockets of people, some producing humans very like us, some not. Some groups figured out harpoons, many made beads, but most did neither, and the specter of extinction haunted them all.
These were dark days for Homo sapiens, but around seventy thousand years ago their luck changed. Eastern and southern Africa became warmer and wetter, which made hunting and gathering easier, and humans reproduced as rapidly as their food sources. Modern Homo sapiens had been evolving for a good hundred thousand years, with a lot of trial, error, and extinctions, but when the climate improved, those populations with the most advantageous mutations took off, outbreeding less brainy humans. There were no monoliths; no Great Leap Forward; just a lot of sex and babies.
Within a few thousand years early humans reached a tipping point that was as much demographic as biological. Instead of dying out so often, bands of modern humans grew big enough and numerous enough to stay in regular contact, pooling their genes and know-how. Change became cumulative and the behavior of Homo sapiens diverged rapidly from that of other ape-men. And once that happened, the days of biological distinctions between East and West were numbered.
Out of Africa—Again
Climate change is rarely simple, and while Homo sapiens’ homelands in eastern and southern Africa were getting wetter seventy thousand years ago, North Africa was drying out. Our ancestors, multiplying rapidly in their home ranges, chose not to spread in that direction; instead, little bands wandered from what is now Somalia across a land bridge to southern Arabia, and then to Iran (Figure 1.3). At least, this is what we think they must have done. There has been relatively little archaeological exploration in South Asia, but we have to assume bands of modern humans moved this way, because by 60,000 BCE they had reached Indonesia, taken to boats, crossed fifty miles of open water, and wandered as far as Lake Mungo in southern Australia. The colonists moved fifty times faster than Homo erectus/ergaster had done when they left Africa, averaging more than a mile a year compared to the earlier ape-men’s thirty-five yards.
Between fifty thousand and forty thousand years ago a second wave of migrants probably moved through Egypt into southwest and central Asia, spreading from there into Europe. Clever enough to make themselves delicate blades and bone needles, these modern humans cut and sewed fitted clothing and built houses out of mammoth tusks and skins, turning even the frigid wastes of Siberia into a home. Around 15,000 BCE humans crossed the land bridge linking Siberia and Alaska and/or sailed in short hops along its edge. By 12,000 BCE they had left coprolites (scientist-speak for dung) in caves in Oregon and seaweed in the mountains of Chile. (Some archaeologists think humans also crossed the Atlantic along the edge of ice sheets then linking Europe and America, though as yet this remains speculative.)
The situation in East Asia is less clear. A fully modern human skull from Liujiang in China may be 68,000 years old, but there are some technical problems with this date, and the oldest uncontroversial remains date back only to around 40,000 BCE. More digging will settle whether modern humans reached China relatively early or relatively late, but they certainly reached Japan by twenty thousand years ago.
Wherever the new humans went, they seem to have wrought havoc. The continents where earlier ape-men had never set foot were teeming with giant game when Homo sapiens arrived. The first humans to enter New Guinea and Australia encountered four-hundred-pound flightless birds and one-ton lizards; by 35,000 BCE these were extinct. The finds from Lake Mungo and a few other sites suggest that humans arrived around 60,000 BCE, meaning that humans and megafauna coexisted for twenty-five millennia, but some archaeologists dispute the dates, putting humanity’s arrival just forty thousand years ago. If they are right, the great beasts disappeared suspiciously quickly after humans arrived. In the Americas, the first human colonists fifteen thousand years ago met camels, elephants, and huge ground sloths; within four thousand years these, too, were all extinct. The coincidence between the coming of Homo sapiens and the going of the giant animals is, to say the least, striking.
There is no direct evidence that humans hunted these animals to extinction or drove them off their ranges, and alternative explanations for the extinctions (like climate change or comet explosions) abound. But there is less debate over the fact that when modern humans entered environments already occupied by ape-men, the ape-men became extinct. Modern humans had entered Europe by 35,000 BCE, and within ten thousand years Neanderthals had vanished everywhere except the continent’s mountainous fringes. The latest Neanderthal deposits known to us, from Gibraltar in southern Spain, date to around 25,000 BCE. After dominating Europe for 150,000 years, the Neanderthals simply disappeared.
The details of how modern humans replaced ape-men, though, are crucial for deciding whether racial explanations for Western rule make sense. We do not know, yet, whether our ancestors actively killed less intellectually gifted species or just outcompeted them for food. At most sites, modern human deposits simply replace those associated with Neanderthals, suggesting that the change was sudden. The main exception is Reindeer Cave in France, where phases of Neanderthal and modern human occupation apparently alternated between 33,000 and 35,000 years ago, and the Neanderthal layers contain stone foundations for huts, bone tools, and necklaces of animal teeth. The excavators suggested that Neanderthals learned from modern humans and were moving toward a Dawn of Neanderthal Consciousness. Several finds of ocher on Neanderthal sites in France (twenty pounds of it in one cave) may point the same way.
It is easy to imagine heavily muscled, low-browed Neanderthals watching the quicker, talkative newcomers painting their bodies and building huts, then struggling to repeat these actions with their clumsy fingers, or perhaps trading freshly killed meat for jewelry. In The Clan of the Cave Bear, Jean Auel imagined modern humans contemptuously chasing off Neanderthal “Flatheads,” while Neanderthals just tried to stay out of the way of “the Others”—except, that is, for Ayla, an orphaned five-year-old human girl whom the Neanderthal Cave Bear clan adopt, with transformative results. It is all fantasy, of course, but it is as plausible as anyone else’s guess (unless we follow those unromantic archaeologists who point out that sloppy excavation is the most economical explanation for the interleaved Neanderthal and human deposits at Reindeer Cave, meaning that there is no direct evidence for Flatheads learning from Others).
The bottom line is sex. If modern humans replaced Neanderthals in the Western Old World and Homo erectus in the Eastern regions without interbreeding, racist theories tracing contemporary Western rule back to prehistoric biological differences must be wrong. But was that what happened?
In the heyday of so-called scientific racism in the 1930s, some physical anthropologists insisted that modern Chinese people were more primitive than Europeans because their skulls had similarities (small ridges on top, relatively flat upper faces, nonprotruding jaws, shovel-shaped incisors) to those of Peking Man. So, too, these anthropologists pointed out, the skulls of Australia’s indigenous peoples had similarities—ridges around the back for attaching neck muscles, shelflike brows, receding foreheads, large teeth—with those of Indonesian Homo erectus a million years ago. Modern Easterners, these (Western) scholars concluded, must have descended from these more primitive ape-men, while Westerners descended from the more advanced Neanderthals; and that might well explain why the West rules.
No one puts things so crudely today, but if we are serious about asking why the West rules we have to confront the possibility that Homo sapiens interbred with premodern peoples, and that Eastern populations remain biologically less advanced than Western. We will never be able to excavate copulating cavemen to see whether Homo sapiens merged their genes with Neanderthals in the West and with Peking Man in the East, but fortunately we do not need to, because we can observe the consequences of their trysts in our own bodies.
Each of us has inherited our DNA from all the ancestors we ever had, which means that in theory geneticists could compare the DNA of everyone alive and draw a family tree going back to humanity’s most recent shared ancestor. In practice, though, the fact that half the DNA in your body comes from your mother’s line and half from your father’s makes disentangling the information as difficult as unscrambling an egg.
Geneticists found a clever way around this problem by focusing on mitochondrial DNA. Rather than being reproduced sexually, like most DNA, mitochondrial DNA is transmitted solely by women (men inherit mitochondrial DNA from their mothers but do not pass it on). Once upon a time we all had the same mitochondrial DNA, so any difference between the mitochondrial DNA in my body and that in yours must be the result of random mutations, not sexual mixing.
In 1987 a team led by the geneticist Rebecca Cann published a study of mitochondrial DNA in living people from all over the world. They distinguished about 150 types within their data and realized that no matter how they shuffled the statistics, they kept getting three key results: first, that there is more genetic diversity in Africa than anywhere else; second, that the diversity in the rest of the world is just a subset of the diversity within Africa; and third, that the deepest—and therefore oldest—mitochondrial DNA lineages all come from Africa. The conclusion was unavoidable: the last female ancestor shared by everyone in the world must have lived in Africa—African Eve, as she was immediately dubbed. As Cann and her colleagues observed, she was “one lucky mother.” Using standard estimates of mutation rates in mitochondrial DNA, they concluded that Eve lived 200,000 years ago.
Throughout the 1990s paleoanthropologists argued over the Cann team’s conclusions. Some questioned their methods (there are thousands of ways to arrange the scores, in theory all equally valid) and others their evidence (most of the “Africans” in the original study were actually African-Americans), but no matter who redid the samples or the numbers, the results came out much the same. The only real change was to push Eve’s lifetime closer to 150,000 years ago. To clinch matters, African Eve got company at the end of the 1990s when technical advances allowed geneticists to examine nuclear DNA on the Y chromosome. Like mitochondrial DNA, this is reproduced asexually, but is transmitted only through the male line. The studies found that Y-chromosome DNA also has the greatest variety and deepest lineages in Africa, pointing to an African Adam living between sixty thousand and ninety thousand years ago, and an origin for non-African variants around fifty thousand years ago. In 2010, geneticists added one more detail: immediately after they left Africa, Homo sapiens copulated enough with Neanderthals to pick up a trace of their DNA, and they then spread this mix across the rest of the planet.
But some paleoanthropologists remain unconvinced, insisting that genetics counts for less than the skeletal similarities they see between Western Homo sapiens and Neanderthals and between Eastern Homo sapiens and Homo erectus. In place of the out-of-Africa model they propose a “multiregional” model. Maybe, they concede, the initial Baby Steps Forward did happen in Africa, but population movements between Africa, Europe, and Asia then promoted such rapid gene flows that beneficial mutations in one place spread everywhere within a few thousand years. As a result, slightly different kinds of modern humans evolved in parallel in several parts of the world. That would explain both the skeletal and the genetic evidence, and would also mean that Easterners and Westerners really are biologically different.
Like so many theories, multiregionalism can cut two ways, and some Chinese scientists have insisted that China is exceptional beause—as the China Daily newspaper puts it—“modern Chinese man originated in what is present-day Chinese territory rather than Africa.” Since the late 1990s, though, the evidence has tipped steadily against this idea. There has been relatively little analysis of ancient DNA in East Asia, and still less that offers cheer to the multiregionalists. The authors of one Y-chromosome study even conclude that “the data do not support even a minimal in situ hominid contribution to the origin of anatomically modern humans in East Asia.” In Europe, initial studies of Neanderthal mitochondrial DNA found zero overlap with human mitochondrial DNA (whether found in 24,000-year-old skeletons or in living, breathing Europeans), suggesting that Neanderthals and Homo sapiens did not—perhaps could not—interbreed at all. The unraveling of the full Neanderthal genome has now shown that this went too far, and that Neanderthals did once inspire enough passion among Homo sapiens to make a small mark on our DNA; but it also showed that that mark is exactly the same all the way from France to China. Everywhere in Eurasia, people (in large groups) are all much the same.
The debate over multiregional origins drags on, and as recently as 2007 new finds from Zhoukoudian and from Xuchang were being trumpeted as showing that modern humans must have evolved from Homo erectus in China. Even as the publication announcing these finds was being printed, however, other scholars drove what looks to be the final nail into the multiregionalist coffin. Their sophisticated multiple-regression analysis of measurements from more than six thousand skulls showed that when we control for climate, the variations in skull types around the world are in fact consistent with the DNA evidence. Our dispersals out of Africa in the last sixty thousand years wiped the slate clean of all the genetic differences that had emerged over the previous half million years.
Racist theories grounding Western rule in biology have no basis in fact. People, in large groups, are much the same wherever we find them, and we have all inherited the same restless, inventive minds from our African ancestors. Biology by itself cannot explain why the West rules.
So if the racial theories are wrong, where did East and West begin? The answer has seemed obvious to many Europeans for more than a hundred years: even if biology does not enter into it, they have confidently asserted, Europeans have just been culturally superior to Easterners ever since there were such things as modern humans. The evidence that convinced them began to appear in 1879. Charles Darwin’s On the Origin of Species, published two decades earlier, had made fossil-hunting a respectable hobby for gentlemen, and like so many of his class, Don Marcelino Sanz de Sautuola took to looking for cavemen on his estates in northern Spain. One day, with his daughter in tow, he visited the cave of Altamira. Archaeology is not much fun for eight-year-olds, so while Sautuola fixed his eyes on the ground, little Maria ran around playing games. “Suddenly,” she told an interviewer many years later, “I made out forms and figures on the roof.” She gasped: “Look, Papa, bulls!”
All archaeologists dream of an “Oh my God” moment—the instant of absolute disbelief, when time stands still and everything falls away in the face of the unbelievable, awe-inspiring discovery. Not many archaeologists actually have one, and maybe no archaeologist ever had one quite like this. Sautuola saw bison, deer, layer upon layer of multicolored animals covering twenty feet of the cave’s ceiling, some curled up, some cavorting, some leaping gaily (Figure 1.4). Each was beautifully, movingly rendered. When Picasso visited the site years later, he was stunned. “None of us could paint like that,” he said. “After Altamira, all is decadence.”
Sautuola’s first reaction was to laugh, but quickly he became “so enthusiastic,” Maria recalled, “that he could hardly speak.” He gradually convinced himself that the paintings really were ancient (the latest studies suggest some are more than 25,000 years old). Back in 1879, though, no one knew this. In fact, when Sautuola presented the site at the International Congress of Anthropology and Prehistoric Archaeology in Lisbon in 1880, the professionals laughed him off the stage. Everyone knew that cavemen could not produce such art; Sautuola, they agreed, was either a liar or a sucker. Sautuola took this—rightly—as an attack on his honor. He died a broken man eight years later. His “Oh my God” moment ruined his life.
Not until 1902 did Sautuola’s main critic actually visit Altamira and publicly recant, and since then several hundred prehistoric painted caves have been found. Chauvet Cave in France, one of the most spectacular of all, was discovered as recently as 1994, so well preserved that it looked like the artists had just stepped out for a quick bite of reindeer and would be back at any moment. One of the paintings at Chauvet is thirty thousand years old, making it one of the earliest traces of modern humans in western Europe.
Nothing quite like these cave paintings has been found anywhere else in the world. The modern human migration out of Africa had swept away all distinctions created by the Movius Line and all biological divergences between earlier species of ape-men; but should we locate the true beginning of a special (and superior) Western tradition thirty thousand years ago in a uniquely creative culture that filled northern Spain and southern France with prehistoric Picassos?
The answer, perhaps surprisingly, lies in the frozen wastes of Antarctica. Every year snow falls there, burying previous snows, and compressing them into thin layers of ice. These layers are like a chronicle of ancient weather. By separating them, climatologists can measure their thickness, telling us how much snow fell; establish the balance between isotopes of oxygen, revealing temperatures; and compare the amounts of carbon dioxide and methane, illuminating greenhouse effects. But drilling cores through the ice sheets is one of the toughest assignments in science. In 2004 a European team finished extracting an Antarctic core almost two miles deep, going back an astonishing 740,000 years, to the days when Neanderthals were still a twinkle in some ape-man’s eye. The scientists did this despite temperatures that plunged to -58°F in winter and never got above -13°, being forced to start over when the drill jammed in 1999, and having to use a plastic bag filled with ethanol as a makeshift drill bit for the final hundred yards.
The results these supermen and -women of science extracted from the ice make one thing very clear: the world the Altamira artists lived in was cold. Temperatures had started tumbling again after modern humans left Africa, and around twenty thousand years ago—when more artists were daubing ocher and charcoal on cave walls than ever before or since—the last ice age reached its chilling climax. Average temperatures stood 14°F below those of recent times. That made a staggering difference. Mile-thick glaciers covered northern Asia, Europe, and America, locking up so much water that the sea level was more than three hundred feet lower than today. You could have walked from Africa to England, Australia, or America without ever laying eyes on the sea. Not that you would have wanted to visit many of these places; at the edges of the glaciers winds howled and dust storms raged across vast arid steppes, frigid in winter and barren in summer. Even in the least forbidding regions, within 40 degrees of the equator, short summers, meager rainfall, and reduced levels of carbon dioxide in the air limited plant growth and kept animal (including human) populations low. Things were as bad as in the worst days before modern humans left Africa.
Life was easier in what are now the tropics than it was in Siberia, but wherever archaeologists look, they find that people adapted to the Ice Age in rather similar ways. They lived in tiny bands. In colder environments, a dozen people was a big group; in the milder regions, twice that many might stick together. They learned when different plants ripened and where to find them; when animals migrated ahead of the seasons and where they could intercept them; and they followed both around the landscape. Those who did not learn these things starved.
Such tiny bands would have struggled to reproduce themselves. Like modern hunter-gatherers in marginal environments, they must have come together from time to time to exchange marriage partners, trade goods, tell stories, and perhaps speak to their gods, spirits, and ancestors. These gatherings would have been the most exciting social events on the calendar. We are guessing, of course, but many archaeologists think these festival days lie behind western Europe’s spectacular cave paintings: everyone put on their best skins and beads, painted their faces, and did what they could to decorate their holy meeting places, making them truly special.
The obvious question, though, is why—if these hard facts of life applied all across Africa, Asia, and Europe—we find such spectacular cave paintings only in western Europe. The traditional answer, that Europeans were more culturally creative than anyone else, seems to make a lot of sense, but we might do better to turn the question around. The history of European art is not a continuous catalogue of masterpieces running from Chauvet to Chagall; the cave paintings died out after 11,500 BCE and many millennia passed before we know of anything to equal them.
Looking for the roots of Western rule in a thirty-thousand-year tradition of European creativity is obviously mistaken if this tradition in fact dried up for thousands of years. Perhaps we should ask instead why the cave paintings ended, because once we do so it starts to look like the astonishing finds from prehistoric Europe have as much to do with geography and climate as with any special Western culture.
Through most of the Ice Age, northern Spain and southern France were excellent hunting grounds, where herds of reindeer migrated from summer to winter pastures and back again. But when temperatures started rising about fifteen thousand years ago (more on this in Chapter 2) the reindeer stopped migrating this far south in winter, and the hunters followed them northward.
It cannot be a coincidence that western European cave painting declined at just the same time. Fewer and fewer artists crawled under the ground with their animal-fat lamps and sticks of ocher. Sometime around 13,500 years ago the very last artist walked away. He or she probably did not realize it, but on that day the ancient tradition died. Darkness fell in the caves, and for millennia only bats and dripping water disturbed their tomblike silence.
Why did beautiful cave paintings not move steadily northward across Europe after 11,500 BCE as hunters followed the retreating reindeer? Probably for the very good reason that northern European hunters did not have such convenient caves to paint. Northern Spain and southern France have a tremendous number of deep limestone caves; northern Europe has far fewer. The efforts prehistoric peoples made to decorate their meeting places rarely survived for us to find unless hunting grounds coincided with deep caves. Whenever this happy coincidence failed to arise, people must have gathered nearer to or even above the surface. Exposed to wind, sun, and rain for twenty thousand years, few traces of their artwork survive.
“Few traces” is not the same as “no traces,” though, and sometimes we get lucky. At the wonderfully named Apollo 11 Cave in Namibia, slabs of stone with drawings of rhinos and zebras peeled off the wall, fell to the floor, and were preserved under deposits that formed between 19,000 and 26,000 years ago, and some Australian examples are even older. At Sandy Creek, mineral deposits that built up over part of a carving on a cave wall can be dated to about 25,000 years ago and fragments of pigment are 26,000 to 32,000 years old, while at Carpenter’s Gap part of a painted cave wall fell into 40,000-year-old occupation debris, making it even earlier than Chauvet.
None of the African or Australian examples compares aesthetically with the best French and Spanish work, and there are quite a few deep caves outside western Europe that do not have paintings (like Zhoukoudian, reoccupied twenty thousand years ago). It would be silly to claim that all humans put equal effort into cave art, let alone that all artistic traditions are equally successful. But given the preservation issues and the fact that archaeologists have been looking longer and harder in Europe than anywhere else, the survival of anything at all on other continents suggests that all modern humans, everywhere, shared the urge to create art. Where the conditions for cave painting were not so good as in western Europe, people may have put their energy into other media.
Figure 1.5 shows nicely that while cave art clusters in western Europe, stone, clay, and bone models of humans and animals are more common farther east. If the economics of publishing allowed it, I could show pictures of dozens of quite extraordinary figurines, found everywhere from Germany to Siberia. Since it does not, I will limit myself to the most recent discovery, found in 2008 at Hohle Fels in Germany (Figure 1.6)—a two-inch-tall statuette of a woman with no head but with gigantic breasts, carved 35,000 years ago from mammoth ivory. Around the same date hunters at Malaya Síya near Lake Baikal in Siberia—surely one of the most inhospitable spots on earth—took time to engrave pictures of animals on bones; and by 25,000 BCE groups up to 120 strong were gathering in huts of mammoth bone and skin at Dolní Vestonice in the Czech Republic, where they made thousands of clay figurines of animals and, again, large-breasted women. In East Asia the artistic record remains thin, but the earliest find—a tiny model bird carved perhaps fifteen thousand years ago from a deer antler, discovered at Xuchang in 2009—seems so sophisticated that we can be confident that future excavations will reveal a flourishing Ice Age artistic tradition in China, too.
Ice Age humans outside western Europe, lacking the conditions that made Chauvet and Altamira what they were, apparently found other outlets for their creativity. There is precious little evidence that earlier ape-men felt any creative urges at all, but imagination seems to be hardwired into Homo sapiens. By fifty thousand years ago humans had the mental faculties to seek meaning in the world and the skills to represent these meanings in art and (probably, though we cannot observe it) poetry, music, and dance. Once again, people (in large groups) all seem to be much the same, wherever we find them. For all its splendor, Altamira did not make the West different from the rest.
Technological, intellectual, and biological differences accumulated for more than a million and a half years after the first ape-men left Africa, dividing the Old World into a Neanderthal/Homo sapiens West and a Homo erectus East. Around a hundred thousand years ago the West was characterized by relatively advanced technology and even hints of humanity, while the East looked increasingly backward; but when fully modern humans moved out of Africa sixty thousand years ago they swept all this away. By the time the last ice age reached its climax twenty thousand years ago, “east” and “west” were just directions in which the sun rose and set. Far more united the little bands of humans scattered from Britain to Siberia—and (relatively) soon to cross over into America—than divided them. Each band foraged and hunted, roaming over huge areas as plants ripened and animals came and went. Each must have known its territory intimately and have told stories about every rock and tree; each had its own art and traditions, tools and weapons, spirits and demons. And each surely knew that their gods loved them, because they were, in spite of everything, still alive.
Humans had come as far as they were likely to in such a cold, dry world; and there, we must suspect, things would have stayed, had the earth not wobbled under their feet.
Excerpted from WHY THE WEST RULES—FOR NOW: THE PATTERNS OF HISTORY, AND WHAT THEY REVEAL ABOUT THE FUTURE by Ian Morris, published in October 2010 by Farrar, Straus and Giroux. Copyright © 2010 by Ian Morris. All rights reserved.
This program aired on November 29, 2010. The audio for this program is not available.
Support the news