FEATURES

Suddenly Smarter

After living simply for hundreds of thousands of years, early humans burst forth with an explosion of innovation. What set off our cultural big bang? Richard Klein suggests a maverick explanation.

July/August 2002

Reading time min

Suddenly Smarter

Linda Cicero

A handful of beads marks a watershed in human history. Their maker shaped the crude, circular pieces from fragments of ostrich eggshell, thinning each one and drilling a hole through the center. Many of them broke before they were finished. An unknown Stone Age artisan spent hours crafting these decorations rather than searching for food, tending children or making tools.

The fragile beads hail from a Kenyan site called Enkapune Ya Muto, or Twilight Cave. Crafted around 40,000 years ago, they appear to be the earliest known jewelry. But some anthropologists think they are much more. The people of Twilight Cave may have exchanged them as ritual gifts or tokens—making the Cheerio-like objects the oldest known example of symbolism.

If the beads were among humanity’s first symbols, says anthropology professor Richard G. Klein, they represent one of the most important revolutions in our species’ career: the dawning of modern human behavior.

Forget about upheavals like the Russian and French Revolutions, which produced mere changes of costume. Forget about the construction of the first cities or the introduction of the internal combustion engine. The revolution that made the biggest difference occurred on the savanna of East Africa roughly 45,000 years ago, Klein and others maintain.

“Communicating with symbols provides an unambiguous sign of our modernity,” says Klein, an eminent archaeologist who has taught at Stanford for nine years. “Once symbols appear, we know we’re dealing with people like us: people with advanced cognitive skills who could not only invent sophisticated tools and weapons and develop complex social networks for mutual security, but could also marvel at the intricacies of nature and their place in it; people who were self-aware.”

Most—though not all—anthropologists agree that human culture, imagination and ingenuity suddenly flowered around 45,000 years ago. The evidence ranges from fantastic cave paintings and elaborate graves to the first fishing equipment and sturdy huts. And whether scientists call it the great leap forward, the dawn of culture or civilization’s big bang, they agree that the change was momentous, giving humans the cohesion and adaptability to expand their range into Europe, Asia, and eventually Australia and the Americas. “In its wake,” Klein says, “humanity was transformed from a relatively rare and insignificant large mammal to something like a geologic force.”

But what turned humans into a planetary power? Lacking evidence to answer that question, we can only guess at the cause or causes. Among the dozen or so experts focusing on “the leap,” most favor cultural, social or demographic explanations, Klein says. They speculate, for example, that humans suddenly crossed a threshold of creativity after a long, slow buildup in population, or that a radical population boom set off a maelstrom of competition between groups, inspiring rapid innovation.

A few researchers reject the whole notion of a sudden behavioral revolution, arguing instead for slow cultural evolution. Some of the so-called hallmarks of modernity, they say, showed up tens of thousands of years earlier. Noting that the brain reached its full size at least 130,000 years ago, these anthropologists think humans had all the intellect they needed from then on and that modern advances arose one by one over a vast period of time.

Klein suggests a third possibility—a strictly neurological scenario that has gained few followers in a field of study dominated by cultural explanations, he says. Humanity’s big bang, he speculates, was sparked not by an increase in brain size but by a sudden increase in brain quality. Klein thinks a fortuitous genetic mutation may have somehow reorganized the brain around 45,000 years ago, boosting the capacity to innovate. “It’s possible this change produced the modern ability for spoken language,” he says.

Clearly, speech eases communication. But it also fosters something less obvious and equally important. Spoken language, Klein says, “allows people to conceive and model complex natural and social circumstances entirely within their minds.”

In other words, it makes us human.

The question of when and how humans became modern has enthralled Klein, 61, from the beginning of his career. A Chicago native who did his undergraduate work at the University of Michigan and graduate work at the University of Chicago, he excavated Spanish sites during the regime of Francisco Franco and studied Neanderthal artifacts in the Soviet Union before moving on to South Africa during apartheid. (“You might think I’ve followed totalitarian regimes around the world,” he jokes.) He still digs in South Africa every summer, reconstructing ancient human ecology at a 300,000-year-old site. He has authored what is considered the definitive text on human evolution, The Human Career (University of Chicago Press, 1989, 2nd ed. 1999), as well as The Dawn of Human Culture (Wiley, 2002), a livelier account summarizing 150 years’ worth of archaeological finds and explaining the mutation proposal. “Richard Klein is exceptionally well informed about all aspects of archaeology and immensely concerned about the health of the discipline as a whole,” says Frank Brown, dean of the College of Mines at the University of Utah, who studies the geologic antiquity and environment of very early people in Africa.

In particular, Klein’s colleagues regard him as a master of “faunal analysis”—reading details of human behavior, population size and social organization from animal bones found at ancient campsites, caves and other locales, says anthropologist John Yellen, director of archaeological programs for the National Science Foundation.

People who work with him say he is witty, thoughtful and gentle. “One characteristic that sets him apart is his unpretentiousness,” says graduate student Tim Weaver, MA ’98. Brown adds: “Were he not so well-meaning and so intent on educating everyone about archaeology, his store of knowledge could be intimidating. Instead, one simply feels that they’ve met a master of a subject who emanates confidence and goodwill.”

In fact, the entire debate over “the leap” has been marked by goodwill and collegiality—a rarity in the field of human evolution. “Everybody in this dispute is generous and polite,” Yellen says. But they’re also adamant in their views.

Virtually no one who studies human origins disputes this: people who looked a lot like us were roaming Africa and the Middle East at least 115,000 years ago. Scientists usually lump them together as “early modern” humans, although a clearer description might be “physically modern.” Unlike their ancestors with jutting brows and sloping foreheads, these people had flatter faces and steeper foreheads like ours. Their brains were as big as ours and, like us, they stood tall on long limbs.

But anthropologists don’t all agree that these physically modern humans shared our mental equipment. Because brain size provides no guide to intelligence, Klein says, researchers must divine cognitive ability from tools, campsites, fragments of ancient garbage, and other remains. “The question is: when does ‘modern behavior’ appear in the archaeological record and what is the cause?” Yellen says. By modern behavior, scientists mean a long list of capabilities, including clever new tools made of stone and bone, and a wealth of ritual, symbolism and art (see chart).

To witness the contrast between premodern and modern ways of life, Klein says, sift through the remains from caves along the southern coast of South Africa. Simple Stone Age hunter-gatherers began camping here around 120,000 years ago and stayed on until around 60,000 years ago, when a punishing drought made the region uninhabitable. They developed a useful tool kit featuring carefully chipped knives, choppers, axes and other stone implements. Animal bones from the caves show that they hunted large mammals like eland, a horse-sized antelope. They built fires and buried their dead. These people, along with the Neanderthals then haunting the caves of Europe, were the most technologically adept beings of their time.

However, Klein says, there were just as many things they couldn’t manage, despite their modern-looking bodies and big brains. They didn’t build durable shelters. They almost never hunted dangerous but meaty prey like buffalo, preferring the more docile eland. Fishing was beyond their ken. They rarely crafted tools of bone, and they lacked cultural diversity. Perhaps most important, they left no indisputable signs of art or other symbolic thought.

Later inhabitants of the same caves, who moved in around 20,000 years ago, displayed all these talents and more.

What happened in between?

The burst of modern behavior—like other momentous happenings in our evolution—arose not in South Africa, Klein says, but in East Africa, which was wetter during the drought. Around 45,000 years ago, he believes, a group of simple people in East Africa began to behave in new ways and rapidly expanded in population and range. With better weapons, they broadened their diet to include more challenging and nutritious prey. With their new sense of aesthetic, they made the first clearly identifiable art. And they freed themselves to wander beyond the local watering hole—setting the stage for long-distance trade—with contrivances like canteens and the delicately crafted eggshell beads, which may have functioned as “hostess gifts” to cement goodwill with other clans.

Dramatic evidence of a surge in ingenuity and adaptability comes from a wave of human migration around 40,000 to 35,000 years ago. Fully modern Africans made their way into Europe, Klein says, where they encountered the Neanderthals, cave dwellers who had lived in and around Europe for more than 200,000 years. The lanky Africans, usually called Cro-Magnons once they reached Europe, were more vulnerable to cold than the husky Neanderthals. Yet they came, saw and conquered in short order, and the Neanderthals vanished forever.

Compare that with an earlier migration around 100,000 years ago, in which the Neanderthals eventually prevailed. Physically—but not yet behaviorally—modern Africans took advantage of a long warm spell to expand northward into Neanderthal territory in the Middle East, only to scuttle south again when temperatures later plunged. The critical difference between the two migrations? The earlier settlers apparently lacked the modern ability to respond to change with new survival strategies, such as fitted garments, projectile weapons and well-heated huts.

Exactly what finished off the Neanderthals remains a mystery. They don’t seem to have blended into human groups, says Klein, who sees the two as separate species. Although humans may have mingled with Neanderthals on occasion, genetic or cultural swapping seems to have been rare. DNA studies show that “there are no Neanderthal genes in my body or yours,” he says.

Klein doesn’t rule out the possibility of fighting, though he says there is no evidence of that. On the whole, he believes, competition between fully modern humans and Neanderthals was probably less direct. As the smarter, more abundant humans reproduced vigorously and expanded their range, they snatched up the best food and shelter. Lacking the versatility and social cohesion of modern humans, the Neanderthals grew hungrier, weaker and fewer until the last of them succumbed.

Whatever the cause, they died out very quickly—in a few thousand years or less—and “it probably wasn’t pretty,” Klein says.

By 30,000 years ago, everyone on earth was fully modern.

The world-shaking transformation of our species probably boils down to a tiny genetic glitch, Klein asserts. He developed his maverick notion as “the simplest, most parsimonious explanation for the available archaeological evidence,” he says. “I propose it only because it seems to be far more plausible and to explain more than the alternatives.”

Genes mutate all the time, Klein notes. Mutations can be useful, harmful or neutral in their effects. In large populations, even helpful mutations tend to get “swamped” by nonmutant genes and vanish over time. But Klein’s proposed mutation would have arisen in a small population, where its bearers could enjoy a survival advantage potent enough to maximize their offspring and spread the new trait like wildfire.

A mutation that improved the organization of the brain, Klein says, could give humans the ability to “conceive, create and communicate in symbols,” soon leading to speech. The symbolic brain would set the stage for future cultural revolutions, from the inception of agriculture about 10,000 years ago to the development of the World Wide Web in our lifetime.

Once our ancestors’ superior brains made them much more adaptable to change, cultural advancement rather than biological evolution became the predominant force shaping our future, Klein says. Before “the leap,” cultural advances paralleled major changes in anatomy. (For instance, crude stone tools first appeared some 2.5 million years ago along with a surge in brain size.) In the last 45,000 years, however, our bodies have changed little, while our culture has churned at an ever-increasing pace. Our flexible, enterprising brains enable us to adapt to almost any conditions and live almost anywhere, from Phoenix to Vladivostok, without changing physically.

“Arguably, this was the most significant mutation in the human evolutionary series,” Klein writes in The Dawn of Culture, “for it produced an organism that could alter its behavior radically without any change in its anatomy and that could cumulate and transmit the alterations at a speed that anatomical innovation could never match.”

If, by chance, the mutation had arisen in Stone Age Europe, he says, “students of human evolution today would be Neanderthals marveling at the peculiar people who used to live in Africa and then abruptly disappeared.”

Klein counters skeptics by focusing on scientific evidence. To those who question whether a minute genetic alteration could so powerfully change our behavior, he points to the recent discovery of a gene that apparently enables us to understand language. Last fall, geneticists at Oxford reported that people born with a mutant (in this case harmful) version of this gene struggle to comprehend spoken or written language, even though they usually score in the normal range on tests of nonverbal intelligence.

To scholars who argue against any sudden leap, claiming that modern behavior emerged slowly, beginning long before the so-called dawn of culture, Klein cites the scarcity of artifacts supporting that idea and questions their presumed age. In one case, a 90,000-year-old site in the Democratic Republic of the Congo turned up barbed harpoons finely wrought of bone. In another, the South African cave of Blombos divulged what some believe is evidence of symbolic thought dating back 77,000 years—matchbox-sized slabs of soft, red ocher covered with crosshatching, which might conceivably be artful engraving. Archaeologist Sally McBrearty of the University of Connecticut, a vociferous proponent of the “slow dawning” argument, interprets such finds as evidence that modern innovations did not come in a rush but instead developed over many thousands of years. McBrearty believes that people living at least 130,000 years ago—and perhaps as far back as 260,000 years ago—boasted the same mental equipment we have today. Competition between groups, driven by an ever-growing human population, may have been the mother of invention, she says, forcing people to develop better tools, take on more dangerous prey and perhaps create art to mark their homelands and their identity.

Klein, in turn, argues that the harpoons don’t show as much wear as animal bones from the same sediment layers, which suggests they might be strays from younger layers above. Radiocarbon dating is frequently unreliable for very early artifacts, he adds, “because radiocarbon takes us back only 40,000 years at most.” More important, if people living 90,000 years ago were creative enough to fashion tools of bone and to express themselves through art, why do we find so few examples dating back more than 45,000 years? The conclusion drawn from multiple sites and drawers full of artifacts—that people were more primitive before 45,000 years ago—cannot be overturned by “a few isolated ‘masterpieces,’ perhaps the work of an occasional premodern Leonardo,” Klein says.

And to the majority of his colleagues engaged in this debate—who agree with the “great leap” premise but tend to favor sociodemographic explanations over a genetic trigger—Klein says he will accept such a view if and when good evidence emerges. “Conceivably, the behavioral change 50,000 to 40,000 years ago reflects merely the crossing of a demographic and sociocultural threshold that stimulated humans to realize a long-dormant behavioral potential,” he wrote in the January 2000 issue of Evolutionary Anthropology. “However, there is no compelling explanation for why such demographic or sociocultural change occurred and especially for why it occurred 50,000 to 40,000 years ago.” He counters the suggestion of a massive population boom, for instance, by arguing that populations in most of Africa appear to have shrunk, not grown, during the drought that parched the continent about 60,000 to 30,000 years ago.

The mutual proposal may be logical and plausible, but it has an inherent disadvantage. “At present, it isn’t testable,” Klein explains, since fossils don’t record details of brain structure or tell us when speech began.

On the other hand, it could be invalidated if archaeologists in Africa discovered numerous “modern” artifacts indisputably dating back much more than 45,000 years, or if they found signs of a sudden demographic shift that could have changed our behavior forever.

But even if such evidence exists, finding it won’t be easy. Compared with the archaeological record for Europe—where researchers have been poking around for more than 100 years—Africa’s record is skimpy, Klein says. And funds are scarce for filling the gaps, “mainly because [the research] has no economic, medical or political implications.” Plenty of other obstacles stand in the way of those who want to work in Africa, from political instability to lack of roads. What’s more, the small populations of very early times, coupled with chancy conditions for artifact preservation, make African evidence hard to locate. For instance, no known African site displays lush paintings like those in French caves such as Lascaux and Chauvet—not because the first fully modern Africans weren’t capable of such feats, but because Africa doesn’t have deep caves, which preserve fragile artwork by functioning as refrigerated vaults, Klein says.

Despite the challenges, anthropologists on every side concur that more research could yield important clues. “Better answers can come if we get a better African record,” Klein says. “And if we can identify some of the genes behind modern cognition and communication and date them [by analyzing dna from living people], that would also help.”

The answer to one of our biggest mysteries, it seems, may lie hidden in the dust of an African cave—or perhaps in the folds of the human brain.


Mitchell Leslie of Albuquerque, N.M., is a frequent contributor to Stanford and the journal Science.

Trending Stories

  1. Let It Glow

    Advice & Insights

  2. Meet Ryan Agarwal

    Student Life

  3. Neurosurgeon Who Walked Out on Sexism

    Women

  4. Art and Soul

    School of Humanities & Sciences

  5. Three Cheers

    Alumni Community

You May Also Like

© Stanford University. Stanford, California 94305.