The BBC has a news report up gathering reactions to a new PLoS ONE paper, The Later Stone Age Calvaria from Iwo Eleru, Nigeria: Morphology and Chronology. This paper reports on remains found in Nigeria which date to ~13,000 years B.P. that exhibit a very archaic morphology. In other words, they may not be anatomically modern humans. A few years ago this would have been laughed out of the room, but science moves. Here is Chris Stringer in the BBC piece:
“[The skull] has got a much more primitive appearance, even though it is only 13,000 years old,” said Chris Stringer, from London’s Natural History Museum, who was part of the team of researchers.
“This suggests that human evolution in Africa was more complex… the transition to modern humans was not a straight transition and then a cut off.”
Prof Stringer thinks that ancient humans did not die away once they had given rise to modern humans.
They may have continued to live alongside their descendants in Africa, perhaps exchanging genes with them, until more recently than had been thought.
The Pith: We are now moving from the human genome project, to the human genomes project. As more and more full genomes of various populations come online new methods will arise to take advantage of the surfeit of data. In this paper the authors crunch through the genomes of half a dozen individuals to make sweeping inferences about the history of the human species over the past few hundred thousand years.
Since the integration of evolution and genetics in the early years of the 20th century there have been several revolutions in our ability to perceive the underlying variation which is the raw material and result of evolutionary genetics. The understanding that DNA was the concrete substrate of Mendelian genetics, and the rise to prominence of molecular genetic techniques in understanding evolution the 1970s and 1980s, was one key transition. No longer were geneticists simply tracking the coat colors of mice or the visible mutations of fruit flies. In the 1990s the uniparental loci, the maternal and paternal lineages as inferred from the mtDNA and Y chromosomes, came into their own. Finally, the 2000s saw the post-genomic era, and researchers routinely began analyzing data sets of hundreds of thousands of single nucelotide polymorphisms (SNPs), genetic variants, in hundreds of individuals.
In this decade some of the promise of the Human Genomic Project will finally ripen, in that whole genomes are going to be used more and more in analyses. This is exciting, but there are some obvious issues. The human genome has ~3 billion base pairs, vs. the 1 million or less you might manipulate per individual in data sets focused on SNPs. There are some things for which a human genome is overkill. You don’t need a full genomic sequence to ascertain your identity as a member of a particular geographic race. Not only can visual inspection usually suffice to reassure you as to your background, but depending on the scale of granularity you want a random SNP set on the order of ~10,000 should suffice, or as few as 25 ancestrally informative markers! But, if you want to ascertain mutation rates within families will precious and confidence, you do need the full genome.
The media is reporting rather breathlessly a new find out of Arabia which seems to push much further back the presence of anatomically modern humans in this region (more accurately, the archaeology was so sparse that assessments of human habitation seem to have been made in a vacuum due to absence of evidence). Here is the major objection:
This idea is at odds with a proposal advanced by Richard Klein, a paleoanthropologist at Stanford University, that the emergence of some social or behavioral advantage — like the perfection of the faculty for language — was required for modern humans to overcome the surrounding human groups. Some kind of barrier had to be surmounted, it seems, or modern humans could have walked out of Africa 200,000 years ago.
Dr. Klein said that the Uerpmann team’s case for an earlier out-of-Africa expansion was “provocative, but in the absence of human remains, it’s not compelling.”
The stone tools of this era are all much alike, and it is hard to tell whether early modern humans or Neanderthals made them. At the sites of Skhul and Qafzeh in what is now Israel, early modern humans were present around 100,000 years ago and Neanderthals at 60,000 years, but archaeologists cannot distinguish their stone tools, Dr. Klein said.
A warmer and wetter climate around this time let modern humans get as far as Israel but apparently no farther, and the new findings from Jebel Faya could represent a second limited excursion. But in this case, it is Africa that is expanding, or at least the African ecological zone, and not modern humans, Dr. Klein said. “The key issue is whether this is an early out-of-Africa movement, but if so, it was far more limited than the modern human expansion to Eurasia roughly 45,000 years ago,” he said.
Image credit: Maathias Kabel
In The Dawn of Human Culture Richard Klein argued that modern humans as we understand them today, protean and highly cultural creatures, are a product of a biological change which reordered our cognitive faculties. Klein pinpoints this change to the “Great Leap Forward” ~50,000 years ago. But, there is a large gap in time between anatomically modern humans, who were resident in Africa nearly ~200,000 years ago, and behaviorally modern humans, who engage in the symbolic cultural production which we perceive to be the hallmarks of humanity. As against this particular model there have always been “gradualists,” who argue that there was no discontinuous biological change which resulted in the shift toward hyperactive cultural production. Stephen Oppenheimer makes the case for this in his book The Real Eve. Oppenheimer suggests that there was a gradual and cumulative cultural evolution. He argues that a proper analogy might be the rate of cultural change in the 20th century vs. than in the 17th century. Obviously we know that genetic evolution can not explain most of the difference in rate of change across the two eras, but looking at archaeological remains from the two periods would make clear their stark differences to a third party observer to the point where I can’t help but think a biological rationale would seem plausible without any other information.
I have no particular brief for either position in this post. I assume that both the biological and cultural models are too extreme now. The long term persistence of the Oldowan culture in much of the world implies to me that there may have been a biological chasm between hominin groups, and that the Oldowan “culture” was somehow biologically encoded. And yet I am not convinced that the gap between our Neandertal and neo-African ancestors was as great as Klein would have us believe. So now to the paper. First, let’s look at the abstract:
My post The paradigm is dead, long live the paradigm! expressed to some extent my befuddlement at the current state of human evolutionary genetics and paleoanthropology. After the review of the paper of possible elevated admixture with Neandertals on the dystrophin locus a friend emailed, “Remember when we thought everything would be so simple once we could finally see this stuff?” Indeed I do remember. The fact that things aren’t simple is very exhilarating, but it is also a major quash on theoretical clarity. Science is after all not a collection of facts, but it is in part facts which one can sieve through a analytic framework.
In hindsight with the relative robustness of ancient DNA results we can make some assessments about the role of human bias within particular heuristic frameworks over the past generation. From the mid-1980s up until 2000 it was victory after victory for the Out-of-Africa with total replacement model. The rise of mtDNA and Y chromosomal lineage studies seemed to buttress the idea of common descent from neo-Africans within the last 100-200,000 years for all human populations. There wasn’t much of a perturbation from this march toward paradigm ascendancy in the aughts, except that there were now also now a trickle of papers which claimed to phylogenetic “long branches” in the human genome. The 2006 Evans et al. paper, Evidence that the adaptive allele of the brain size gene microcephalin introgressed into Homo sapiens from an archaic Homo lineage, was probably the one that made the biggest media splash. But these were inferences. Subsequent analysis of the draft Neandertal genome seems to suggest that in fact the microcephalin allele in question did not introgress.
Case closed? Obviously not. Now we’re in a different era. The Evans et al. paper may have wrong in the specifics, but its general framework seems to likely have been validated: there are genetic lineages in the modern human genome which are not derived from the neo-Africans. But, let us remember that the overwhelming majority of the human genome is neo-African. A reasonable interval for non-Africans is 90-99% neo-African. But, a non-trivial minority has introgressed or admixed from other lineages. Out-of-Africa is mostly correct, but in some ways so is Multiregionalism. But how do we describe this? “Weighted multiregionalism”? “Mostly Out-of-Africa?” The old terms were nice because they were punchy and precise. If you look at Multiregionalism or Out-of-Africa in Wikipedia the newest results are noted, but it doesn’t seem that they’ve been integrated into the analytic narrative. Yet.
Fossils matter. Fossils are evidence. That was Milford Wolpoff’s refrain in the 1987 NOVA documentary which heralded the long cresting of mitochondrial Eve and Out of Africa. Fossils remain highly relevant and important when it comes to deeper time phylogenetic relationships, but it does seem that they have only served to supplement the genetic data when it comes to recent human origins (e.g., calibrate and fine-tune molecular clocks). The paleoanthropologist Tim White, whose own position on human origins is at some contradiction from Milford Wolpoff’s, nevertheless felt the need to reiterate the relevance of fossils at a conference several years ago where most of the participants were geneticists (we received a preview of Ardi). Chris Stringer, who advocated for an Out of Africa model before Allan Wilson and his students roiled the academic waters often seems to have been relegated to nothing more than an adjunct to the molecular biologists in the public mind despite his priority. I think we are a turning point, and must acknowledge that recent human origins can no longer remain a one horse buggy. Genetics itself in the form of ancient DNA research, as well as more powerful analytic techniques utilizing larger autosomal data sets, have overturned and challenged the old conventional wisdom gleaned from trusting inferences derived from the patterns of variation of extant populations.
There is a new paper in PNAS on remains from China which re-order and muddle our understanding of the emergence of anatomical and behavioral modernity in Eurasia. Human remains from Zhirendong, South China, and modern human emergence in East Asia:
The 2007 discovery of fragmentary human remains (two molars and an anterior mandible) at Zhirendong (Zhiren Cave) in South China provides insight in the processes involved in the establishment of modern humans in eastern Eurasia. The human remains are securely dated by U-series on overlying flowstones and a rich associated faunal sample to the initial Late Pleistocene, >100 kya. As such, they are the oldest modern human fossils in East Asia and predate by >60,000 y the oldest previously known modern human remains in the region. The Zhiren 3 mandible in particular presents derived modern human anterior symphyseal morphology, with a projecting tuber symphyseos, distinct mental fossae, modest lateral tubercles, and a vertical symphysis; it is separate from any known late archaic human mandible. However, it also exhibits a lingual symphyseal morphology and corpus robustness that place it close to later Pleistocene archaic humans. The age and morphology of the Zhiren Cave human remains support a modern human emergence scenario for East Asia involving dispersal with assimilation or populational continuity with gene flow. It also places the Late Pleistocene Asian emergence of modern humans in a pre-Upper Paleolithic context and raises issues concerning the long-term Late Pleistocene coexistence of late archaic and early modern humans across Eurasia.
I read the paper, and I really didn’t understand anything between the introduction and discussion. Mostly because it was a detailed exploration of anatomical details, and I’ve never taken an anatomy class. I basically rely on people like John Hawks to tell me what’s going on in that domain. He hasn’t blogged the paper (well, as of this writing), but he did give an assessment to National Geographic:
Food is a fraught topic. In How Pleasure Works Paul Bloom alludes to the thesis that while conservatives fixate on sexual purity, liberals fixate on culinary purity. For example, is it organic? What is the sourcing? Is it “authentic”? Obviously one can take issue with this characterization, especially its general class inflection (large swaths of the population buy what they can afford). Additionally, I doubt Hindus, Muslims and Jews who take a deep interest in the provenance, preparation, and substance of their food are liberals. What Bloom is noticing is actually a general human preoccupation which somehow has taken on a strange political valence in the United States. Somehow being conservative in this country has become aligned with a satisfaction with the mass-produced goods of the agricultural-industrial complex.* Some conservatives such as Rod Dreher have pushed back against this connotation, lengthily in his book Crunchy Cons.
Stepping away from politics, we are a diet obsessed culture broadly. Apparently Christina Hendricks is going on a diet, her aim being to lose 30 pounds. Diet fads come and go. The Atkins approach has faded of late, with the Paleolithic diet coming into fashion. A totally separate market segment, that of raw food, remains robustly popular. This was obvious when Richard Wrangham came out with Catching Fire: How Cooking Made Us Human; raw food enthusiasts would call in to talk shows where he was a guest, sometimes irritated that Wrangham was claiming that cooking was central to the emergence of modern humanity. His contention that raw food practitioners were healthy precisely because they don’t process as much of their nutritional intake because of the relatively coarse character of what they were consuming was clearly discomfiting to many of them. This is because it is at variance with some of the rationale for their diet. They are not cooking the food in part because they believe that that removes a great deal of nutritive value.
I was thinking about this while reading What is Global History? Offhand the author mentions bread-making as early as 20,000 years go in the process of asserting that many of the preconditions for an agricultural mode of production were already in existence before the end of the last Ice Age. I was surprised by this fact, having never encountered it before. Unfortunately there wasn’t a footnote which I could follow up on, so I thought no more of it. Imagine my curiosity when I stumble upon this paper in PNAS, Thirty thousand-year-old evidence of plant food processing:
The German magazine Der Spiegel has a rather thick new article out reviewing the latest research which is starting to reintroduce the concept of mass folk wanderings into archaeology. The title is How Middle Eastern Milk Drinkers Conquered Europe. In the story you get a good sense of the recent revision of the null model once dominant within archeology that the motive forces of history manifested through the flow of pots and not people. This viewpoint came to ascendancy after World War II, and succeeded an older method of interpretation which presumed a tight correlation between race and culture. It repudiated the idea that the flow and change of pottery styles and extant patterns of linguistic dialects may have been markers for the waxing and waning of peoples.
Obviously a pots-not-people model had some major exceptions even during its heyday. The demographic explosion of European peoples after 1492, and especially the Anglo peoples after 1700, occurred within the light of history. Even if it hadn’t it would be ludicrous on the face of it to assert that the modern American population were derived from the indigenous populations, and that they had simply adopted the language, religion and folkways of the British conquerors of North America. But outside the presumed aberration of the European imperialist and colonial venture of the modern era the details on the ground were obscure enough that a model could be imposed from without.
When I talk about sexual selection I usually make sure to have an accompanying visual of a peacock to go with the post. But really I could have used a dandy as an illustration, or perhaps in our day & age “The Situation”. Unlike the peacock much of what passes for human “plumage” is not a result of native biological processes, but rather refashioning the materials of other organisms or synthetics into a sort of second skin (or skins with all our layers). In other words, clothes. These artificialities are so essential to our own identity as individuals that they often mark out our tribal affiliations, in pre-modern and post-modern contexts. Whole industries exist to cater to both our utilitarian needs and aesthetic sensibilities in regards to how we dress ourselves. The definition of a cyborg usually connotes a synthesis of biological with electronic. Perhaps that is because our artificial extensions in the form of clothes have seamlessly merged with our self-images, to the point where it would be ludicrous to perceive ourselves as merged entities. If you encountered many of your acquaintances or friends naked not only would embarrassment ensue, but I suspect one might initially not recognize them. A naked physique without distinctive aspects of clothing one associates with someone strips away individual identity.
But clothing has not been the eternal condition of man, recall that Eve met the fig leaf after an unfortunate sequence of events. In all likelihood our common ancestor with bonobos and chimpanzees were predominantly hirsute, as are most mammals. A mammal without fur is like a fish without scales and birds without feathers. Not impossible, but atypical. But at some point we did lose our fur. When? A 2004 paper offered up an intriguing possibility, that ~1.2 million years ago our lineage became hairless. How did they come to this inference? The authors noted that the consensus sequence of the MC1R locus in humans among dark skinned peoples coalesced back to this period (i.e., the last common ancestor of the MC1R genes which exhibit the ancestral type, which confers dark skin). Once our ancestors lost their fur then they would have been exposed to solar radiation, and so the necessity of dark skin. When did this naked dark ape cover his shame? (yes, I censored one of the images above to make this post “work safe” above the fold) A new paper in Molecular Biology & Evolution offers up a precise date using another proxy, Origin of clothing lice indicates early clothing use by anatomically modern humans in Africa:
One aspect of human demographic expansions seems to be the fact that we often model them as a constant diffusion process, when in reality there were likely pulses (economic historians can conceive of this as the periodic gaps between land and labor factor inputs). I don’t know much about the human movements prior to H. sapiens sapiens, and from what I can gather the fossil remains are too sparse to be too wedded to a specific model, but it seems clear that anatomically modern human expansion occurred through a series of rapid outward sweeps which would periodically reach a “natural barrier.” Modern humans reached the Solomon Islands ~30,000 years ago, after which there was stasis for ~25,000 years. Only with the Austronesian expansion did humanity push past the Solomons. And this was no baby-step, ultimately the Austronesians went as far as the Hawaiian islands and Easter Island.
I just stumbled onto two amusing articles, Ancient legends once walked among early humans?, and The discovery of material evidence of a distinct hominin lineage in Central Asia as recently as 30,000 years ago is no surprise. The second is a letter from a folklorist:
Sir, The discovery of material evidence of a distinct hominin lineage in Central Asia as recently as 30,000 years ago (report, Mar 25) does not come as a surprise to those who have looked at the historical and anecdotal evidence of “wild people” inhabiting the region. The evidence stretches from Herodotus to the present day. The Russian historian Boris Porshnev suggested that they are relict Neanderthals, although the lack of evidence of material culture suggests a type closer to Home erectus.
* There was a sharp spike in cranial capacity ~200,000 years ago, on the order of 30%
* And, that the large brain was not deleterious despite its large caloric footprint (25% of our calories service the brain) because the “environment of early humans was so clement and rich in resources”
Hawks refutes the first by simply reposting the chart the above (x axis = years before present, y axis = cranial capacity). It’s rather straightforward, I don’t know the paleoanthropology with any great depth, but the gradual rise in hominin cranial capacity has always been a “mystery” waiting to be solved (see Grooming, Gossip, and the Evolution of Language and The Mating Mind: How Sexual Choice Shaped the Evolution of Human Nature). Blakemore may have new data, but as they say, “bring it.” Until then the consensus is what it is (the hominins with the greatest cranial capacities for what it’s worth were Neandertals, and even anatomically modern humans have tended toward smaller cranial capacities since the end of the last Ice Age along with a general trend toward smaller size).
Last week Nature published a paper which may have found a new ‘branch’ of the hominin evolutionary bush which may have been coexistent which modern humans and Neandertals. I recommend The Atavism, Carl and John Hawks on this story. Interesting times.