A phylogenetic tree is an essential tool in understanding the broad scope of natural history, placing particular lineages in specific evolutionary contexts of relatedness. These sorts of trees range from Ernst Haeckel’s classical attempt, depicting relationships which biologists derived from intuition within the framework of a grand evolutionary scheme, all the way down to modern methods implemented in software packages such as Mr. Bayes, which many frankly utilize in a “turnkey” manner. These trees are abstractions, in that they reduce down a wide range of phenomena into schematic representations which impart aspects of particular interest in a stylized form. This is important, because the actual nature of the phenomena being represented may be more complex than is being represented. A simple illustration of what I’m getting is clear when you look at the long history of phylogenetics and phylogeography utilizing mitochondrial DNA lineages (mtDNA). Because mtDNA is copious in comparison to nuclear DNA, it is easy to obtain. And, as there is no recombination and it is inherited in a haploid fashion (mother to daughter) it makes the inference of gene trees much easier. The key problem is that the genealogy of this particular sequence is used to infer aspects about population history, when they may not accurately represent the history of other regions of the genome very well. Different genes may have different histories.
There was arecent case in Ireland of a young Roma girl who was blonde haired and blue eyed being removed from her home, on the suspicion that she was not in fact the biological child of the presumed parents (who, like most Roma, are reportedly of dark complexion, hair, and eye). I even saw a report that a hospital was consulted on the probability of such an outcome, and they said it would be “extremely unusual”. It turns out that DNA tests confirmed that this girl was the biological child of the putative parents. And of course all this has be understood in light of the case of “Maria” in Greece; a little blonde girl who turned out not to be the biological child of the two Roma who claimed her as their daughter (it looks like there was welfare fraud in that case).
My initial response to the Irish case was that consultant should be fired, because in an admixed population like the Roma it shouldn’t be that unusual to have offspring who deviate a great deal from the parental phenotype. This prompted some interesting reactions. First, there were those who seem blissfully ignorant of the fact that the Roma are an admixed population. That’s easy enough to resolve, as there have been scientific papers published on this issue using genome-wide data. Second, there are claims that very small fraction of Roma have blonde hair and blue eyes (on the order of less than 1%). The latter may be a defensible claim, though not indisputably so.
Before we move on I have to clarify that there is a distinction between “Roma” and “Romani.” The latter refers broadly to the populations across Europe which were referred to as “Gypsy,” while the former denotes a set of populations with a center of distribution in Southeast Europe, in particular in the Balkans. In much of Northern and Western Europe there are now two populations of Romani with very distinct histories (and genetics): the Roma who have recently arrived from Southeast Europe, and the various non-Roma groups who have a very long history in their nations of residence (e.g., Finnish Kale).
In terms of various traits we know a fair amount about the genetics of pigmentation in humans. Though the fine grained individual predictive models are coarse, most of the genes which have large effects on population-scale differences are now well characterized. This allows me to produce a model which is reasonably plausible to give you an intuition for why brown-skinned populations can generate a wide range of outcomes in realized phenotype.
Imagine five loci rank-ordered in effect size, gene 1, gene 2, gene 3, gene 4, and gene 5. Each gene comes in two flavors, two alleles. One is a “dark” allele (produces dark pigmentation) and another is a “light” allele. From these you can have a distribution of complexion which is referred to as a “melanin index” (it’s dependent on reflectance). Imagine that you assume each allele at each gene exhibits a melanin index value like so in relation to the aggregate:
Gene 1 = 30, 2
Gene 2 = 15, 1
Gene 3 = 10, 1
Gene 4 = 5, 2
Gene 5 = 5, 0
First, “Is it okay to introduce non-human DNA in our genome?” The premise is false. A substantial proportion of the human genome is derived from viruses. Lateral gene transfer in complex organisms is not unknown, and may sometimes be quite functional (arguably endosymbiogenesis and mitochondria is the classic case, but that’s so far back in the past that people aren’t shocked by it). Second, the piece also asks if we “Should we biologically enhance non-human animals?” Last I checked selection was a biological process. Domestication events have radically changed many organisms. The io9 piece spends some time on the possible Uplift of other species, but as a matter of reality coexistence with humans tends to reduce the intelligence of domestic animals (they offload many tasks to us). The narrow exception though is the case of dogs. Yes, they are uniformly less intelligent than wolves, but excel at reading human social cues. We’ve modified them to be our perfect companion animals!
By now you have probably seen the articles about how a new skull has transformed our understanding of the human family tree. The original paper is at Science, A Complete Skull from Dmanisi, Georgia, and the Evolutionary Biology of Early Homo. More colorfully you might say that this publication burns down the “bushy” model of human origins, where you have a complex series of bifurcations and local regional diversity, and then rapid extinction with the rise of H. sapiens sapiens ~50,000 years ago. In general I’m more in agreement with those plant geneticists who are skeptical of excessive fixation on the concept of species, so this is not a shock to me. To me a species concept is not a thing, but an instrument to a thing (i.e., I’m in interested in population and phylogenetics). The reason these sorts of findings overturn the orthodoxy has more to do with human cognitive intuitions about why things are categorized, than the reality of how nature arranges itself.
I should be careful about being flip on this issue. As recently as the mid aughts (see Mutants) the details of this trait were not entirely understood. Today the nature of inheritance in various populations is well understood, and a substantial proportion of the evolutionary history is also known to a reasonable clarity as far as these things go. The 50,000 foot perspective is this: we lost our fur millions of years ago, and developed dark skin, and many of us lost our pigmentation after we left Africa ~50,000 years ago (in fact, it seems likely that hominins in the northern latitudes were always diverse in their pigmentation)
“There were giants in the earth in those days…when the sons of God came in unto the daughters of men, and they bare children to them, the same became mighty men which were of old, men of renown.” -Genesis 6:4
Seven years ago I wrote a short post, Why patriarchy?, which attempted to present a concise explanation for the ubiquity of what we might term patriarchy in complex societies (i.e., not “small-scale societies”). Broadly speaking my conjecture is that social and political dominance of small groups of males (proportionally) over the past several thousand years is an example of “evoked culture”. The higher population densities in agricultural societies produced a relative surfeit of accessible marginal surplus, which could be given over to supporting non-peasant classes who specialized in trade, religion, and war, all of which were connected. This new economic and cultural context served to trigger a reorganization the typical distribution of power relations of human societies because of the responses of the basic cognitive architecture of our species inherited from Paleolithic humans. Agon, or intra-specific competition, has always been part of the game on human socialization. The scaling up and channeling of this instinct in bands of males totally transformed human societies (another dynamic is elaboration of cooperative structures, though this often manifests as agonistic competition between coalitions of humans).
Six months back there was a lot of discussion of Y haplogroup A00, An African American Paternal Lineage Adds an Extremely Ancient Root to the Human Y Chromosome Phylogenetic Tree. Now there’s an attempt by some of the researchers on that paper to raise money to collect more samples. Which of Cameroon’s peoples have members of haplogroup A00?:
This is Round Two of our fundraising for our groundbreaking research on the world’s earliest-branching Y-chromosome lineage, A00. Its origins lie in the earliest days of humanity’s emergence, the exact time very much in debate, but almost surely over 200,000 years ago. We first discovered it in early 2012, when the results of the Perrys’ Y-DNA tests were unlike anything seen before. We learned that they have matches among some of the diverse peoples of Southwest Cameroon. The new samples to be collected by Matthew in his homeland will allow us to learn much more about A00.
– Genesis 16:12
By now you may have seen or read two important papers which just came out in Science, 2000 Years of Parallel Societies in Stone Age Central Europe, and Ancient DNA Reveals Key Stages in the Formation of Central European Mitochondrial Genetic Diversity. The details have been extensively explored elsewhere. If you don’t have academic access I highly recommend the supplement of the second paper. It’s also very illuminating if you don’t have a good grasp of the nuts and bolts of archaeology (I do not). I can’t, for example, confirm whether the merging strategies of different archaeological cultures were appropriate or not, because I’m not totally clear in my own head about the nature of these distinct archaeological ‘cultures’ (quotations due to the fact that archaeologists infer culture from material remains, and so they may not be cultures in the sense we understand culture). But the overall finding is clear, in ancient Europe thousands of years ago there were multiple demographic replacements and amalgamations. The post-World War II thesis in archaeology that one could not infer changes in the demographic character from material remains (because the latter can diffuse purely through memetic means) seems to be false. The correspondence is surprisingly tight.
Update: Just watch this movie.
No time to write about it now, but check Science Magazine this afternoon (in a few hours from this posting) for a major paper on ancient mtDNA, and the striking correlation between changes in lineage frequencies and cultures that they discovered. Turns out that when you peel back the palimpsest it is much more complicated and surprising than we’d have thought. National Geographic, which funded the project, already has a post out on it:
What they found was that the shift in the frequency of DNA lineages closely matched the changes and appearances of new Central European cultures across time. In other words, the people who lived in Central Europe 7,000 years ago had different DNA lineages than those that lived there 5,000 years ago, and again different to those that lived 3,500 years ago. Central Europe was dynamic place during the Bronze age, and the genetic composition of the people that lived there demonstrates that there was nothing static about European prehistory.
Genographic Project Director and National Geographic Explorer-in-Residence, Spencer Wells expounds: “spanning a period from the dawn of farming during the Neolithic period through to the Bronze Age, the [genetic] data from the archaeological remains reveals successive waves of migration and population replacement- genetic ‘revolutions’ that combined to create the genetic patterns we see today.”
I hope this doesn’t lead to a new simplicity to replace the old one of no migration.
I didn’t go that route…I’ve been writing for 10+ years now, and long time readers can probably attest to the fact that I’ve become more and more focused on genetics as time goes by. This is due to the reality that I really like genetics. Really. The friend with whom I was having the conversation about our various interests admitted she couldn’t even imagine an alternative universe version of me who would nerd out on neuroscience. That would be a bizarro-world Razib.
A few weeks ago, Peter Turchin, who runs the excellent Social Evolution Forum, published a paper in PNAS, War, space, and the evolution of Old World complex societies. It is open access, so you can read it yourself. And Turchin himself has responded to questions at length. You might also check out my reviews of two of his books, as well as my interview with him. From all that it should be clear that broadly speaking I am enthusiastic about Turchin’s project, Cliodynamics. Historical details matter, but Turchin’s aim is to establish a broad brush framework of the coarse dynamics which shape the scaffold for specific aspects of historical process. Without this background all we have are cluttered details.
A few years ago Greg Cochran mentioned to me how he perceives the two child family to be the new bourgeois normal, enforced by the professional class and blue-haired ladies alike (this impression is informed by the fact that he has more than two children). This seems to align with my own general sense, but then again how normal is my socioeconomic milieu? So I decided to look at the General Social Survey. I limited the sample to non-Hispanics whites age 45 and over, constrained to the interval 2000-2012,* and broke the data into male and female classes. I crossed the number of children, binned 0, 1, 2, 3, 4, and 5+, with the highest educational attainment of the individual.** In other words I limited the data set, and looked at how the number of children of individuals varied as a function of education.
I am old enough to remember card catalogs. They did not make me happy. As a small child I noticed omissions and incorrect classifications so often that for long periods of time I would simply avoid the catalog, and methodically consume books from whole sections of the public library in line with my preferences through tedious manual browsing. I am also old enough to remember when the internet was still primitive in its data organization and storage capacity (i.e., pre-Google, pre-Wikipedia), and the library was the first, last, and best, recourse toward retrieving data. When Braveheart was released in 1995 I ran down to the local university library to see if I could find more about the protagonist’s biography than was present in Britannica. By chance there was a book available on the life and times of William Wallace, but it was checked out, and there were more than 10 holds ahead of me! This was not an uncommon occurrence in the age before the data rich internet. The reality is what I wanted to know about Wallace is probably found in the Wikipedia entry, but then there was no Wikipedia! These are just a few of the reasons that I have little patience for neo-Luddites such as Nicholas Carr. When I read Carr’s “old man” jeremiads I always wonder, “son, were you even around back in my day?”*
There has been a lot of attention to Erika Check Hayden’s piece Ethics: Taboo genetics, at least judging by people commenting on my Facebook feed. In some ways this is not an incredibly empirically grounded argument, because the biological basis of complex traits is going to be rather difficult to untangle on a gene-by-gene basis. In other words, this isn’t a clear and present “concern.” The heritability of many behavioral traits has long been known. This is not revolutionary, though for cultural reasons may well educated people are totally surprised when confronted with data that many traits, such as intelligence and personality, have robust heritabilities* (the proportion of trait variation explained by variation in genes across the population). The literature reviewed in The Nurture Assumption makes clear that a surprising proportion of contribution any parents make to their offspring is through their genetic composition, and not their modeled example. You wouldn’t know this if you read someone like Brian Palmer of Slate, who seems to be getting paid to reaffirm the biases of the current age among the smart set (pretty much every single one of his pieces that touch upon genetics is larded with phrases which could have been written by a software program designed to sooth the concerns of the cultural Zeitgeist). But the new genomics is confirming the broad outlines of the findings from behavior genetics. There’s nothing really to see there. The bigger issue of any interest is normative; the values we hold dear as a culture.
Matter has a very long feature by my friend Virginia Hughes, Uprooted, on how personal genomics is changing, and sometimes disrupting, family relationships. I sat in on one session at the Consumer Genetics Conference last week, and an audience member expressed worry about how genetic results might cause family disruption. This individual was actually a faculty member who wanted to introduce personal genomics into the classroom as a way to educate, but was wary of these sorts of side effects. Even neglecting the reality that paternity uncertainty is likely far less pervasive among the sorts whose parents would be enrolling their offspring at universities in the Boston area, these worries always have to be predicated by the fact that even dodging this ethical gray zone in the specific case only delays the near-future inevitable. Unless medical authorities ubiquitously and invariably selectively shield this sort of information from the relevant parties the widespread adoption of genetic analysis as a consumer product will result in exposure of this sort of information. Though it may seem crazy preemptive testing of all offspring to ascertain biological relatedness of putative parents may simply be the best way to head off this issue, which will be like a ticking time bomb.