Prof Dr M Iqbal Choudhary, Director International Centre for Chemical and Biological Sciences (ICCBS), Karachi University (KU), disclosed on Thursday that former Chairman, Higher Education Commission (HEC) Prof Dr Atta-ur-Rehman is the first Pakistani whose genome has been mapped by Pakistani scientists at a cost of $40,000 in just 10 months.
China has contributed $20,000 in the total cost of the genome project. Pakistani and Indian genomes have similarities compared to others, he said, while speaking at a press conference, held on Thursday at Dr Panjwani Center for Molecular Medicine & Drug Research (PCMD), Karachi University (KU).
“Dr Atta has become the first Muslim man with this distinction, while he is the third one among a list of renowned people in the world whose genomes have been mapped by scientists. The names of the first two persons are Prof Watson and Dr Ventor (2007), while others are unnamed.
Every year or so there seems to be a redating of a key fossil in human evolution. It’s nice to see scientific self-correction in action, and soon after Neandertals got a little older, casting doubt on their supposedly long co-existence with modern humans, we now have a redating of Homo erectus soloensis from Java to about 150-550 thousand years ago, but certainly long before there were any anatomically modern humans in the area.
I think Dienekes is jumping the gun a bit in terms of the solidity of any given finding in knocking down prior consensus. That being said, the very young ages for Southeast Asian H. erectus, on the order of ~30-50,000 years B.P., always seemed strange to me. The paper Dienekes is referring to, The Age of the 20 Meter Solo River Terrace, Java, Indonesia and the Survival of Homo erectus in Asia, is rather technical in the earth science, as it involves dating and interpreting confounds in the stratigraphy. But this section of the discussion gets to the gist of the matter if you can’t follow the details of fossil dating:
If the middle Pleistocene 40Ar/39Ar ages better reflect the age of the Solo River 20 meter terrace deposits and hominins, the site of Ngandong remains a relatively late source of H. erectus; however, these H. erectus would not be the contemporaries of Neandertals and modern humans, and their chronology would widen the gap between the last surviving H. erectus and the population from Flores – whose source population has been argued to be Indonesian H. erectus…although this point is contested…Instead, the Ngandong hominins would be contemporaries of the H. heidelbergensis from Atapuerca, Spain and elsewhere in Europe, and, possibly the archaic H. sapiensspecimen from Bodo (Ethiopia), which might favor arguments that they are more closely affiliated with these taxa and differ from H. erectus…Such ages for Ngandong would suggest that a series of geographically relatively isolated lineages of hominins lived during the middle Pleistocene.
… First, I don’t know whom the company thinks it’s kidding; Google+ is obviously a direct competitor to Facebook. Given the large overlap in functionality, I can’t imagine that many people will use Google+ and Facebook simultaneously. For most of us, it will be one or the other. Google+’s success, then, will rest in large part on Google’s ability to convince people to ditch Facebook for the new site. For that, Google+ will have to offer some compelling view of social networking that’s substantially different from what’s available on Facebook. And that’s where Google+ baffles me. What is so compelling about Google+ that I can’t currently get on Facebook or Twitter? Or Gmail, for that matter? At the moment, I can’t tell….
But circles are nothing new. Facebook has offered several ways to break your network into smaller chunks for many years now, and it has worked constantly to refine them. And you know what? Almost no one uses those features. Only 5 percent of Facebookers keep “Lists,” Facebook’s first attempt for people to categorize their friends. Recognizing that “Lists” weren’t great, last year the site unveiled a new way to manage your friends, called “Groups.” I was optimistic that “Groups” would help to compartmentalize Facebook, but from what I can tell, few people use that feature, either.
Since Google+ is not “prime time” I’m not going to judge it too much. The interface feels a lot zippier and more fluid than Facebook’s, but that might just be because there are hundreds of millions of people using Facebook. Unlike Manjoo I do think that the idea of “circles” is not without merit. I tried Facebook’s Lists, and it just plain didn’t work the way it was supposed to work, so I gave up. Right now I, along with others, slice and dice my online voice across different platforms. twitter for public interaction, Facebook for semi-public interaction.
When you have friends you know through science blogging, transhumanism, right-wing politics, high school, not to mention cousins who were raised in the Tablighi subculture, Facebook’s one-size-fits-all tendency of throwing them into a big pot has been kind of suboptimal. Then again, most people probably don’t manifest as much dilettantism as I do, leading them to have a much more well “sorted” social set.
I will say though that Google+ doesn’t seem as patently useless as Wave and Buzz were. But if you haven’t gotten an invite, you aren’t missing out on much. There is no way this should warrant the hysteria which was the norm when Gmail first rolled out and required invites.
That’s what Ann Patchett is claiming. More specifically, there are no bricks & mortar institutions which specialize in selling new books. There are places you can get used books in the city of Nashville. To remedy the situation Patchett is opening up a bookstore herself. She asserts that “…we’ve got to get back to a 3000-square-foot store and not 30,000. Amazon is always going to have everything – you can’t compete with that. But there is, I believe, still a place for a store where people read books.”
I recall going to a Barnes & Noble when I was in Nashville in the summer of 2004. Here’s some demographic data: “As of the 2010 census, the balance population was 601,222. The 2000 population was 545,524.” The details here are a bit muddy because parts of Davidson county are included with the Nashville total, but you get a general sense of how substantial the population of this city is. As a point of comparison Eugene, OR, has a population of 156,185, and 29 Yelp hits for bookstores. Nashville has 46 results.
Back to Patchett’s claim, I think there is something there. I don’t know how it’s going to shake out in the details. But consider the fact that it is far cheaper to brew your own coffee at home, but more and more people are frequenting shops which sell coffee at a much higher per unit cost. Obviously people are going for the experience. The main issue with bookstores is that the per unit cost of a book is higher than even a fancy drink at most coffee shops.
John Winthrop, ~1600. Mitt Romney, 2008 – image credit, Jessica Rinaldi
Recently Megan Mcardle had a post up where she expressed curiosity as to why “futurists” circa 1900 had a tendency not to imagine revolutions in clothing style which might have been anticipated to occur over the next few decades. You also see see this in Star Trek in the 1960s, where faux-future fashion was clearly based on the trends of the day, from the beehive hair to miniskirts. So I thought this comment was of interest:
I don’t know the answer, but I don’t know that they were wrong to do it. Keeping fashions exactly the same as the present generally winds up with more in common with the actual future than deliberate “future” fashions. A fair number of men still wear ties, and on rare occasions a few even wear tailcoats; rather fewer wear silver jumpsuits.
There have been a few counters to extreme fashions in media SF: “Blade Runner”‘s lead wore the same trenchcoat as his noir forebears; “Babylon 5” went for modified business suits and moderate variations on military uniforms; the “Battlestar Galactica” reimagining was pretty much straight conservative turn-of-the-millennium wear despite being in a far different time. How have those worn versus the approach taken by “Star Trek” or the 2015 segment of the “Back to the Future” movies?
I’m not sure that I accept this case as airtight, but this is certainly true in the specifics. Though I just saw some clips of Running Man for the first time on Youtube, I viewed Blade Runner a few years back for the second time and was struck by how undated it was in regards to fashion sense. At least in a very noticeable manner. It got me to thinking of the nature of cultural evolution even then.
Back when this sort of thing was cutting edge mtDNA haplogroup J was a pretty big deal. This was the haplogroup often associated with the demic diffusion of Middle Eastern farmers into Europe. This was the “Jasmine” clade in Seven Daughters of Eve. A new paper in PLoS ONE makes an audacious claim: that J is not a lineage which underwent recent demographic expansion, but rather one which has been subject to a specific set of evolutionary dynamics which have skewed the interpretations due to a false “molecular clock” assumption. By this assumption, I mean that mtDNA, which is passed down in an unbroken chain from mother to daughter, is by and large neutral to forces like natural selection and subject to a constant mutational rate which can serve as a calibration clock to the last common ancestor between two different lineages. Additionally, mtDNA has a high mutational rate, so it accumulates lots of variation to sample, and, it is copious, so easy to extract. What’s not to like?
In the comments below Antonio pointed me to this working paper, What Do DNA Ancestry Tests Reveal About Americans’ Identity? Examining Public Opinion on Race and Genomics. I am perhaps being a bit dull but I can’t figure where its latest version is found online (I stumbled upon what looks like another working paper version on one of the authors’ websites). Here’s the abstract:
Genomics research will soon have a deep impact on many aspects of our lives, but its political implications and associations remain undeveloped. Our broad goal in this research project is to analyze what Americans are learning about genomic science, and how they are responding to this new and potentially fraught technology.
We pursue that goal here by focusing on one arena of the genomics revolution — its relationship to racial and ethnic identity. Genomic ancestry testing may either blur racial boundaries by showing them to be indistinct or mixed, or reify racial boundaries by revealing ancestral homogeneity or pointing toward a particular geographic area or group as likely forebears. Some tests, or some contexts, may permit both outcomes. In parallel fashion, genomic information about race can emphasize its malleability and social constructedness or its possible biological bases. We posit that what information individuals choose to obtain, and how they respond to genomic information about racial ancestry will depend in part on their own racial or ethnic identity.
We evaluate these hypotheses in three ways. The first is a public opinion survey including vignettes about hypothetical individuals who received contrasting DNA test results. Second is an automated content analysis of about 5,500 newspaper articles that focused on race-related genomics research. Finally, we perform a finer-grained, hand-coded, content analysis of about 700 articles profiling people who took DNA ancestry tests.
Three major findings parallel the three empirical analyses. First, most respondents find the results of DNA ancestry tests persuasive, but blacks and whites have very different emotional responses and effects on their racial identity. Asians and Hispanics range between those two poles, while multiracials show a distinct pattern of reaction. Second, newspaper articles do more to teach the American reading public that race has a genetic component than that race is a purely social construction. Third, African Americans are disproportionately likely to react with displeasure to tests that imply a blurring of racial classifications. The paper concludes with a discussion, outline of next steps, and observations about the significance of genomics for political science and politics.
The image above is adapted from the 2010 paper A Predominantly Neolithic Origin for European Paternal Lineages, and it shows the frequencies of Y chromosomal haplogroup R1b1b2 across Europe. As you can see as you approach the Atlantic the frequency converges upon ~100%. Interestingly the fraction of R1b1b2 is highest among populations such as the Basque and the Welsh. This was taken by some researchers in the late 1990s and early 2000s as evidence that the Welsh adopted a Celtic language, prior to which they spoke a dialect distantly related to Basque. Additionally, the assumption was that the Basques were the ur-Europeans. Descendants of the Paleolithic populations of the continent both biologically and culturally, so that the peculiar aspects of the Basque language were attributed by some to its ancient Stone Age origins.
As indicated by the title the above paper overturned such assumptions, and rather implied that the origin of R1b1b2 haplogroup was in the Near East, and associated with the expansion of Middle Eastern farmers from the eastern Mediterranean toward western Europe ~10,000 years ago. Instead of the high frequency of R1b1b2 being a confident peg for the dominance of Paleolithic rootedness of contemporary Europeans, as well as the spread of farming mostly though cultural diffusion, now it had become a lynch pin for the case that Europe had seen one, and perhaps more than one, demographic revolutions over the past 10,000 years.
This is made very evident in the results from ancient DNA, which are hard to superimpose upon a simplistic model of a two way admixture between a Paleolithic substrate and a Neolithic overlay. Rather, it may be that there were multiple pulses into a European cul-de-sac since the rise of agriculture from different starting points. We need to be careful of overly broad pronouncements at this point, because as they say this is a “developing” area. But, I want to go back to the western European fringe for a moment.
The DNA ancestry testing industry is more than a decade old, yet details about it remain a mystery: there remain no reliable, empirical data on the number, motivations, and attitudes of customers to date, the number of products available and their characteristics, or the industry customs and standard practices that have emerged in the absence of specific governmental regulations. Here, we provide preliminary data collected in 2009 through indirect and direct participant observation, namely blog post analysis, generalized survey analysis, and targeted survey analysis. The attitudes include the first available data on attitudes of those of individuals who have and have not had their own DNA ancestry tested as well as individuals who are members of DNA ancestry-related social networking groups. In a new and fluid landscape, the results highlight the need for empirical data to guide policy discussions and should be interpreted collectively as an invitation for additional investigation of (1) the opinions of individuals purchasing these tests, individuals obtaining these tests through research participation, and individuals not obtaining these tests; (2) the psychosocial and behavioral reactions of individuals obtaining their DNA ancestry information with attention given both to expectations prior to testing and the sociotechnical architecture of the test used; and (3) the applications of DNA ancestry information in varying contexts.
If anyone wants the paper, email me, I can send you a copy. But really it’s just kind of dated because the information was collected in 2009, before the massive increase in 23andMe’s customer base which began in the spring of 2010. Additionally, “genome blogging” really hadn’t started much at that point.
In terms of the reactions to ancestry analysis, my personal experience after doing analysis on hundreds of people (most in public for AAP, but some in private) is that most are pretty calm about whatever they find out. On occasion you run into a stubborn person who is basically going to fix upon a really implausible explanation for a particular ancestral slice rather than the lowest hanging fruit. But there was one individual who had a freak out when their results were published, because it did not accord with family beliefs. I was kind of confused, and checked their results with their self-reported ethnicity. Weirdly the results were exactly what I would have expected from the self-reported ethnicity, so it was a really strange reaction.
On several occasions I’ve gotten into discussions with geneticists about the possibility of reconstructing someone’s facial structure by genes alone. Combined with advances in pigmentation prediction by genetics, this could put the sketch artist out of business! But all that begs the question: how heritable are facial features anyhow? Impressionistically we know that feature are broadly heritable. This isn’t a tenuous supposition, you see the resemblance over and over across families. All that being said, what are the specific quantitative heritability estimates? How do they relate to other traits we’re interested in? This review from the early 1990s seems to have what I’m looking for, The Role of Genetics in Craniofacial Morphology and Growth. Below is a table which shows averaged heritabilities for a range of facial quantitative traits from a large number of studies:
BusinessWeek‘s The Rise and Inglorious Fall of Myspace is a compelling read. But a huge piece of the puzzle which I thought was omitted was that Myspace was incubated in the short term bottom-feeder world to begin with, so the later fixation on revenue now rather than a long term vision may simply have been part of its original DNA. See this Planet Money podcast, MySpace Was Born Of Total Ignorance. Also Porn And Spyware, for what I’m talking about. As it is in the BusinessWeek piece Chris DeWolfe just tries to blame News Corp. Remember that DeWolfe and Tom Anderson sold out to Rupert Murdoch, while Mark Zuckerberg was uninterested in an immediate cash windfall. As far as the long term impact of Myspace I notice that the Urban Dictionary entry for ‘myspace angle’ is still more fleshed out than ‘facebook angle’, so the word “myspace” might still get preserved in this manner. In this way Myspace may resemble the audio cassette, which is still haunting our culture as the “mixtape”. Not surprisingly some young people are totally unaware that the tape portion actually refers to cassette tapes, since that technology was before their time.
In my long post below, Celts to Anglo-Saxons, in light of updated assumptions, I had a “cartoon” demographic model in mind which I attempted to sketch out in words. But sometimes prose isn’t the best in terms of precision, and almost always lacks in economy.
In particular I wanted to emphasize how genes and memes may transmit differently, and, the importance of the steps of going between A to Z in determining the shape of things in the end state. To illustrate more clearly what I have in mind I thought it might be useful to put up a post with my cartoon model in charts and figures.
First, you start out with a large “source” population and a smaller “target” population. Genetically only the migration from the source to the target really has an effect, because the source is so huge that migration from the target is irrelevant. So we’ll be focusing on the impact upon the target of migration both genetically and culturally.
To simplify the model we’ll imagine a character, whether genetic or memetic, where the source and target are absolutely different at t = 0, or generation 1. Also, these are discrete generations, and the population is fixed, so you can presume that it’s at carrying capacity. Migration of the outsiders into the target population from the source means less of the original native population in absolute terms (to be realistic this is bidirectional, so people are leaving the target too, but that’s not our concern here).
There are two time series which illustrate the divergent dynamics on both the genetic and memetic dimensions. In one series you see gradual and continuous migration from the source to the target population over 13 generations. In another there are two generations of massive migration, before and after which there is no migration. For the genetic character, imagine disjoint allele frequencies at generation 1. So at generation 1 the target population is at 100% for allele A, while the source is at 100% for allele B. Therefore migration of from the source to the target results in a decrease in the proportion of allele A, which is what is being measured on the y-axis. For the memetic character, imagine that it’s language. So at generation 1 100% in the target zone speak language A, while everyone in the source zone speak language B. Again, the frequency on the y-axis is of the proportion who speak language A in the target zone.
Over the past week there have been three posts which I’ve put up which are related. Two of them have a straightforward relation, Britons, English, Germans, and collective action and Britons, English, and Dutch. But the third might not seem related to the other two, We stand on the shoulders of cultural giants, but it is. When we talk about things such as the spread of language through “elite emulation” or “population replacement” they’re rather vague catchall terms. We don’t decompose them mechanistically into their components to explore whether they can explain what they purport to explain. Rather, we take these phenomena for granted in a very simplistic black box fashion. We know what they’re describing on the face of it. “We” here means people without a background in sociolinguistics, obviously.
To give an example of the pitfall of this method, in much of Rodney Stark’s work on sociology of religion (the production before his recent quasi-apologetic material) his thinking was crisp and logical, but the psychological models were intuitive and naive and tended to get little input from the latest findings in cognitive science. In One True God he actually offers an explanation for why Christian Trinitarianism is psychologically more satisfying than the starker monotheism of the Jews and Muslims, or the more elaborated diffuse polytheism which predates monotheism. All I will say is whether you are convinced or not, Stark’s argument has some logical coherency and a level of plausibility, until you explore the literature in cognitive science on conceptualization of supernatural agents. The psychological literature as outlined in Theological Incorrectness indicates quite clearly that no matter the explicit philosophical nature of God as outlined in a given religion, cognitively the human mind has strong constraints in terms of how it represents abstractions, so that the vast majority of believers conceptualize the godhead in an invariant manner. To be more clear about it, even though Jews and Muslims are strict monotheists and some Hindus conceive of themselves as polytheists,* their concrete mental image of the divine doesn’t vary much from person to person and religion to religion. As a practical matter Hindus who may accept the reality of a nearly infinite number of gods on paper still exhibit personal devotion to only a few. Jews and Muslims who are strict monotheists nevertheless may have cults of saints and lesser supernatural agents in their mental universe. There is a difference between saying you accept the reality of millions of gods, and actually being able to mentally focus on millions of gods. The latter is not possible, and that has real world consequences, in that the concrete difference between an avowed polytheistic and monotheist in terms of mental state is minimal at most. So the psychological contrasts which Stark assumes motivate higher order social differences turn out to be superficial word games in pure cognitive terms.**
Similarly, we know intuitively what “elite emulation” means. It’s self-evident in that the mass of the population emulates the elite in terms of their folkways. But how does this really play out? The description is just a description, it doesn’t elaborate on the process of how you get from A to Z. When I try and find references in the ethnographic literature, generally what I encounter is in as an aside. What has been gnawing at me are cases like the Bulgar assimilation into the Slavic substrate, and the Magyar assimilation of their own Slavic and Latinate substrate. What distinguishes these two cases? They’re two instances of mobile populations from the western margins of Inner Asia erupting into the eucumene. Even if they were not pure horse-nomads in the vein of the Huns, they were clearly amongst the last of these class of peoples to force themselves into the heart of Europe after the fall of Rome because of their obligate male militarization and mobility. In the case of the Bulgarians all that remains of their distinctive identity as a mobile Turkic population is their ethnonym. In contrast in the case of the Magyars they imposed their Ugric language language upon the population which they dominated. Modern Hungarians don’t seem to be any genetically different from what you’d expect based on geography. This is in contrast with Anatolian Turks, who do seem to have a minority East Asian element. The emergence of a dominant Magyar ethnicity on the Hungarian plain in the early medieval period then is clearly an instance of elite emulation if there ever was one, in contrast to the absorption of the Bulgars into their substrate. But this is just a description, it doesn’t tell us why elite emulation worked in one zone, but not in another.
Within each 4-year period, participants gained an average of 3.35 lb (5th to 95th percentile, −4.1 to 12.4). On the basis of increased daily servings of individual dietary components, 4-year weight change was most strongly associated with the intake of potato chips (1.69 lb), potatoes (1.28 lb), sugar-sweetened beverages (1.00 lb), unprocessed red meats (0.95 lb), and processed meats (0.93 lb) and was inversely associated with the intake of vegetables (−0.22 lb), whole grains (−0.37 lb), fruits (−0.49 lb), nuts (−0.57 lb), and yogurt (−0.82 lb) (P≤0.005 for each comparison). Aggregate dietary changes were associated with substantial differences in weight change (3.93 lb across quintiles of dietary change). Other lifestyle factors were also independently associated with weight change (P<0.001), including physical activity (−1.76 lb across quintiles); alcohol use (0.41 lb per drink per day), smoking (new quitters, 5.17 lb; former smokers, 0.14 lb), sleep (more weight gain with <6 or >8 hours of sleep), and television watching (0.31 lb per hour per day).
I took the results when they controlled for other variables and filtered them all so that their p-values were 0.001 or less (in fact, of the ones below only “sweets and desserts” is p-value 0.001, all the others are below that). Nothing too surprising, but the magnitude of effect of french fries was pretty large:
In reading The cultural niche: Why social learning is essential for human adaptation in PNAS I couldn’t help but think back to a conversation I had with a few old friends in Evanston in 2003. They were graduate students in mathematics at Northwestern, and at one point one of them expressed some serious frustration at the fact that so many of the science and business students in his introductory calculus courses simply wanted to “learn” a disparate set of techniques, rather than understand calculus. The reality of course is that the vast majority of people who ever encounter calculus aim to learn it for reasons of utility, not so that they can grok the fundamental theorem of calculus. With the proliferation of tools such as Mathematica and powerful portable calculators fewer and fewer people are getting their hands dirty with calculus in an analytic sense, and more often see it as simply a “requirement” which they have to pass.
Calculus, and mathematics generally, is a clean and crisp human invention. In the late 17th century Isaac Newton and Gottfried Leibniz originated calculus as we understand it. Later thinkers extended their work. But for the vast majority of humans who have ever learned calculus it is simply a “black box” set of techniques which work rather magically. They did not contribute anything new to the body of knowledge which they drew upon. Mathematics is part of our cultural patrimony, we implicitly stand upon the shoulders of giants without apology. Such is to be human.
Also at the meeting, researchers led by geneticist Angela Graefen of the Institute for Mummies and the Iceman reported that they have succeeded in sequencing the Iceman’s whole genome, despite the highly fragmented nuclear DNA. The genome has already revealed some surprises. One preliminary finding shows that the Iceman probably had brown eyes rather than the blue eyes found in many facial reconstructions done by artists. Graefen and her colleagues are also examining the DNA to see if Ötzi possessed genetic predispositions to diseases such as arthritis, which other researchers have diagnosed based on radiological and other evidence.
I’m assuming we’ll know a whole lot more before the end of summer. So I’m going to go out on a limb and make a prediction based on what I suspect about the southern European genetic landscape ~5,000 years ago: Ötzi will be more like contemporary West Asian people, Georgians, Armenians, etc., than modern north Italians and south Germans are. Right or wrong, I hope the results will be interesting!
Today I took some time out to see Cave of Forgotten Dreams. My main reaction is that I really would have appreciated less verbal exposition from Werner Herzog. The most gripping portions of the film were invariably those which focused on the cave art with no commentary. These were the scenes where Herzog also seems to have leveraged 3D technology the most seamlessly. In contrast many of the outdoor shots were really disorienting, especially where were trees in the background. There was shot near the end where a person was approaching the camera with branches framing the background, and it was so jarring that I just looked away.
Not surprisingly all of the standard “talking heads” exposition really didn’t benefit at all from the 3D; it just distracted you while people were trying to explain various archaeological details. Speaking of which, I wish there was more of the scientific background here. Obviously that’s going to be my bias. Herzog naturally provide a spiritual rationale for why the original artists did what they did. That seems more plausible than anything else, but I still assume that the probability of any given hypothesis we offer being correct is going to be low. We just don’t know enough about the human past, and what might have motivated Paleolithic peoples.
If you do go to see Cave of Forgotten Dreams, don’t get discouraged in the middle section when the director allows the researchers to talk at length. The best scenes, with the least verbal and visual clutter within the cave, are near the end of the film. If you can, it might be most rational to purchase the ticket, go do something else, and show up for the last 30 minutes.
Epigenetics is making it “big time,” Slate has a review up of the new book Epigenetics: The Ultimate Mystery of Inheritance. In case you don’t know epigenetics in terms of “what it means/why it matters” holds out the promise to break out of the genes → trait conveyor belt. Instead positing genes → trait → experience → genes, and so forth. Or perhaps more accurately genes → trait × experience → genes. Epigenetics has obviously long been overlooked as a biological phenomenon. But, I think the same could be said for the ubiquity of asexual reproduction and unicellularity! Life science exhibits anthropocentrism. That’s why there’s human genetics, and biological anthropology. My own concern is that epigenetics will give some a license to posit that the old models have been overthrown, when in fact in many cases they have been modified on the margin. Especially at the level of organisms which we’re concerned about; human-scaled eukaryotes. Humans most of all.
The last paragraph in the review highlights the hope, promise, and perils of epigenetics in regards to social relevance: