In some quarters it is now “conventional wisdom” that Google Glass is going to seem dorky and laughable at first. But it’s probably just the pre-alpha version of the type of technology which seems inevitable (and is familiar to anyone who has read cyberpunk science fiction).
Recently Bill Maher ripped into CSU San Bernadino professor Brian Levin for making the ridiculous equivalence between Christian extremism and Islamic extremism. The problem, which Maher pinpoints, is that Islamic extremism is not that extreme. By this, I mean that Islamic extremism (e.g., Muslim Brotherhood) has much greater broad based support than Christian extremism (e.g., Christian Reconstruction). The difference here is that you’ve heard of the Muslim Brotherhood, while far fewer have heard of Christian Reconstructionists. That’s because the former have democratic support in a populous Muslim country as the ruling party.
A few days ago I came across this four year old article in The Wall Street Journal on the Naga Bhut Jolokia “ghost” pepper, which is reportedly hotter than Habanero. Since none of the local grocers carry the ghost pepper, I went online. I purchased some seeds. But I also ordered Dave’s Ghost Pepper Naga Jolokia Hot Sauce and Volcano Dust Bhut Jolokia Powder. The latter was spicy, but it actually wasn’t too potent. I’d expected a lot more. Think cayenne powder on steroids. On the other hand, the hot sauce was hot. And unlike Dave’s Insanity Sauce there were flavors besides the heat which one could discern. Unlike Dave’s Insanity the Dave’s Ghost Pepper doesn’t taste like it came out of a chemical plant. I heartily recommend it.
Over at Slate there is a piece out which is being shared on Facebook a fair amount, Thesis Hatement: Getting a literature Ph.D. will turn you into an emotional trainwreck, not a professor. As a contrast I think it is useful to read this other piece in Slate, Is a Science Ph.D. a Waste of Time? Don’t feel too sorry for graduate students. It’s worth it. But I want to focus on one aspect of the Slate rant:
After reading Nature’s Oracle (yes, a lengthy review will be up soonish) I am even more struck by how evolutionary process suffused W. D. Hamilton’s whole worldview. This resulted in some peculiar conflicts over his career with those who wished to partition evolutionary and biological processes away from the domain of humans. Of course Hamilton himself focused for most of his scientific life on non-human phenomena in the specific details (e.g., the utilization of hymenoptera to illustrate inclusive fitness), but he always believed that his evolutionary insights were general. This makes sense in light of his idolization of R. A. Fisher, for whom evolutionary genetics was a practical science (he was a eugenicist). One of the biographical details which receives great attention in Nature’s Oracle is Hamilton’s untimely approach of an anthropology department the early 1960s in the interest of pursuing graduate work on the evolution of social behavior. It was a reflection of his absolute naivete as to the political climate during this period.
The word “discourse” is used a lot now. I’ve seen it even bleeding out of academic discourse, to general interest news. So I thought I’d look at the comparison between it an a few other terms in Google’s Ngram Viewer, which surveys mentions in books. The results were mildly interesting.
There is a blog post going around about the strange interaction of a freelancer with an editor at The Atlantic. The short of it is that the bargaining position of labor has deteriorated a great deal for some individuals over the past 10 years. Matt Yglesias put up a flip response, defending writing for free. This might seem rich coming from an individual who has a well compensated staff position at Slate, but Matt’s response is that he did write for free/low pay for years.
Now that I have a daughter I do reflect a bit more on what the purpose of my life is, because at some point I want to talk to her about the purpose of her life. There is a little bit of irony in this insofar as now she is a primary purpose of my life! But in any case, though Chris Rock’s raison d’être speaks to me, additionally my job is also to make sure that my daughter doesn’t become a C.P.A. Certain professions, such as dentistry or accountancy, are honorable. But there are enough people who want to enter those financially lucrative professions as it is. In a world of such absolute affluence we can afford the luxury of the life the mind. Aristotle’s father was a physician, no doubt a good man. But his memory persists only because of the incandescent brilliance of his son, who ventured into wide intellectual waters.
Speaking of Aristotle, Aristotle Onassis is reputed to have said that “If women didn’t exist, all the money in the world would have no meaning.” Point taken, and I think there’s a great deal of truth in this. But let me rephrase it: if books didn’t exist, all the time in the world would have no meaning. To many this sort of assertion would seem strange, but I suspect among my readership it is comprehensible. And by books I don’t mean to imply paper and ink and binding, I mean the information encoded within those books.
With that out of the way, I thought I would share an email from a long time reader (though only very rarely a correspondent). I don’t necessarily agree with everything stated here obviously, and I hope that the comments don’t devolve in discussions of the nature of East Asian society. I didn’t feel comfortable expurgating that aspect just because some might take objection though. Rather, it is to consider how one might find a place to flourish and be nurtured socially in their intellectual explorations.
By most material measures we’re doing better as a species than we ever have. That is, in an absolute sense. But a lot of human life is about relative prosperity. I recall hearing once that role playing games which emphasized egalitarianism, with no “winners” or “losers,” often had a difficult time gaining users. We are a cooperative species, but we’re also a competitive species. The idea of a rising tiding lifting all boats is appealing, but so is the idea that one needs to have a larger McMansion than the Jones’. Non-zero sum interactions are splendid, but as social organisms we evolved to a great extent in a world dominated by zero sum games. Our rationality counsels that we trust in reason’s logic, but our emotions drive us toward cognitive biases such as loss aversion.
Three articles in The New York Times prompt me to reflect on the shortsightedness of modern life in the developed and aspiring developed world. First, It Takes a B.A. to Find a Job as a File Clerk. Basically the transformation of college into the new high school. Second, As Families Change, Korea’s Elderly Are Turning to Suicide. The focus of this article is how modern economic and social tumult are tearing apart the fabric of South Korean life. But it also focuses on the mad scramble for the “best” education which drives many to penury: ‘Some parents, the “edu poor,” drained their savings to pay for cram schools that operate after regular school and on weekends.’ Finally, In China, Families Bet It All on College for Their Children. This despite the fact that there is a surfeit of graduates in many areas.
Jonathan Eisen and Michael Eisen both have posts up about the suicide of their father, and its relation to the recent death of Aaron Swartz. Like many people I was depressed when I heard what had happened. I never met Swartz, nor had any interactions with him online. In terms of specifics our views differed on a range of issues. But, I admired his ferocity, intensity, and clear and obvious genuine commitment to the life of the mind. Whatever disagreements we may have with Swartz’s specific commitments, I suspect many people strive to throw themselves into their passions with the follow through that Swartz exhibited. Swartz’s suicide has made me reflect on the role of the institutional academy in our society, and what ends it pursues. But my thoughts are inchoate, so I leave you with the links to the Eisens’ posts, who can draw upon more relevant personal experiences.
I find the photo above of John Quincy Adams striking because it is of a man who was born in 1767. The era of the Revolutionary War is one of paintings (albeit, not contemporary ones). And yet here we look upon the face of an old man who was alive and self-aware during that period, and who grew into adult when the Founders still flourished. The photograph is of poor quality and lacking in color. Arguably it transmits less precise detail of the features of John Quincy Adams than a painted portrait, but photographs capture something ineffable (or more accurately they replicate physical details which we are unable to elucidate verbally, but which are recognized by our innate cognitive system). John Quincy Adams is long dead, but the verisimilitude of the image brings him back to life in some way due to the reflexes awakened in my brain. I see the man, so the man must be.
The power of photographic technology should make us wary of those claims that science and technology drain the wonder from the world by making the mysterious comprehensible, and the numinous prosaic. Our lives are magical, we simply don’t know it.
By now you will have seen the Facebook generated map of NFL fan distributions by county. The map itself is fascinating for two reasons. Substantively it illustrates the power of state lines and regionalism. The latter is not surprising, but I suspect that the former may be. The boundary between the territory of the Buffalo Bills and New York Giants may not surprise you, but it aligns almost perfectly with where people shift from saying “pop” to “soda.” On the other hand you have strange phenomenon such as northern Mississippi and Alabama’s attachment to a New Orleans team which is much more distant than the Tennessee Titans (granted, the Titans are relatively new).
But a deeper more “meta’ point is that some of the most cutting edge and data swollen social science now occurs in the private sector. Facebook and Google are obvious cases. But credit rating and marketing firms also have a very deep understanding of your behaviors, and how you should behave. I do think people should be somewhat concerned about his, but I also suspect that if a really blockbuster fact was discovered “in house,” it would leak soon enough.
|0 AD||The codex|
|1500 AD||Printing press|
|2000 AD||The internet|
There has been an issue I have wanted to bring up, but my thoughts have been rather inchoate. If you read this blog closely it won’t surprise you that in general my idealistic sympathies in regards to “access” of scientific publications are in line with Michael Eisen‘s. He (and others) do a good enough job in this area that I don’t feel like I have much to add, aside from cheering, or noting an open access success now and then.
Recently David Attenborough made the news because he expressed some old fashioned population alarmism. I say old fashioned because we’ve come a long way since Paul Ehrlich’s Population Bomb was published. It’s been 44 year since the original edition, and it hasn’t aged well. Not only is the world healthier and wealthier than it’s ever been, but population growth is likely to taper off in a stabilization by the mid-21st century. If there are resource scarcity issues it won’t be because of human numbers, it will be because of the unsustainability of per capita consumption. And that doesn’t take into account technological change and innovation. Agricultural inputs aren’t static.
The real issue here is one of values. It probably is difficult to not have reduced biodiversity as humans have to exploit more and more of the world to maintain their lifestyles. A “population bomb” in the sense of the impending end of civilization is probably not a good medium term (i.e., ~50 years) prediction. But for large to medium sized non-human organisms we are a bomb or plague. The irony here is that a concern for the environment is to a great extent a post-materialist value, which emerges in the wake of the affluence which may be the greatest threat to biodiversity….
Over at ScienceDaily there is a report on a new paper on affirmative action and academia, Understanding the Impact of Affirmative Action Bans in Different Graduate Fields of Study. The paper is gated, but the regression model used really doesn’t seem to do much more than confirm intuition. The descriptive details are more interesting and straightforward.
It looks like law school applications are finally declining precipitously. The specific issue here is that it’s not necessarily easy to leverage a non-elite law school degree into a lucrative career (see the bimodal distribution of law school graduate pay) which makes servicing student loans (which can not be wiped out by bankruptcy) manageable. This is layered on top of the fact that many non-elite law schools seem to have been engaged in de facto marketing fraud in cooking-the-books on the prospects of their graduates for years. There have been many who have criticized Paul Campos of The Law School Scam, but I have plenty of anecdata to support his assertions in a qualitative sense. If you lack quantitative skills but have above average, but not stellar, verbal skills then loading up on $100,000+ debt in law school is not a path to riches (assuming you lack connections and are not on track to simply take over your family firm).
After last week’s post on e-books I started reading some of the interactions that Nicholas Carr was having with others. This post, which mostly consists of exchanges between Carr and Clay Shirky has to be read to believed. Shirky’s comment “as usual your remarks defy a simple reply” encapsulates my own reaction to Carr. The more I read from him the less persuaded and the more skeptical I become of his contentions. Carr deploys analogies like a lawyer holding forth to a dull jury in classic cinematic fashion. Upon further inspection the point is often facile, but there is a superficial gleam of plausibility which might convince those not so mentally endowed and eager to swallow the tendentious propositions whole.
Nicholas G. Carr, purveyor of high-brow neo-ludditism and archeo-utopianism, has a piece out in The Wall Street Journal, Don’t Burn Your Books—Print Is Here to Stay. The subtitle is “The e-book had its moment, but sales are slowing. Readers still want to turn those crisp, bound pages.” Here are some of his rancid chestnuts of un-wisdom:
… Hardcover books are displaying surprising resiliency. The growth in e-book sales is slowing markedly. And purchases of e-readers are actually shrinking, as consumers opt instead for multipurpose tablets. It may be that e-books, rather than replacing printed books, will ultimately serve a role more like that of audio books—a complement to traditional reading, not a substitute.
What’s more, the Association of American Publishers reported that the annual growth rate for e-book sales fell abruptly during 2012, to about 34%. That’s still a healthy clip, but it is a sharp decline from the triple-digit growth rates of the preceding four years.
The initial e-book explosion is starting to look like an aberration… 2012 survey by Bowker Market Research revealed that just 16% of Americans have actually purchased an e-book and that a whopping 59% say they have “no interest” in buying one.
From the start, e-book purchases have skewed disproportionately toward fiction, with novels representing close to two-thirds of sales…Screen reading seems particularly well-suited to the kind of light entertainments that have traditionally been sold in supermarkets and airports as mass-market paperbacks.
Readers of weightier fare, including literary fiction and narrative nonfiction, have been less inclined to go digital. They seem to prefer the heft and durability, the tactile pleasures, of what we still call “real books”—the kind you can set on a shelf.
…In fact, according to Pew, nearly 90% of e-book readers continue to read physical volumes. The two forms seem to serve different purposes.
Having survived 500 years of technological upheaval, Gutenberg’s invention may withstand the digital onslaught as well. There’s something about a crisply printed, tightly bound book that we don’t seem eager to let go of.