Two weeks ago I saw the film Mud. It’s one of the few “serious” movies I’ve watched over the past twoyears. I can’t tell you what Pacific Rim was really about despite having viewed it five days ago (aside from the striking fact that of the protagonists two were played by bizarrely similar looking actors). And yet aspects of Mud have stuck with me. Why? It’s not because of the plot, which was laughably implausible. Or the development of the characters, which I found a bit overwrought or cliche in most (though not all) cases. Rather, I am still reflecting upon the depiction of the main young protagonist, a fourteen year old played by Tye Sheridan, and the landscape upon which he is “coming of age,” the central theme of the film. The specific details of the concerns of a teenage boy navigating new found feelings toward the opposite sex, an unstable family life, and a pedestrian rural milieu, are not novel. Rather, it was the whole portrait which I think warrants further exploration here in 2013.
Today I was missing my daughter, so I decided to Skype with her on my phone. The phone has a camera which can record video, so I can talk to her, and if she gets bored I’ll show her something besides my face. I take this for granted, but it is interesting to reflect that my “video phone” is actually just a regular phone on which I installed a third party application to enable two way video calls. It’s a banal and marginal use for the device. Information technology is far more ubiquitous than the occasional video conference.
With the imminent demise of Google Reader there’s a lot of talk about how this is a death blow for RSS. I don’t really get this. Does anyone remember the stuff about “the death of comments” in the late 2000s? E.g.:
It’s sad and disappointing but the death of blog comments may be near. It’s getting harder and harder to fight against the hordes of spammers and mediocrity and animosity out there.
That’s from 2007. Granted, many blogs and media organizations have worthless comments sections. But not all by any stretch. And arguably technology like Disqus has made comments more, not less, relevant, due to features like “up voting” (I’m aware that Slashdot had this a long time ago!). Around the same time there was also the “death of email”. Like blog comments, email is still around.
After my last post on inevitable nature of the shift of the book toward electronic formats, I revisited the data which highlights the decline in sales of e-readers. Some of this is probably competition with tablets. But I’ve had the same Kindle for two and a half years. I got a newer version of the Kindle for my wife, but have seen no need for me to upgrade (and, I got a Kindle Fire for my daughter). Why? The point of e-readers is the content, not the delivery. This reiterates that “e-books” aren’t revolutionary, they’re evolutionary, and the fixation on technology is going to be transient. A true revolution in information transmission and delivery would be a direct data port, which would transform “publishing” in a much deeper fashion than the digitization of type and script.
Last weekend I was at the Singularity Summit for a few days. There were interesting speakers, but the reality is that quite often a talk given at a conference has been given elsewhere, and there isn’t going to be much “value-add” in the Q & A, which is often limited and constrained. No, the point of the conference is to meet interesting people, and there were some conference goers who didn’t go to any talks at all, but simply milled around the lobby, talking to whoever they chanced upon.
I spent a lot of the conference talking about genomics, and answering questions about genomics, if I thought could give a precise, accurate, and competent answer (e.g., I dodged any microbiome related questions because I don’t know much about that). Perhaps more curiously, in the course of talking about personal genomics issues relating to my daughter’s genotype came to the fore, and I would ask if my interlocutor had seen “the lion.” By the end of the conference a substantial proportion of the attendees had seen the lion.
This included a polite Estonian physicist. I spent about 20 minutes talking to him and his wife about personal genomics (since he was a physicist he grokked abstract and complex explanations rather quickly), and eventually I had to show him the lion. But during the course of the whole conference he was the only one who had a counter-response: he pulled up a photo of his 5 children! Touché! Only as I was leaving did I realize that I’d been talking the ear off of Jaan Tallinn, the lead developer of Skype . For much of the conference Tallinn stood like an impassive Nordic sentinel, engaging in discussions with half a dozen individuals in a circle (often his wife was at his side, though she often engaged people by herself). Some extremely successful and wealthy people manifest a certain reticence, rightly suspicious that others may attempt to cultivate them for personal advantage. Tallinn seems to be immune to this syndrome. His manner and affect resemble that of a graduate student. He was there to learn, listen, and was exceedingly patient even with the sort of monomaniacal personality which dominated conference attendees (I plead guilty!).
For many, IVF smacked of a moral overstep — or at least of a potential one. In a 1974 article headlined “The Embryo Sweepstakes,” The New York Times considered the ethical implications of what it called “the brave new baby”: the child “conceived in a test tube and then planted in a womb.” (The scare phrase in that being not “test tube” so much as “a womb” and its menacingly indefinite article.) And no less a luminary than James Watson — yes, that James Watson — publicly decried the procedure, telling a Congressional committee in 1974 that a successful embryo transplant would lead to “all sorts of bad scenarios.”
Specifically, he predicted: “All hell will break loose, politically and morally, all over the world.”
The past is not always prologue, but it’s very instructive to look at newspapers from a given time period and see what the public mood was. Fear is a natural human reaction to new technology. My general bias is that technology itself usually isn’t as disruptive as social innovation. That being said, when technology is genuinely revolutionary it can have a much bigger impact than social or institutional shifts.
There’s a wide-ranging story in LA Weekly on the decline of 35mm film. It covers a lot of angles, but this one issue jumped out at me:
No wonder, then, that directors like Christopher Nolan worry that if 35mm film dies, so will the gold standard of how movies are made. Film cameras require reloading every 10 minutes. They teach discipline. Digital cameras can shoot far longer, much to the dismay of actors like Robert Downey Jr. — who, rumor has it, protests by leaving bottles of urine on set.
“Because when you hear the camera whirring, you know that money is going through it,” Wright says. “There’s a respectfulness that comes when you’re burning up film.”
This particular variant of critique of new technologies is very old. It is famously well known that writing and printing both ushered in warnings that these were simply crutches, and might diminish mental acuity. But I’m 99% sure that when bow & arrow become common, some hunters warned that the skills and traditions associated with the atlatl would decay. The piece highlights some genuine advantages of analog over digital. I do not think making filming more difficult is an advantage, to state the obvious.
I found out today that a private equity firm has purchased the majority of the Yellow Pages from AT&T. Which prompts me to ask: when was the last time you used the yellow pages? A pay phone? In a similar vein, Google And The Death Of Getting Lost. In 10 years (2001 to 2011) wireless penetration in the USA went from ~40 percent to ~100 percent.* This is the difference between arranging a rendezvous ahead of time in precise detail, and being confident that you can just end it with “I’ll call you.”
Image credit: Wikipedia
* This is actually calculated by comparing the number of phones to people. Since some people have multiple phones, and businesses purchase them for their employees, “real” penetration is somewhat less than this. I suspect that it is a larger underestimate for 2001, as a larger proportion of phones were probably business-related.
Most readers know that I’ve been tracking Google Trends data on Facebook for years. Now on January 1 2012 It seems pretty obviously that in the international aggregate this was the year that Facebook finally hit saturation in terms of “mindshare.”
But there are interesting international differences.
Hominin increase in cranial capacity, courtesy of Luke Jostins
A few years ago a statistical geneticist at Cambridge’s Sanger Institute, Luke Jostins, posted the chart above using data from fossils on cranial capacity of hominins (the human lineage). As you can see there was a gradual increase in cranial capacity until ~250,000 years before the present, and then a more rapid increase. I should also note that from what I know about the empirical data, mean human cranial capacity peaked around the Last Glacial Maximum. Our brains have been shrinking, even relative to our body sizes (we’re not as large as we were during the Ice Age). But that’s neither here nor there. In the comments Jostins observes:
The data above includes all known Homo skulls, but none of the results change if you exclude the 24 Neandertals. In fact, you see the same results if you exclude Sapiens but keep Neandertals; the trends are pan-Homo, and aren’t confined to a specific lineage….
It’s been a while…what’s going on with Google+? I think we can conclude it isn’t a Facebook killer in anything like the medium term. After moving away from Facebook I started posting again because almost all of my friends in “flesh space” simply don’t use Google+. Rather, Google+ has become a more elaborate extension of my twitter circle. I’ve got over 2,100 who’ve added me to their Google+ circles, and only 1,600 twitter followers. But I don’t do post much on Google+ at this point. For someone with my amount of time and interest two social networks seems optimal in terms of complements. That being said, as many have noted Google+ is more than just a social networking platform. Rather, it has to be understood as an extension of giving your whole suite of Google services an identity, specificity, and personality.
With all that said, Facebook is starting to get a little too busy for my taste. Does anyone else feel the same way? Mark Zuckerberg has known how much to “push it” for years, but my own suspicion is that he has to be very careful of a rapid implosion of usage due to feedback loops if he moves beyond a particular threshold.
A few stray thoughts, which might be worth having a discussion about. Unless one wants to go Soylent Green or Logan’s Run both the proponents of stable/declining world population and continued growth have to look to technology. More people means more economic productivity to keep everyone afloat ahead of the Malthusian trap. But even if the population stabilizes, there is still the major problem of the rising dependency fraction due to aging. The only way that we can keep up is by increasing the productivity of the work force. This is especially going to be an issue in a nation like China because of the one child policy (which practically turned out to be a 1.5 child policy). Either “working age” people have to work more productively, or health care has to reduce late in life morbidity so that people can work longer and get the ratio of retirees to workers reasonable.
Secondarily, I’m kind of getting sick of the fact that everyone’s battery is dying. My battery is dying, your battery is dying. “Hey, can I call you later? My battery is dying.” With the rising penetration of smartphones batteries are dying all over the place. I remember a time, back in 2006, when I must have been charging my phone once a week or something! The days. I know that smartphone technology is a step forward, but it goes to show how difficult it is to make a good battery, insofar as we’ve taken a massive step backward in terms of the battery life that we had come to expect. There’s a creeping element of zero sum in all of this; more features means less juice per feature.
I use Google Trends a lot, but I don’t necessarily know if it’s telling me anything useful. So I decided to see if it might correlate well with browser share data. I know that W3Schools has been tracking their own stats for years, so I took their data from September of 2008 to September of 2011, and plotted the browser share. Below it are some trends from Google. Notice the pattern for Firefox and Chrome in particular.
You’ve wondered I’m sure. I have. Why are restaurant websites so horrifically bad?:
…The rest of the Web long ago did away with auto-playing music, Flash buttons and menus, and elaborate intro pages, but restaurant sites seem stuck in 1999. The problem is getting worse in the age of the mobile Web—Flash doesn’t work on Apple’s devices, and while some of these sites do load on non-Apple smartphones, they take forever to do so, and their finicky
I did get a plausible-sounding explanation of the design process from Tom Bohan, who heads up Menupages, the fantastic site that lists menus of restaurants in several large cities. “Say you’re a designer and you’ve got to demo a site you’ve spent two months creating,” Bohan explains. “Your client is someone in their 50s who runs a restaurant but is not very in tune with technology. What’s going to impress them more: Something with music and moving images, something that looks very fancy to someone who doesn’t know about optimizing the Web for consumer use, or if you show them a bare-bones site that just lists all the information? I bet it would be the former—they would think it’s great and money well spent.”
Not coincidentally, designers make more money to create a complicated, multipage Flash site than one that tells you everything you want to know on one page….
I can comprehend the reliance on old-school designers who overcharge for a series of static pages if you need fine-grained control of the visual look & feel which aligns with your class and elegance. But a lot of high end restaurant websites look like they were outsourced to the Insane Clown Posse and their stylists. Most fine-dining American eateries I’ve been too tend to avoid the bright and flashy aesthetic you might find at iHop or Red Robin in their meatspace ambiance, but that seems less assured when it comes to their cyberface.
Carl pointed me to this really strange interview in New Scientist, Susan Greenfield: Living online is changing our brains. If you removed it from the New Scientist website and put it on the The Onion it wouldn’t really need much editing. Some of the things Susan Greenfield says make you scratch your head. First paragraph:
You think that digital technology is having an impact on our brains. How do you respond to those who say there’s no evidence for this?
When people say there is no evidence, you can turn that back and say, what kind of evidence would you imagine there would be? Are we going to have to wait for 20 years and see that people are different from previous generations? Sometimes you can’t just go into a lab and get the evidence overnight. I think there are enough pointers that we should be talking about this rather than stressing about not being able to replicate things in a lab instantly.
Happy-slapping? Seriously? That was so mid-2000s. It’s going to be really hard to escape the oncoming rush of the “wall of information” in the near future. If it drives our world insane, there are always the residents of North Sentinel Island.
Well-known coder and activist Aaron Swartz was arrested Tuesday, charged with violating federal hacking laws for downloading millions of academic articles from a subscription database service that MIT had given him access to. If convicted, Swartz faces up to 35 years in prison and a $1 million fine.
Swartz, the 24-year-old executive director of Demand Progress, has a history of downloading massive data sets, both to use in research and to release public domain documents from behind paywalls. Swartz, who was aware of the investigation, turned himself in Tuesday.
The grand jury indictment accuses Swartz of evading MIT’s attempts to kick his laptop off the network while downloading more than four million documents from JSTOR, a non-for-profit company that provides searchable, digitized copies of academic journals. The scraping, which took place from September 2010 to January 2011 via MIT’s network, was invasive enough to bring down JSTOR’s servers on several occasions.
This is great. At a minimum Google+ could become like the Chrome browser. It might not attain a dominant market share position (though Chrome already has a higher share than IE on this site, and others, with a tech-savvy audience), but it could push the edge of innovation. I don’t have a problem with Facebook, but with the collapse of MySpace years ago it has had a de facto monopoly in the general social networking space.
Finally, an anecdatum: a friend noticed that six of his contacts deactivated their Facebook accounts in the past few days. He didn’t know why, but there’s a high probability that these may be the types who just want to start over like Ezra Klein suggested.
Amazon: Kindle books outselling all print books. This is more something I’d put on pinboard, but this requires noting more prominently. The figure itself isn’t important, but it is a marker for a silent transition occurring as we shift mediums.
I experienced a very strange and perhaps illuminating dream last night. I’ve had an HTC Evo 4G since Christmas (for what it’s worth, Sprint’s customer service has been horrible, but the phone itself is great). Before that I had phones with internet access, but which were more primitive. At this point I probably can’t imagine what life was like without a phone like the Evo. That was evident in my dream.
Here’s what happened. Apparently I had left the phone in my pocket while doing laundry. This meant that it was damaged. For reasons which the dream-gods did not explain to me, as a replacement I received an old school gray Nokia of some sort, the likes of which I hadn’t encountered since the mid-2000s. Here’s the kicker, I looked at my replacement phone, and wondered out loud: “OK, so what am I supposed to do with this phone? All it can do is call people. I don’t even like calling people!”
Obviously I still refer to my phone as a phone. But at this point I don’t see its primary role as sending and receiving phone calls! (in fact, if I want more reliable voice I’ll probably go with Skype due to issues with reception) I am reminded of the origin of the term ‘stationery’.
In 2006-2007 I worked at a firm which had its own web application, and “web 2.0” was a big term in the marketing materials. This article in DealB%k, Is It a New Tech Bubble? Let’s See if It Pops, made me wonder what happened to that term.
Here’s Google Trends: