Today I was missing my daughter, so I decided to Skype with her on my phone. The phone has a camera which can record video, so I can talk to her, and if she gets bored I’ll show her something besides my face. I take this for granted, but it is interesting to reflect that my “video phone” is actually just a regular phone on which I installed a third party application to enable two way video calls. It’s a banal and marginal use for the device. Information technology is far more ubiquitous than the occasional video conference.
With the imminent demise of Google Reader there’s a lot of talk about how this is a death blow for RSS. I don’t really get this. Does anyone remember the stuff about “the death of comments” in the late 2000s? E.g.:
It’s sad and disappointing but the death of blog comments may be near. It’s getting harder and harder to fight against the hordes of spammers and mediocrity and animosity out there.
That’s from 2007. Granted, many blogs and media organizations have worthless comments sections. But not all by any stretch. And arguably technology like Disqus has made comments more, not less, relevant, due to features like “up voting” (I’m aware that Slashdot had this a long time ago!). Around the same time there was also the “death of email”. Like blog comments, email is still around.
One of the topics that occasionally crops up in personal conversations with friends is the issue of the rate of technological change. And yet the more and more I live life the more I feel that many of these discussions are predicated on the punctuated and precise emergence of technologies at a specific time and point (e.g., the web in 1995). And yet consider the “smart phone,” or more accurately, the phone as we understand it today. When the iPhone came out it was criticized for not being quite so radical or revolutionary, and I think the idea of the smart phone with a data plan has transformed the way we live our lives. It’s just not as sexy as more salient technologies. Sometimes there are even technologies which are obviously radical, but whose importance seems to bleed into our lives. Within the next 5 years I assume that civilian “drones” will become ubiquitous and banal, whether we like them or not.
The rise of drones have the potential for radically centralizing power and control. 3-D printing on the other hand pushes in the other direction. The apotheosis of this idea is a firm called Defcad, which made a splash at South by Southwest. Defcad emerged out of conflicts in the “Maker” subculture. Below is the introductory video of the founder:
After my last post on inevitable nature of the shift of the book toward electronic formats, I revisited the data which highlights the decline in sales of e-readers. Some of this is probably competition with tablets. But I’ve had the same Kindle for two and a half years. I got a newer version of the Kindle for my wife, but have seen no need for me to upgrade (and, I got a Kindle Fire for my daughter). Why? The point of e-readers is the content, not the delivery. This reiterates that “e-books” aren’t revolutionary, they’re evolutionary, and the fixation on technology is going to be transient. A true revolution in information transmission and delivery would be a direct data port, which would transform “publishing” in a much deeper fashion than the digitization of type and script.
I’ve had a Kindle for a few years now. I read a lot on it. And yet I observed something recently: I’ve stopped going to the library much. This is a big deal for me…probably since the age of 7 I’ve clocked in at least one visit to the public library per week in my life. I never turn books in past due because of the frequency with which I patronize the public or university libraries which I’ve had access to in life. Until recently. Now on occasion books go overdue, because I don’t go very often.
In the short term the Kindle has been a boon. But I’m not sure if it’s good for us in the long term. I’d rather pay more for a device which allowed for easier usage of different formats, as well as looser distribution policies.
Even Twitter? Can Twitter be declining? Over at the Atlantic‘s Technology Channel I note that my own Twitter conversations are not quite as dynamic as they once were, and speculate about why that might be. I didn’t say this in the post, but I wonder whether it might have something to do with people who enjoy online conversations also enjoying new tools and toys: perhaps we get tired of Twitter not because it has a deficiency, but just because it’s been around a while. I’m not suggesting this in lieu of the explanations I offer there, but in addition to them.
I think this is an artifact of the fact that Alan Jacobs seems to have been a very early Twitter adopter. Here’s Google Trends for the USA for searches for Twitter:
Last weekend I was at the Singularity Summit for a few days. There were interesting speakers, but the reality is that quite often a talk given at a conference has been given elsewhere, and there isn’t going to be much “value-add” in the Q & A, which is often limited and constrained. No, the point of the conference is to meet interesting people, and there were some conference goers who didn’t go to any talks at all, but simply milled around the lobby, talking to whoever they chanced upon.
I spent a lot of the conference talking about genomics, and answering questions about genomics, if I thought could give a precise, accurate, and competent answer (e.g., I dodged any microbiome related questions because I don’t know much about that). Perhaps more curiously, in the course of talking about personal genomics issues relating to my daughter’s genotype came to the fore, and I would ask if my interlocutor had seen “the lion.” By the end of the conference a substantial proportion of the attendees had seen the lion.
This included a polite Estonian physicist. I spent about 20 minutes talking to him and his wife about personal genomics (since he was a physicist he grokked abstract and complex explanations rather quickly), and eventually I had to show him the lion. But during the course of the whole conference he was the only one who had a counter-response: he pulled up a photo of his 5 children! Touché! Only as I was leaving did I realize that I’d been talking the ear off of Jaan Tallinn, the lead developer of Skype . For much of the conference Tallinn stood like an impassive Nordic sentinel, engaging in discussions with half a dozen individuals in a circle (often his wife was at his side, though she often engaged people by herself). Some extremely successful and wealthy people manifest a certain reticence, rightly suspicious that others may attempt to cultivate them for personal advantage. Tallinn seems to be immune to this syndrome. His manner and affect resemble that of a graduate student. He was there to learn, listen, and was exceedingly patient even with the sort of monomaniacal personality which dominated conference attendees (I plead guilty!).
I didn’t even notice, Founders of Diaspora, Intended as the Anti-Facebook, Move On. Though I was skeptical about the prospects after one of the co-founders committed suicide. One of the reasons I took an interest is that I gave $50 to the project when it first made a media splash…but honestly I thought the chances of success were always pretty low. The chances of many worthwhile endeavors are low.
For many, IVF smacked of a moral overstep — or at least of a potential one. In a 1974 article headlined “The Embryo Sweepstakes,” The New York Times considered the ethical implications of what it called “the brave new baby”: the child “conceived in a test tube and then planted in a womb.” (The scare phrase in that being not “test tube” so much as “a womb” and its menacingly indefinite article.) And no less a luminary than James Watson — yes, that James Watson – publicly decried the procedure, telling a Congressional committee in 1974 that a successful embryo transplant would lead to “all sorts of bad scenarios.”
Specifically, he predicted: “All hell will break loose, politically and morally, all over the world.”
The past is not always prologue, but it’s very instructive to look at newspapers from a given time period and see what the public mood was. Fear is a natural human reaction to new technology. My general bias is that technology itself usually isn’t as disruptive as social innovation. That being said, when technology is genuinely revolutionary it can have a much bigger impact than social or institutional shifts.
There’s a wide-ranging story in LA Weekly on the decline of 35mm film. It covers a lot of angles, but this one issue jumped out at me:
No wonder, then, that directors like Christopher Nolan worry that if 35mm film dies, so will the gold standard of how movies are made. Film cameras require reloading every 10 minutes. They teach discipline. Digital cameras can shoot far longer, much to the dismay of actors like Robert Downey Jr. — who, rumor has it, protests by leaving bottles of urine on set.
“Because when you hear the camera whirring, you know that money is going through it,” Wright says. “There’s a respectfulness that comes when you’re burning up film.”
This particular variant of critique of new technologies is very old. It is famously well known that writing and printing both ushered in warnings that these were simply crutches, and might diminish mental acuity. But I’m 99% sure that when bow & arrow become common, some hunters warned that the skills and traditions associated with the atlatl would decay. The piece highlights some genuine advantages of analog over digital. I do not think making filming more difficult is an advantage, to state the obvious.
Large companies have overheads, a necessary evil, you say. Overheads need to be managed. And managed they are: Group Managers, Program managers, General managers, together with ‘Senior’ flavours of those and a whole new breed of directors, stakeholders, business owners, relationship leads coupled with their own countless derivatives.
All those meeting-goers are not making anything. Deciding upon and making something is hard. And if this onerous activity has to be done, then hire external consultants for it. It’s easier and less risky.
There is no creative tension, no vision these days. Left to Microsoft’s hands we’d still be toiling on overheating Vista desktops.
This company is becoming the McDonalds of computing. Cheap, mass products, available everywhere. No nutrients, no ideas, no culture. Windows 8 is a fine example. The new Metro interface displays nonstop, trivial updates from Facebook, Twitter, news sites and stock tickers. Streams of raw noise distract users from the moment they login.
A comment below prompted me to recheck the browser stats on the web. People are now starting to give Google crap for not having really hit the jackpot on anything since Gmail, especially after the flubs with Google Wave and Buzz, and the mixed reviews at best for Google+. But it looks like Chrome may actually reach a plural majority this year. Back in the day (i.e., 1990s) control of the majority browser share was actually a big deal. My earlier hunch that eventually Chrome will start eating into IE’s user base more than Firefox’s seems to be panning out.
He’s a similar chart from the w3schools website (because it’s a tech oriented site IE automatically suffers a penalty, but the overall trends are similar):
I found out today that a private equity firm has purchased the majority of the Yellow Pages from AT&T. Which prompts me to ask: when was the last time you used the yellow pages? A pay phone? In a similar vein, Google And The Death Of Getting Lost. In 10 years (2001 to 2011) wireless penetration in the USA went from ~40 percent to ~100 percent.* This is the difference between arranging a rendezvous ahead of time in precise detail, and being confident that you can just end it with “I’ll call you.”
Image credit: Wikipedia
* This is actually calculated by comparing the number of phones to people. Since some people have multiple phones, and businesses purchase them for their employees, “real” penetration is somewhat less than this. I suspect that it is a larger underestimate for 2001, as a larger proportion of phones were probably business-related.
Google+ became the fastest growing social network within months of its debut last June, but a recent study casts doubt on whether most of its users are spending much time on the site.
According to ComScore, users spent an average of just 3.3 minutes on Google+ in the month of January, a decline from its recent figures and a tiny sliver of Facebook’s total.
I accept the argument of friends that G+ and Facebook are fundamentally different, and that Google’s aim here is not to replicate Facebook. But I also think that this is well short of what Google was intending for G+ at this stage; otherwise they would surely have quashed the media bubble and hyperbole which crested last summer. G+ is obviously much better than Buzz. But that’s a low bar.
Apple executives say that going overseas, at this point, is their only option. One former executive described how the company relied upon a Chinese factory to revamp iPhone manufacturing just weeks before the device was due on shelves. Apple had redesigned the iPhone’s screen at the last minute, forcing an assembly line overhaul. New screens began arriving at the plant near midnight.
A foreman immediately roused 8,000 workers inside the company’s dormitories, according to the executive. Each employee was given a biscuit and a cup of tea, guided to a workstation and within half an hour started a 12-hour shift fitting glass screens into beveled frames. Within 96 hours, the plant was producing over 10,000 iPhones a day.
“The speed and flexibility is breathtaking,” the executive said. “There’s no American plant that can match that.”
The story emphasizes that labor costs are not the primary issue here. There is the natural discussion of skill levels, and the sheer number of Chinese works coming online. But there simply is no way that Foxconn City could exist in the United States today. There is no way I can deny the massive quality of life improvements in China over the past generation. But, the flip side of this is that a way of life has now emerged organically in places like Shenzen which is rather reminiscent of late 19th and early 20th century dystopian visions of the industrial future.
Believe it or not I am probably mildly skeptical about the possibilities for the 21st century as a canvas for human flourishing. That is one reason I like to emphasize the positive, because it is important for me to not get caught up in my own bias. Over the last two human generations (50 years) mean world life expectancy has gone from ~53 to ~69. This is easy for me forget concretely because I come from a relatively long lived family. Though all were born in British India and died in Bangladesh my grandparents lived to ages of 75, 100, 80, and 80. My grandparent who died at the age of 75 still lived 25 years longer than life expectancy in Bangladesh in the year he died.
Today I see a headline in The New York Times, Majority of Chinese Now Live in Cities. For some reason I was prompted to look up the Wikipedia entry for Shenzhen, a city of 350,000 in 1982, which is now at 10 million. The image below of Shenzhen captures for me the poignant banality of the future present. One the one hand it is nothing special, a typical “world city” skyline. But there is also an aspect redolent of the soft focus depictions of the cities of the future in the children’s books I would read in the 1980s. The photo is proof of nothing. Rather, it is an illustration of fact.
Out of curiosity, how many readers are switching mostly to Kindle books? I myself find myself doing this. Not for any ideological or conscious reason. Rather, cost and portability are both major upsides of the Kindle. I also find that “impulse buys” are easier for me on the Kindle (purchased The Great Sea and Civilization: The West and the Rest, the latter mostly to see if Panjak Mishra actually did read the book). The Kindle has been around for a few years, but it looks like web traffic related to it is still increasing radically. I compare it o the iPad below.
Most readers know that I’ve been tracking Google Trends data on Facebook for years. Now on January 1 2012 It seems pretty obviously that in the international aggregate this was the year that Facebook finally hit saturation in terms of “mindshare.”
But there are interesting international differences.