Believe it or not I am probably mildly skeptical about the possibilities for the 21st century as a canvas for human flourishing. That is one reason I like to emphasize the positive, because it is important for me to not get caught up in my own bias. Over the last two human generations (50 years) mean world life expectancy has gone from ~53 to ~69. This is easy for me forget concretely because I come from a relatively long lived family. Though all were born in British India and died in Bangladesh my grandparents lived to ages of 75, 100, 80, and 80. My grandparent who died at the age of 75 still lived 25 years longer than life expectancy in Bangladesh in the year he died.
Today I see a headline in The New York Times, Majority of Chinese Now Live in Cities. For some reason I was prompted to look up the Wikipedia entry for Shenzhen, a city of 350,000 in 1982, which is now at 10 million. The image below of Shenzhen captures for me the poignant banality of the future present. One the one hand it is nothing special, a typical “world city” skyline. But there is also an aspect redolent of the soft focus depictions of the cities of the future in the children’s books I would read in the 1980s. The photo is proof of nothing. Rather, it is an illustration of fact.
Out of curiosity, how many readers are switching mostly to Kindle books? I myself find myself doing this. Not for any ideological or conscious reason. Rather, cost and portability are both major upsides of the Kindle. I also find that “impulse buys” are easier for me on the Kindle (purchased The Great Sea and Civilization: The West and the Rest, the latter mostly to see if Panjak Mishra actually did read the book). The Kindle has been around for a few years, but it looks like web traffic related to it is still increasing radically. I compare it o the iPad below.
Most readers know that I’ve been tracking Google Trends data on Facebook for years. Now on January 1 2012 It seems pretty obviously that in the international aggregate this was the year that Facebook finally hit saturation in terms of “mindshare.”
But there are interesting international differences.
This is probably old news to you, and I’ve read about Digg’s problems in the tech media, but I just realized how much reddit has eclipsed Digg in referral traffic. I’ve always gotten way more attention from reddit (some science bloggers have told me that reddit readers are a “smarter set”), but when I did get Digg bumps they were often of greater magnitude. No more. Not only are referrals from Digg much more rare than they used to be, but they aren’t as significant as reddit.
So of course I checked out Google Trends:
There’s been extensive reporting in the media on the rise of Chrome, and the decline of Firefox, based on StatCounter data. I’ve got access to four weblog analytics, one of them going back to 2006. I see the same trend. It’s real. What I don’t understand is the lack of acknowledgment of the continued stagnation and decline of the Internet Explorer franchise. The magnitude of the downward slope of IE usage is about twice as large as that of Firefox. There is presumably a floor of IE usage of those who don’t download Firefox, Chrome, etc., when they get their Windows machine. But I would contend there is is also a floor of Firefox users, who are attached to particular extensions and features which are unique to Firefox. And, these Firefox users are probably much more fundamentally loyal to their product than IE users. So it will be interesting to observe the long term trend, and see if Chrome eats into IE or Firefox usage more in the future.
I hadn’t given the issue much thought, but that’ what Randall Parker asserted in the comments below.
First, let’s look at Google Trends search traffic with Facebook as well:
Facebook dwarfs twitter, so you can’t tell. So with only twitter:
It’s been a while…what’s going on with Google+? I think we can conclude it isn’t a Facebook killer in anything like the medium term. After moving away from Facebook I started posting again because almost all of my friends in “flesh space” simply don’t use Google+. Rather, Google+ has become a more elaborate extension of my twitter circle. I’ve got over 2,100 who’ve added me to their Google+ circles, and only 1,600 twitter followers. But I don’t do post much on Google+ at this point. For someone with my amount of time and interest two social networks seems optimal in terms of complements. That being said, as many have noted Google+ is more than just a social networking platform. Rather, it has to be understood as an extension of giving your whole suite of Google services an identity, specificity, and personality.
With all that said, Facebook is starting to get a little too busy for my taste. Does anyone else feel the same way? Mark Zuckerberg has known how much to “push it” for years, but my own suspicion is that he has to be very careful of a rapid implosion of usage due to feedback loops if he moves beyond a particular threshold.
CNNMoney reports that Ilya Zhitomirskiy, one of four co-founders of the social networking site Diaspora, died over the weekend, and that suicide was the likely cause of death. He was 22.
I gave some money to Diaspora. It seems like it didn’t pan out. So? But that’s easy for me to say, I just gave a little money. I suspect many of us have faced with panic the likelihood of failure. Personalities differ in how we can process that failure. I remember in college reading the story of a kid who killed himself because of shame over a credit card debt on the order of a few thousand dollars! We make a big deal today about how failure is critical to ultimate success, but we don’t put enough spotlight on the day to day toll that failure necessarily takes on many people….
Slate has an interesting retrospective on why Second Life never fulfilled the hype. My own caution was rooted in an argument from a tech journalist who pointed out that the exact same things stated about Second Life were once stated about MUDs. He actually simply repeated quotes from stories in the early to mid-1990s and compared them to those in 2006 to illustrate how the same passages were being recycled again. He knew about the power of the hype, because he participated in the first wave before it faded. Of course Second Life was much more sophisticated than any MUD,but it struck me that when the same reasoning is applied to a more perfected version of the phenomenon the same outcome may ensue.
In other news, Facebook’s plateau continues….
A few stray thoughts, which might be worth having a discussion about. Unless one wants to go Soylent Green or Logan’s Run both the proponents of stable/declining world population and continued growth have to look to technology. More people means more economic productivity to keep everyone afloat ahead of the Malthusian trap. But even if the population stabilizes, there is still the major problem of the rising dependency fraction due to aging. The only way that we can keep up is by increasing the productivity of the work force. This is especially going to be an issue in a nation like China because of the one child policy (which practically turned out to be a 1.5 child policy). Either “working age” people have to work more productively, or health care has to reduce late in life morbidity so that people can work longer and get the ratio of retirees to workers reasonable.
Secondarily, I’m kind of getting sick of the fact that everyone’s battery is dying. My battery is dying, your battery is dying. “Hey, can I call you later? My battery is dying.” With the rising penetration of smartphones batteries are dying all over the place. I remember a time, back in 2006, when I must have been charging my phone once a week or something! The days. I know that smartphone technology is a step forward, but it goes to show how difficult it is to make a good battery, insofar as we’ve taken a massive step backward in terms of the battery life that we had come to expect. There’s a creeping element of zero sum in all of this; more features means less juice per feature.
John McCarthy has died. Sadly I was expecting this, I was told that McCarthy was still teaching courses in 2008 by someone in Stanford’s computer science department, but he was in obvious bad health. One of the major downsides of the incredible information flow in the internet age is that you often hear through the grapevine that eminent so-and-so is ill, and have to prepare yourself years ahead for the inevitable. We all die, but it seems starker in the case of those individuals who have grasped upon a fragment of the sort of immortality given to Gilgamesh.
In the early to mid-2000s I had some conversations and arguments with McCarthy about the history of Islam and the politics of the Middle East (in hindsight I knew a lot more about the former than I did the latter). He followed Gene Expression now and then in the course of his meanderings around the web. Initially I did not make the connection that this was the John McCarthy, which was especially ironic in that I was playing at learning Lisp at that moment! Outside of his domains of almost godlike achievement I have to say that McCarthy was a relatively no-nonsense down to earth person from what little I could gather. He was curious about what he didn’t know, and if you weren’t aware that he was one of the most accomplished computer scientists in the world he didn’t seem too keen on cluing you in. My own overall impression was that he was a deep pragmatist and skeptic.
I use Google Trends a lot, but I don’t necessarily know if it’s telling me anything useful. So I decided to see if it might correlate well with browser share data. I know that W3Schools has been tracking their own stats for years, so I took their data from September of 2008 to September of 2011, and plotted the browser share. Below it are some trends from Google. Notice the pattern for Firefox and Chrome in particular.
TechCrunch has a post up on the declining public usage of Google+. It’s been several months since I’ve been “using” Google+. I put usage in quotes because I am not a big active poster on twitter, Facebook, or Google+. But I do participate passively a fair amount. At this point for me I can say that Google+ is turning into a very different beast from Facebook. I have 70 people in a circle labeled “Friends,” but well over 700 in another labeled “Internet.” The latter category are those individuals who I basically don’t know, but usually know of me. Recall that I purposely limited the number of individual who I invited to Google+. So I’ve been passive the whole time. At this point I suspect that within ~3-4 months, at current rates, I will have more people in my Google+ circles than who follow me on twitter.
And remember, I raised the funds to defray the cost of a genotyping kit via Google+. That’s worth something. I didn’t get any response on twitter or Facebook. Why? I think because Facebook is strongly biased toward people who I know in real life, not all of whom share my obsession with personal genomics. My twitter followers exhibit a stronger concordance of interests, but still far less than those people who sought me out on Google+.
You’ve wondered I’m sure. I have. Why are restaurant websites so horrifically bad?:
…The rest of the Web long ago did away with auto-playing music, Flash buttons and menus, and elaborate intro pages, but restaurant sites seem stuck in 1999. The problem is getting worse in the age of the mobile Web—Flash doesn’t work on Apple’s devices, and while some of these sites do load on non-Apple smartphones, they take forever to do so, and their finicky
I did get a plausible-sounding explanation of the design process from Tom Bohan, who heads up Menupages, the fantastic site that lists menus of restaurants in several large cities. “Say you’re a designer and you’ve got to demo a site you’ve spent two months creating,” Bohan explains. “Your client is someone in their 50s who runs a restaurant but is not very in tune with technology. What’s going to impress them more: Something with music and moving images, something that looks very fancy to someone who doesn’t know about optimizing the Web for consumer use, or if you show them a bare-bones site that just lists all the information? I bet it would be the former—they would think it’s great and money well spent.”
Not coincidentally, designers make more money to create a complicated, multipage Flash site than one that tells you everything you want to know on one page….
I can comprehend the reliance on old-school designers who overcharge for a series of static pages if you need fine-grained control of the visual look & feel which aligns with your class and elegance. But a lot of high end restaurant websites look like they were outsourced to the Insane Clown Posse and their stylists. Most fine-dining American eateries I’ve been too tend to avoid the bright and flashy aesthetic you might find at iHop or Red Robin in their meatspace ambiance, but that seems less assured when it comes to their cyberface.
This doesn’t mean that we should stop socializing on the web. But it does suggest that we reconsider the purpose of our online networks. For too long, we’ve imagined technology as a potential substitute for our analog life, as if the phone or Google+ might let us avoid the hassle of getting together in person.
But that won’t happen anytime soon: There is simply too much value in face-to-face contact, in all the body language and implicit information that doesn’t translate to the Internet. (As Mr. Glaeser notes, “Millions of years of evolution have made us into machines for learning from the people next to us.”) Perhaps that’s why Google+ traffic is already declining and the number of American Facebook users has contracted in recent months.
These limitations suggest that the winner of the social network wars won’t be the network that feels the most realistic. Instead of being a substitute for old-fashioned socializing, this network will focus on becoming a better supplement, amplifying the advantages of talking in person.
For years now, we’ve been searching for a technological cure for the inefficiencies of offline interaction. It would be so convenient, after all, if we didn’t have to travel to conferences or commute to the office or meet up with friends. But those inefficiencies are necessary. We can’t fix them because they aren’t broken.
Carl pointed me to this really strange interview in New Scientist, Susan Greenfield: Living online is changing our brains. If you removed it from the New Scientist website and put it on the The Onion it wouldn’t really need much editing. Some of the things Susan Greenfield says make you scratch your head. First paragraph:
You think that digital technology is having an impact on our brains. How do you respond to those who say there’s no evidence for this?
When people say there is no evidence, you can turn that back and say, what kind of evidence would you imagine there would be? Are we going to have to wait for 20 years and see that people are different from previous generations? Sometimes you can’t just go into a lab and get the evidence overnight. I think there are enough pointers that we should be talking about this rather than stressing about not being able to replicate things in a lab instantly.
Happy-slapping? Seriously? That was so mid-2000s. It’s going to be really hard to escape the oncoming rush of the “wall of information” in the near future. If it drives our world insane, there are always the residents of North Sentinel Island.
I am only being added to Google+ “circles” at a clip of half a dozen per day. This is off the peak of nearly 20 or so per day a little over a week ago. I’m now at nearly 500 people in my Google circles, though only 5 were individuals whom I added proactively. I honestly have no idea who 2/3 of these people are, though it seems that most of them know me through my blogs. About ~75 people I know rather well, though fewer than 50 are people who I’ve met in real life (many of these only once or twice). In contrast on Facebook there are hundreds of people I’ve met and known and know in real life. Very few of my college or high school friends have “added me” to their circles. In contrast, the people who I am socially engaged with currently have added me. It’s like Google+ is a vast and shallow circle extending outward into my present social space, both explicit (people I know) and implicit (those who know me through my web presence). In contrast Facebook has more historical depth. Though it’s been around a lot longer too, so the comparison isn’t fair.
Well-known coder and activist Aaron Swartz was arrested Tuesday, charged with violating federal hacking laws for downloading millions of academic articles from a subscription database service that MIT had given him access to. If convicted, Swartz faces up to 35 years in prison and a $1 million fine.
Swartz, the 24-year-old executive director of Demand Progress, has a history of downloading massive data sets, both to use in research and to release public domain documents from behind paywalls. Swartz, who was aware of the investigation, turned himself in Tuesday.
The grand jury indictment accuses Swartz of evading MIT’s attempts to kick his laptop off the network while downloading more than four million documents from JSTOR, a non-for-profit company that provides searchable, digitized copies of academic journals. The scraping, which took place from September 2010 to January 2011 via MIT’s network, was invasive enough to bring down JSTOR’s servers on several occasions.
Over the past few weeks I’ve seen several media stories profiling the rise of Google+ by noting that hoopla also greeted Google Wave and Google Buzz before their expiration as “It” technologies. This caveat was probably more true for Google Wave, which heralded the revolution which no one seemed anxious for (“what if we designed email now?!?!?!”). Buzz was a public relations disaster from its inception. When I first posted on Google+ I asserted that it was not in the same category as Wave or Buzz, and that was in a good way. By that, I meant that taking Google+ for a test drive I thought I’d stick around for at least a bit. I didn’t get that sense with Wave, and proactively shut down Buzz in my Gmail account. But that’s an N of 1, me. Over the past few weeks though friends have been joining Google+, and real conversations have been starting. I’ve consciously avoided adding anyone to my Google+ circle proactively, rather I have been reciprocally adding them. I’m at 300+ now. Right now the people in my circles are much closer in profile to my twitter account than my Facebook. That’s probably not typical, as I am a quasi-public individual (looking at who I share in common with those I’m adding to my circles it seems that some of my journalist friends and acquaintances are replicating their twitter followings as well, and that’s how people are finding me).
In any case, I have some non-anecdotal data that Google+ is not replicating the paths of Wave of Buzz. Google Trends. It’s early yet, so I don’t think that Google+ has “peaked” in terms of news or search by any means (if it’s successful), but it’s already surpassed the other two offerings:
- I have invited 5 people to the service (as per their requests). And yet I have 163 people in my circles. Right now the rate of people adding me to their circles is increasing.
– At least half the people I don’t even recognize at all. Most of these are obviously people who know me from the blogs I’m associated with in some capacity judging by people we have in common. A grand total of 1 person is someone I know from high school, and this individual I actually got to know much better at university. Otherwise, people I recognize and know tend to be bloggers and my friends in “real life” currently, topped off with a few friends from college. In contrast, Facebook is stacked with a lot of my friends and acquaintances from high school (as well as random people I met at conferences over the years or something).
– I still don’t know how to really use the service or see any strong components of functionality which gives it a comparative advantage over Facebook besides the relative transparency of circles vs. Facebook groups and lists.
Right now I’d say that Google+ does very little, but what it does it does smoothly. Facebook does a lot, but much of the implementation is kludgy. But for someone like me I think Google+’s future role may actually be to replace twitter, judging by how many people who I vaguely recognize only from twitter!