Earlier today I saw a conversation with William Gibson, the inaugural event of this year’s Chicago Humanities Festival. It took place on the set of an ongoing play on Northwestern University’s campus, mostly cleared off for the event save for two pay phones. This reminder of our technological past joined forces with persistent microphone problems to provide an odd dys-technological backdrop to a conversation about the way our lives are changing under the tremendous force of technological change.
Some of Gibson’s most fascinating comments were about how our era would be thought about by people in the far future. If the Victorians are known for their denial of the reality of sex, Gibson said, we will be known for our odd fixation with distinguishing real from virtual reality. This comment resonated with me on many different levels. Just a couple weeks before, I had lunch with Craig Mundie, the head of Microsoft Research, prior to a talk he gave at Northwestern. He told us about some new directions they are taking one of their hottest products, the Kinect. The Kinect is a camera for the Xbox gaming system that can see things in 3D. One of their new endeavors with this camera is to allow you to create 3D avatars that move and talk as you are in real time, so you can have very realistic virtual meet-ups. This is now available on the Xbox as Avatar Kinect. The second direction is the real time generation of 3D models of the world around you as you sweep the Kinect around by hand, called Kinect Fusion. With this model of the world around you, you can start to meld real and virtual in some very fun ways. In one of his demos, Mundie waved a Kinect around a clay vase on a nearby table. We instantly got an accurate 3D model up on the screen – exciting and impressive from a $150 gizmo. I’ve had to create 3D models of stuff in my own research, and that’s involved hardware about 100 times more expensive. Even more impressive, Mundie next had the projected image of the 3D model of the vase start to spin, then stuck his hands out in front of the Kinect and used movements of his hand to sculpt it, potter-like. It was wild. All that was needed to complete the trip was a quick 3D print of the result. Further demos showed other ways in which the line between reality and virtuality was being blurred, and it all brought me back to the confluence of real and virtual worlds so well envisioned by the show I advised during its brief life, Caprica.
Gibson’s right. We haven’t yet moved beyond our need to identify what belongs to what when it comes to digital and physical worlds, so we constantly consecrate it with our language. Ironically, some of that very language was created by him: “cyberspace,” a word Gibson coined in his story “Burning Chrome” in 1982. During the conversation today, led by fellow faculty member and author Bill Savage, Gibson said he’s less interested in its rise than to see it die out. He sees its use as a hallmark of our distancing ourselves from who we are as mediated by computer technology. He thinks the term is starting to go out of use, and he’s happy about that — in his view, there’s no need for a word about a space that we are constantly moving through the coordinates of, as we do each time we go on to twitter, facebook, google+, and other digital extensions of self. It’s not cyberspace anymore: it’s our space.
It seemed inevitable that a question about The Singularity would be put to Gibson in the Q&A. Sure enough, it was the final note, and Gibson dispatched it with typical incisiveness. The Singularity, he said, is the Geek Rapture. The world will not change in that way. Like our gradual entrance into cyberspace, now complete enough that marking this world with a separate term seems quaint, Gibson said we will eventually find ourselves sitting on the other side of a whole bunch of cool hardware. But, he feels our belief that it will be a sudden, quasi-religious transformation (perhaps with Cylon guns blazing?) is positively 4th century in its thinking.
The cover of The Postmortal is one of the coolest images I’ve seen in a long time. Death impaled by his own scythe – be not proud, indeed.
The idea behind Drew Magary’s great new book is simple: aging, as it turns out, is caused by one gene. Shut that gene off and you stop aging; accidents and disease are still a problem, but you’ve cured death by natural causes. Now compound that discovery with the fact that any person who gets the Cure simply stops aging. People don’t become younger, they just don’t get older, frozen at their “Cure age.” What happens next?
In an effort to find out, Magary takes us through the life of John Farrell, a New York lawyer who gets the Cure for aging at the age of 29 in the year 2019. From that point on, things go rather poorly for John and the rest of humanity. As one might expect, curing aging doesn’t cure social ills, over-population, ennui, or a host of other human hangups. Mark Frauenfelder has an excellent synopsis of the book over at boingboing.net, and I share his opinions about the book’s bleak tone and high quality.
Magary’s argument through the text is essentially this: death creates meaning. Not mortality, but guaranteed natural death due to aging. The idea that no matter what you do, how you live your life, the concept that you will be born, mature, grow old, and die creates human meaning. Magary has a point: from the riddle of the Sphinx to Tyler Durden to the final books of Harry Potter, aging and death seem to be at the epicenter of human thought. I don’t deny him that at any moment any one of us could meet a tragic end. Life is precious in part because it is not meant to last.
But here is where I struggle. The Postmortal is not about a post-mortal society, it is about a post-aging society. Lots and lots and lots of people die in Magary’s vision. In fact, he seems to argue that in the absence of death, people will not only seek death but will create circumstances that create death and thereby, create meaning. It is only when Farrell’s life is most in peril that he finds purpose in existence. But Farrell is never immortal, no one is. So my question is: is the process of aging as meaningful as the condition of being mortal? Read More
Philip Ball’s new book, Unnatural: The Heretical Idea of Making People gets into the mythological underpinnings of our concerns about making people. Nature‘s Chris Mason reviews [gated] Unnatural and makes a striking observation:
Even today, Ball points out, societal and cultural debate is pervaded by the belief that technology is intrinsically perverting and thus carries certain penalty. Views that human cloning will be used for social engineering, eradicating one gender or resurrecting undesirable figures from the past, for example, all reflect age-old fears about the consequences of meddling in the ‘unnatural’. Ball warns that, as there is no global ban on human reproductive cloning, there is a strong chance that it will happen. It is thus likely to become a de facto reality without the well-informed debate it deserves.
Let’s unpack that little nugget, because it contains two very important points.
The first point is that many of our fears about advancing science and biotechnology related to the body trigger fundamental, core cultural fears. Leon Kass calls this the “Yuck” reaction, or, more eloquently, “Wisdom from Repugnance.” Kass’ argument is that we are naturally repelled by abhorrent ideas, like torturing babies and eating people. As regular readers of Science Not Fiction know, eating people isn’t always bad.
Well, as it turns out, Leon Kass’ argument that we should trust our gut when it says, “yuck!” is a pretty terrible way to do ethics. Why? Because what is “yuck” to me might be “yum” to you. And we’re back to not knowing if doing something ethically questionable, like cloning people, is morally permissible. Unnatural at least explains why so many people say “yuck” to modifying humans; it is a lesson we’ve been told over and over for millennia in myths and religion.
The second point is that we should be discussing these ideas like rational adults. Biotechnology is progressing at a rate and in ways that are so rapid as to be unpredictable. I make lots of educated guesses and suppositions, but none of what I write here is a prediction or a guarantee. My interest is in figuring out whether or not something like cloning is ethically permissible if we’re ever able to do it. As Ball notes, there is no current global ban on cloning. There is, as it stands, no global ban on most of the transhumanist issues, from eugenics to cognitive enhancers to A.I. to nano-implants. These possible technologies strain the very foundations of many of our philosophies and cultural institutions. If the lack of a global ban means the technology is likely inevitable, we better figure out how to go about things correctly.
Debate and discussion are essential to making good decisions. Recognizing our old, deep seated prejudices and biases, such as those against technology and making people, is equally essential. Simply because something is unnatural does not mean it is immoral. But that’s where the discussion starts, not where it ends ends.
Image of Book Cover via Bodley Head
The chasm between science and the humanities is nowhere more blatent than the lack of work on how science fiction is reprocessed and used by those of us securely strapped into the laboratory. It’s a topic that attracts some heat: Some scientists take to suggestions of inspiration between their creations and those in preceding Sci-Fi with the excitement of a freshman accused of buying their midterm essay off the internet. In Colin Milburn’s new work on ways of thinking about this interaction, he refers to Richard Feynman’s 1959 lecture “There’s plenty of room at the bottom.” This lecture is a key event in the history of nanotechnology. In it, Feynman refers to a pantograph-inspired mechanism for manipulating molecules. It turns out that he most likely got this idea from the story “Waldo” by Robert Heinlein, who in turn probably got it from another science fiction story by Edmond Hamilton. Rejecting the suggestion of influence, chemist Pierre Laszlo writes: “Feynman’s fertile imagination had no need for an outside seed. This particular conjecture [about a link between Feynman and Heinlein] stands on its head Feynman’s whole argument. He proposed devices at the nanoscale as both rational and realistic, around the corner so to say. To propose instead that the technoscience, nanotechnology, belongs to the realm of science-fictional fantasy is gratuitous mythology, with a questionable purpose.”
Ray Bradbury is the last living of the great early titans of science fiction, now that Isaac Asimov and Arthur C. Clarke have passed. He said he’s attended every Comic-Con since the first one, when he went to the El Cortez Hotel and spoke to a few of the 300 attendees that year. These days, 125,000 people turn out for Comic-Con every year, and I had to wait 30 minutes to get in to see Bradbury speak. He’ll be 90 in August, and he’s hard of hearing, but he’s still sharp, and he’s forgotten nothing.
The Bradbury panel featured Bradbury talking to his biographer, Sam Weller. I’m just going to share select quotes from his remarks. These are in order, but incomplete.
“The Internet to me is a great big goddamn stupid bore.”
“I got a call from a man who wanted to publish my books on the Internet. I told him, prick up your ears and go to hell.”
[Bradbury has met most, if not all, of the Apollo and Gemini astronauts.]
“All those astronauts had read the Martian Chronicles. When they were young men, they read my books and decided they wanted to become astronauts.”
“Dragons? Awesome. Napoleonic wars? Awesome. Together? Even more awesome.” So said Naomi Novik in kicking off yesterday’s Comic-Con panel on combining genres. Novik was so happy with that particular mishmash that she used it in her Temeraire series, which reared its dragony head for the sixth time with the publication of Tongues of Serpents this month.
All of the authors on the panel write in genre-bending styles, but they use the technique differently, and their reasons for doing it vary, too. Novik said her motivation for crossing the streams was simple: “It’s absolutely for short attention spans. The Reese’s Peanut Butter Cup theory.”
Where do budding, even experienced, science-fiction writers learn about the science behind the science fiction? Going back to school and getting a university degree in a scientific discipline is an option, but that’s going to take quite a while. You could short-circuit the process by spending a week at Launch Pad at the University of Wyoming!
Launch Pad 2010 Attendees
Launch Pad is a free, NASA-funded workshop for established writers held in beautiful high-altitude Laramie, Wyoming. Launch Pad aims to provide a “crash course” for the attendees in modern astronomy science through guest lectures, and observation through the University of Wyoming’s professional telescopes.
The workshop’s mission is to:
…teach writers of all types about modern science, primarily astronomy, and in turn reach their audiences. We hope to both educate the public and reach the next generation of scientists.
If you’re anything like me, then you probably uttered an audible groan of disdain upon first laying eyes on the title of this book. In a literary landscape already awash in guides on surviving the coming zombie/robot/(insert your own term) apocalypse, the last thing I wanted to read was yet another piece of cloying, pseudo-scientific babble.
I felt compelled to give it a chance, however, after flipping to the authors’ page and realizing, to my great relief, that I was dealing with actual scientists. Scientists with a wry sense of humor and penchant for science fiction, as I soon found out. Having listened to (or slept through) my fair share of biology lectures during college, I was curious to see how they would approach such a complex topic–and, more importantly, how helpful their “tips” would turn out to be. I’m happy to report that not only have they written one of the most entertaining, succinct guides to biotechnology and cloning, they have also provided an exhaustive guide on how to best your clone—surely a pressing question for anyone reading this blog. Read More
Before Atlanta-based writer Robert Venditti had a publisher for his graphic novel, Surrogates, Bruce Willis topped his rather fantastical wish list of actors to play the lead. Seven years later, guess who’s starring the film version.
Surrogates—which opens September 25—features a world where people jack into robotic avatars and send the bots out into the world in their stead (trailer here). Not only was this Venditti’s freshman graphic novel, but it’s publisher Top Shelf’s first credit as a film producer.
“Bruce Willis is one of the few actors who can do the action sequences and personal moments,” Venditti told me during a break signing his novel at Comic-Con. “A big theme in the book is the relationship the main character has with his wife. He’s a police detective who can do his job without worrying about the hazards of his job. He’ll go home to his wife and she’ll only react with him through her surrogate, because she’s uncomfortable with aging. So it’s a strain on their marriage.”
The ubiquity and rapid evolution of technology has made science fiction one of the hardest genres to master. In Friday’s Comic-Con panel “Building Tomorrow’s Technology,” moderator Steve Saffel, a New York editor and publishing consultant, and four sci-fi novelists explored how present technology and availability of natural resources affects how we imagine the future.
“There was a day and time when authors didn’t worry about making technology work. You just had to have the spaceship work,” said Staffel. “These days, technology is changing at such a rapid rate, that the science-fiction writer has to compete with reality in a way they didn’t before. People also understand technology more so than in the past, so if it isn’t right, the reader will spot it.”
The panelists—Greg Bear (City at the End of Time), David Williams (Burning Skies), Dani and Eytan Kollin (The Unincorporated Man) and Kirsten Imani Kasai (Ice Song)—cited alternative energy sources, environmental decay, eventual development of quantum computing, and man/machine interfaces in military and biotech arenas as technologies with the most impact on society.
“Biotech is transforming everything,” said Bear. “It has resulted in the removal of the middleman between audience and creator. But removing teachers and experts from the throne is not always such a good thing.”