It’s good to be back to blogging after a brief hiatus. As part of my return to some minimal level of leisure, I was finally able to watch the movie Moon (directed and co-written by Duncan Jones) and I’m glad that I did. (Alert: many spoilers ahead). Like all worthwhile art, it leaves nagging questions to ponder after experiencing it. It also gives me another chance to revisit questions about how technology may change our sense of identity, which I’ve blogged a bit about in the past.
A brief synopsis: Having run out of energy on Earth, humanity has gone to the Moon to extract helium-3 for powering the home planet. The movie begins with shots outside of a helium-3 extraction plant on the Moon. It’s a station manned by one worker, Sam, and his artificial intelligence helper, GERTY. Sam starts hallucinating near the end of his three-year contract, and during one of these hallucinations drives his rover into a helium-3 harvester. The collision causes the cab to start losing air and we leave Sam just as he gets his helmet on. Back in the infirmary of the base station, GERTY awakens Sam and asks if he remembers the accident. Sam says no. Sam starts to get suspicious after overhearing GERTY being instructed by the station’s owners not to let Sam leave the base.
So Sam tricks GERTY into letting him go out of the station in one of the rovers. He finds the first Sam who has crashed and brings him back to nurse him to health. The new Sam decides that chronic communication difficulties—which have only permitted seeing previously recorded messages from his wife and daughter waiting for him to return back on Earth—might be an elaborate deception. He goes far enough off base to get outside of the range of jamming antennas and calls back home to Earth to discover his daughter, who was an infant in the pre-recorded messages, is now a teenager, his wife is now dead—and her father Sam is there on Earth.
The sinister truth of the helium-3 base is now fully disclosed. What is actually happening is that the “first” Sam was himself a clone (where this means everything, including all his memories, not simply a genetic clone). Evidently, the copying occurred early in Sam 1’s stay at the station. Each clone is awakened with the thought of returning home to his family in three years. What actually happens at the end of those three years is that the clone is incinerated in the return capsule, and a new clone is awakened, to begin the cycle anew.
Near the end of the film comes a striking moment. The Sam that nearly died in the earlier crash has gotten increasingly sick and will die soon. The two Sams realize that the bosses of the station are coming to kill both of them and activate a new clone. They hatch a plan that has one of them leaving back to Earth in one of the helium-3 delivery shuttles. After newly awakened Sam tells dying Sam that he deserves to go back—“you did the three years”—dying Sam disagrees, and tells new Sam that he should return to Earth, because dying Sam is too sick to make it. This is a really powerful moment in the film, and our feelings about it are helpful in untangling our own mangle of thoughts about identity and death.
Dying Sam’s sacrifice seems less significant than, say, me telling an unrelated co-worker to take the capsule home. There are suggestive biological resonances to this feeling. Think of how, in social insects like bees, individuals give up the right to reproduce in order to facilitate the genetic continuity of individuals that they are closely related to. So, would the fact that you have a copy of yourself, which diverged from you even quite some while back (in this case, three years of solitude on a Moon base), ease your anxiety about dying?
Consider the following thought experiment. Rather than three-year stints, the clones of Moon get replaced on a 24-hour cycle. You fall asleep. Your memories and any other physical changes from the “base copy” get noted and propagated to a new clone. You are then, in Moon-like fashion, vaporized, and in the morning, a new clone is awakened after these changes have been “installed.” You awake, none the wiser for this change in body. Consciousness is not continuous, of course, and discontinuities such as sleep are natural places where we can do the “body change” business with minimal mess (not unlike what was depicted in the fantastic sci-fi film Dark City). The gap between what actually happens in sleep and this scenario seems too small to quibble over. Or is it?
As experiences and other physical changes separate you from your base clone as weeks, months, and years pass, your ability to separate your own identity from that of the clone grows similarly. It is like a core scene in the play “On Ego,” when a Star Trek-like teleporter fails to vaporize the original version of the protagonist. So two protagonists now exist. From that moment forward what was once one person is now two people, with increasingly different senses of self and experiences.
Your sense of how much you would sacrifice for your copy might be a good test for how different you feel from him or her. Your sense of how much comfort you would feel in dying, knowing that this other version of you lives on, might be another good test for how much of your identity has leaked out of the lump of tissue that has hitherto conveniently been bounded off by your jacket of skin. Perhaps in the first few days after such a teleporter accident, you would feel you could give up your life for your copy (and be relaxed about the idea of dying so that one of you can go on); after a few weeks, maybe something less than your life, and after some years of passed, perhaps you’d feel you could sacrifice nothing more than you would sacrifice for a close friend. (Topic for a future movie and post: Does forming a close friendship involve blurring and merging of your two identities?)
Here’s some final thought experiments for you to puzzle over. The great anthropologist Mary Douglas wrote in her paper “The Forensic Self,”
[In] western culture, whatever we say seriously about persons and selfhood needs to some extent to be compatible with what a jury in a court of law will accept.
For a graduate degree in philosophy with Ian Hacking many years ago, I once applied this idea to the issue of multiple personality disorder (MPD), to see how the judicial system dealt with defenses of MPD. The courts have mostly taken a view most eloquently put by Judge Birdsong in the case of Georgia v. Kirkland: “…we will not begin to parcel criminal accountability out among the various inhabitants of the mind.”
Rather than MPD, let’s see where we get when we apply Douglas’ insight to the problem of multiple person disorder: having multiple copies of yourself present at once. What if, just prior to copying, one of you formed a criminal intent. Because of slightly different post-copying existences, one of you now decide to stop the other. Would it be ethical to kill your copy? What would ethics require of how you treat one another? After all, we have sometimes odd ideas of what we are allowed to do to ourselves: Yes to smoking ourselves to death, no to elective limb amputations. These confusions would only be amplified by the peculiar situation of having multiple person disorder. Or being the victim of a sinister plot by Lunar Industries on the Moon.