Try to picture a time machine.
You probably envisioned a tricked-out DeLorean or, perhaps, a blue, spinning phone booth, right? But today, time travel isn’t so much about fast cars or alien technology as it is about tweaking our perception of reality. In fact, if you’re reading this on a tablet, you’re holding a time machine of sorts in your hands right now.
Of course, your iPad won’t actually transport you back in time, but it can serve as a window into another world. Imagine visiting the Parthenon, for example, and when you point your iPad toward the crumbled structure, you see the majestic building, but as it was thousands of years ago. You can even walk toward and around the structure, and so long as you’re peering through the tablet, it’s as if you were walking through the past.
This immersive experience, called augmented reality, has captivated archaeologist Stuart Eve, who is trying to change the way we learn history through the five senses. He’s working on augmented-reality technology that not only visually recreates ancient ruins, but also gives you a sense of what they smelled and sounded like.
Eye tracking devices sound a lot more like expensive pieces of scientific research equipment than joysticks – yet if the latest announcements about the latest Assassin’s Creed game are anything to go by, eye tracking will become a commonplace feature of how we interact with computers, and particularly games.
Eye trackers provide computers with a user’s gaze position in real time by tracking the position of their pupils. The trackers can either be worn directly on the user’s face, like glasses, or placed in front of them, such as beneath a computer monitor for example.
Eye trackers are usually composed of cameras and infrared lights to illuminate the eyes. Although it’s invisible to the human eye, the cameras can use infrared light to generate a grayscale image in which the pupil is easily recognizable. From the position of the pupil in the image, the eye tracker’s software can work out where the user’s gaze is directed – whether that’s on a computer screen or looking out into the world.
But what’s the use? Well, our eyes can reveal a lot about a person’s intentions, thoughts and actions, as they are good indicators of what we’re interested in. In our interactions with others we often subconsciously pick up on cues that the eyes give away. So it’s possible to gather this unconscious information and use it in order to get a better understanding of what the user is thinking, their interests and habits, or to enhance the interaction between them and the computer they’re using.
This article was originally published on The Conversation.
In 1959, John Howard Griffin, a white American writer, underwent medical treatments to change his skin appearance and present himself as a black man. He then traveled through the segregated US south to experience the racism endured daily by millions of black Americans. This unparalleled life experiment provided invaluable insights into how the change in Griffin’s own skin color triggered negative and racist behaviors from his fellow Americans.
But what about the changes that Griffin himself might have experienced? What does it mean to become someone else? How does this affect one’s self? And how can this affect one’s stereotypes, beliefs and racial attitudes? That was the key question that my colleagues and I set out to answer in a series of psychological experiments that looked at the link between our bodies and our sense of who we are.
Mark Changizi is an evolutionary neurobiologist and director of human cognition at 2AI Labs. He is the author of The Brain from 25000 Feet, The Vision Revolution, and his newest book, Harnessed: How Language and Music Mimicked Nature and Transformed Ape to Man.”
Also check out his related commentary on a promotional video for Project Glass, Google’s augmented-reality project.
Experience happens here—from my point of view. It could happen over there, or from a viewpoint of an objective nowhere. But instead it happens from the confines of my own body. In fact, it happens from my eyes (or from a viewpoint right between the eyes). That’s where I am. That’s consciousness central—my “soul.” In fact, a recent study by Christina Starmans at Yale showed that children and adults presume that this “soul” lies in the eyes (even when the eyes are positioned, in cartoon characters, in unusual spots like the chest).
The question I wish to raise here is whether we can teleport our soul, and, specifically, how best we might do it. I’ll suggest that we may be able to get near-complete soul teleportation into the movie (or video game) experience, and we can do so with some fairly simple upgrades to the 3D glasses we already wear in movies.
Consider for starters a simple sort of teleportation, the “rubber arm illusion.” If you place your arm under a table out of your view, and have a fake, rubber, arm on the table where your arm usually would be, an experimenter who strokes the rubber arm while simultaneously stroking your real arm on the same spot will trick your brain into believing that the rubber arm is your arm. Your arm—or your arm’s “soul”—has “teleported” from under the table and within your real body into a rubber arm sitting well outside of your body.
It’s the same basic trick to get the rest of the body to transport. If you were to wear a virtual reality suit able to touch you in a variety of spots with actuators, then you can be presented with a virtual experience – a movie-like experience – wherein you can see your virtual body being touched and the bodysuit you’re wearing simultaneously touches your real body in those same spots. Pretty soon your entire body has teleported itself into the virtual body.
And… Yawn, we all know this. We saw James Cameron’s Avatar, after all, which uses this as the premise.
My question here is not whether such self-teleportation is possible, but whether it may be possible to actually do this in theaters and video games. Soon.