What’s the news: In a study published last week, researchers showed they could reconstruct video clips by watching viewers’ brain activity. The video of the study’s results, below, is pretty amazing, showing the original clips and their reconstructions side by side. How does it work, and does it mean mind-reading is on its way in?
Once again, scientists are trying to read your mind. Specifically, they are using fMRI (functional magnetic resonance imaging) to see what areas of the brain people use to process numbers, and even to determine what number a person just viewed.
Test subjects were shown images with either an amount of something—in this case a bunch of dots—or a numeral like 2, 4, or 6. Scientists suspected that our brains use overlapping areas to process quantities and their symbolic representations, however the findings suggest that people process the fundamental idea of a quantity differently from the way they process a symbol representing that quantity [Science News]. When a test subject looked at two dots and later at the number 2, different areas of the brain were activated, researchers report in Current Biology.
Yesterday, Honda Research Institute revealed the latest trick from its Asimo robot: It can now respond to commands issued only as thoughts. The Japanese carmaker ran a video of a man imagining four simple movements – raising his right hand, raising his left hand, running and eating – that were then duplicated by Asimo, the company’s humanoid robot. Honda said the technology was not ready for a live demonstration because the test subject might get distracted. A previous demonstration in 2006 required the test subjects to lie motionless in an MRI scanner in order to pick up the signals [Financial Times].
The mind-reading system is non-invasive, meaning that the controller doesn’t have electrodes implanted in his head. Researchers used a specialized helmet instead, which is the first “brain-machine interface” to combine two different techniques for picking up activity in the brain. Sensors in the helmet detect electrical signals through the scalp in the same way as a standard EEG (electroencephalogram). The scientists combined this with another technique called near-infrared spectroscopy, which can be used to monitor changes in blood flow in the brain [The Guardian]. A software program then integrates the two signals and transmits a command to the robot.
Brain scans can pick up the distinct pattern of brain waves that occur when a person’s attention lapses, and can therefore predict when the person is about to make a mistake, according to a small new study.
Using magnetoencephalography (MEG), researchers monitored the oscillations in brain activity for 14 test subjects. Each student was asked to take part in monotonous test in which a random number from one to nine flashed on a screen every two seconds, and they were asked to tap a button as soon as any number except five appeared. The test was so boring that even when a five showed up, the subjects spontaneously hit the button an average of 40% of the time [BBC News].
About a second before they committed the error, brain waves in two regions spiked: alpha wave activity in the occipital region was about 25 percent greater than usual, and in the sensorimotor cortex there was a corresponding increase in the brain’s mu wave activity.
Using not much more than a brain scanner, scientists have successfully found a way to read people’s minds—or at least, certain thoughts in them. A London research team was able to determine where its volunteer subjects were located, in a computer-generated virtual environment, by using fMRI scanning to analyze activation patterns in the hippocampus area of their brains. After correlating this information with the subjects’ movements, the researchers found that they could accurately predict their subjects’ locations based solely on the scanner read-out.
The findings may bring scientists one step closer to understanding the workings of the hippocampus, the part of the brain responsible for short-term memory and spatial relationships but which has been too disorganized for scientists, until now, to decode. Led by Eleanor Maguire, the researchers focused on groups of neurons identified by Maguire in an earlier study of London taxi drivers, whose hippocampi were hyperdeveloped by years of mental navigation through the city’s mazelike streets…. The results “are an intriguing first step toward using fMRI to read out information about visuo-spatial scenes,” [Wired] said Arne Ekstrom, a California-based neuroscientist.
Researchers have previously managed to achieve a limited degree of mind-reading by using electroencephalograms and infrared sensors—in addition to familiar dubious psychic methods—so it was probably inevitable that they would accomplish a similar trick with functional magnetic resonance imaging, or fMRI, the brain-scanner of choice in modern neuroscience. Researchers at Vanderbilt University have shown they can use fMRI to determine which of two images people are thinking about.
Six subjects were given different patterns to look at—one with horizontal stripes, the other with vertical ones. An fMRI scanner was used throughout the experiment to monitor brain activity in four different early visual areas—the first areas of the brain to receive and process visual signals, which were previously thought to have no role in higher cognitions…. Researchers knew that early visual areas could process in fine detail visual signals from the eye, but thought these areas could not retain information [CBC News]. After the images were removed from view, the subjects were instructed to think back to one in particular. The scientists were then able to tell, with 80 percent accuracy, which the subjects were thinking of.
In a new experiment, researchers didn’t have to ask their test subjects whether they’d prefer coffee or tea; instead, they just read their minds. With a nifty bit of technical wizardry, researchers beamed near-infrared light at the volunteers’ foreheads while asking them to mentally decide which of two beverages they liked better. By examining how the light was absorbed by the volunteers’ brain tissue, researchers were able to predict a person’s preference with 80 percent accuracy.
Lead researcher Tom Chau says he hopes a similar device can one day help people with severe cerebral palsy or neuromuscular conditions that keep them paralyzed in unresponsive bodies. “Basically their mind is alert,” he said. “This is kind of the compelling argument behind the work, that these individuals are cognitively capable – they’re aware of their surroundings, they understand what’s going on – but they have no means of communicating their intentions or preferences to the outside world” [Canadian Press].
Coauthor Sheena Luu adds that the device could use simple preferences to build up to larger decisions and thoughts. “If we limit the context – limit the question and available answers, as we have with predicting preference – then mind-reading becomes possible” [The Register], she says.
At the Tokyo Game Show today, a video game player strapped on a headset and, without touching a conventional video game controller, began to kill zombies–using only the power of his mind. Called “Judecca,” the PC game is a zombie-killing first-person-shooter, though it’s really just a proof of concept and not something that’s going to be available to consumers anytime soon [Fox News].
The game was produced by leading Japanese studio Square Enix and makes use of a brainwave-sensing headset from NeuroSky, a San Jose-based company that develops biometric sensors and similar products aimed at the consumer market [PC World]. NeuroSky’s “Mindset” device looks like a big pair of headphones with a single electrode that wraps around to tap the wearer’s forehead; the company says the device uses electroencephalography (EEG) to measure the current state of relaxation or concentration of players.