Think you’re good at classic arcade games such as Space Invaders, Breakout and Pong? Think again.
In a groundbreaking paper published yesterday in Nature, a team of researchers led by DeepMind co-founder Demis Hassabis reported developing a deep neural network that was able to learn to play such games at an expert level.
What makes this achievement all the more impressive is that the program was not given any background knowledge about the games. It just had access to the score and the pixels on the screen.
It didn’t know about bats, balls, lasers or any of the other things we humans need to know about in order to play the games.
But by playing lots and lots of games many times over, the computer learned first how to play, and then how to play well.
Pluto is the largest object in the Kuiper belt, and NASA’s New Horizons spacecraft will arrive there on July 15.
These two events will make 2015 an exciting year for solar system exploration and discovery. But there is much more to this story than mere science. I expect 2015 will be the year when general consensus, built upon our new knowledge of these two objects, will return Pluto and add Ceres to our family of solar system planets.
The efforts of a very small clique of Pluto-haters within the International Astronomical Union (IAU) plutoed Pluto in 2006. Of the approximately 10,000 internationally registered members of the IAU in 2006, only 237 voted in favor of the resolution redefining Pluto as a “dwarf planet” while 157 voted against; the other 9,500 members were not present at the closing session of the IAU General Assembly in Prague at which the vote to demote Pluto was taken. Yet Pluto’s official planetary status was snatched away.
Ceres and Pluto are both spheroidal objects, like Mercury, Earth, Jupiter and Saturn. That’s part of the agreed upon definition of a planet. They both orbit a star, the Sun, like Venus, Mars, Uranus and Neptune. That’s also part of the widely accepted definition of a planet.
Eye tracking devices sound a lot more like expensive pieces of scientific research equipment than joysticks – yet if the latest announcements about the latest Assassin’s Creed game are anything to go by, eye tracking will become a commonplace feature of how we interact with computers, and particularly games.
Eye trackers provide computers with a user’s gaze position in real time by tracking the position of their pupils. The trackers can either be worn directly on the user’s face, like glasses, or placed in front of them, such as beneath a computer monitor for example.
Eye trackers are usually composed of cameras and infrared lights to illuminate the eyes. Although it’s invisible to the human eye, the cameras can use infrared light to generate a grayscale image in which the pupil is easily recognizable. From the position of the pupil in the image, the eye tracker’s software can work out where the user’s gaze is directed – whether that’s on a computer screen or looking out into the world.
But what’s the use? Well, our eyes can reveal a lot about a person’s intentions, thoughts and actions, as they are good indicators of what we’re interested in. In our interactions with others we often subconsciously pick up on cues that the eyes give away. So it’s possible to gather this unconscious information and use it in order to get a better understanding of what the user is thinking, their interests and habits, or to enhance the interaction between them and the computer they’re using.
I have always been in awe of the night sky, trying to comprehend the vastness of space and the countless wonders it contains. But I have always felt a certain dissatisfaction with only being able to see it at a distance.
One day I imagine that humanity will be able to visit other planets in the solar system, and venture even further to other stars, but this has always seemed very far away. That’s the reason why I applied for the Mars One mission, aimed at starting a human colony on Mars – it seemed like a real opportunity to get closer to the rest of the night sky, to give me a chance to be a part of taking humanity into the stars.
Mars is, in a way, the perfect stepping stone into the rest of the universe. Despite its inhospitable conditions, it has a day-night cycle only 39 minutes longer than on Earth. Unlike the moon, it is resource-rich, and has a soil and atmosphere rich in water and nitrogen respectively. Mars does not suffer from the sweltering heat and toxic atmosphere found on Venus, closer to the sun from Earth, but still receives enough light from the sun to enable the generation of solar power.
I was seduced by infinity at an early age. Georg Cantor’s diagonality proof that some infinities are bigger than others mesmerized me, and his infinite hierarchy of infinities blew my mind. The assumption that something truly infinite exists in nature underlies every physics course I’ve ever taught at MIT—and, indeed, all of modern physics. But it’s an untested assumption, which begs the question: Is it actually true?
Gold is a modern expression of love, and every Valentine’s Day thousands of shoppers browse boutique windows full of the stuff. Over 90 countries mine the gold that is fashioned into jewelry, with China currently topping the exporter tables (though as illegal exports are rampant in some countries, exact figures are hard to pin down).
South American countries are also major gold producers, particularly Perú, which ranks variously fifth or sixth worldwide. A global gold rush over the last decade has seen a boom in South American mining. But this has led to a specific problem — it is now financially viable to extract gold deposits from areas which were previously unprofitable, such as under tropical forests, resulting in growing damage to one of the planet’s most vulnerable ecosystems.
“If at first the idea is not absurd, then there is no hope for it,” Albert Einstein reportedly said. I’d like to broaden the definition of addiction—and also retire the scientific idea that all addictions are pathological and harmful.
Since the beginning of formal diagnostics more than fifty years ago, the compulsive pursuit of gambling, food, and sex (known as non-substance rewards) have not been regarded as addictions. Only abuse of alcohol, opioids, cocaine, amphetamines, cannabis, heroin, and nicotine have been formally regarded as addictions. This categorization rests largely on the fact that substances activate basic “reward pathways” in the brain associated with craving and obsession and produce pathological behaviors. Psychiatrists work within this world of psychopathology—that which is abnormal and makes you ill. Read More
What if babies could tell us what they want, before they start crying for it? Bring in baby signing, a system of symbolic hand gestures for key works such as “milk,” “hot” and “all gone” that are taught to hearing babies as a way to communicate before they can talk.
The sign for milk, for example, is made by opening and closing the hand, while the sign for “more” by tapping the ends of the fingers together.
Now new research has reported that it’s even possible for babies to learn these signs just from viewing videos at home. The study found that babies learned to produce baby signs just as well from a video as they did if they were taught by their parents.
Yet only those babies who had been taught the signs from a parent showed evidence of understanding what the signs meant. The bigger question is whether these findings should be taken as encouragement to teach babies to sign and what impact it has on child development.
It’s difficult to deny that humans began as Homo sapiens, an evolutionary offshoot of the primates. Nevertheless, for most of what is properly called “human history” (that is, the history starting with the invention of writing), most of Homo sapiens have not qualified as “human”—and not simply because they were too young or too disabled.
In sociology, we routinely invoke a trinity of shame—race, class, and gender—to characterize the gap that remains between the normal existence of Homo sapiens and the normative ideal of full humanity. Much of the history of social science can be understood as either directly or indirectly aimed at extending the attribution of humanity to as much of Homo sapiens as possible. It’s for this reason that the welfare state is reasonably touted as social science’s great contribution to politics in the modern era. But perhaps membership in Homo sapiens is neither sufficient nor even necessary to qualify a being as “human.” What happens then?
Last week, BICEP2 scientists — who in March announced evidence of cosmic inflation, a potentially Nobel-worthy find — threw handfuls of dust on the grave of their own results. The official paper [pdf], just published on the BICEP website, tells the story of how they mistook cosmic dust for “primordial gravitational waves,” and why everybody needs to calm down and stop trying to bury inflation, too.
Just 10-35 seconds after the Big Bang, cosmologists (or at least most of them) believe the universe expanded in hyperdrive — faster than it ever has since and faster than it ever will again. This ballooning, called inflation, smoothed everything out. It turned the cosmos into the roughly homogenous place we see today, and perhaps created other universes that add up to the sci-fi-sounding “multiverse.”
But it’s difficult to find direct evidence that inflation actually happened (after all, it was a long time ago). That’s where B-modes, which the BICEP2 team saw, come in.