Computer Learns to Take Over Virtual Worlds by Doing What Most of Us Don't: Reading the Manual

By Joseph Castro | July 13, 2011 3:53 pm

spacing is importantScreenshot of Civilization IV, a later version
of the game that MIT’s computer played.

What’s the News: Many video gamers scoff at the idea of actually reading the instruction manual for a game. But a manual can not only teach you how to play a game, it can also give you the basics of language—that is, if you’re a machine-learning computer. Researchers at MIT’s Computer Science and Artificial Intelligence Lab have now designed a computer system that can learn the meaning of certain words by playing complex games like Civilization II and comparing on-screen information to the game’s instruction manual.

How the Heck:

  • The researchers, lead by computer scientist Regina Barzilay, began by giving their machine-learning system very basic knowledge about Civilization II, such as the various actions it can take (moving the cursor, clicking, etc.). The computer also had access to the words and other information that popped up on-screen—though it didn’t understand what the text and objects meant—and it knew when it won or lost a game. Here, the computer’s behavior was mostly random and it was able to win 46 percent of the time.
  • The researchers then augmented the computer system so that it could use the game’s manual to develop strategies. So, when words like “river” now popped up during game play, the computer searched for those words in the instructions and analyzed the surrounding text. With this information, the computer made assumptions about what actions the words corresponded to, giving greater weight to ideas that consistently produced good results and trashing those associated with poor results. Its winning percentage jumped to 79 percent.

What’s the Context:

  • Two years ago, Barzilay conducted a similar experiment where she had her machine-learning system install software on a Windows PC by using instructions available on Microsoft’s website. The system carried out 80 percent of the steps that a person using the same instructions would.

The Future Holds:

  • For most complex games that allow players to compete against computer opponents, programmers must develop and code various strategies for the computer to follow. The researchers say that programmers will soon be able to use their system to automatically create those algorithms (via MIT News).
  • The team is currently trying to affix robotic systems with their “meaning-inferring algorithms.”

(via MIT News)

Image: Flickr/yoppy

  • amphiox

    Who or what was the computer playing against? A human player, or the games own opponent AI? (In the second instance, wouldn’t that count as the computer playing against itself?) And if the AI, what level?

    A 79% victory rate against the programmed opponent AI would suggest that this self-learning algorithm could be incorporated into the game to make a more challenging AI opponent for the human player. (And could the AI be programmed over time to learn in contests against the human player to become ever more adept at playing that human player?)

  • Awnshegh

    I played Civ 2 to death. The original manual was a behemoth with some serious instructions on how the game should be played in order to win. It’s no surprise that reading them would provide better results in games – the surprise is having a seperate AI (from the game code) that can put 2 and 2 together and actually use this information.

    Will it change gaming AI? Maybe. Remember gaming AI is built on very simple yet lengthy branching scripts for decision making and pathing. This is a paradigm shift with results that aen’t necessarily in line with what the programmers may want. Where’s the tweakability if they wan the AI to be easy, medium or hard?

    So what will it change? Personally I think this will become a gaming QA dream. You could build an external AI to run the game through hundreds or thousands of iterations of games to hunt out bugs, AI quirks (within the game) and even find balancing issues which currently take weeks or months of testing.

  • Naveed

    Very awesome. Can’t wait until this gets into real video games will make games much more fun and challenging.

    amphiox, it sounds like the AI they used played the player since the article mentions them teaching it about mouse movements and what shows up on the screen (the real AI doesn’t see any of this). Also in Civ 2 and for many many games they only write one AI and they give the player or AI bonuses and handicaps depending on the difficulty level. Hopefully they used a normal or higher difficulty to show the superiority of this new AI.

    The AI right now is one of the areas in gaming where there really needs to be some improvement. The AI can never stack up to a player on an even playing field and usually has some quirks/flaws that can be exploited.

  • FlaGator

    I wonder what difficulty level the computer was playing against. I still find it near impossible to beat the CivIV AI at Noble level.

  • hopeful

    It would be outstanding for AI of this level to be built into civilisation however, I for one don’t look forward to the 2tb download to get the AI opponent!

    That said, without knowing anything about the game at all, a win ratio of 46% is damn impressive. Or worrying, if you’re a dedicated Civ player!

  • Wesley

    What I don’t understand is how almost random play can get the computer a 46% winning advantage? Are the humans playing randomly too?

  • Kevin R. Bridges

    I think it’s interesting that a randomly acting program won 46 percent of the time.

  • Anthony

    And here we thought SkyNet would be the end of us.

  • Andy Farid

    foundthe majority ofwillhave the same opinionwith your blog.


Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!


80beats is DISCOVER's news aggregator, weaving together the choicest tidbits from the best articles covering the day's most compelling topics.

See More

Collapse bottom bar