It's one thing to teach a computer how to play another game. It's a completely different story to try and make it learn the game on its own by reading the manual.
Some rather smart folks over at MIT decided to see how well a computer could learn to comprehend new tasks from reading text. As a test, they decided to see if it could learn to play Civilization.
The computer started out with just the basic ability to use the mouse cursor and click on things. It could see the words on the screen, and by playing the game, it learned what the associated meaning of the words were. The computer learned basic English words by seeing what they did in Civilization.
After playing for a while, the computer won 46% of the matches it played, which is more than some human players (like me) can say. To see how much it really understood, they decided to let it read the game's manual. From what it knew about these words from playing, it was able to take this new knowledge and increase the games it won to 79%. It followed the same steps that 80% of human players did while playing the game, and it won more games than another computer that relied on conventional AI methods.
S. R. K. Branavan, a graduate student on the project, said that “Games are used as a test bed for artificial-intelligence techniques simply because of their complexity. Every action that you take in the game doesn’t have a predetermined outcome, because the game or the opponent can randomly react to what you do. So you need a technique that can handle very complex scenarios that react in potentially random ways.”
Branavan also explained that game manuals have “very open text. They don’t tell you how to win. They just give you very general advice and suggestions, and you have to figure out a lot of other things on your own.”
This is some pretty heavy stuff, but it's very cool. The team from MIT is already looking at ways they can implement this new method of AI training into robotics research.