A Robot that Follows Natural-Language Directions
Prof. Andy Haas
SUNY Albany
February 12, 2008
5pm-6pm
Abstract
If robots are to be useful in everyday life, they need to speak English. This problem is not just a practical issue. It raises a central question in cognitive science: how do we connect words to the world around us? Most research on natural language processing is words-in, words-out. The problem of connecting language to the world does not arise. If robots are to follow directions in human language, that is exactly the problem we must face.

I will describe a simulated robot that travels through a virtual office building. Computer graphics allow humans being also to see this virtual building. I have collected a corpus of sets of directions, each tagged with the path it describes. This tagging provides strong, detailed evidence about the meaning of the directions. On the basis of this data, I have built a robot that follows directions.