Robot Face

Photo by rawpixel on Unsplash

We want to continue experiments using an animated face displayed on a screen mounted on the social robots head. This gives us more control over emotional expression using combinations of mouth, eyelid and eyebrow movements, and fine grain controllable directional eye movements. This customizable face required further implementation, and is and evaluation using Mechanical Turk.

We will study human reactions to the face in lab conditions, using both human observation and taking advantage of sub-millimeter tracking of head and body orientation using our high-precision motion capture camera system. When the robot is giving directional instructions, or talking about specifically placed, spacial objects, we can see how users react to head or eye movements given by the robot. For instance, when directing users to a particular door, and indicating which door using a head or eye gesture, we can track the users movements with extreme accuracy. We can use this to determine if people either consciously or subconsciously follow the movements of the robot’s head, eyes or both.

Avatar
Nick Webb
Assistant Professor of Computer Science / Director of Data Analytics

My research interests include Natural Language Processing, Social Robotics and Data Analytics.

Related