Head Pose Estimation

Photo by rawpixel on Unsplash

Changes in head orientation could be a factor in choosing people for interaction. If, on making an interaction attempt, we see no deviation either of trajectory or head orientation, we can assume that we have failed to engage that person, and move on to our next candidate.

We want to examine head pose estimation using affordable, on-board robot sensing, such as RGB-D, laser sensors and camera data. Models using camera data exist for estimating real-time head orientation, such as https://github.com/lincolnhard/head-pose-estimation, and we want to investigate the effectiveness of these models also in high traffic scenarios. The ability to estimate head pose will be used in evaluation (we can use head movements by human subjects in the direction of the robot as evidence that attempts to attract their attention are successful).

With the robot situated in busy, public use areas, we will evaluate the accuracy of head pose estimation, both of stationary humans and those passing-by. We already annotate head orientation manually for existing experimental work, and so can use this approach as a gold standard measure.

Avatar
Nick Webb
Assistant Professor of Computer Science / Director of Data Analytics

My research interests include Natural Language Processing, Social Robotics and Data Analytics.

Related