Robotic assistance in indoor navigation for people who are blind - Robotics Institute Carnegie Mellon University

Robotic assistance in indoor navigation for people who are blind

Aditi Kulkarni, Allan Wang, Lynn Urbina, Aaron Steinfeld, and M. Bernardine Dias
Conference Paper, Proceedings of 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI '16), pp. 461 - 462, March, 2016

Abstract

In this paper, we describe the process of making a robot useful as a guide robot for people who are blind or visually impaired. For this group, the interactive audio feature of a robot assumes a very high level of importance. We have introduced some features that will help to make the robot sound natural and be more comfortable. We first addressed the question of the speaker placement to help the user determine the size and distance of the robot. After the initial meeting, user data will be retained by the robot so that their communication evolves with every interaction. The robot will also ask the users if they need to take a rest after a specified interval depending upon the user's age and the distance they need to cover. The next time they visit, all this information will be used to make the interaction more natural and customized for each individual user.

BibTeX

@conference{Kulkarni-2016-122811,
author = {Aditi Kulkarni and Allan Wang and Lynn Urbina and Aaron Steinfeld and M. Bernardine Dias},
title = {Robotic assistance in indoor navigation for people who are blind},
booktitle = {Proceedings of 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI '16)},
year = {2016},
month = {March},
pages = {461 - 462},
}