A Multi-modal Sensor Array for Safe Human-Robot Interaction and Mapping - Robotics Institute Carnegie Mellon University

A Multi-modal Sensor Array for Safe Human-Robot Interaction and Mapping

Colette Abah, Andrew L. Orekhov, Garrison L. H. Johnston, Peng Yin, Howie Choset, and Nabil Simaan
Conference Paper, Proceedings of (ICRA) International Conference on Robotics and Automation, pp. 3768 - 3774, May, 2019

Abstract

In the future, human-robot interaction will include collaboration in close-quarters where the environment geometry is partially unknown. As a means for enabling such interaction, this paper presents a multi-modal sensor array capable of contact detection and localization, force sensing, proximity sensing, and mapping. The sensor array integrates Hall effect and time-of-flight (ToF) sensors in an I 2 C communication network. The design, fabrication, and characterization of the sensor array for a future in-situ collaborative continuum robot are presented. Possible perception benefits of the sensor array are demonstrated for accidental contact detection, mapping of the environment, selection of admissible zones for bracing, and constrained motion control of the end effector while maintaining a bracing constraint with an admissible rolling motion.

BibTeX

@conference{Abah-2019-119950,
author = {Colette Abah and Andrew L. Orekhov and Garrison L. H. Johnston and Peng Yin and Howie Choset and Nabil Simaan},
title = {A Multi-modal Sensor Array for Safe Human-Robot Interaction and Mapping},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2019},
month = {May},
pages = {3768 - 3774},
}