Towards collaborative robots with sensory awareness: preliminary results using multi-modal sensing - Robotics Institute Carnegie Mellon University

Towards collaborative robots with sensory awareness: preliminary results using multi-modal sensing

Andrew L. Orekhov, G. L. Johnston, Colette Abah, Howie Choset, and Nabil Simaan
Workshop Paper, ICRA '19 Workshop on "Physical human-robot interaction: a design focus", May, 2019

Abstract

Current robotic systems are unable to achieve safe operation and mapping in confined spaces using intrinsic sensory data only. Recent advancement in sensory technologies in terms of miniaturization and affordability has allowed the creation of a new multi-modal sensory robot skin that can potentially achieve mapping and allow safe operation. Such robot skins may also benefit users in cases of telemanipulation, which is often hindered by limited situational awareness. In this paper, we focus on identifying the potential benefits of such robots within the context of collaborative manufacturing in confined spaces and present an experimental testing of a 4 degree-of-freedom test platform with multiple sensing disks. We demonstrate the use of the sensing skin for bracing against the environment and for avoiding collision with a human and/or the environment using proximity sensing. We also show physical human-robot interaction using Hall-effect contact sensors. We believe such robots with intrinsic distributed sensing along their entire length can enable a variety of applications in manufacturing and search and rescue domains.

BibTeX

@workshop{Orekhov-2019-130982,
author = {Andrew L. Orekhov and G. L. Johnston and Colette Abah and Howie Choset and Nabil Simaan},
title = {Towards collaborative robots with sensory awareness: preliminary results using multi-modal sensing},
booktitle = {Proceedings of ICRA '19 Workshop on "Physical human-robot interaction: a design focus"},
year = {2019},
month = {May},
}