Sensor Fusion for Human Safety in Industrial Workcells - Robotics Institute Carnegie Mellon University

Sensor Fusion for Human Safety in Industrial Workcells

Paul Rybski, Peter Anderson-Sprecher, Daniel Huber, Christopher Niessl, and Reid Simmons
Conference Paper, Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3612 - 3619, October, 2012

Abstract

Current manufacturing practices require complete physical separation between people and active industrial robots. These precautions ensure safety, but are inefficient in terms of time and resources, and place limits on the types of tasks that can be performed. In this paper, we present a real-time, sensor-based approach for ensuring the safety of people in close proximity to robots in an industrial workcell. Our approach fuses data from multiple 3D imaging sensors of different modalities into a volumetric evidence grid and segments the volume into regions corresponding to background, robots, and people. Surrounding each robot is a danger zone that dynamically updates according to the robot's position and trajectory. Similarly, surrounding each person is a dynamically updated safety zone. A collision between danger and safety zones indicates an impending actual collision, and the affected robot is stopped until the problem is resolved. We demonstrate and experimentally evaluate the concept in a prototype industrial workcell augmented with stereo and range cameras.

BibTeX

@conference{Rybski-2012-7603,
author = {Paul Rybski and Peter Anderson-Sprecher and Daniel Huber and Christopher Niessl and Reid Simmons},
title = {Sensor Fusion for Human Safety in Industrial Workcells},
booktitle = {Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems},
year = {2012},
month = {October},
pages = {3612 - 3619},
}