Force and Vision Resolvability for Assimilating Disparate Sensory Feedback - Robotics Institute Carnegie Mellon University

Force and Vision Resolvability for Assimilating Disparate Sensory Feedback

Bradley Nelson and Pradeep Khosla
Tech. Report, CMU-RI-TR-95-11, Robotics Institute, Carnegie Mellon University, March, 1995

Abstract

Force and vision sensors provide complementary information, yet they are fundamentally different sensing modalities. This implies that traditional sensor integration techniques that require common data representations are not appropriate for combining the feedback from these two disparate sensor. In this paper, we introduce the concept of vision and force sensor resolvability as a means of comparing the ability of the two sensing modes to provide useful information during robotic manipulation tasks. By monitoring the resolvability of the two sensing modes with respect to the task, the information provided by the disparate sensors can be seamlessly assimilated during task execution. A nonlinear force/vision servoing algorithm that uses force and vision resolvability to switch between sensing modes is proposed. The advantages of the assimilation technique is demonstrated during contact transitions between a stiff manipulator and rigid environment, a system configuration that easily becomes unstable when force control alone is used. Experimental results show that robust contact transitions are made by the proposed nonlinear controller while simultaneously satisfying the conflicting task requirements of fast approach velocities, maintaining stability, minimizing impact forces, and suppressing bounce between contact surfaces.

BibTeX

@techreport{Nelson-1995-13852,
author = {Bradley Nelson and Pradeep Khosla},
title = {Force and Vision Resolvability for Assimilating Disparate Sensory Feedback},
year = {1995},
month = {March},
institute = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-95-11},
}