Integrating Sensor Placement and Visual Tracking Strategies
Abstract
Real-time visual feedback is an important capability that many robotic systems must possess if these systems are to operate successfully in dynamically varying and imprecisely calibrated environments. An eye-in-hand system is a common technique for providing camera motion to increase the working region of a visual sensor. Although eye-in-hand robotic systems have been well-studied, several deficiencies in proposed systems make them inadequate for actual use. Typically, the systems fail if manipulators pass through singularities or joint limits. Objects being tracked can be lost if the objects become defocused, occluded, or if features on the objects lie outside the field of view of the camera. In this paper, a technique is introduced for integrating a visual tracking strategy with dynamically determined sensor placement criteria. This allows the system to automatically determine, in real-time, proper camera motion for tracking objects successfully while accounting for the undesirable, but often unavoidable, characteristics of camera-lens and manipulator systems. The sensor placement criteria considered include focus, field-of-view spatial resolution, manipulator configuration, and a newly introduced measure called resolvability. Experimental results are presented.
BibTeX
@conference{Nelson-1994-13683,author = {Bradley Nelson and Pradeep Khosla},
title = {Integrating Sensor Placement and Visual Tracking Strategies},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {1994},
month = {May},
volume = {2},
pages = {1351-1356},
}