Robotic contour following based on visual servoing - Robotics Institute Carnegie Mellon University

Robotic contour following based on visual servoing

E. C. Maniere, P. Couvignou, and Pradeep Khosla
Conference Paper, Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 2, pp. 716 - 722, July, 1993

Abstract

Presents a coherent approach for visual servoing tasks that consists of following the contours of unknown objects using a hand-eye robotic system. The motion of the contour relative to the camera sensor is estimated in real-time by processing the measured optical flow of a set of relevant feature points. The desired motion of the end-effector is then computed to minimize the displacements of the feature points with respect to a reference configuration in the image. Control schemes are used in order to stabilize the robot and enhance tracking performances. The authors illustrate the approach by experimental results acquired with a real-time environment, where their CMU-Direct-Drive Arm II follows the boundaries of motionless objects positioned in a plane parallel to the image plane. Three degrees of freedom of planar motions are thus servoed to perform the contour-following task.

BibTeX

@conference{Maniere-1993-13534,
author = {E. C. Maniere and P. Couvignou and Pradeep Khosla},
title = {Robotic contour following based on visual servoing},
booktitle = {Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems},
year = {1993},
month = {July},
volume = {2},
pages = {716 - 722},
}