Handheld micromanipulation with vision-based virtual fixtures - Robotics Institute Carnegie Mellon University

Handheld micromanipulation with vision-based virtual fixtures

Conference Paper, Proceedings of (ICRA) International Conference on Robotics and Automation, pp. 4127 - 4132, May, 2011

Abstract

Precise movement during micromanipulation becomes difficult in submillimeter workspaces, largely due to the destabilizing influence of tremor. Robotic aid combined with filtering techniques that suppress tremor frequency bands increases performance; however, if knowledge of the operator’s goals is available, virtual fixtures have been shown to greatly improve micromanipulator precision. In this paper, we derive a control law for position-based virtual fixtures within the framework of an active handheld micromanipulator, where the fixtures are generated in real-time from microscope video. Additionally, we develop motion scaling behavior centered on virtual fixtures as a simple and direct extension to our formulation. We demonstrate that hard and soft (motion-scaled) virtual fixtures outperform state-of-the-art tremor cancellation performance on a set of artificial but medically relevant tasks: holding, move-and-hold, curve tracing, and volume restriction.

BibTeX

@conference{Becker-2011-7269,
author = {Brian Becker and Robert MacLachlan and Gregory Hager and Cameron Riviere},
title = {Handheld micromanipulation with vision-based virtual fixtures},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2011},
month = {May},
pages = {4127 - 4132},
keywords = {medical robotics, microsurgery, virtual fixtures},
}