Vision-Based Control of a Handheld Surgical Micromanipulator with Virtual Fixtures
Abstract
Performing micromanipulation and delicate operations in submillimeter workspaces is difficult because of destabilizing tremor and imprecise targeting. Accurate micromanipulation is especially important for microsurgical procedures, such as vitreoretinal surgery, to maximize successful outcomes and minimize collateral damage. Robotic aid combined with filtering techniques that suppress tremor frequency bands increases performance; however, if knowledge of the operator’s goals is available, virtual fixtures have been shown to further improve performance. In this paper, we derive a virtual fixture framework for active handheld micromanipulators that is based on high-bandwidth position measurements rather than forces applied to a robot handle. For applicability in surgical environments, the fixtures are generated in real-time from microscope video during the procedure. Additionally, we develop motion scaling behavior around virtual fixtures as a simple and direct extension to the proposed framework. We demonstrate that virtual fixtures significantly outperform tremor cancellation algorithms on a set of synthetic tracing tasks (p < 0.05). In more medically relevant experiments of vein tracing and membrane peeling in eye phantoms, virtual fixtures can significantly reduce both positioning error and forces applied to tissue (p < 0.05).
BibTeX
@article{Becker-2013-7668,author = {Brian Becker and Robert MacLachlan and Louis A. Lobes Jr. and Gregory Hager and Cameron Riviere},
title = {Vision-Based Control of a Handheld Surgical Micromanipulator with Virtual Fixtures},
journal = {IEEE Transactions on Robotics},
year = {2013},
month = {February},
volume = {29},
number = {3},
pages = {674 - 683},
keywords = {Micro/nano robots, dexterous manipulation, motion control, medical robotics, vision-based control, microsurgery, tremor canceling},
}