Multidimensional Capacitive Sensing for Robot-Assisted Dressing and Bathing
Abstract
Robotic assistance presents an opportunity to benefit the lives of many people with physical disabilities, yet accurately sensing the human body and tracking human motion remain difficult for robots. We present a multidimensional capacitive sensing technique that estimates the local pose of a human limb in real time. A key benefit of this sensing method is that it can sense the limb through opaque materials, including fabrics and wet cloth. Our method uses a multielectrode capacitive sensor mounted to a robot's end effector. A neural network model estimates the position of the closest point on a person's limb and the orientation of the limb's central axis relative to the sensor's frame of reference. These pose estimates enable the robot to move its end effector with respect to the limb using feedback control. We demonstrate that a PR2 robot can use this approach with a custom six electrode capacitive sensor to assist with two activities of daily living- dressing and bathing. The robot pulled the sleeve of a hospital gown onto able-bodied participants' right arms, while tracking human motion. When assisting with bathing, the robot moved a soft wet washcloth to follow the contours of able-bodied participants' limbs, cleaning their surfaces. Overall, we found that multidimensional capacitive sensing presents a promising approach for robots to sense and track the human body during assistive tasks that require physical human-robot interaction.
Best Student Paper Award
BibTeX
@conference{Erickson-2019-127578,author = {Zackory Erickson and Henry M. Clever and Vamsee Gangaram and Greg Turk and C. Karen Liu and Charles C. Kemp},
title = {Multidimensional Capacitive Sensing for Robot-Assisted Dressing and Bathing},
booktitle = {Proceedings of IEEE 16th International Conference on Rehabilitation Robotics (ICORR '19)},
year = {2019},
month = {June},
pages = {224 - 231},
}