3D SLAM for Powered Lower Limb Prosthesis - Robotics Institute Carnegie Mellon University

3D SLAM for Powered Lower Limb Prosthesis

Master's Thesis, Tech. Report, CMU-RI-TR-21-21, Robotics Institute, Carnegie Mellon University, June, 2021

Abstract

During locomotion, humans use visual feedback to adjust their leg movement when navigating the environment. This natural behavior is lost, however, for lower-limb amputees, as current control strategies of prosthetic legs do not typically consider environment perception. With the ultimate goal of achieving environment-awareness and adaptability for prosthetic legs, we here seek as an initial step to take advantage of the rapid development in affordable hardware and advanced algorithms for simultaneous localization and mapping (SLAM) that has fueled the integration of perception in the control of mobile robots including legged machines, and propose, implement and analyze a SLAM integration for a lower limb prosthesis. To this end, we first simulate the motion of a range sensor mounted on a prosthesis and investigate the performance of state of art SLAM algorithms subjected to the rapid motions seen in lower limb movements. Our simulation results highlight the challenges of drift and registration errors stemming from the dynamic motion sensor. Based on these observations and knowledge about the walking gait, we then implement a modified SLAM pipeline in hardware that uses gait phase information to bypass these challenges by resetting the global map and odometry at the beginning of each stride. This pipeline uses an RGB-D camera to perform a dense reconstruction of the terrain directly in front of the prosthesis using a colored point cloud registration algorithm. In preliminary tests with one able-bodied subject, we find the algorithm creates dense representations of multiple obstacles with an accuracy of 11mm, while simultaneously tracking the camera pose with an accuracy of 19mm. Although we conducted the hardware experiments with the registration algorithms running offline, our results suggest that SLAM methods can be implemented on lower-limb prostheses with sufficient accuracy to enable environment perception, opening up avenues for the development of advanced control strategies of prosthetic legs that more proactively adapt to changes in the environment and, thus, unburden their amputee users.

BibTeX

@mastersthesis{Shah-2021-127969,
author = {Manan Shah},
title = {3D SLAM for Powered Lower Limb Prosthesis},
year = {2021},
month = {June},
school = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-21-21},
}