RCA: Ride Comfort-Aware First-Person Navigation via Self-Supervised Learning
Abstract
Under shared autonomy, wheelchair users expect vehicles to provide safe and comfortable rides while following users' high-level navigation plans. To find such a path, vehicles negotiate with different terrains and assess their traversal difficulty. Most prior works model surroundings either through geometric representations or semantic classifications, which do not reflect perceived motion intensity and ride comfort in downstream navigation tasks. We propose to model ride comfort explicitly in traversability analysis using proprioceptive sensing. We develop a self-supervised learning framework to predict traversability costmap from first-person-view images by leveraging vehicle states as training signals. Our approach estimates how the vehicle would ''feel'' if traversing over based on terrain appearances. We then show our navigation system provides human-preferred ride comfort through robot experiments together with a human evaluation study.
BibTeX
@mastersthesis{Yao-2022-131679,author = {Xinjie Yao},
title = {RCA: Ride Comfort-Aware First-Person Navigation via Self-Supervised Learning},
year = {2022},
month = {May},
school = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-22-15},
keywords = {Autonomous Vehicle Navigation; Vision-Based Navigation; Self-Supervised Learning},
}