Bimanual Cloth Manipulation using Flow
Abstract
In this work, we address the problem of goal-directed cloth manipulation, a challenging task due to the deformability of cloth. Our insight is that optical flow, a technique normally used for motion estimation in video, can also provide an effective representation for corresponding cloth poses across observation and goal images. We introduce FabricFlowNet (FFN), a cloth manipulation policy that leverages flow as both an input and as an action representation to improve performance for folding tasks. FabricFlowNet also allows elegantly switching between dual-arm and single-arm actions based on the desired goal.
We show that FabricFlowNet outperforms state-of-the-art model-free and model-based cloth manipulation policies. We also present real-world experiments on a bimanual system, demonstrating effective sim-to-real transfer. In addition, we show that our method generalizes when trained on a single square cloth to other cloth shapes, such as t-shirts and rectangular cloth.
BibTeX
@mastersthesis{Bajracharya-2021-128575,author = {Sujay Bajracharya},
title = {Bimanual Cloth Manipulation using Flow},
year = {2021},
month = {August},
school = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-21-27},
}