Robots with human-level dexterity have the potential to revolutionize object manipulation and reconstruction tasks, but achieving this level of dexterity comes with challenges, particularly when it comes to robots handling a wide variety of low-texture objects with high precision.
To address this challenge and push the boundaries of current robotic tactile sensing, RI Ph.D student Hung-Jui (Joe) Huang and his advisors Michael Kaess and Wenzhen Yuan introduced NormalFlow, a real-time tactile-based 6DoF (degrees of freedom) tracking algorithm. NormalFlow focuses on improving tracking to enhance robot manipulation, enabling reliable handling of low-texture objects while also benefiting other objects with more robust and accurate tracking.
NormalFlow outperforms current baseline tracking methods through its use of surface normals instead of surface point clouds. While registering point clouds between frames can help tracking the object’s movement, surface maps help describe how an object’s surface is tilted at every point, which allows the robot to better understand how the object is rotated and translated, even when it lacks explicit geometrical features.
Huang gave an example of one of the many situations NormalFlow could be applied.
“Imagine that a robot is required to insert a key to open up a door. Before inserting the key, the robot needs to track the slight displacement of the key in its fingers when trying to insert the key into the keyhole. When rotating the key in the keyhole, the robot needs to tell if it gets stuck during rotation and in which way. All of this requires precise and robust tracking of novel objects.”
The team conducted several different experiments to evaluate the speed and accuracy of the NormalFlow method for tracking objects. They employed GelSight, a vision-based tactile sensor to capture surface normals of several objects, such as an avocado, a can, a table, a wrench and a baseball. To gather pose data, they moved each object on the sensor’s surface, applying both rotational and translational motions in several directions. Through these experiments, they found that NormalFlow’s accuracy and speed surpassed current traditional methods, particularly with minimal texture objects.
Alongside tracking the team applied NormalFlow to a tactile-based 3D reconstruction task. Using a bead as the test subject, they found that the high precision of their algorithm allowed for excellent reconstructions that capture intricate details of the bead.
“While this project focuses on tracking, it is still exciting to see positive results in a brief 3D reconstruction test,” said Huang. “It proves that our algorithm has future potential for more detailed reconstruction tasks.”
NormalFlow was accepted to the IEEE Robotics and Automation Letters, and Huang will present the project at the 2025 IEEE International Conference on Robotics and Automation (ICRA) with a live demonstration of the algorithm’s tracking capabilities. In his future work, Huang aims to further develop sensing technologies based on NormalFlow and its initial success.
Visuals, open-source code and the research paper on NormalFlow can be found on the project website.
For More Information: Aaron Aupperlee | 412-268-9068 | aaupperlee@cmu.edu