Building Robot Hands and Teaching Dexterity
Abstract
Try typing on your keyboard, hammering a nail, or using chopsticks. Our hands
are the key to manipulating the world around us. They have incredible strength
at the fingertips to be able to pinch and grasp in over 70 different ways. Its
ability to feel is unparalleled. This great sensing and adaptation is controlled
by the great capabilities of our brains. In fact, the development of our brains is
often attributed to the need to manipulate the world around us using our hands.
In robotics, manipulation has been mostly limited to the claw gripper or suction
cup for pick-and-place in factories. However, our shared dream is to have
robot humanoids co-habitate with us and complete similar tasks that humans
do. Why don’t robotic humanoids with useful robot hands exist?
While there are a few robot hands available today, the popular opinion is that
they are difficult to use, expensive, and hard to obtain. We argue that this
is not an inherent problem of robot hands, but rather that these robot hands
are not designed with the right principles. In this thesis, we introduce a new
class of robot hands designed for machine learning and make them more
dexterous, lower cost, and easier to use than prevailing robot hands. They serve
as a significant, democratized entry point for many people entering dexterous
manipulation research. They foster conversations about robot hand design and
customization in the open-source community.
Next, how can robot hands imitate the human brain’s ability to complete
dexterous tasks in a human-like fashion? For instance, we must firmly grasp a
knife from the handle and not from the blade. While most robots used today
have fewer than 10 degrees of freedom, a humanoid with two hands has over
50 degrees of freedom with many points of contact with the environment. This
high dimensionality makes data-efficient learning extremely difficult. To rectify
this, we leverage internet-scale human experience from the web as training
data. Because robot hands have a similar morphology as human hands, we can
directly learn from retargeted human motion to teach robots with significantly
more data. In this thesis, we find that this unlocks the generalizable, human-like
behavior we seek.
BibTeX
@mastersthesis{Shaw-2024-140681,author = {Kenneth Shaw},
title = {Building Robot Hands and Teaching Dexterity},
year = {2024},
month = {May},
school = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-24-02},
}