Abstract:
We are on the cusp of a fundamental revolution in how robotics at large will be consumed by and assimilated into our everyday life. In the next decade, state of the art robot platforms will become easier to deploy, more accessible to purchase, and readily available to non-expert users. An increasing body of work has found that developing algorithms for robots to engage in close, natural and extended interactions with humans presents peculiar challenges. Existing approaches for robot control and motion planning, originally developed for robots that work in structured environments and in isolation from humans, fall short when dealing with un-expected or un-observable human behavior; on the other side, current technologies for human interaction are limited within the confines of structured algorithms and opaque systems. Developing effective and efficient human-robot interactions therefore calls for a paradigm shift in what have been traditional approaches to robotics problems. In my talk, I review my work on developing robot algorithms for physical and human interaction, with specific focus on applications in human-robot collaboration and advanced manufacturing. My vision is for the human-robot collaborative interaction to be at the center of a larger scope of contributions to problems that have been traditionally dealt with in isolation. To this end, my future research agenda will focus on: i) distributed, reactive, whole-body controllers for physical interaction; ii) tactile technologies (i.e. artificial skins) as a key enabler to achieve richer physical and human interactions; iii) algorithmic approaches to the broad HRI problem, i.e. building human-aware, transparent robot controllers that tackle the human interaction problem at the same level of abstraction as the physical interaction problem.
Bio:
Alessandro Roncone is a Postdoctoral Associate at the Social Robotics Lab in Yale University, under the supervision of prof. Brian Scassellati. The scope of his current research is within the larger field of human–robot collaboration and advanced manufacturing. He implements transparent, human–centered technologies where robots transition from being recipients of human instructions to becoming proficient and proactive collaborators. He incorporates natural language into classical task planning algorithms, with the goal of developing robots that are able to: i) provide effective support to the human when he needs it the most; ii) learn complex hierarchical representations from single instructions; iii) proactively ask questions and provide contextual information to query and share internal states and intents.
Alessandro Roncone received his B.Sc. summa cum laude in Biomedical Engineering in February 2008, and his M.Sc. summa cum laude in NeuroEngineering in July 2011 from University of Genoa, Italy. In April 2015 he completed his Ph.D. in Robotics, Cognition and Interaction Technologies from University of Genoa and Italian Institute of Technology [IIT], working in the Robotics, Brain and Cognitive Sciences department and the iCub Facility under the supervision of prof. Giorgio Metta. The goal of his Ph.D. project was to exploit insights from the neurosciences in order to improve the sensorimotor capabilities of the iCub humanoid robot, by implementing a bio-inspired system able to learn a multisensory representation of the space around the robot’s body (or peripersonal space). His research also made contributions to the field of optimization-based approaches to inverse kinematics and robot control. Specifically, he implemented a state of the art gaze stabilization framework—later integrated with an existing gaze controller. His work in the topic formally solved the problem of controlling a binocular head to foveate toward an arbitrary 3D point in space, and concurrently exploiting redundancy to stabilize gaze at the same time.