Seminar
State Estimation and Localization for ROV-Based Reactor Pressure Vessel Inspection Using a Pan-Tilt-Zoom Camera
Event Location: NSH 1305Bio: Timothy E. Lee is a M.S. in Robotics graduate student at Carnegie Mellon University, advised by Prof. Nathan Michael. Timothy's field robotics research seeks to enable robust, efficient, and autonomous inspection of critical infrastructure. Specifically, he is working towards improving the efficiency of nuclear power by enabling camera-based navigation of underwater [...]
Detecting and Grasping Sorghum Stalks in Outdoor Occluded Environments
Event Location: GHC 6501Bio: Merritt Jenkins is an M.S. student in the Robotics Institute at Carnegie Mellon University, advised by Dr. George Kantor. Merritt's field robotics research focuses on perception and intelligent manipulation of plants in outdoor environments, enabling plant breeders and geneticists to make better-informed breeding decisions. Prior to CMU, Merritt received a B.E. [...]
Adaptive Spectroscopic Exploration Driven by Science Hypotheses for Geologic Mapping
Event Location: NSH 1507Bio: Alberto Candela Garza is an M.S. in Robotics student at Carnegie Mellon University, advised by Prof. David Wettergreen. Alberto is affiliated to the Field Robotics Center and is interested in science autonomy for planetary rovers. Prior to CMU, Alberto received a B.S. in Mechatronics Engineering and a B.S. in Industrial Engineering [...]
Sven Koenig: Progress on Multi-Robot Path Finding
Abstract Teams of robots often have to assign target locations among themselves and then plan collision-free paths to their target locations. Examples include autonomous aircraft towing vehicles and automated warehouse systems. For example, in the near future, autonomous aircraft towing vehicles might tow aircraft all the way from the runways to their gates (and vice [...]
David Held: Robots Learning to Understand Environmental Changes
Abstract Robots today are typically confined to operate in relatively simple, controlled environments. One reason for these limitation is that current methods for robotic perception and control tend to break down when faced with occlusions, viewpoint changes, poor lighting, unmodeled dynamics, and other challenging but common situations that occur when robots are placed in the [...]