Loading Events

VASC Seminar

October

24
Mon
Andrew Johnson Principal Member of Technical Staff NASA
Monday, October 24
3:00 pm to 12:00 am
Safe and Precise Planetary Landing Technologies

Event Location: NSH3305
Bio: Dr. Andrew E. Johnson graduated with Highest Distinction from the University of Kansas in 1991 with a BS in Engineering Physics and a BS in Mathematics. In 1997, he received his Ph.D. from the Robotics Institute at Carnegie Mellon University where he developed the spin-image surface signature for three dimensional object recognition and surface matching. Currently, he is a Principal Member of Technical Staff at NASA’s Jet Propulsion Laboratory where he is developing image-based techniques for autonomous navigation and mapping during descent to planets moons, comets and asteroids. At JPL, Dr. Johnson has worked on technology development tasks as well as flight projects. For the Mars Exploration Rover Project, Dr. Johnson was the lead algorithm developer for the Descent Image Motion Estimation Subsystem, the first autonomous machine vision system used during planetary landing. Following the successful development and execution of DIMES, he is now moving back to the development of systems for landing hazard avoidance, pin-point landing and rover navigation. Currently he is the Task Manager for the Mars 2018 Lander Vision System and the Supervisor for the Guidance Navigation and Control Hardware and Testbed Development Group. In 2011, Dr. Johnson was awarded a NASA Exceptional Technology Achievement Medal for “significant contributions in evolving Terrain Relative Navigation and Hazard Detection and Avoidance technologies for NASA missions.”

Abstract: Landing hazards are surface features that could damage a planetary lander during touch down. Landing hazards include tall rocks, steep slopes, scarps, cliffs and craters. The purpose of Hazard Detection and Avoidance (HDA) is to autonomously detect these hazards near the landing site and then determine a new landing site that is safe. After selection, Hazard Relative Navigation (HRN) estimates the position of the lander relative to the safe site so that it can be targeted accurately.

In contrast, Terrain Relative Navigation (TRN) estimates the position of a lander relative to a map built from a-priori orbital reconnaissance. If the lander has enough fuel, TRN can be used to guide the lander to a single landing site of importance (e.g., the sample cache for Mars Sample Return or a lunar cave for exploration). TRN can also be used to avoid large hazards enabling more hazardous landing ellipses to be selected by the science community (e.g., Multi-X for Mars 2018).

Over the past decade, JPL has been developing HDA, HRN and TRN systems using passive visible and active lidar imaging sensing modalities. This talk will describe the sensors, and algorithms developed and performance from aircraft and sounding rocket field test campaigns. Results will show that lidar based TRN can achieve 90m accuracy under any lighting conditions, passive visible TRN can achieve 10m accuracy during the day, lidar-based HDA can detect 40cm hazards from 1km and lidar-based HRN can navigate relative to a specified site to within 1m.