July 17, 2024    Mallory Lindahl

JI Zhang headshotCarnegie Mellon systems scientist Ji Zhang has received the prestigious Robotics: Science and Systems (RSS) 2024 Test of Time Award for his work on LOAM: Lidar Odometry and Mapping in Real-Time.

RSS introduced the Test of Time Award to acknowledge papers published at least ten years ago that had the greatest impact on robotics design or new approaches to problem solving. The award aims to cultivate discussion about developments in the robotics field by reflecting on the past and looking forward to future endeavors in the field. As this year’s awardee, Ji Zhang will present a virtual keynote on July 18th, with a Test of Time panel session devoted to his work on LOAM at the RSS 2024 conference.

“It’s called AI Meets Autonomy,” said Zhang. “The workshop is to bring researchers together from around the world to share their work at the intersection of natural language understanding, computer vision scene understanding, and autonomous navigation.”

Zhang’s work on LOAM began back in 2013, when he joined his PhD advisor, Sanjiv Singh, on an agricultural robotics project. Their goal was to use simultaneous location and mapping (SLAM) to automate an electrical utility vehicle and have it drive between tree rows in orchards. In order to achieve this goal, Singh and Zhang modified a 2D lidar to become 3D by adding a spinning mechanism to sense the surrounding environment.

The next year, their first LOAM paper was published in Robotics: Science and Systems, which expanded on the real-time method for odometry and mapping using a 2-axis lidar in 6-DOF (degrees of freedom). The method addressed the challenges of real-time mapping by developing systems that enabled the robot to move in multiple directions and cover a 360 degree horizontal plane, helping it to map environments more efficiently. This paper has remained critical to simultaneous location and mapping research, leading to its earning of the Test of Time award.

In 2015 and 2016, Zhang and Singh added a camera to the initial system in order to make its sensing abilities more robust. By using three sensing modalities– the new camera, lidar, and an Inertial Measurement Unit (IMU)– the device could move much faster and map larger areas at a time. Soon after, the team ventured into creating a flying device to create maps from both the air and the ground.

By 2017, the team had started on autonomous systems. They used LOAM as the building block for autonomous vehicles to develop collision avoidance and path planning. LOAM helped the robot understand where it was and create a map of its surroundings. The autonomy modules then used this information to guide the robot toward specific goal points or to explore the environment autonomously and create maps.

As work on autonomous exploration continued over the next five years, the research was further recognized by major robotics conferences. One paper was the first in history to win both Best Paper and Best Systems Paper at RSS 2021, with follow-up work winning Best Student Paper at IROS 2022. In 2023, Science Robotics published the most recent paper that presents a dual-resolution scheme to achieve time-efficient autonomous exploration in robots.

Today, Zhang is working with RI faculty member Wenshan Wang to take the fully built system and move it further toward the intersection of computer vision, natural language understanding, and autonomous navigation. Zhang and Wang are currently developing high-level AI modules that understand environmental scenes and language while guiding the robot autonomously.

“Now we have the full autonomy stack, which is a full autonomous navigation system that can guide the vehicle to go to goal points or to explore unknown environments and build a map,” said Zhang. “We’re trying to build a high level AI module that can understand people’s natural language and understand the scene with camera vision technologies to guide the vehicle to navigate autonomously.”

From its early applications in agricultural robots to its current integration of natural language understanding and computer vision, LOAM has helped grow the standards for real-time mapping and autonomous navigation. The Test of Time award highlights the lasting impact of Zhang’s research and its role in driving future robotics research.

Read the award-winning 2014 paper and watch the featured video.

For More Information: Aaron Aupperlee | 412-268-9068 | aaupperlee@cmu.edu