Introspective Perception through Identifying Blur, Light Direction, and Angle-of-View
Abstract
Robotic perception tasks have achieved great performance, especially in autonomous vehicles and robot assistance. However, we still often do not understand how and when perception tasks fail. Researchers have achieved some success in creating introspective perception systems that detect when perception tasks will fail, but they usually are tuned to only specific, connected perception tasks and do not identify the reasons for failure such as blur, light direction changes, and angle-of-view changes. To address this shortcoming, we work on the use case of a perception task that determines non-destructive block removal from a scene. We design a combined introspective perception system that detects three common failure types: blur, light direction, and angle-of-view. We split these failure cases into two target areas of research: Blur Detection & Classification and Light Direction & Angle-of-View Failure Prediction.
One of the main failure cases for perception tasks is due to blur in the camera image. Blur can occur from issues with motion, lighting, and focus. Proper classification of the blur type is important for determining corrective robot action (e.g., slow down, turn on a light, etc.), but most blur classification techniques do not classify lighting-related blur properly. Here, we present a new approach by including two new types of blur in blur detection and classification and evaluate its performance using an extension of a standard image dataset with an eye towards informing subsequent robot action.
Another key challenge in robot perception is due to changes in light direction and angle-of-view. To this end, we have identified if certain angles-of-view or light directions are more likely to cause failures than others. Here, we present a failure prediction network that used a new dataset to demonstrate how perception task performance can be predicted and explained through light direction and angle-of-view.
We combined these subsystems into one introspective perception system that allows us to identify failure in the perception task. We show that using an introspective perception system that can identify at least these three failure types have risk mitigation benefits for a perception task.
BibTeX
@mastersthesis{Hatfalvi-2022-132821,author = {Mary Theresa Hatfalvi},
title = {Introspective Perception through Identifying Blur, Light Direction, and Angle-of-View},
year = {2022},
month = {August},
school = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-22-33},
keywords = {Computer Vision, Introspective Perception, Blur Detection, Blur Classification, Self-Assessment},
}