Putting Image Manipulations in Context: Robustness Testing for Safe Perception
Abstract
We introduce a method to evaluate the robustness of perception systems to the wide variety of conditions that a deployed system will encounter. Using person detection as a sample safety-critical application, we evaluate the robustness of several state-of-the-art perception systems to a variety of common image perturbations and degradations. We introduce two novel image perturbations that use “contextual information” (in the form of stereo image data) to perform more physically-realistic simulation of haze and defocus effects. For both standard and contextual mutations, we show cases where performance drops catastrophically in response to barely-perceptible changes. We also show how robustness to contextual mutators can be predicted without the associated contextual information in some cases.
BibTeX
@conference{Pezzementi-2018-122258,author = {Zachary Pezzementi and Trenton Tabor and Samuel Yim and Jonathan K. Chang and Bill Drozd and David Guttendorf and Michael Wagner and Philip Koopman},
title = {Putting Image Manipulations in Context: Robustness Testing for Safe Perception},
booktitle = {Proceedings of IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR '18)},
year = {2018},
month = {August},
}