Capturing good videos outdoors can be challenging due to harsh lighting, unpredictable scene changes, and most relevant to this work, dynamic weather. Particulate weather, such as rain and snow, creates complex flickering effects that are irritating to people and confusing to vision algorithms. Although each raindrop or snowflake only affects a small number of pixels, collections of them have predictable global spatio-temporal properties. In this paper, we formulate a model of these global dynamic weather frequencies. To begin, we derive a physical model of raindrops and snowflakes that is used to determine the general shape and brightness of a single streak. This streak model is combined with the statistical properties of rain and snow, to determine how they effect the spatio-temporal frequencies of an image sequence. Once detected, these frequencies can then be suppressed. At a small scale, many things appear the same as rain and snow, but by treating them as global phenomena, we achieve better performance an with just a local analysis. We show the effectiveness of removal on a variety of complex video sequences.
Rain and Snow Removal via Spatio-Temporal Frequency Analysis
Project Head: Takeo Kanade