Sensory Attention: Computational Sensor Paradigm for Low-Latency Adaptive Vision - Robotics Institute Carnegie Mellon University

Sensory Attention: Computational Sensor Paradigm for Low-Latency Adaptive Vision

Workshop Paper, DARPA Image Understanding Workshop (IUW '97), pp. 177 - 183, May, 1997

Abstract

The need for robust self?ontained and low-latency vision systems is growing: high speed visual servoing and vision?ased human computer interface. Conventional vision systems can hardly meet this need because 1) the latency is incurred in a data transfer and computational bottlenecks, and 2) there is no top?own feedback to adapt sensor performance for improved robustness. In this paper we present a tracking computational sensor - a VLSI implementation of a sensory attention. The tracking sensor focuses attention on a salient feature in its receptive field and maintains this attention in the world coordinates. Using both low?atency massive parallel processing and top?own sensory adaptation, the sensor reliably tracks features of interest while it suppresses other irrelevant features that may interfere with the task at hand.

BibTeX

@workshop{Brajovic-1997-14371,
author = {Vladimir Brajovic and Takeo Kanade},
title = {Sensory Attention: Computational Sensor Paradigm for Low-Latency Adaptive Vision},
booktitle = {Proceedings of DARPA Image Understanding Workshop (IUW '97)},
year = {1997},
month = {May},
pages = {177 - 183},
}