Neural network simulation at Warp speed: How we got 17 million connections per second
Conference Paper, Proceedings of IEEE International Joint Conference on Neural Networks, Vol. 2, pp. 143 - 150, July, 1988
Abstract
A fast back-propagation algorithm for a linear array of processors is described. Results of an implementation of this algorithm on Warp, a ten-processor, programmable systolic array computer, are reviewed and compared with back-propagation implementations on other machines. The current Warp simulator is about eight times faster at simulating the NETtalk text-to-speech network than the fastest back-propagation simulator previously reported in the literature. This fast simulator on Warp is being used routinely in a road-recognition experiment for robot navigation. Results indicate that linear systolic array machines can be efficient neural network simulators. Planned extensions and improvements to the current algorithm are discussed.
BibTeX
@conference{Pomerleau-1988-15414,author = {Dean Pomerleau and G. L. Gusciora and David S. Touretzky and H. T. Kung},
title = {Neural network simulation at Warp speed: How we got 17 million connections per second},
booktitle = {Proceedings of IEEE International Joint Conference on Neural Networks},
year = {1988},
month = {July},
volume = {2},
pages = {143 - 150},
}
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.