Stochastic Neural Networks with Monotonic Activation Functions
Conference Paper, Proceedings of 19th International Conference on Artificial Intelligence and Statistics (AISTATS '16), Vol. 51, pp. 809 - 818, May, 2016
Abstract
We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units.
BibTeX
@conference{Ravanbakhsh-2016-119753,author = {S. Ravanbakhsh and B. Poczos and J. Schneider and D. Schuurmans and R. Greiner},
title = {Stochastic Neural Networks with Monotonic Activation Functions},
booktitle = {Proceedings of 19th International Conference on Artificial Intelligence and Statistics (AISTATS '16)},
year = {2016},
month = {May},
volume = {51},
pages = {809 - 818},
}
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.