Stochastic Neural Networks with Monotonic Activation Functions - Robotics Institute Carnegie Mellon University

Stochastic Neural Networks with Monotonic Activation Functions

S. Ravanbakhsh, B. Poczos, J. Schneider, D. Schuurmans, and R. Greiner
Conference Paper, Proceedings of 19th International Conference on Artificial Intelligence and Statistics (AISTATS '16), Vol. 51, pp. 809 - 818, May, 2016

Abstract

We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units.

BibTeX

@conference{Ravanbakhsh-2016-119753,
author = {S. Ravanbakhsh and B. Poczos and J. Schneider and D. Schuurmans and R. Greiner},
title = {Stochastic Neural Networks with Monotonic Activation Functions},
booktitle = {Proceedings of 19th International Conference on Artificial Intelligence and Statistics (AISTATS '16)},
year = {2016},
month = {May},
volume = {51},
pages = {809 - 818},
}