Preprint 16/2014

Expressive Power of Conditional Restricted Boltzmann Machines

Guido Montúfar, Nihat Ay, and Keyan Ghazi-Zahedi

Contact the author: Please use for correspondence this email.
Submission date: 14. Feb. 2014 (revised version: July 2014)
Pages: 26
published in: Journal of machine learning research, 16 (2015), p. 2405-2436 
Bibtex
with the following different title: Geometry and expressive power of conditional restricted Boltzmann machines
MSC-Numbers: 68T05, 60K99, 97R40
PACS-Numbers: 02.50.Cw, 07.05.Mh
Keywords and phrases: conditional restricted Boltzmann machine, universal approximation, Kullback-Leibler approximation error, expected dimension
Download full preprint: PDF (428 kB)

Abstract:
Conditional restricted Boltzmann machines are undirected stochastic neural networks with a layer of input and output units connected bipartitely to a layer of hidden units. These networks define models of conditional probability distributions on the states of the output units given the states of the input units, parametrized by interaction weights and biases. We address the representational power of these models, proving results on the minimal size of universal approximators of conditional probability distributions, the minimal size of universal approximators of deterministic functions, the maximal model approximation errors, and on the dimension of the set of representable conditional distributions. We contribute new tools for investigating conditional models and obtain significant improvements over the results that can be derived directly from existing work on restricted Boltzmann machine probability models.

18.10.2019, 02:15