Search

MiS Preprint Repository

Delve into the future of research at MiS with our preprint repository. Our scientists are making groundbreaking discoveries and sharing their latest findings before they are published. Explore repository to stay up-to-date on the newest developments and breakthroughs.

MiS Preprint
59/2020

On the Locality of the Natural Gradient for Deep Learning

Nihat Ay

Abstract

We study the natural gradient method for learning in deep Bayesian networks, including neural networks. There are two natural geometries associated with such learning systems consisting of visible and hidden units. One geometry is related to the full system, the other one to the visible sub-system. These two geometries imply different natural gradients. In a first step, we demonstrate a great simplification of the natural gradient with respect to the first geometry, due to locality properties of the Fisher information matrix. This simplification does not directly translate to a corresponding simplification with respect to the second geometry. We develop the theory for studying the relation between the two versions of the natural gradient and outline a method for the simplification of the natural gradient with respect to the second geometry based on the first one. This method suggests to incorporate a recognition model as an auxiliary model for the efficient application of the natural gradient method in deep networks.

Received:
May 21, 2020
Published:
May 22, 2020
Keywords:
natural gradient, Fisher-Rao metric, Deep learning, Helmholtz machines, wake-sleep algorithm

Related publications

inJournal
2023 Journal Open Access
Nihat Ay

On the locality of the natural gradient for learning in deep Bayesian networks

In: Information geometry, 6 (2023) 1, pp. 1-49