Search

Talk

Geometric Aspects of Graphical Models and Neural Networks

  • Nihat Ay
  • Guido Montúfar
A3 02 (Seminar room)

Abstract

The first part of this course introduces the basic theory of finite random fields and corresponding geometric structures studied in information geometry. Topics include sampling algorithms and their convergence properties. Graphical models (an important class of random fields) will be discussed in more detail. The second part of the course deals with neural networks. We discuss Boltzmann machines (a kind of stochastic neural networks) and elaborate on related network architectures that have become prominent in machine learning applications (including restricted Boltzmann machines, deep belief networks, and deep Boltzmann machines). We will concentrate on geometric aspects of the different networks, addressing, in particular, the geometry of their parametrization and their expressive power.

Date and time info
Wednesday 10.15 - 11.45

Keywords
Information theory, Boltzmann machines, geometric aspects of networks

Audience
MSc students, PhD students, Postdocs

Language
English

lecture
01.10.14 31.01.15

Regular lectures Winter semester 2014-2015

MPI for Mathematics in the Sciences / University of Leipzig see the lecture detail pages

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail