Search

Talk

Artificial Neural Networks and Machine Learning: Theoretical Foundations II

Abstract

This is a continuation of the course that I taught during the winter term 2018/2019. My general aim is to review core mathematical concepts and results that play an important role within the fields of neural networks and machine learning. I already presented various models and architectures of neural networks, together with their corresponding universal approximation properties. I will continue the course by addressing aspects of learning and generalisation within the framework of statistical learning theory. The generality of this theory will be exemplified in the context of neural networks and support vector machines. Further theoretical approaches to learning, in particular gradient-based approaches, will be reviewed. Here, the information-geometric perspective of the natural gradient method will be highlighted.

Date and time info
Thursday 11:15 - 12:45

Keywords
Neural networks, universal approximation, statistical learning theory, support vector machines, deep learning, information geometry

Audience
MSc students, PhD students, Postdocs

Language
English

lecture
01.04.19 31.07.19

Regular lectures Summer semester 2019

MPI for Mathematics in the Sciences / University of Leipzig see the lecture detail pages

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail