Conference on Mathematics of Machine Learning

Program

All talks will be held in room Plenarsaal,
Center for Interdisciplinary Research (ZiF), Bielefeld University (Bielefeld, Methoden 1) and broadcasted via Zoom.

 
Wednesday, August 04, 2021
10:00 - 10:30Welcome
10:30 - 11:00Coffee Break
11:00 - 12:00Lenka Zdeborová (EPFL - École Polytechnique Fédérale de Lausanne, Switzerland)
Understand machine learning via exactly solvable statistical physics models
12:00 - 12:30Youness Boutaib (RWTH Aachen University, Germany)
Path classification by stochastic linear RNNs
12:30 - 13:00Yuguang Wang (Max Planck Institute for Mathematics in the Sciences (Leipzig), Germany)
How framelets enhance graph neural networks
13:00 - 14:00Lunch Break
14:00 - 14:30Luca Ratti (University of Genoa (Genova), Italy)
Learning the optimal regularizer for linear inverse problems
14:30 - 15:00Florent Krzakala (EPFL (Lausanne), Switzerland)
Generalization & Overparametrization in Machine Learning: Rigorous Insights from Simple Models
15:00 - 15:30Coffee Break
15:30 - 16:00Oxana Manita (Eindhoven University of Technology, Netherlands)
Dropout regularization viewed from the large deviations perspective
16:00 - 16:30Jochen Merker (HTWK Leipzig, Germany)
Complexity-reduced data models beyond the classical bias-variance trade-off
16:30 - 17:00Coffee Break
17:00 - 18:00Peter Bartlett (University of California, Berkeley, USA)
Benign overfitting
18:00 - 19:00Stefano Soatto (University of California, Los Angeles, USA)
The Information in Optimal Representations
19:00Dinner @ ZIF
 
Thursday, August 05, 2021
09:30 - 10:30Ingo Steinwart (University of Stuttgart, Germany)
to be announced
10:30 - 11:00Coffee Break
11:00 - 12:00Dmitry Yarotsky (Skoltech, Russia)
Universal scaling laws in the gradient descent training of neural networks
12:00 - 12:30Andreas Habring (University of Graz, Austria)
A Generative Variational Model for Inverse Problems in Imaging
12:30 - 14:00Lunch Break
14:00 - 15:00Poster Session
15:00 - 15:30Coffee Break
15:30 - 16:30Stefanie Jegelka (Massachusetts Institute of Technology, USA)
Learning and Generalization in Graph Neural Networks
16:30 - 17:00Coffee Break
17:00 - 18:00Pierre Baldi (University of California, Irvine, USA)
Neural Capacity and Attention
18:00 - 19:00Yasaman Bahri (Google Brain, USA)
Dynamics & Phase Transitions in Wide, Deep Neural Networks
 
Friday, August 06, 2021
09:30 - 10:30Matthias Hein (University of Tübingen, Germany)
Towards neural networks which know when they don't know
10:30 - 11:00Coffee Break
11:00 - 12:00Kathlén Kohn (KTH - Royal Institute of Technology, Sweden)
The Geometry of Linear Convolutional Networks
12:00 - 12:30Luigi Malagò (Transylvanian Institute of Neuroscience (TINS) (Cluj-Napoca), Romania)
Lagrangian and Hamiltonian Mechanics for Probabilities on the Statistical Bundle
12:30 - 13:00Kirandeep Kour (Max Planck Institute for Dynamics of Complex Technical Systems (Magdeburg), Germany)
A Low-rank Support Tensor Network
13:00 - 14:00Lunch Break
14:00 - 14:30Matthias Löffler (ETH Zürich (Zurich), Switzerland)
AdaBoost and robust one-bit compressed sensing
14:30 - 15:00Benjamin Fehrman (University of Oxford, United Kingdom)
Convergence rates for stochastic gradient descent algorithms in non-convex loss landscapes
15:00 - 15:30Coffee Break
15:30 - 16:30Terry Lyons (University of Oxford, United Kingdom)
From rough paths to streamed data
16:30 - 17:00Coffee Break
17:00 - 17:30Michael Murray (University of Oxford (Oxford, UK), United Kingdom)
Activation Function Design for Deep Networks: Linearity and Effective Initialisation
17:30 - 18:00Daniel McKenzie (University of California, Los Angeles, USA)
Learning to predict equilibria from data using fixed point networks
18:00 - 19:00Jeffrey Pennington (Google Brain, USA)
Demystifying deep learning through high-dimensional statistics
19:00Conference Dinner
 
Saturday, August 07, 2021
09:30 - 10:30Arnulf Jentzen (Universität Münster, Germany)
Convergence analysis for gradient flows in the training of artificial neural networks with ReLU activation
10:30 - 11:00Coffee Break
11:00 - 12:00Minh Hà Quang (RIKEN-AIP, Japan)
Regularized information geometric and optimal transport distances between covariance operators and Gaussian processes
12:00 - 12:30Hanyuan Hang (University of Twente (Enschede), Netherlands)
A Combination of Ensemble Methods for Large-Scale Regression
12:30 - 13:00Soon Hoe Lim (Nordita, KTH Royal Institute of Technology and Stockholm University, Sweden)
Noisy Recurrent Neural Networks
13:00 - 14:00Lunch Break
14:00 - 14:30Guido Montúfar (Max Planck Institute for Mathematics in the Sciences (Leipzig), Germany, and UCLA ( ), USA)
Implicit bias of gradient descent for mean squared error regression with wide neural networks
14:30 - 15:00Michael Schmischke (Chemnitz University of Technology, Germany)
High-Dimensional Explainable ANOVA Approximation
15:00 - 15:30Coffee Break
15:30 - 16:00Sebastian Kassing (WWU Münster, Germany)
Convergence of Stochastic Gradient Descent for Analytic Target Functions
16:00 - 16:30Burim Ramosaj (TU Dortmund, Germany)
Interpretable Machines - Constructing valid Prediction Intervals with Random Forest
16:30 - 17:00Coffee Break
17:00 - 18:00Jason Lee (Princeton University, USA)
Representation Learning
18:00 - 19:00Věra Kůrková (Czech Academy of Sciences, Czech Republic)
Some implications of high-dimensional geometry for neurocomputing

Date and Location

August 04 - 07, 2021 (previously planned for February 22 - 25, 2021)
Center for Interdisciplinary Research (ZiF), Bielefeld University
Methoden 1
33615 Bielefeld

Scientific Organizers

Benjamin Gess, MPI for Mathematics in the Sciences & Universität Bielefeld

Guido Montúfar, MPI for Mathematics in the Sciences & UCLA

Nihat Ay
Hamburg University of Technology
Institute of Data Science Foundations

23.08.2021, 09:28