
Mathematical Machine Learning
Head:
Guido Montúfar (Email)
Phone:
+49 (0) 341 - 9959 - 880
Fax:
+49 (0) 341 - 9959 - 658
Address:
Inselstr. 22
04103 Leipzig

Math Machine Learning seminar MPI MiS + UCLA
The Math Machine Learning seminar MPI MiS + UCLA is an online seminar series (via Zoom) organized by Guido Montúfar's research group at the Max Planck Institute for Mathematics in the Sciences and UCLA. The seminar is usually Thursdays (sometimes also Fridays) at 5 pm (CEST) (GMT+2; PDT: 8-9 am). The talks are about 50 minutes with time for questions and discussion.
If you want to participate in this video broadcast please register using our registration form (see the abstract for every talk). One day before each seminar, an announcement with the Zoom link is mailed to the Math ML seminar e-mail list and the registered participants. Some talks might be officially recorded at the discretion of the speaker. Enthusiasts of mathematical machine learning are very welcome to attend the sessions!
You can find titles, abstracts and recordings of upcoming and previous seminar sessions below.
Upcoming Seminars
28.01.2021, 17:00 Uhr:
- Suriya Gunasekar (Microsoft Research, Redmond)
- Rethinking the role of optimization in learning
- Videobroadcast, registration required, check abstract for details
04.02.2021, 17:00 Uhr:
- Tailin Wu (Stanford University)
- to be announced
- Videobroadcast, registration required, check abstract for details
11.02.2021, 17:00 Uhr:
- Samuel L. Smith (DeepMind)
- to be announced
- Videobroadcast, registration required, check abstract for details
18.02.2021, 17:00 Uhr:
- Umut Simsekli (INRIA - École Normale Supérieure (Paris))
- to be announced
- Videobroadcast, registration required, check abstract for details
25.02.2021, 17:00 Uhr:
- Zhiyuan Li (Princeton University)
- to be announced
- Videobroadcast, registration required, check abstract for details
04.03.2021, 17:00 Uhr:
- Omar Rivasplata (DeepMind and University College London)
- to be announced
- Videobroadcast, registration required
Previous Seminars
02.04.2020, 11:00 Uhr:
- Yu Guang Wang (University of New South Wales, Sydney)
- Haar Graph Pooling
- see PDF Abstract
- see video
09.04.2020, 17:00 Uhr:
- Michael Arbel (University College London)
- Kernelized Wasserstein Natural Gradient
- see video
16.04.2020, 17:00 Uhr:
- Anton Mallasto (University of Copenhagen)
- Estimation of Optimal Transport in Generative Models
- see video
23.04.2020, 17:00 Uhr:
- Johannes Mueller (MPI MIS, Leipzig)
- Deep Ritz Revisited
- see video
30.04.2020, 17:00 Uhr:
- Quynh Nguyen (University Saarbruecken)
- Loss surface of deep and wide neural networks
- see video
07.05.2020, 17:00 Uhr:
- Kathlén Kohn (KTH Royal Institute of Technology)
- The geometry of neural networks
- see video
14.05.2020, 17:00 Uhr:
- Benjamin Fehrman (University of Oxford)
- Convergence rates for the stochastic gradient descent method for non-convex objective functions
- see video
21.05.2020, 17:00 Uhr:
- Dennis Elbrächter (Universität Wien)
- How degenerate is the parametrization of (ReLU) neural networks?
- see video
28.05.2020, 17:00 Uhr:
- Jonathan Frankle (Massachusetts Institute of Technology)
- The Lottery Ticket Hypothesis: On Sparse, Trainable Neural Networks
04.06.2020, 17:00 Uhr:
- Ulrich Terstiege (Rheinisch-Westfälische Technische Hochschule Aachen)
- Learning deep linear neural networks: Riemannian gradient flows and convergence to global minimizers
11.06.2020, 17:00 Uhr:
- Poorya Mianjy (Johns Hopkins University)
- Understanding the Algorithmic Regularization due to Dropout
- see video
18.06.2020, 17:00 Uhr:
- Alessandro Achille (University of California, Los Angeles)
- Structure of Learning Tasks and the Information in the Weights of a Deep Network
- see video
25.06.2020, 17:00 Uhr:
- Adam Gaier (INRIA)
- Weight Agnostic Neural Networks
02.07.2020, 17:00 Uhr:
- Wenda Zhou (Columbia University)
- New perspectives on cross-validation
- see video
03.07.2020, 17:00 Uhr:
- Kai Fong Ernest Chong (Singapore University of Technology and Design)
- The approximation capabilities of neural networks
- see video
10.07.2020, 17:00 Uhr:
- Nasim Rahaman (MPI-IS Tübingen, and Mila, Montréal)
- On the Spectral Bias of Neural Networks
- see video
23.07.2020, 17:00 Uhr:
- Léonard Blier (Facebook AI Research, Université Paris Saclay, Inria)
- The Description Length of Deep Learning Models
30.07.2020, 17:00 Uhr:
- Robert Peharz (TU Eindhoven)
- Minimal Random Code Learning: Getting Bits Back from Compressed Model Parameters
- see video
31.07.2020, 17:00 Uhr:
- Greg Ongie (University of Chicago)
- A function space view of overparameterized neural networks
- see video
06.08.2020, 17:00 Uhr:
- Simon S. Du (University of Washington)
- Ultra-wide Neural Network and Neural Tangent Kernel
- see video
13.08.2020, 17:00 Uhr:
- Lénaïc Chizat (CNRS - Laboratoire de Mathématiques d'Orsay)
- Analysis of Gradient Descent on Wide Two-Layer ReLU Neural Networks
- see video
14.08.2020, 17:00 Uhr:
- Ido Nachum (École Polytechnique Fédérale de Lausanne)
- Regularization by Misclassification in ReLU Neural Networks
- see video
20.08.2020, 17:00 Uhr:
- Arthur Jacot (École Polytechnique Fédérale de Lausanne)
- Neural Tangent Kernel: Convergence and Generalization of DNNs
- see video
21.08.2020, 17:00 Uhr:
- Preetum Nakkiran (Harvard University)
- Distributional Generalization: A New Kind of Generalization
- see video
27.08.2020, 17:00 Uhr:
- Felix Draxler (Heidelberg University)
- Characterizing The Role of A Single Coupling Layer in Affine Normalizing Flows
- see video
03.09.2020, 17:00 Uhr:
- Niladri S. Chatterji (UC Berkeley)
- Upper and lower bounds for gradient based sampling methods
- see video
04.09.2020, 17:00 Uhr:
- Nadav Cohen (Tel Aviv University)
- Analyzing Optimization and Generalization in Deep Learning via Dynamics of Gradient Descent
- see video
17.09.2020, 17:00 Uhr:
- Mahito Sugiyama (National Institute of Informatics, JST, PRESTO)
- Learning with Dually Flat Structure and Incidence Algebra
18.09.2020, 17:00 Uhr:
- Mert Pilanci (Stanford University)
- Exact Polynomial-time Convex Formulations for Training Multilayer Neural Networks: The Hidden Convex Optimization Landscape
24.09.2020, 17:00 Uhr:
- Liwen Zhang (University of Chicago)
- Tropical Geometry of Deep Neural Networks
- see video
25.09.2020, 17:00 Uhr:
- Randall Balestriero (Rice University)
- Max-Affine Spline Insights into Deep Networks
- see video
01.10.2020, 17:00 Uhr:
- David Rolnick (McGill University & Mila)
- Expressivity and learnability: linear regions in deep ReLU networks
- see video
08.10.2020, 17:00 Uhr:
- Yaoyu Zhang (Shanghai Jiao Tong University)
- Impact of initialization on generalization of deep neural networks
- see video
15.10.2020, 17:00 Uhr:
- Guy Blanc (Stanford University)
- Provable Guarantees for Decision Tree Induction
- see video
22.10.2020, 17:00 Uhr:
- Boris Hanin (Princeton University)
- Neural Networks at Finite Width and Large Depth
- see video
29.10.2020, 17:00 Uhr:
- Yaim Cooper (Institute for Advanced Study, Princeton)
- The geometry of the loss function of deep neural networks
05.11.2020, 17:00 Uhr:
- Kenji Kawaguchi (MIT)
- Deep learning: theoretical results on optimization and mixup
12.11.2020, 17:00 Uhr:
- Yasaman Bahri (Google Brain)
- The Large Learning Rate Phase of Wide, Deep Neural Networks
- see video
20.11.2020, 17:00 Uhr:
- Amartya Sanyal (University of Oxford)
- How benign is benign overfitting?
- see video
17.12.2020, 17:00 Uhr:
- Tim G. J. Rudner (University of Oxford)
- Outcome-Driven Reinforcement Learning via Variational Inference
14.01.2021, 17:00 Uhr:
- Mahdi Soltanolkotabi (University of Southern California)
- Learning via early stopping and untrained neural nets
- see video
21.01.2021, 17:00 Uhr:
- Taiji Suzuki (The University of Tokyo, and Center for Advanced Intelligence Project, RIKEN, Tokyo)
- Statistical efficiency and optimization of deep learning from the viewpoint of non-convexity