Abstract for the talk on 29.10.2020 (17:00 h)Math Machine Learning seminar MPI MIS + UCLA
Yaim Cooper (Institute for Advanced Study, Princeton)
The geometry of the loss function of deep neural networks
The mathematical heart of deep learning is gradient descent on a loss function L. If gradient descent converges, it will converge to a critical point of L. Thus the geometry of the locus of critical points is of great interest. We will discuss what is known about the critical points of L, including dimension estimates and connectedness results.