Zusammenfassung für den Vortrag am 31.07.2020 (17:00 Uhr)Math Machine Learning seminar MPI MIS + UCLA
Greg Ongie (University of Chicago)
A function space view of overparameterized neural networks
Contrary to classical bias/variance trade-offs, deep learning practitioners have observed that vastly overparameterized neural networks with the capacity to fit virtually any labels nevertheless generalize well when trained on real data. One possible explanation of this phenomenon is that complexity control is being achieved by implicitly or explicitly controlling the magnitude of the weights of the network. This raises the question: What functions are well-approximated by neural networks whose weights are bounded in norm? In this talk, I will give some partial answers to this question. In particular, I give a precise characterization of the space of functions realizable as a two-layer (i.e., one hidden layer) neural network with ReLU activations having an unbounded number of units, but where the Euclidean norm of the weights in the network remains bounded. Surprisingly, this characterization is naturally posed in terms of the Radon transform as used in computational imaging, and I will show how tools from Radon transform analysis yield novel insights about learning with two and three-layer ReLU networks.
One day before the seminar, an announcement with the Zoom link will be sent to those who registered with Valeria Hünniger. For this please contact her at Valeria Hünniger.