Abstract for the talk on 08.10.2020 (17:00 h)Math Machine Learning seminar MPI MIS + UCLA
Yaoyu Zhang (Shanghai Jiao Tong University)
Impact of initialization on generalization of deep neural networks
See the video of this talk.
It is well-known that initialization could have huge impact on the performance of deep neural networks (DNNs). In this talk, focusing on the regression problems, I will present our empirical and theoretical studies about two types of influence of initialization on generalization of DNNs. The first type of influence is through a biased initial DNN output function, whereas the second type is through changing the behavior of training dynamics. I will also talk about the anti-symmetrical initialization (ASI) trick and other practical implications of our results.