Search

Talk

Are activation functions required for learning in all deep networks?

  • Grigoris Chrysos (UW Madison)
Live Stream

Abstract

Activation functions play a pivotal role in deep neural networks, enabling them to tackle complex tasks like image recognition. However, activation functions also introduce significant challenges for deep learning theory, network dynamics analysis, and properties such as interpretability and privacy preservation. In this talk, we revisit the necessity of activation functions across various scenarios. Specifically, we explore expressing network outputs through high-degree interactions among input elements using multilinear algebra. This approach allows us to attain the necessary expressivity via these intricate interactions. Yet, the question remains: Is this expressivity alone sufficient for effective learning? Our recent research, presented at ICLR’24, unveils meticulously designed networks employing multilinear operations. Remarkably, these networks achieve strong performance even in demanding tasks, such as ImageNet image recognition.

seminar
23.05.24 13.06.24

Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of this Seminar