Search

Talk

Tensors, deep learning: new results and open problems

  • Ivan Oseledets (Skoltech Moscow)
E1 05 (Leibniz-Saal)

Abstract

Deep neural networks and tensors are different forms of approximation of multivariate functions. In this talk, I will give an overview of our recent results on tensor and matrix analysis, deep learning, and their connections

1) Desingularization of low-rank matrix manifolds (joint with V. Khrulkov)

2) The expressive power of recurrent neural networks (joint with V. Khrulkov and A. Novikov)

3) Universal adversarial examples and singular vectors (joint with V. Khrulkov)

4) Geometry score: a way to compare generative adversarial networks (joint with V. Khrulkov)

Mirke Olschewski

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of This Seminar