Search

Talk

Compressed Sensing and Neural Networks

  • Ekkehard Schnoor (RWTH Aachen)
G3 10 (Lecture hall)

Abstract

This talk will be an introduction to dimensionality reduction with random projections, their use in the area of compressed sensing, and connections to deep learning. In the first part, we begin by discussing distance-preserving linear embeddings as in the classical Johnson-Lindenstrauss-Lemma. Then, we introduce the area of compressed sensing (inverse problems with sparsity constraints) and show how it relies on random measurement matrices satisfying the so-called restricted isometry property (RIP) as a sufficient guarantee for sparse recovery. In the second part, we will relate the previously discussed techniques to neural networks, including topics such as random initialization, (sparse) signal recovery using neural networks, compressed sensing using generative models etc.

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail