Search

Talk

Spectral clustering, Laplacian Eigenmaps, neural network approximation and beyond

  • Chenghui Li (University of Wisconsin-Madison)
E2 10 (Leon-Lichtenstein)

Abstract

Spectral clustering is a widely employed technique for uncovering underlying data structures. Researchers have explored various tools to analyze its theoretical properties. Recently, a variational method is employed to demonstrate the convergence of spectral clustering results towards a continuum operator that captures the geometry of the underlying manifold. This research line not only offers efficient explanation to spectral clustering but also inspires algorithm design.

In this talk, I will present two significant applications of this convergence, focusing on algorithm design and interpretation. Firstly, I will discuss the two sufficient conditions for designing an algorithm capable of solving Multi-manifold clustering problems. Subsequently, I will present an explicit algorithm that satisfies these sufficient conditions.

Moving forward, I will introduce a cutting-edge technique known as Spectral Neural Network, which showcases state-of-the-art performance in the field of contrastive learning. I will demonstrate that the curse of dimensionality does not afflict the number of neurons required to approximate the manifold's geometry. Furthermore, I will explore the optimization perspective of Spectral Neural Networks, revealing the convexity of the ambient problem in a quotient geometry. Additionally, I will illustrate the existence of a local strongly convex regime in the parameterized problem space. To support these theoretical findings, I will present some simulation results.

Antje Vandenberg

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of This Seminar