Search

Talk

Mode Collapse in Self-Consuming Generative Models

  • Nate Gillman (Brown University)
Live Stream

Abstract

As synthetic data becomes higher quality and proliferates on the internet, machine learning models are increasingly trained on a mix of human- and machine-generated data. Despite the successful stories of using synthetic data for representation learning, using synthetic data for generative model training creates “self-consuming loops” which may lead to training instability or even collapse, unless certain conditions are met. We provide an overview of research in this new and growing field, which started in July 2023 when a group of researchers showed that "Self Consuming Generative Models Go Mad." Since then, there have been many papers that study this self-consuming loop problem, but there have been very few works which try to solve it. We'll discuss many of these papers, including our recent work, "Self-Correcting Self-Consuming Loops for Generative Model Training" (ICML 2024). Our paper aims to stabilize self-consuming generative model training by introducing a "correction" function, which ensures that the synthetic training data is higher quality. We empirically validate the effectiveness of self-correcting self-consuming loops on the challenging human motion synthesis task, and observe that it successfully avoids model collapse, even when the ratio of synthetic data to real data is as high as 100%.

Links

seminar
16.01.25 30.01.25

Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of this Seminar