Wasserstein Proximal of GANs
Alex Tong Lin, Wuchen Li, Stanley Osher, and Guido Montúfar
Contact the author: Please use for correspondence this email.
Submission date: 06. Oct. 2018
Keywords and phrases: optimal transport, natural gradient, Generative Adversarial Network
Download full preprint: PDF (2138 kB)
Link to arXiv: See the arXiv entry of this preprint.
We introduce a new method for training GANs by applying the Wasserstein-2 metric proximal on the generators. This approach is based on the gradient operator induced by optimal transport theory, which connects the geometry of the sample space and the parameter space in implicit deep generative models. From this theory, we obtain an easy-to-implement regularizer for the parameter updates. Our experiments demonstrate that this method improves the speed and stability in training GANs in terms of wallclock time and Frechet Inception Distance (FID) learning curves.