Preprint 21/2021

PAC-Bayes and Information Complexity

Pradeep Kumar Banerjee and Guido Montúfar

Contact the author: Please use for correspondence this email.
Submission date: 20. Sep. 2021
Pages: 17
Bibtex
MSC-Numbers: 68Q32, 68T05, 94A15
Keywords and phrases: PAC-Bayes generalization bounds, Gibbs algorithm, flat minima
Download full preprint: PDF (387 kB)
Link to arXiv: See the arXiv entry of this preprint.

Abstract:
We point out that a number of well-known PAC-Bayesian-style and information-theoretic generalization bounds for randomized learning algorithms can be derived under a common framework starting from a fundamental information exponential inequality. We also obtain new bounds for data-dependent priors and unbounded loss functions. Optimizing these bounds naturally gives rise to a method called Information Complexity Minimization for which we discuss two practical examples for learning with neural networks, namely Entropy- and PAC-Bayes- SGD.

11.11.2021, 22:22