Search

MiS Preprint Repository

We have decided to discontinue the publication of preprints on our preprint server as of 1 March 2024. The publication culture within mathematics has changed so much due to the rise of repositories such as ArXiV (www.arxiv.org) that we are encouraging all institute members to make their preprints available there. An institute's repository in its previous form is, therefore, unnecessary. The preprints published to date will remain available here, but we will not add any new preprints here.

MiS Preprint
21/2021

PAC-Bayes and Information Complexity

Pradeep Kumar Banerjee and Guido Montúfar

Abstract

We point out that a number of well-known PAC-Bayesian-style and information-theoretic generalization bounds for randomized learning algorithms can be derived under a common framework starting from a fundamental information exponential inequality. We also obtain new bounds for data-dependent priors and unbounded loss functions. Optimizing these bounds naturally gives rise to a method called Information Complexity Minimization for which we discuss two practical examples for learning with neural networks, namely Entropy- and PAC-Bayes- SGD.

Received:
Sep 20, 2021
Published:
Sep 20, 2021
MSC Codes:
68Q32, 68T05, 94A15
Keywords:
PAC-Bayes generalization bounds, Gibbs algorithm, flat minima

Related publications

inBook
2021 Repository Open Access
Pradeep Kumar Banerjee and Guido Montúfar

PAC-bayes and information complexity

In: ICLR 2021 workshop on neural compression : from information theory to applications
[s. l.] : ICLR, 2021. - pp. 1-15