PAC-Bayes Un-Expected Bernstein Inequality - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

PAC-Bayes Un-Expected Bernstein Inequality

Résumé

We present a new PAC-Bayesian generalization bound. Standard bounds contain a $\sqrt{L_n \cdot \KL/n}$ complexity term which dominates unless $L_n$, the empirical error of the learning algorithm's randomized predictions, vanishes. We manage to replace $L_n$ by a term which vanishes in many more situations, essentially whenever the employed learning algorithm is sufficiently stable on the dataset at hand. Our new bound consistently beats state-of-the-art bounds both on a toy example and on UCI datasets (with large enough $n$). Theoretically, unlike existing bounds, our new bound can be expected to converge to $0$ faster whenever a Bernstein/Tsybakov condition holds, thus connecting PAC-Bayesian generalization and {\em excess risk\/} bounds---for the latter it has long been known that faster convergence can be obtained under Bernstein conditions. Our main technical tool is a new concentration inequality which is like Bernstein's but with $X^2$ taken outside its expectation.
Fichier principal
Vignette du fichier
1905.13367.pdf (537.42 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02401295 , version 1 (09-12-2019)

Identifiants

Citer

Zakaria Mhammedi, Peter Grünwald, Benjamin Guedj. PAC-Bayes Un-Expected Bernstein Inequality. NeurIPS 2019, Dec 2019, Vancouver, Canada. ⟨hal-02401295⟩
80 Consultations
131 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More