Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Information-Theoretic Methods for Rigorous, Responsible, and Reliable Machine Learning (ITR3)

Data-Dependent PAC-Bayesian Bounds in the Random-Subset Setting with Applications to Neural Networks

Fredrik Hellström · Giuseppe Durisi


Abstract:

The PAC-Bayesian framework has proven to be a useful tool to obtain nonvacuous generalization bounds for modern learning algorithms, such as overparameterized neural networks. A known heuristic to tighten such bounds is to use data-dependent priors. In this paper, we show how the information-theoretically motivated random-subset setting introduced by Steinke & Zakynthinou (2020) enables the derivation of PAC-Bayesian bounds that naturally involve a data-dependent prior. We evaluate these bounds for neural networks trained on MNIST and Fashion-MNIST, and study their dependence on the training set size, the achieved training accuracy, and the effect of randomized labels.

Chat is not available.