Poster
A Unified View on PAC-Bayes Bounds for Meta-Learning
Arezou Rezazadeh
Hall E #630
Keywords: [ T: Deep Learning ] [ PM: Bayesian Models and Methods ] [ T: Probabilistic Methods ] [ MISC: Supervised Learning ] [ OPT: Convex ] [ T: Miscellaneous Aspects of Machine Learning ] [ T: Learning Theory ] [ MISC: Transfer, Multitask and Meta-learning ]
Meta learning automatically infers an inductivebias, that includes the hyperparameter of the baselearningalgorithm, by observing data from a finitenumber of related tasks. This paper studiesPAC-Bayes bounds on meta generalizationgap. The meta-generalization gap comprises twosources of generalization gaps: the environmentleveland task-level gaps resulting from observationof a finite number of tasks and data samplesper task, respectively. In this paper, by upperbounding arbitrary convex functions, which linkthe expected and empirical losses at the environmentand also per-task levels, we obtain new PAC-Bayesbounds. Using these bounds, we developnew PAC-Bayes meta-learning algorithms. Numericalexamples demonstrate the merits of theproposed novel bounds and algorithm in comparisonto prior PAC-Bayes bounds for meta-learning