Skip to yearly menu bar Skip to main content



Abstract:

Meta learning automatically infers an inductivebias, that includes the hyperparameter of the baselearningalgorithm, by observing data from a finitenumber of related tasks. This paper studiesPAC-Bayes bounds on meta generalizationgap. The meta-generalization gap comprises twosources of generalization gaps: the environmentleveland task-level gaps resulting from observationof a finite number of tasks and data samplesper task, respectively. In this paper, by upperbounding arbitrary convex functions, which linkthe expected and empirical losses at the environmentand also per-task levels, we obtain new PAC-Bayesbounds. Using these bounds, we developnew PAC-Bayes meta-learning algorithms. Numericalexamples demonstrate the merits of theproposed novel bounds and algorithm in comparisonto prior PAC-Bayes bounds for meta-learning

Chat is not available.