Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Information-Theoretic Methods for Rigorous, Responsible, and Reliable Machine Learning (ITR3)

Excess Risk Analysis of Learning Problems via Entropy Continuity

Aolin Xu


Abstract:

This paper gives an overview of a recently developed framework of excess risk analysis of learning problems using the distributional continuity of the generalized entropy. The three paradigms of learning problems under consideration include: 1) frequentist learning, 2) Bayesian learning, and 3) learning by first fitting the empirical distribution to a predefined family of distributions and then designing the decision rule under the fitted distribution. It is shown that the excess risks in the first two paradigms can be studied via the continuity of various forms of generalized unconditional entropy, while the third paradigm can be studied via the continuity of generalized conditional entropy. Representative bounds derived for each learning problem are presented and discussed.

Chat is not available.