Skip to yearly menu bar Skip to main content


Poster

Optimal Bounds between f-Divergences and Integral Probability Metrics

Rohit Agrawal · Thibaut Horel

Virtual

Keywords: [ Convex Optimization ] [ Information Theory and Estimation ] [ Learning Theory ]


Abstract:

The families of f-divergences (e.g. the Kullback-Leibler divergence) and Integral Probability Metrics (e.g. total variation distance or maximum mean discrepancies) are commonly used in optimization and estimation. In this work, we systematically study the relationship between these two families from the perspective of convex duality. Starting from a tight variational representation of the f-divergence, we derive a generalization of the moment generating function, which we show exactly characterizes the best lower bound of the f-divergence as a function of a given IPM. Using this characterization, we obtain new bounds on IPMs defined by classes of unbounded functions, while also recovering in a unified manner well-known results for bounded and subgaussian functions (e.g. Pinsker's inequality and Hoeffding's lemma).

Chat is not available.