Skip to yearly menu bar Skip to main content


Poster

Towards Generalization beyond Pointwise Learning: A Unified Information-theoretic Perspective

Yuxin Dong · Tieliang Gong · Hong Chen · Zhongjiang He · Shiquan Wang · Shuangyong Song · Chen Li

Hall C 4-9 #1412
[ ] [ Project Page ] [ Paper PDF ]
Thu 25 Jul 2:30 a.m. PDT — 4 a.m. PDT

Abstract:

The recent surge in contrastive learning has intensified the interest in understanding the generalization of non-pointwise learning paradigms. While information-theoretic analysis achieves remarkable success in characterizing the generalization behavior of learning algorithms, its applicability is largely confined to pointwise learning, with extensions to the simplest pairwise settings remaining unexplored due to the challenges of non-i.i.d losses and dimensionality explosion. In this paper, we develop the first series of information-theoretic bounds extending beyond pointwise scenarios, encompassing pointwise, pairwise, triplet, quadruplet, and higher-order scenarios, all within a unified framework. Specifically, our hypothesis-based bounds elucidate the generalization behavior of iterative and noisy learning algorithms via gradient covariance analysis, and our prediction-based bounds accurately estimate the generalization gap with computationally tractable low-dimensional information metrics. Comprehensive numerical studies then demonstrate the effectiveness of our bounds in capturing the generalization dynamics across diverse learning scenarios.

Chat is not available.