Skip to yearly menu bar Skip to main content


Poster

CLUB: A Contrastive Log-ratio Upper Bound of Mutual Information

Pengyu Cheng · Weituo Hao · Shuyang Dai · Jiachang Liu · Zhe Gan · Lawrence Carin

Keywords: [ Deep Learning - General ] [ Bayesian Deep Learning ]


Abstract:

There has been considerable recent interest in mutual information (MI) minimization for various machine learning tasks. However, estimating and minimizing MI in high-dimensional spaces remains a challenging problem, especially when only samples are accessible, rather than the underlying distribution forms. Previous works mainly focus on MI lower bound approximation, which is not applicable to MI minimization problems. In this paper, we propose a novel Contrastive Log-ratio Upper Bound (CLUB) of mutual information. We provide a theoretical analysis of the properties of CLUB and its variational approximation. Based on this upper bound, we introduce an accelerated MI minimization training scheme, that bridges MI minimization with negative sampling. Simulation studies on Gaussian distributions show that CLUB provides reliable estimates. Real-world MI minimization experiments, including domain adaptation and the information bottleneck, further demonstrate the effectiveness of the proposed method.

Chat is not available.