Skip to yearly menu bar Skip to main content


Poster

Marginal Distribution Adaptation for Discrete Sets via Module-Oriented Divergence Minimization

Hanjun Dai · Mengjiao Yang · Yuan Xue · Dale Schuurmans · Bo Dai

Hall E #333

Keywords: [ MISC: Transfer, Multitask and Meta-learning ] [ MISC: Representation Learning ] [ DL: Generative Models and Autoencoders ]


Abstract:

Distributions over discrete sets capture the essential statistics including the high-order correlation among elements. Such information provides powerful insight for decision making across various application domains, e.g., product assortment based on product distribution in shopping carts. While deep generative models trained on pre-collected data can capture existing distributions, such pre-trained models are usually not capable of aligning with a target domain in the presence of distribution shift due to reasons such as temporal shift or the change in the population mix. We develop a general framework to adapt a generative model subject to a (possibly counterfactual) target data distribution with both sampling and computation efficiency. Concretely, instead of re-training a full model from scratch, we reuse the learned modules to preserve the correlations between set elements, while only adjusting corresponding components to align with target marginal constraints. We instantiate the approach for three commonly used forms of discrete set distribution---latent variable, autoregressive, and energy based models---and provide efficient solutions for marginal-constrained optimization in either primal or dual forms. Experiments on both synthetic and real-world e-commerce and EHR datasets show that the proposed framework is able to practically align a generative model to match marginal constraints under distribution shift.

Chat is not available.