Skip to yearly menu bar Skip to main content


Poster

Incremental Sampling Without Replacement for Sequence Models

Kensen Shi · David Bieber · Charles Sutton

Virtual

Keywords: [ Sequential, Network, and Time-Series Modeling ] [ Structured Prediction ] [ Generative Models ] [ Combinatorial Optimization ]


Abstract:

Sampling is a fundamental technique, and sampling without replacement is often desirable when duplicate samples are not beneficial. Within machine learning, sampling is useful for generating diverse outputs from a trained model. We present an elegant procedure for sampling without replacement from a broad class of randomized programs, including generative neural models that construct outputs sequentially. Our procedure is efficient even for exponentially-large output spaces. Unlike prior work, our approach is incremental, i.e., samples can be drawn one at a time, allowing for increased flexibility. We also present a new estimator for computing expectations from samples drawn without replacement. We show that incremental sampling without replacement is applicable to many domains, e.g., program synthesis and combinatorial optimization.

Chat is not available.