Timezone: »

 
Poster
Incremental Sampling Without Replacement for Sequence Models
Kensen Shi · David Bieber · Charles Sutton

Tue Jul 14 10:00 AM -- 10:45 AM & Tue Jul 14 09:00 PM -- 09:45 PM (PDT) @ Virtual #None

Sampling is a fundamental technique, and sampling without replacement is often desirable when duplicate samples are not beneficial. Within machine learning, sampling is useful for generating diverse outputs from a trained model. We present an elegant procedure for sampling without replacement from a broad class of randomized programs, including generative neural models that construct outputs sequentially. Our procedure is efficient even for exponentially-large output spaces. Unlike prior work, our approach is incremental, i.e., samples can be drawn one at a time, allowing for increased flexibility. We also present a new estimator for computing expectations from samples drawn without replacement. We show that incremental sampling without replacement is applicable to many domains, e.g., program synthesis and combinatorial optimization.

Author Information

Kensen Shi (Google)
David Bieber (Google Research)
Charles Sutton (Google)

More from the Same Authors