Timezone: »

Deep Adaptive Design: Amortizing Sequential Bayesian Experimental Design
Adam Foster · Desi Ivanova · ILYAS MALIK · Tom Rainforth

Thu Jul 22 06:00 AM -- 06:20 AM (PDT) @ None

We introduce Deep Adaptive Design (DAD), a method for amortizing the cost of adaptive Bayesian experimental design that allows experiments to be run in real-time. Traditional sequential Bayesian optimal experimental design approaches require substantial computation at each stage of the experiment. This makes them unsuitable for most real-world applications, where decisions must typically be made quickly. DAD addresses this restriction by learning an amortized design network upfront and then using this to rapidly run (multiple) adaptive experiments at deployment time. This network represents a design policy which takes as input the data from previous steps, and outputs the next design using a single forward pass; these design decisions can be made in milliseconds during the live experiment. To train the network, we introduce contrastive information bounds that are suitable objectives for the sequential setting, and propose a customized network architecture that exploits key symmetries. We demonstrate that DAD successfully amortizes the process of experimental design, outperforming alternative strategies on a number of problems.

Author Information

Adam Foster (University of Oxford)
Desi Ivanova (University of Oxford)
Tom Rainforth (University of Oxford)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors