Tutorial
Active Hypothesis Testing: An Information Theoretic (re)View
Tara Javidi

Mon Jun 10th 03:45 -- 06:00 PM @ Hall B

This tutorial revisits the problem of active hypothesis testing: a classical problem in statistics in which a decision maker is responsible to actively and dynamically collect data/samples so as to enhance the information about an underlying phenomena of interest while accounting for the cost of communication, sensing, or data collection. The decision maker must rely on the current information state to constantly (re-)evaluate the trade-off between the precision and the cost of various actions. This tutorial explores an often overlooked connection between active hypothesis testing and feedback information theory. This connection, we argue, has significant implications for next generation of information acquisition and machine learning algorithms where data is collected actively and/or by cooperative yet local agents.

In the first part of the talk, we discuss the history of active hypothesis testing (and experiment design) in statistics and the seminal contributions by Blackwell, Chernoff, De Groot, and Stein. In the second part of the talk, we discuss the information theoretic notions of acquisition rate and reliability (and their fundamental trade-off) as well as Extrinsic Jensen-Shannon divergence. We also discuss a class of algorithms based on posterior matching, a capacity-achieving feedback scheme for channel coding. We will illustrate the utility of these information theoretic notions, analysis as well as insights and algorithms for a number of important practically relevant problems such as measurement-dependent noisy search and decentralized Bayesian federated learning.

Author Information

Tara Javidi (University of California San Diego)

More from the Same Authors