## Adaptive Data Analysis with Correlated Observations

### Aryeh Kontorovich · Menachem Sadigurschi · Uri Stemmer

##### Hall E #1003

Keywords: [ SA: Privacy-preserving Statistics and Machine Learning ] [ SA: Trustworthy Machine Learning ] [ T: Learning Theory ]

[ Abstract ]
[ [
Wed 20 Jul 3:30 p.m. PDT — 5:30 p.m. PDT

Spotlight presentation: T: Learning Theory/Domain Adaptation
Wed 20 Jul 7:30 a.m. PDT — 9 a.m. PDT

Abstract:

The vast majority of the work on adaptive data analysis focuses on the case where the samples in the dataset are independent. Several approaches and tools have been successfully applied in this context, such as {\em differential privacy}, {\em max-information}, {\em compression arguments}, and more. The situation is far less well-understood without the independence assumption. We embark on a systematic study of the possibilities of adaptive data analysis with correlated observations. First, we show that, in some cases, differential privacy guarantees generalization even when there are dependencies within the sample, which we quantify using a notion we call {\em Gibbs-dependence}. We complement this result with a tight negative example.%Second, we show that the connection between transcript-compression and adaptive data analysis can be extended to the non-iid setting.

Chat is not available.