Oral
Adversarially Learned Representations for Information Obfuscation and Inference
Martin A Bertran · Natalia Martinez · Afroditi Papadaki · Qiang Qiu · Miguel Rodrigues · Galen Reeves · Guillermo Sapiro

Wed Jun 12th 04:00 -- 04:20 PM @ Grand Ballroom

Data collection and sharing are pervasive aspects of modern society. This process can either be voluntary, as in the case of a person taking a facial image to unlock his/her phone, or incidental, such as traffic cameras collecting videos on pedestrians. An undesirable side effect of these processes is that shared data can carry information about attributes that users might consider as sensitive, even when such information is of limited use for the task. It is therefore desirable for both data collectors and users to design procedures that minimize sensitive information leakage. Balancing the competing objectives of providing meaningful individualized service levels and inference while obfuscating sensitive information is still an open problem. In this work, we take an information theoretic approach that is implemented as an unconstrained adversarial game between Deep Neural Networks in a principled, data-driven manner. This approach enables us to learn domain-preserving stochastic transformations that maintain performance on existing algorithms while minimizing sensitive information leakage.

Author Information

Martin A Bertran (Duke University)
Natalia Martinez (Duke University)
Afroditi Papadaki (University College London)
Qiang Qiu (Duke University)
Miguel Rodrigues (University College London)
Galen Reeves (Duke)
Guillermo Sapiro (Duke University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors