Timezone: »

 
Poster
Augment with Care: Contrastive Learning for Combinatorial Problems
Haonan Duan · Pashootan Vaezipoor · Max Paulus · Yangjun Ruan · Chris Maddison

Thu Jul 21 03:00 PM -- 05:00 PM (PDT) @ Hall E #403

Supervised learning can improve the design of state-of-the-art solvers for combinatorial problems, but labelling large numbers of combinatorial instances is often impractical due to exponential worst-case complexity. Inspired by the recent success of contrastive pre-training for images, we conduct a scientific study of the effect of augmentation design on contrastive pre-training for the Boolean satisfiability problem. While typical graph contrastive pre-training uses label-agnostic augmentations, our key insight is that many combinatorial problems have well-studied invariances, which allow for the design of label-preserving augmentations. We find that label-preserving augmentations are critical for the success of contrastive pre-training. We show that our representations are able to achieve comparable test accuracy to fully-supervised learning while using only 1% of the labels. We also demonstrate that our representations are more transferable to larger problems from unseen domains. Our code is available at https://github.com/h4duan/contrastive-sat.

Author Information

Haonan Duan (University of Toronto)
Pashootan Vaezipoor (University of Toronto and Vector Institute)
Max Paulus (ETH Zurich)
Yangjun Ruan (University of Toronto)
Chris Maddison (University of Toronto)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors