Timezone: »
The optimization of zero-sum games, multi-objective agent training, or in general, the optimization of variational inequality (VI) problems is currently notoriously unstable on general problems. Owing to the increased need for training such models in machine learning, the above observation attracted significant research attention over the past years. Substantial progress has been made towards understanding the qualitative differences with single-objective minimization by casting the optimization method in its corresponding continuous-time dynamics, as well as obtaining convergence guarantees and rates for some instances of VIs because such guarantees often guide the corresponding proof for the discrete counterpart. Most notably, continuous-time tools allowed for analyzing complex non-convex problems, which in some cases, cannot be carried out using standard discrete-time tools. This paper aims to provide an overview of these ideas specifically for the broad VI problem class, and the insights originating from applying continuous-time tools for VI problems. We finalize by describing various desiderata of fundamental open questions towards developing optimization methods that work for general VIs and argue that tackling these requires understanding the associated continuous-time dynamics.
Author Information
Tatjana Chavdarova (UC Berkeley)
Ya-Ping Hsieh (ETH)
More from the Same Authors
-
2022 : Recovering Stochastic Dynamics via Gaussian Schrödinger Bridges »
Ya-Ping Hsieh · Charlotte Bunne · Marco Cuturi · Andreas Krause -
2022 : Recovering Stochastic Dynamics via Gaussian Schrödinger Bridges »
Charlotte Bunne · Ya-Ping Hsieh · Marco Cuturi · Andreas Krause -
2023 Affinity Workshop: Women in Machine Learning (WiML) Un-Workshop for ICML 2023 »
Mandana Samiei · Tatjana Chavdarova -
2022 Affinity Workshop: Women in Machine Learning (WiML) Un-Workshop »
Vinitra Swamy · Paula Gradu · Mojgan Saeidi · Noor Sajid · Shweta Khushu · Giulia Clerici · Tatjana Chavdarova -
2021 Poster: The Limits of Min-Max Optimization Algorithms: Convergence to Spurious Non-Critical Sets »
Ya-Ping Hsieh · Panayotis Mertikopoulos · Volkan Cevher -
2021 Affinity Workshop: Women in Machine Learning (WiML) Un-Workshop »
Wenshuo Guo · Beliz Gokkaya · Arushi G K Majha · Vaidheeswaran Archana · Berivan Isik · Olivia Choudhury · Liyue Shen · Hadia Samil · Tatjana Chavdarova -
2021 Oral: The Limits of Min-Max Optimization Algorithms: Convergence to Spurious Non-Critical Sets »
Ya-Ping Hsieh · Panayotis Mertikopoulos · Volkan Cevher -
2020 : Introduction and Opening Remarks »
Tatjana Chavdarova -
2020 Affinity Workshop: Women in Machine Learning Un-Workshop »
Tatjana Chavdarova · Caroline Weis · Amy Zhang · Fariba Yousefi · Mandana Samiei · Larissa Schiavo