Timezone: »

 
Poster
Towards Constituting Mathematical Structures for Learning to Optimize
Jialin Liu · Xiaohan Chen · Zhangyang “Atlas” Wang · Wotao Yin · HanQin Cai

Tue Jul 25 05:00 PM -- 06:30 PM (PDT) @ Exhibit Hall 1 #640

Learning to Optimize (L2O), a technique that utilizes machine learning to learn an optimization algorithm automatically from data, has gained arising attention in recent years. A generic L2O approach parameterizes the iterative update rule and learns the update direction as a black-box network. While the generic approach is widely applicable, the learned model can overfit and may not generalize well to out-of-distribution test sets. In this paper, we derive the basic mathematical conditions that successful update rules commonly satisfy. Consequently, we propose a novel L2O model with a mathematics-inspired structure that is broadly applicable and generalized well to out-of-distribution problems. Numerical simulations validate our theoretical findings and demonstrate the superior empirical performance of the proposed L2O model.

Author Information

Jialin Liu (Alibaba Group US)
Xiaohan Chen (The University of Texas at Austin)
Zhangyang “Atlas” Wang (University of Texas at Austin)
Wotao Yin (Alibaba US)
HanQin Cai (University of Central Florida)

More from the Same Authors