Hom-PGD+: Fast Reparameterized Optimization over Non-convex Ball-Homeomorphic Set
Chenghao Liu ⋅ Enming Liang ⋅ Minghua Chen
Abstract
We study optimization over non-convex constraint sets that are homeomorphic to a ball, encompassing important problem classes such as star-shaped sets that frequently arise in machine learning and engineering applications. We propose Hom-PGD$^+$, a learning-based and projection-efficient first-order method that efficiently solves such problems without requiring expensive projection or optimization oracles. Our approach leverages an invertible neural network (INN) to learn the homeomorphism between the non-convex constraint set and a unit ball, transforming the original problem into an equivalent ball-constrained optimization where projections admit closed-form solutions. We establish that Hom-PGD$^+$ achieves an $\mathcal{O}(\epsilon^{-2})$ convergence rate to an ($\epsilon + \mathcal{O}(\sqrt{\epsilon_{\rm inn}})$)-approximate stationary solution, where $\epsilon_{\rm inn}$ denotes the homeomorphism learning error. This rate significantly improves upon existing methods for optimization over non-convex sets, while maintaining a per-iteration complexity of only $\mathcal{O}(W)$ for $W$ INN parameters. Experiments on chance-constrained optimization problems in power systems demonstrate that Hom-PGD$^+$ achieves convergence rates comparable to state-of-the-art methods while delivering speedups of up to one order of magnitude.
Successful Page Load