Certificate-Guided Pruning for Stochastic Lipschitz Optimization
Ibne Farabi Shihab ⋅ SANJEDA AKTER ⋅ Anuj Sharma
Abstract
We study black-box optimization of Lipschitz functions under noisy evaluations. Existing adaptive discretization methods implicitly avoid suboptimal regions but do not provide explicit certificates of optimality or measurable progress guarantees. We introduce **Certificate-Guided Pruning (CGP)**, which maintains an explicit *active set* $A_t$ of potentially optimal points via confidence-adjusted Lipschitz envelopes. Any point outside $A_t$ is certifiably suboptimal with high probability, and under a margin condition with near-optimality dimension $\alpha$, we prove Vol $(A_t)$ shrinks at a controlled rate yielding sample complexity $Õ(\varepsilon^{-(2+\alpha)})$. We develop three extensions: CGP-Adaptive learns $L$ online with $O(\log T)$ overhead; CGP-TR scales to $d > 50$ via trust regions with local certificates; and CGP-Hybrid switches to GP refinement when local smoothness is detected. Experiments on 12 benchmarks ($d \in [2, 100]$) show CGP variants match or exceed strong baselines while providing principled stopping criteria via certificate volume.
Successful Page Load