Timezone: »
In Generalized Linear Estimation (GLE) problems, we seek to estimate a signal that is observed through a linear transform followed by a component-wise, possibly nonlinear and noisy, channel. In the Bayesian optimal setting, Generalized Approximate Message Passing (GAMP) is known to achieve optimal performance for GLE. However, its performance can significantly deteriorate whenever there is a mismatch between the assumed and the true generative model, a situation frequently encountered in practice. In this paper, we propose a new algorithm, named Generalized Approximate Survey Propagation (GASP), for solving GLE in the presence of prior or model misspecifications. As a prototypical example, we consider the phase retrieval problem, where we show that GASP outperforms the corresponding GAMP, reducing the reconstruction threshold and, for certain choices of its parameters, approaching Bayesian optimal performance. Furthermore, we present a set of state evolution equations that can precisely characterize the performance of GASP in the high-dimensional limit.
Author Information
Carlo Lucibello (Bocconi University)
Luca Saglietti (Microsoft Research)
Yue Lu (Harvard University, USA)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Generalized Approximate Survey Propagation for High-Dimensional Estimation »
Wed. Jun 12th 01:30 -- 04:00 AM Room Pacific Ballroom #160
More from the Same Authors
-
2021 Poster: On the Inherent Regularization Effects of Noise Injection During Training »
Oussama Dhifallah · Yue Lu -
2021 Spotlight: On the Inherent Regularization Effects of Noise Injection During Training »
Oussama Dhifallah · Yue Lu -
2020 Poster: The Role of Regularization in Classification of High-dimensional Noisy Gaussian Mixture »
Francesca Mignacco · Florent Krzakala · Yue Lu · Pierfrancesco Urbani · Lenka Zdeborova