Quantized Maximum Likelihood Estimation under Normal Mean-Variance Mixture Model
Abstract
Estimating statistical parameters from quantized data has received significant attention in recent years, as recovering information from quantized measurements has numerous applications across signal processing, communications, and data analysis. In this work, we focus on maximum likelihood (ML) estimation of statistical parameters from quantized samples. Directly solving the ML problem is challenging, as the likelihood function involves multiple integrals that are difficult to evaluate. To address this challenge, we propose an expectation-conditional-maximization (ECM) algorithm under a general distributional framework. Our approach generalizes the quantization model to multi-bit settings and allows the underlying signal to follow any distribution within the normal mean-variance mixture family. By designing suitable surrogate functions, the ECM algorithm ensures that all model parameters can be updated in closed form at each iteration. Leveraging the ECM framework, we provide convergence guarantees, and under specific distributional assumptions, we further derive bounds on the convergence rate and the statistical error. Extensive experiments demonstrate the effectiveness of our method in recovering statistical parameters from quantized data.