We propose a novel and flexible rank-breaking-then-composite-marginal-likelihood (RBCML) framework for learning random utility models (RUMs), which include the Plackett-Luce model. We characterize conditions for the objective function of RBCML to be strictly log-concave by proving that strict log-concavity is preserved under convolution and marginalization. We characterize necessary and sufficient conditions for RBCML to satisfy consistency and asymptotic normality. Experiments on synthetic data show that RBCML for Gaussian RUMs achieves better statistical efficiency and computation efficiency than the state-of-the-art algorithm and our RBCML for the Plackett-Luce model provides flexible tradeoffs between running time and statistical efficiency.
Zhibing Zhao (Rensselaer Polytechnic Institute)
Lirong Xia (RPI)
Related Events (a corresponding poster, oral, or spotlight)
2018 Oral: Composite Marginal Likelihood Methods for Random Utility Models »
Thu Jul 12th 11:50 AM -- 12:00 PM Room A5