Skip to yearly menu bar Skip to main content


Poster

Partial Trace Regression and Low-Rank Kraus Decomposition

Hachem Kadri · Stephane Ayache · Riikka Huusari · alain rakotomamonjy · Ralaivola Liva

Virtual

Keywords: [ Supervised Learning ] [ Matrix/Tensor Methods ] [ General Machine Learning Techniques ]


Abstract:

The trace regression model, a direct extension of the well-studied linear regression model, allows one to map matrices to real-valued outputs. We here introduce an even more general model, namely the partial-trace regression model, a family of linear mappings from matrix-valued inputs to matrix-valued outputs; this model subsumes the trace regression model and thus the linear regression model. Borrowing tools from quantum information theory, where partial trace operators have been extensively studied, we propose a framework for learning partial trace regression models from data by taking advantage of the so-called low-rank Kraus representation of completely positive maps. We show the relevance of our framework with synthetic and real-world experiments conducted for both i) matrix-to-matrix regression and ii) positive semidefinite matrix completion, two tasks which can be formulated as partial trace regression problems.

Chat is not available.