Partial Trace Regression and Low-Rank Kraus Decomposition

Hachem Kadri*, Stéphane Ayache, Riikka Huusari, Alain Rakotomamonjy, Liva Ralaivola

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

2 Citations (Scopus)
19 Downloads (Pure)


The trace regression model, a direct extension of the well-studied linear regression model, al-lows one to map matrices to real-valued outputs.We here introduce an even more general model,namely the partial-trace regression model, a family of linear mappings from matrix-valued inputs to matrix-valued outputs; this model subsumes the trace regression model and thus the linear regression model. Borrowing tools from quantum information theory, where partial trace operators have been extensively studied, we propose a framework for learning partial trace regression models from data by taking advantage of the so-called low-rank Kraus representation of completely positive maps.We show the relevance of our framework with synthetic and real-world experiments conducted for both i) matrix-to-matrix regression and ii) positive semidefinite matrix completion, two tasks which can be formulated as partial trace regression problems.
Original languageEnglish
Title of host publication37th International Conference on Machine Learning, ICML 2020
PublisherInternational Machine Learning Society
Number of pages11
ISBN (Electronic)9781713821120
Publication statusPublished - 2020
MoE publication typeA4 Conference publication
EventInternational Conference on Machine Learning - Vienna, Austria
Duration: 12 Jul 202018 Jul 2020
Conference number: 37

Publication series

NameProceedings of Machine Learning Research
ISSN (Electronic)2640-3498


ConferenceInternational Conference on Machine Learning
Abbreviated titleICML


Dive into the research topics of 'Partial Trace Regression and Low-Rank Kraus Decomposition'. Together they form a unique fingerprint.

Cite this