Magnitude-Preserving Ranking for Structured Outputs

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review


Original languageEnglish
Title of host publicationProceedings of the Ninth Asian Conference on Machine Learning
EditorsMin-Ling Zhang, Yung-Kyun Noh
Publication statusPublished - 3 Nov 2017
MoE publication typeA4 Article in a conference publication
EventAsian Conference on Machine Learning - Yonsei University, Seoul, Korea, Seoul, Korea, Republic of
Duration: 15 Nov 201717 Nov 2017
Conference number: 9

Publication series

NameProceedings of Machine Learning Research
ISSN (Electronic)1938-7228


ConferenceAsian Conference on Machine Learning
Abbreviated titleACML
CountryKorea, Republic of
Internet address


Research units

  • Friedrich Schiller University Jena


In this paper, we present a novel method for solving structured prediction problems, based on combining Input Output Kernel Regression (IOKR) with an extension of magnitude-preserving ranking to structured output spaces. In particular, we concentrate on the case where a set of candidate outputs has been given, and the associated pre-image problem calls for ranking the set of candidate outputs. Our method, called magnitude-preserving IOKR, both aims to produce a good approximation of the output feature vectors, and to preserve the magnitude differences of the output features in the candidate sets. For the case where the candidate set does not contain corresponding ’correct’ inputs, we propose a method for approximating the inputs through application of IOKR in the reverse direction. We apply our method to two learning problems: cross-lingual document retrieval and metabolite identification. Experiments show that the proposed approach improves performance over IOKR, and in the latter application obtains the current state-of-the-art accuracy.

ID: 16201639