Magnitude-Preserving Ranking for Structured Outputs

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Details

Original languageEnglish
Title of host publicationProceedings of the Ninth Asian Conference on Machine Learning
EditorsMin-Ling Zhang, Yung-Kyun Noh
Pages407-422
Number of pages16
StatePublished - 3 Nov 2017
MoE publication typeA4 Article in a conference publication
EventAsian Conference on Machine Learning - Yonsei University, Seoul, Korea, Seoul, Korea, Republic of
Duration: 15 Nov 201717 Nov 2017
Conference number: 9
http://www.acml-conf.org/2017/

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume77
ISSN (Electronic)1938-7228

Conference

ConferenceAsian Conference on Machine Learning
Abbreviated titleACML
CountryKorea, Republic of
CitySeoul
Period15/11/201717/11/2017
Internet address

Researchers

Research units

  • Friedrich Schiller University Jena

Abstract

In this paper, we present a novel method for solving structured prediction problems, based on combining Input Output Kernel Regression (IOKR) with an extension of magnitude-preserving ranking to structured output spaces. In particular, we concentrate on the case where a set of candidate outputs has been given, and the associated pre-image problem calls for ranking the set of candidate outputs. Our method, called magnitude-preserving IOKR, both aims to produce a good approximation of the output feature vectors, and to preserve the magnitude differences of the output features in the candidate sets. For the case where the candidate set does not contain corresponding ’correct’ inputs, we propose a method for approximating the inputs through application of IOKR in the reverse direction. We apply our method to two learning problems: cross-lingual document retrieval and metabolite identification. Experiments show that the proposed approach improves performance over IOKR, and in the latter application obtains the current state-of-the-art accuracy.

ID: 16201639