Incremental ELMVIS for Unsupervised Learning

Anton Akusok, Emil Eirola, Yoan Miche, Ian Oliver, Kaj-Mikael Björk, Andrey Gritsenko, Stephen Baek, Amaury Lendasse

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review


An incremental version of the ELMVIS+ method is proposed in this paper. It iteratively selects a few best fitting data samples from a large pool, and adds them to the model. The method keeps high speed of ELMVIS+ while allowing for much larger possible sample pools due to lower memory requirements. The extension is useful for reaching a better local optimum with greedy optimization of ELMVIS, and the data structure can be specified in semi-supervised optimization. The major new application of incremental ELMVIS is not to visualization, but to a general dataset processing. The method is capable of learning dependencies from non-organized unsupervised data—either reconstructing a shuffled dataset, or learning dependencies in complex high-dimensional space. The results are interesting and promising, although there is space for improvements.
Original languageEnglish
Title of host publicationProceedings of ELM-2016
EditorsJiuwen Cao, Erik Cambria, Amaury Lendasse, Yoan Miche, Chi Man Vong
Place of PublicationCham
Number of pages11
Publication statusPublished - 2018
MoE publication typeA4 Article in a conference publication
EventInternational Conference on Extreme Learning Machines - Singapore, Singapore
Duration: 13 Dec 201615 Dec 2016

Publication series

NameProceedings in Adaptation, Learning and Optimization
ISSN (Print)2363-6084


ConferenceInternational Conference on Extreme Learning Machines
Abbreviated titleELM


  • engineering
  • artificial intelligence
  • computational intelligence

Fingerprint Dive into the research topics of 'Incremental ELMVIS for Unsupervised Learning'. Together they form a unique fingerprint.

Cite this