Efficient approximate online convolutional dictionary learning

Farshad Ghorbani Veshki, Sergiy Vorobyov

Research output: Contribution to journalArticleScientificpeer-review

1 Citation (Scopus)
37 Downloads (Pure)

Abstract

Most existing convolutional dictionary learning (CDL) algorithms are based on batch learning, where the dictionary filters and the convolutional sparse representations are optimized in an alternating manner using a training dataset. When large training datasets are used, batch CDL algorithms become prohibitively memory-intensive. An online-learning technique is used to reduce the memory requirements of CDL by optimizing the dictionary incrementally after finding the sparse representations of each training sample. Nevertheless, learning large dictionaries using the existing online CDL (OCDL) algorithms remains highly computationally expensive. In this paper, we present a novel approximate OCDL method that incorporates sparse decomposition of the training samples. The resulting optimization problems are addressed using the alternating direction method of multipliers. Extensive experimental evaluations using several image datasets and based on an image fusion task show that the proposed method substantially reduces computational costs while preserving the effectiveness of the state-of-the-art OCDL algorithms.
Original languageEnglish
Pages (from-to)1165-1175
Number of pages11
Journal IEEE Transactions on Computational Imaging
Volume9
DOIs
Publication statusPublished - 15 Dec 2023
MoE publication typeA1 Journal article-refereed

Fingerprint

Dive into the research topics of 'Efficient approximate online convolutional dictionary learning'. Together they form a unique fingerprint.

Cite this