Noise2Noise: Learning image restoration without clean data

Jaakko Lehtinen*, Jacob Munkberg, Jon Hasselgren, Samuli Laine, Tero Karras, Miika Aittala, Timo Aila

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

40 Citations (Scopus)
13 Downloads (Pure)


We apply basic statistical reasoning to signal reconstruction by machine learning - learning to map corrupted observations to clean signals - with a simple and powerful conclusion: It is possible to learn to restore images by only looking at corrupted examples, at performance at and some-times exceeding training using clean data, without explicit image priors or likelihood models of the corruption. In practice, we show that a single model learns photographic noise removal, denois- ing synthetic Monte Carlo images, and reconstruction of undersampled MRI scans - all corrupted by different processes - based on noisy data only.

Original languageEnglish
Title of host publication35th International Conference on Machine Learning, ICML 2018
EditorsJennifer Dy, Andreas Krause
Number of pages12
ISBN (Electronic)9781510867963
Publication statusPublished - 1 Jan 2018
MoE publication typeA4 Article in a conference publication
EventInternational Conference on Machine Learning - Stockholm, Sweden
Duration: 10 Jul 201815 Jul 2018
Conference number: 35

Publication series

NameProceedings of Machine Learning Research
ISSN (Electronic)1938-7228


ConferenceInternational Conference on Machine Learning
Abbreviated titleICML

Fingerprint Dive into the research topics of 'Noise2Noise: Learning image restoration without clean data'. Together they form a unique fingerprint.

  • Cite this

    Lehtinen, J., Munkberg, J., Hasselgren, J., Laine, S., Karras, T., Aittala, M., & Aila, T. (2018). Noise2Noise: Learning image restoration without clean data. In J. Dy, & A. Krause (Eds.), 35th International Conference on Machine Learning, ICML 2018 (Vol. 7, pp. 4620-4631). (Proceedings of Machine Learning Research; No. 80).