Subtractive Mixture Models via Squaring: Representation and Learning

Lorenzo Loconte, Aleksanteri Sladek, Stefan Mengel, Martin Trapp, Arno Solin, Nicolas Gillis, Antonio Vergari

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

Abstract

Mixture models are traditionally represented and learned by adding several distributions as components. Allowing mixtures to subtract probability mass or density can drastically reduce the number of components needed to model complex distributions. However, learning such subtractive mixtures while ensuring they still encode a non-negative function is challenging. We investigate how to learn and perform inference on deep subtractive mixtures by squaring them. We do this in the framework of probabilistic circuits, which enable us to represent tensorized mixtures and generalize several other subtractive models. We theoretically prove that the class of squared circuits allowing subtractions can be exponentially more expressive than traditional additive mixtures; and, we empirically show this increased expressiveness on a series of real-world distribution estimation tasks.
Original languageEnglish
Title of host publication12th International Conference on Learning Representations (ICLR 2024)
PublisherCurran Associates Inc.
ISBN (Print)978-1-7138-9865-8
Publication statusPublished - 2024
MoE publication typeA4 Conference publication
EventInternational Conference on Learning Representations - Messe Wien Exhibition and Congress Center, Vienna, Austria
Duration: 7 May 202411 May 2024
Conference number: 12
https://iclr.cc/

Conference

ConferenceInternational Conference on Learning Representations
Abbreviated titleICLR
Country/TerritoryAustria
CityVienna
Period07/05/202411/05/2024
Internet address

Fingerprint

Dive into the research topics of 'Subtractive Mixture Models via Squaring: Representation and Learning'. Together they form a unique fingerprint.
  • Science-IT

    Hakala, M. (Manager)

    School of Science

    Facility/equipment: Facility

Cite this