Abstract

High training costs of generative models and the need to fine-tune them for specific tasks have created a strong interest in model reuse and composition. A key challenge in composing iterative generative processes, such as GFlowNets and diffusion models, is that to realize the desired target distribution, all steps of the generative process need to be coordinated, and satisfy delicate balance conditions. In this work, we propose Compositional Sculpting: a general approach for defining compositions of iterative generative processes. We then introduce a method for sampling from these compositions built on classifier guidance. We showcase ways to accomplish compositional sculpting in both GFlowNets and diffusion models. We highlight two binary operations - the harmonic mean and the contast - between pairs, and the generalization of these operations to multiple component distributions. We offer empirical results on image and molecular generation tasks. Project codebase: https://github.com/timgaripov/compositional-sculpting.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 36 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023
PublisherCurran Associates Inc.
Pages12665-12702
ISBN (Electronic)978-1-7138-9992-1
Publication statusPublished - 2024
MoE publication typeA4 Conference publication
EventConference on Neural Information Processing Systems - Ernest N. Morial Convention Center, New Orleans, United States
Duration: 10 Dec 202316 Dec 2023
Conference number: 37
https://nips.cc/

Publication series

NameAdvances in Neural Information Processing Systems
PublisherMorgan Kaufmann Publishers
Volume36
ISSN (Print)1049-5258

Conference

ConferenceConference on Neural Information Processing Systems
Abbreviated titleNeurIPS
Country/TerritoryUnited States
CityNew Orleans
Period10/12/202316/12/2023
Internet address

Fingerprint

Dive into the research topics of 'Compositional Sculpting of Iterative Generative Processes'. Together they form a unique fingerprint.

Cite this