Challenges and Opportunities in High-dimensional Variational Inference

Akash Kumar Dhaka, Alejandro Catalina, Manushi Welandawe, Michael Riis Andersen, Jonathan H. Huggins, Aki Vehtari

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

20 Citations (Scopus)

Abstract

Current black-box variational inference (BBVI) methods require the user to make numerous design choices—such as the selection of variational objective and approximating family—yet there is little principled guidance on how to do so. We develop a conceptual framework and set of experimental tools to understand the effects of these choices, which we leverage to propose best practices for maximizing posterior approximation accuracy. Our approach is based on studying the pre-asymptotic tail behavior of the density ratios between the joint distribution and the variational approximation, then exploiting insights and tools from the importance sampling literature. Our framework and supporting experiments help to distinguish between the behavior of BBVI methods for approximating low-dimensional versus moderate-to-high-dimensional posteriors. In the latter case, we show that mass-covering variational objectives are difficult to optimize and do not improve accuracy, but flexible variational families can improve accuracy and the effectiveness of importance sampling—at the cost of additional optimization challenges. Therefore, for moderate-to-high-dimensional posteriors we recommend using the (mode-seeking) exclusive KL divergence since it is the easiest to optimize, and improving the variational family or using model parameter transformations to make the posterior and optimal variational approximation more similar. On the other hand, in low-dimensional settings, we show that heavy-tailed variational families and mass-covering divergences are effective and can increase the chances that the approximation can be improved by importance sampling.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 34 - 35th Conference on Neural Information Processing Systems, NeurIPS 2021
EditorsMarc'Aurelio Ranzato, Alina Beygelzimer, Yann Dauphin, Percy S. Liang, Jenn Wortman Vaughan
PublisherNeural Information Processing Systems Foundation
Pages7787-7798
Number of pages12
ISBN (Electronic)9781713845393
Publication statusPublished - 2021
MoE publication typeA4 Conference publication
EventConference on Neural Information Processing Systems - Virtual, Online
Duration: 6 Dec 202114 Dec 2021
Conference number: 35
https://neurips.cc

Publication series

NameAdvances in Neural Information Processing Systems
PublisherNeural Information Processing Systems Foundation
Volume34
ISSN (Print)1049-5258

Conference

ConferenceConference on Neural Information Processing Systems
Abbreviated titleNeurIPS
CityVirtual, Online
Period06/12/202114/12/2021
Internet address

Fingerprint

Dive into the research topics of 'Challenges and Opportunities in High-dimensional Variational Inference'. Together they form a unique fingerprint.

Cite this