Abstract

Conditional Neural Processes (CNPs) are a class of metalearning models popular for combining the runtime efficiency of amortized inference with reliable uncertainty quantification. Many relevant machine learning tasks, such as spatio-temporal modeling, Bayesian Optimization and continuous control, contain equivariances -- for example to translation -- which the model can exploit for maximal performance. However, prior attempts to include equivariances in CNPs do not scale effectively beyond two input dimensions. In this work, we propose Relational Conditional Neural Processes (RCNPs), an effective approach to incorporate equivariances into any neural process model. Our proposed method extends the applicability and impact of equivariant neural processes to higher dimensions. We empirically demonstrate the competitive performance of RCNPs on a large array of tasks naturally containing equivariances.
Original languageEnglish
Number of pages38
Publication statusAccepted/In press - 24 Sept 2023
MoE publication typeNot Eligible
EventConference on Neural Information Processing Systems - Ernest N. Morial Convention Center, New Orleans, United States
Duration: 10 Dec 202316 Dec 2023
Conference number: 37
https://nips.cc/

Conference

ConferenceConference on Neural Information Processing Systems
Abbreviated titleNeurIPS
Country/TerritoryUnited States
CityNew Orleans
Period10/12/202316/12/2023
Internet address

Fingerprint

Dive into the research topics of 'Practical Equivariances via Relational Conditional Neural Processes'. Together they form a unique fingerprint.

Cite this