Abstrakti

Conditional Neural Processes (CNPs) are a class of metalearning models popular for combining the runtime efficiency of amortized inference with reliable uncertainty quantification. Many relevant machine learning tasks, such as spatio-temporal modeling, Bayesian Optimization and continuous control, contain equivariances -- for example to translation -- which the model can exploit for maximal performance. However, prior attempts to include equivariances in CNPs do not scale effectively beyond two input dimensions. In this work, we propose Relational Conditional Neural Processes (RCNPs), an effective approach to incorporate equivariances into any neural process model. Our proposed method extends the applicability and impact of equivariant neural processes to higher dimensions. We empirically demonstrate the competitive performance of RCNPs on a large array of tasks naturally containing equivariances.
AlkuperäiskieliEnglanti
OtsikkoAdvances in Neural Information Processing Systems 36 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023
KustantajaCurran Associates Inc.
Sivumäärä38
ISBN (elektroninen)978-1-7138-9992-1
TilaJulkaistu - 2024
OKM-julkaisutyyppiA4 Artikkeli konferenssijulkaisussa
TapahtumaConference on Neural Information Processing Systems - Ernest N. Morial Convention Center, New Orleans, Yhdysvallat
Kesto: 10 jouluk. 202316 jouluk. 2023
Konferenssinumero: 37
https://nips.cc/

Julkaisusarja

NimiAdvances in Neural Information Processing Systems
KustantajaMorgan Kaufmann Publishers
Vuosikerta36
ISSN (painettu)1049-5258

Conference

ConferenceConference on Neural Information Processing Systems
LyhennettäNeurIPS
Maa/AlueYhdysvallat
KaupunkiNew Orleans
Ajanjakso10/12/202316/12/2023
www-osoite

Sormenjälki

Sukella tutkimusaiheisiin 'Practical Equivariances via Relational Conditional Neural Processes'. Ne muodostavat yhdessä ainutlaatuisen sormenjäljen.

Siteeraa tätä