Algebraic Positional Encodings

Kokos Kogkalidis, Jean-Philippe Bernardy, Vikas Garg

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

Abstract

We introduce a novel positional encoding strategy for Transformer-style models, addressing the shortcomings of existing, often ad hoc, approaches. Our framework implements a flexible mapping from the algebraic specification of a domain to a positional encoding scheme where positions are interpreted as orthogonal operators. This design preserves the structural properties of the source domain, thereby ensuring that the end-model upholds them. The framework can accommodate various structures, including sequences, grids and trees, but also their compositions. We conduct a series of experiments demonstrating the practical applicability of our method. Our results suggest performance on par with or surpassing the current state of the art, without hyper-parameter optimizations or task search'' of any kind.Code is available through https://aalto-quml.github.io/ape/.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 37 (NeurIPS 2024)
EditorsA. Globerson, L. Mackey, D. Belgrave, A. Fan, U. Paquet, J. Tomczak, C. Zhang
PublisherCurran Associates Inc.
ISBN (Print)9798331314385
Publication statusPublished - 2025
MoE publication typeA4 Conference publication
EventConference on Neural Information Processing Systems - Vancouver, Canada, Vancouver , Canada
Duration: 10 Dec 202415 Dec 2024
Conference number: 38
https://neurips.cc/Conferences/2024

Publication series

NameAdvances in Neural Information Processing Systems
PublisherCurran Associates Inc.
Volume37
ISSN (Print)1049-5258

Conference

ConferenceConference on Neural Information Processing Systems
Abbreviated titleNeurIPS
Country/TerritoryCanada
CityVancouver
Period10/12/202415/12/2024
Internet address

Fingerprint

Dive into the research topics of 'Algebraic Positional Encodings'. Together they form a unique fingerprint.

Cite this