Abstract
Bidirectional masked Transformers have become the core theme in the current NLP landscape. Despite their impressive benchmarks, a recurring theme in recent research has been to question such models’ capacity for syntactic
generalization. In this work, we seek to address this question by adding a supervised, token-level supertagging objective to standard unsupervised pretraining, enabling the explicit incorporation of syntactic biases into the network’s training dynamics. Our approach is straightforward to implement, induces a marginal computational overhead and is general enough to adapt to a variety of settings. We apply our methodology on Lassy Large, an automatically annotated corpus of written Dutch. Our experiments suggest that our syntax-aware model performs on par with established baselines, despite Lassy Large being one order of magnitude smaller than commonly used corpora.
generalization. In this work, we seek to address this question by adding a supervised, token-level supertagging objective to standard unsupervised pretraining, enabling the explicit incorporation of syntactic biases into the network’s training dynamics. Our approach is straightforward to implement, induces a marginal computational overhead and is general enough to adapt to a variety of settings. We apply our methodology on Lassy Large, an automatically annotated corpus of written Dutch. Our experiments suggest that our syntax-aware model performs on par with established baselines, despite Lassy Large being one order of magnitude smaller than commonly used corpora.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2023 CLASP Conference on Learning with Small Data |
Publisher | Association for Computational Linguistics |
Pages | 176-184 |
ISBN (Electronic) | 979-8-89176-000-4 |
Publication status | Published - 2023 |
MoE publication type | A4 Conference publication |
Event | Learning with Small Data - Gothenburg, Sweden Duration: 11 Sept 2023 → 12 Sept 2023 |
Publication series
Name | CLASP Papers in Computational Linguistics |
---|---|
Publisher | Association for Computational Linguistics |
Volume | 5 |
ISSN (Electronic) | 2002-9764 |
Conference
Conference | Learning with Small Data |
---|---|
Abbreviated title | LSD |
Country/Territory | Sweden |
City | Gothenburg |
Period | 11/09/2023 → 12/09/2023 |