Scheduled Sampling for Transformers

Tsvetomila Mihaylova, Andre F.T. Martins

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

Abstract

Scheduled sampling is a technique for avoiding one of the known problems in sequence-to-sequence generation: exposure bias. It consists of feeding the model a mix of the teacher forced embeddings and the model predictions from the previous step in training time. The technique has been used for improving model performance with recurrent neural networks (RNN). In the Transformer model, unlike the RNN, the generation of a new word attends to the full sentence generated so far, not only to the last word, and it is not straightforward to apply the scheduled sampling technique. We propose some structural changes to allow scheduled sampling to be applied to Transformer architectures, via a two-pass decoding strategy. Experiments on two language pairs achieve performance close to a teacher-forcing baseline and show that this technique is promising for further exploration.
Original languageEnglish
Title of host publicationProceedings of the 57th Conference of the Association for Computational Linguistics, ACL 2019
PublisherAssociation for Computational Linguistics
Pages351-356
ISBN (Electronic)9781950737475
DOIs
Publication statusPublished - 2019
MoE publication typeA4 Conference publication
EventAnnual Meeting of the Association for Computational Linguistics: Student Research Workshop - Florence, Italy
Duration: 28 Jul 20192 Aug 2019

Workshop

WorkshopAnnual Meeting of the Association for Computational Linguistics
Abbreviated titleSRW
Country/TerritoryItaly
CityFlorence
Period28/07/201902/08/2019

Fingerprint

Dive into the research topics of 'Scheduled Sampling for Transformers'. Together they form a unique fingerprint.

Cite this