TheanoLM - An extensible toolkit for neural network language modeling

Tutkimustuotos: Artikkeli kirjassa/konferenssijulkaisussavertaisarvioitu

Tutkijat

Organisaatiot

Kuvaus

We present a new tool for training neural network language models (NNLMs), scoring sentences, and generating text. The tool has been written using Python library Theano, which allows researcher to easily extend it and tune any aspect of the training process. Regardless of the flexibility, Theano is able to generate extremely fast native code that can utilize a GPU or multiple CPU cores in order to parallelize the heavy numerical computations. The tool has been evaluated in difficult Finnish and English conversational speech recognition tasks, and significant improvement was obtained over our best back-off n-gram models. The results that we obtained in the Finnish task were compared to those from existing RNNLM and RWTHLM toolkits, and found to be as good or better, while training times were an order of magnitude shorter.

Yksityiskohdat

AlkuperäiskieliEnglanti
OtsikkoProceedings of the 17th Annual Conference of the International Speech Communication Association (INTERSPEECH)
AlaotsikkoSan Francisco, USA, Sept. 8-12
TilaJulkaistu - 2016
OKM-julkaisutyyppiA4 Artikkeli konferenssijulkaisuussa
TapahtumaInterspeech - San Francisco, Yhdysvallat
Kesto: 8 syyskuuta 201612 syyskuuta 2016
Konferenssinumero: 17

Julkaisusarja

NimiProceedings of the Annual Conference of the International Speech Communication Association
KustantajaInternational Speech Communications Association
ISSN (painettu)1990-9770
ISSN (elektroninen)2308-457X

Conference

ConferenceInterspeech
MaaYhdysvallat
KaupunkiSan Francisco
Ajanjakso08/09/201612/09/2016

Lataa tilasto

Ei tietoja saatavilla

ID: 9697062