TheanoLM - An extensible toolkit for neural network language modeling

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review


Research units


We present a new tool for training neural network language models (NNLMs), scoring sentences, and generating text. The tool has been written using Python library Theano, which allows researcher to easily extend it and tune any aspect of the training process. Regardless of the flexibility, Theano is able to generate extremely fast native code that can utilize a GPU or multiple CPU cores in order to parallelize the heavy numerical computations. The tool has been evaluated in difficult Finnish and English conversational speech recognition tasks, and significant improvement was obtained over our best back-off n-gram models. The results that we obtained in the Finnish task were compared to those from existing RNNLM and RWTHLM toolkits, and found to be as good or better, while training times were an order of magnitude shorter.


Original languageEnglish
Title of host publicationProceedings of the 17th Annual Conference of the International Speech Communication Association (INTERSPEECH)
Subtitle of host publicationSan Francisco, USA, Sept. 8-12
Publication statusPublished - 2016
MoE publication typeA4 Article in a conference publication
EventInterspeech - San Francisco, United States
Duration: 8 Sep 201612 Sep 2016
Conference number: 17

Publication series

NameProceedings of the Annual Conference of the International Speech Communication Association
PublisherInternational Speech Communications Association
ISSN (Print)1990-9770
ISSN (Electronic)2308-457X


CountryUnited States
CitySan Francisco

    Research areas

  • Artificial neural networks, Automatic speech recognition, Conversational language, Language modeling

Download statistics

No data available

ID: 9697062