Show simple item record

dc.contributor.authorZuo, L.
dc.contributor.authorWan, X.
dc.contributor.authorLiu, Jian
dc.date.accessioned2018-12-13T09:08:43Z
dc.date.available2018-12-13T09:08:43Z
dc.date.created2018-12-12T02:46:48Z
dc.date.issued2016
dc.identifier.citationZuo, L. and Wan, X. and Liu, J. 2016. Comparison of Various Neural Network Language Models in Speech Recognition, pp. 894-898.
dc.identifier.urihttp://hdl.handle.net/20.500.11937/71074
dc.identifier.doi10.1109/ICISCE.2016.195
dc.description.abstract

© 2016 IEEE. In recent years, research on language modeling for speech recognition has increasingly focused on the application of neural networks. However, the performance of neural network language models strongly depends on their architectural structure. Three competing concepts have been developed: Firstly, feed forward neural networks representing an n-gram approach, Secondly, recurrent neural networks that may learn context dependencies spanning more than a fixed number of predecessor words, Thirdly, the long short-term memory (LSTM) neural networks can fully exploits the correlation on a telephone conversation corpus. In this paper, we compare count models to feed forward, recurrent, and LSTM neural network in conversational telephone speech recognition tasks. Furthermore, we put forward a language model estimation method introduced the information of history sentences. We evaluate the models in terms of perplexity and word error rate, experimentally validating the strong correlation of the two quantities, which we find to hold regardless of the underlying type of the language model. The experimental results show that the performance of LSTM neural network language model is optimal in n-best lists re-score. Compared to the first pass decoding, the relative decline in average word error rate is 4.3% when using ten candidate results to re-score in conversational telephone speech recognition tasks.

dc.titleComparison of Various Neural Network Language Models in Speech Recognition
dc.typeConference Paper
dcterms.source.startPage894
dcterms.source.endPage898
dcterms.source.titleProceedings - 2016 3rd International Conference on Information Science and Control Engineering, ICISCE 2016
dcterms.source.seriesProceedings - 2016 3rd International Conference on Information Science and Control Engineering, ICISCE 2016
dcterms.source.isbn9781509025350
curtin.departmentWASM: Minerals, Energy and Chemical Engineering (WASM-MECE)
curtin.accessStatusFulltext not available


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record