X

Passwort vergessen?

Recurrent Neural Network based Probabilistic Language Model

Recurrent Neural Network based Probabilistic Language Model

Speech Recognition with Probabilistic Language Model

AV Akademikerverlag ( 25.10.2017 )

€ 23,90

Im MoreBooks! Shop bestellen

Statistical n-gram language models are widely used for their state of the art performance in a continuous speech recognition system. In a domain based scenario, the sequences vary at large for expressing same context by the speakers. But, holding all possible sequences in training corpora for estimating n-gram probabilities is practically difficult. Capturing long distance dependencies from a sequence is an important feature in language models that can provide non zero probability for a sparse sequence during recognition. A simpler back-off n-gram model has a problem of estimating the probabilities for sparse data, if the size of n gram increases. Also deducing knowledge from training patterns can help the language models to generalize on an unknown sequence or word by its linguistic properties like noun, singular or plural, novel position in a sentence. For a weaker generalization, n-gram model needs huge sizes of corpus for training. A simple recurrent neural network based language model approach is proposed here to efficiently overcome the above difficulties for domain based corpora.

Buch Details:

ISBN-13:

978-620-2-20544-3

ISBN-10:

620220544X

EAN:

9786202205443

Buchsprache:

English

von (Autor):

Sathyanarayanan Kuppusami

Seitenanzahl:

60

Veröffentlicht am:

25.10.2017

Kategorie:

Internet