Presentation 1998/6/12
Estimating Entropy of Language from Word Insertion Penalty
Atsunori OGAWA, Kazuya TAKEDA, Fumitada ITAKURA,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) Optimization of Word Insertion Penalty (WIP) is discussed from the view point of entropy of a language model by 1) relating optimal value of WIP with the difference between true and model entropies, and 2) confirming the relation through recognition experiments. Difference of 'true' entropies estimated from bigram and trigram, i. e. 0.04 [bit], is much smaller than that of test set entropies of them, i. e. 0.2 [bit]. From this result, the function of WIP for compensating the entropy of the model is confirmed.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Entropy of a Language / Ergodicity / Continuous Speech Recognition
Paper # SP98-31
Date of Issue

Conference Information
Committee SP
Conference Date 1998/6/12(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Speech (SP)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Estimating Entropy of Language from Word Insertion Penalty
Sub Title (in English)
Keyword(1) Entropy of a Language
Keyword(2) Ergodicity
Keyword(3) Continuous Speech Recognition
1st Author's Name Atsunori OGAWA
1st Author's Affiliation Graduate School of Engineering, Nagoya University()
2nd Author's Name Kazuya TAKEDA
2nd Author's Affiliation Graduate School of Engineering, Nagoya University
3rd Author's Name Fumitada ITAKURA
3rd Author's Affiliation Center for Information and Media Studies, Nagoya University
Date 1998/6/12
Paper # SP98-31
Volume (vol) vol.98
Number (no) 106
Page pp.pp.-
#Pages 6
Date of Issue