Presentation 1999/12/20
Phonetic Tied-Mixture Model for LVCSR
Akinobu Lee, Tatsuya Kawahara, Kazuya Takeda, Kiyohiro Shikano,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) A phonetic tied-mixture(PTM)model for efficient large vocabulary continuous speech recognition ispresented. It is synthesized from context-independent phone models with 64mixture components per state by assigning different mixture weights according to the shared states of triphones. Mixtures are then re-estimated for optimization. The model achieves a word error rate of 7.O% at 20k-word dictation of newspaper corpus, which is comparable to the best figure by the triphone of much higher resolutions. Compared with conventional PTMs that share Gaussians by all states, the proposed model is easily trained and reliably estimated. Furthermore, the model enables the decoder to perform efficient Gaussian Pruning. It is found out that computing only two out of 64 components does not cause any loss of accuracy. Several methods for the pruning are proposed and compared, and the best one reduced compared, and the best one reduced the computation to about 20%.
Keyword(in Japanese) (See Japanese page)
Keyword(in English)
Paper # NLC99-100
Date of Issue

Conference Information
Committee NLC
Conference Date 1999/12/20(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Natural Language Understanding and Models of Communication (NLC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Phonetic Tied-Mixture Model for LVCSR
Sub Title (in English)
Keyword(1)
1st Author's Name Akinobu Lee
1st Author's Affiliation ()
2nd Author's Name Tatsuya Kawahara
2nd Author's Affiliation
3rd Author's Name Kazuya Takeda
3rd Author's Affiliation
4th Author's Name Kiyohiro Shikano
4th Author's Affiliation
Date 1999/12/20
Paper # NLC99-100
Volume (vol) vol.99
Number (no) 523
Page pp.pp.-
#Pages 6
Date of Issue