Presentation 2002/1/22
A variational Bayes learning for normalized Gaussian network
Shin Ishii, Masa-aki Sato,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) Normalized Gaussian network, which is a function approximator, can be formulated as a mixture model of exponential distributions. Its inference can then be done by the variational Bayes learning, which is a natural extension of the expectation-maximization (EM) algorithm for the maximum likelihood inference. This study introduces a hierarchical prior distribution and a confidence parameter to the variational Bayes learning. In addition, a hierarchical model selection heuristics is introduced. When applied to a simple two-dimensional function approximation problem and a reconstruction problem of a chaotic dynamical system, the modified VB method achieves good results.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Bayes learning / Variational Bayes learning / EM algorithm / Normalized Gaussian network / Model selection
Paper #
Date of Issue

Conference Information
Committee NC
Conference Date 2002/1/22(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) A variational Bayes learning for normalized Gaussian network
Sub Title (in English)
Keyword(1) Bayes learning
Keyword(2) Variational Bayes learning
Keyword(3) EM algorithm
Keyword(4) Normalized Gaussian network
Keyword(5) Model selection
1st Author's Name Shin Ishii
1st Author's Affiliation Nara Institute of Science and Technology:CREST Doya Project, Japan Science and Technology Corporation()
2nd Author's Name Masa-aki Sato
2nd Author's Affiliation Advanced Telecommunication Research Institute International (ATR):CREST Doya Project, Japan Science and Technology Corporation
Date 2002/1/22
Paper #
Volume (vol) vol.101
Number (no) 616
Page pp.pp.-
#Pages 8
Date of Issue