Presentation 2001/3/16
H_∞-Learning : Global Optimization Approach
Kiyoshi NISIYAMA, Kiyohiko SUZUKI,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) Back propagation(BP)method is widely known as a learning algorithm of layered neural networks. However, the learning rate is too late, and it is strongly affected by the initial values of weight coefficients and thresholds. In this paper, H_∞-learning of layered neural networks is proposed, and a new learning algorithm, called the g-EHF algorithm, is derived from the H_∞-learning, comparing with back propagation(BP)and extended Kalman filter(EKF)learning algorithms. The robustness of H_∞-learning to variances bo the initial weights and incorrect teach data is verified bycomputer simulations.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) learning algorithm / H_∞ theory / neural network / robust / back propagation / Kalman filter
Paper # NC2000-158
Date of Issue

Conference Information
Committee NC
Conference Date 2001/3/16(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) H_∞-Learning : Global Optimization Approach
Sub Title (in English)
Keyword(1) learning algorithm
Keyword(2) H_∞ theory
Keyword(3) neural network
Keyword(4) robust
Keyword(5) back propagation
Keyword(6) Kalman filter
1st Author's Name Kiyoshi NISIYAMA
1st Author's Affiliation Department of Computer and Information Science, Iwate University()
2nd Author's Name Kiyohiko SUZUKI
2nd Author's Affiliation Department of Computer and Information Science, Iwate University
Date 2001/3/16
Paper # NC2000-158
Volume (vol) vol.100
Number (no) 688
Page pp.pp.-
#Pages 6
Date of Issue