Presentation 1997/10/20
Learning Processes of Neural Networks and Error Meaures
Sumiyoshi Fujiki, M. Nahomi Fujiki, Mitsuyuki Nakao,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) In the learning process by the conventional back-propagation algorithm, the trapping problem in metastable states occurs using the quadratic error measure. By using the Kullback divergence as an error measure, it is shown by numerical simulations that the learning efficiency is improved significantly no matter how large the number of internal nurons, and the optimal network size is achieved automatically. The Kullback divergence is a superior error measure in the two sence of the scalability: one for the complexity of the problem, and another for the excess network size.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Layered Neural Network / Learning Process / Error back-propagation / Kullback's Measure
Paper # NC97-43
Date of Issue

Conference Information
Committee NC
Conference Date 1997/10/20(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Learning Processes of Neural Networks and Error Meaures
Sub Title (in English)
Keyword(1) Layered Neural Network
Keyword(2) Learning Process
Keyword(3) Error back-propagation
Keyword(4) Kullback's Measure
1st Author's Name Sumiyoshi Fujiki
1st Author's Affiliation GSIS, Tohoku University()
2nd Author's Name M. Nahomi Fujiki
2nd Author's Affiliation Sendai National College, of Technology
3rd Author's Name Mitsuyuki Nakao
3rd Author's Affiliation GSIS, Tohoku University
Date 1997/10/20
Paper # NC97-43
Volume (vol) vol.97
Number (no) 332
Page pp.pp.-
#Pages 8
Date of Issue