Presentation 1997/11/17
Noise Suppression of Training Data and Generalization Ability
Akiko NAKASHIMA, Hidemitsu OGAWA,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) Multi-layer feedforward neural networks are trained using the error back-propagation(BP) algorithm. The algorithm minimizes the training error. Hence, in the case of noisy training data, a trained network memorizes noisy outputs for given inputs. In order to suppress noise in training data, we proposed error correcting memorization learning(CML). In this paper, we evaluate generalization ability of CML comparing with the projection learning (PL). It is theoretically proved that although CML merely suppresses noise in training data, it provides the same generalization as PL under some necessary and sufficient condition.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) generalization / memorization learning / suppression of noise / admissibility / back-propagation
Paper # NC97-52
Date of Issue

Conference Information
Committee NC
Conference Date 1997/11/17(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Noise Suppression of Training Data and Generalization Ability
Sub Title (in English)
Keyword(1) generalization
Keyword(2) memorization learning
Keyword(3) suppression of noise
Keyword(4) admissibility
Keyword(5) back-propagation
1st Author's Name Akiko NAKASHIMA
1st Author's Affiliation Department of Computer Science Graduate School of Information Science and Engineering Tokyo Institute of Technology()
2nd Author's Name Hidemitsu OGAWA
2nd Author's Affiliation Department of Computer Science Graduate School of Information Science and Engineering Tokyo Institute of Technology
Date 1997/11/17
Paper # NC97-52
Volume (vol) vol.97
Number (no) 379
Page pp.pp.-
#Pages 8
Date of Issue