Presentation 1994/6/17
Learning on perceptron:Neuton′s method based on improved second-or der back-propagation
Takae Ohta, Yoshiaki Kawamura,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) If Neuton′s method is applied for each layer of a neural network ,the excellent convergence property is usually lost.An improved method is proposed in this paper in whcih collections of the back- propagation caculation are made according to the weight change of other layers.The backpropagation formula of the first and second sensitivity function is derived.It is shown by numerical simulation for the XOR problem and the 3 bit parity problem that correct networks are obtained by the improved method within about 10 trials of learning without fail.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) neural networks / perceptron / back-propagation / Neuton′s meth od / sensitivity
Paper # NLP94-32
Date of Issue

Conference Information
Committee NLP
Conference Date 1994/6/17(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Nonlinear Problems (NLP)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Learning on perceptron:Neuton′s method based on improved second-or der back-propagation
Sub Title (in English)
Keyword(1) neural networks
Keyword(2) perceptron
Keyword(3) back-propagation
Keyword(4) Neuton′s meth od
Keyword(5) sensitivity
1st Author's Name Takae Ohta
1st Author's Affiliation College of Engineering,University of Osaka Prefecture()
2nd Author's Name Yoshiaki Kawamura
2nd Author's Affiliation College of Engineering,University of Osaka Prefecture
Date 1994/6/17
Paper # NLP94-32
Volume (vol) vol.94
Number (no) 98
Page pp.pp.-
#Pages 8
Date of Issue