Presentation 2011-03-11
Adaptive Learning of Activation Function of Layer Neural Network
Daisuke SEI, Masahiro NAKAMURA,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) In this paper, on back propagation of a layer neural network, we study the effect of adaptive learning of activation function not only weight and levels. We prepare two of non-liner separation problem, and get the learning success rate and mean the number of learning times by searching the number of times on mean squared error (MSE) under the constant value and the number of learning times. The result is that we confirmed adaptive learning is faster than normal one. So we confirmed that adaptive learning is effective for acceleration of learning.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Layer Neural Network / Back Propagation
Paper # NLP2010-191
Date of Issue

Conference Information
Committee NLP
Conference Date 2011/3/3(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Nonlinear Problems (NLP)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Adaptive Learning of Activation Function of Layer Neural Network
Sub Title (in English)
Keyword(1) Layer Neural Network
Keyword(2) Back Propagation
1st Author's Name Daisuke SEI
1st Author's Affiliation Nagaoka University of Techonology()
2nd Author's Name Masahiro NAKAMURA
2nd Author's Affiliation Nagaoka University of Techonology
Date 2011-03-11
Paper # NLP2010-191
Volume (vol) vol.110
Number (no) 465
Page pp.pp.-
#Pages 4
Date of Issue