Presentation 2008-03-12
Conversion of inner parameters to outer parameters
Yoshifusa Ito, Hiroyuki Izumi,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) To overcome difficultites in learning of a three-layer nural network, construction of a network with hidden-layer units having smaller numbers of inner parameters is recently proposed. Since the paper is substantially consists of approximation theorems, we discuss on application of the theorems to actual neural networks in this paper, where our simulation results are incorporated. Though the individual hidden units of the network have less parameters, the number of hidden units is increased. Hence, the result implies that the inner parameters of the network are converted to the outer parameters.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Dichotomic random variables / learning / hidden-layer unit / parameter / layered neural netwoks
Paper # NC2007-127
Date of Issue

Conference Information
Committee NC
Conference Date 2008/3/5(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Conversion of inner parameters to outer parameters
Sub Title (in English)
Keyword(1) Dichotomic random variables
Keyword(2) learning
Keyword(3) hidden-layer unit
Keyword(4) parameter
Keyword(5) layered neural netwoks
1st Author's Name Yoshifusa Ito
1st Author's Affiliation Department of Information and Policy Studies Aichi-Gakuin University()
2nd Author's Name Hiroyuki Izumi
2nd Author's Affiliation Department of Information and Policy Studies Aichi-Gakuin University
Date 2008-03-12
Paper # NC2007-127
Volume (vol) vol.107
Number (no) 542
Page pp.pp.-
#Pages 6
Date of Issue