Presentation 2007-12-22
Rank Reduction by Cross-Validated Backpropagation
Masashi SEKINO, Katsumi NITTA,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) In this paper, we propose cross-validated backpropagation for training neural networks. The experimental results show that the average generalization error of the proposed method is smaller than those of the ordinal backpropagation, early-stopping and Bayes estimation, that the proposed method gives the same results as those by large rank models and small rank models when the rank of the true function is small, that the plateau of the learning, which is observed, when the backpropagation is applied, , is also observed when the proposed method is applied, and that the proposed method also gives good approximation performance when the true function is almost unidentifiable.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Neural Network / Backpropagation / Overfitting / Cross-Validation / Reduced Rank Regression
Paper # NC2007-76
Date of Issue

Conference Information
Committee NC
Conference Date 2007/12/15(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Rank Reduction by Cross-Validated Backpropagation
Sub Title (in English)
Keyword(1) Neural Network
Keyword(2) Backpropagation
Keyword(3) Overfitting
Keyword(4) Cross-Validation
Keyword(5) Reduced Rank Regression
1st Author's Name Masashi SEKINO
1st Author's Affiliation ()
2nd Author's Name Katsumi NITTA
2nd Author's Affiliation Interdisciplinary Graduate School of Science and Engineering, Tokyo Institute of Technology
Date 2007-12-22
Paper # NC2007-76
Volume (vol) vol.107
Number (no) 410
Page pp.pp.-
#Pages 6
Date of Issue