Presentation 1998/5/25
Block Size Optimization for the Block Orthogonal Projection Algorithm
Kazushi IKEDA, Seiji MIYOSHI, Kenji NAKAYAMA,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) The block orthogonal projection algorithm which is one for transversal filters can be applied to the linear dichotomy (what is called the perceptron) which is nonlinear. When the block size which is the number of examples used in one renewal is one, the algorithm is equivalent to the normalized LMS algorithm and is proven to stop in a finite number of iterations when the learning rate is unity. This report gives the block size which maximizes the convergence speed when the learning rate is unity, and confirms it by computer simulations. The results say that larger block size is not necessarily better.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Perceptron / Block Orthogonal Projection Algorithm / Convergence Rate
Paper #
Date of Issue

Conference Information
Committee NC
Conference Date 1998/5/25(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Block Size Optimization for the Block Orthogonal Projection Algorithm
Sub Title (in English)
Keyword(1) Perceptron
Keyword(2) Block Orthogonal Projection Algorithm
Keyword(3) Convergence Rate
1st Author's Name Kazushi IKEDA
1st Author's Affiliation Grad.Sch.Informatics, Kyoto Univ.()
2nd Author's Name Seiji MIYOSHI
2nd Author's Affiliation Kobe City College of Tech.
3rd Author's Name Kenji NAKAYAMA
3rd Author's Affiliation Fac.Eng., Kanazawa Univ.
Date 1998/5/25
Paper #
Volume (vol) vol.98
Number (no) 77
Page pp.pp.-
#Pages 8
Date of Issue