Presentation 1999/5/20
A Larger Family of Objective Functions to which Hopfield Neural Networks can give Globally Optimal Solutions
Yoshinori Uesaka,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) A family of objective functions is discussed for which the conjecture, stating that globally optimal solution (not a local one) of Hopfield neural networks may be obtained by starting from an initial point sufficiently close to the origin, truly holds. To the conjecture, Nishi (1998) has been giving an interesting family of objective functions through the eigenvector analysis. The present paper extends his result to the larger family which suggests that the conjecture really holds for a wide range of objective functions.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Combinatorial optimization / Global optimum / Hopfield neural networks / Dynamical system
Paper # NC99-3
Date of Issue

Conference Information
Committee NC
Conference Date 1999/5/20(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) A Larger Family of Objective Functions to which Hopfield Neural Networks can give Globally Optimal Solutions
Sub Title (in English)
Keyword(1) Combinatorial optimization
Keyword(2) Global optimum
Keyword(3) Hopfield neural networks
Keyword(4) Dynamical system
1st Author's Name Yoshinori Uesaka
1st Author's Affiliation Department of Information Sciences, Faculty of Science and Technology, Science University of Tokyo()
Date 1999/5/20
Paper # NC99-3
Volume (vol) vol.99
Number (no) 58
Page pp.pp.-
#Pages 8
Date of Issue