Presentation 2000/6/16
Generalization of Complete Stability Conditions of Neural Networks with Piecewise Linear Output Function
Norikazu Takahashi,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) A recurrent neural network is said to be completely stable if its state trajectory converges to an equilibrium point for any initial condition. The author recently derived a sufficient condition for recurrent neural networks with the piecewise linear output function to be completely stable. In this report, a new complete stability condition which is a generalization of the above mentioned sufficient condition is given. A convergence theorem of the Gauss-Seidel method, which is a well-known iterative technique for solving linear algebraic equations, plays an important role while most of the conventional stability criteria were obtained by constructing Lyapunov functions.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) recurrent neural networks / complete stability / Gauss-Seidel method
Paper # CAS2000-29,VLD2000-38,DSP2000-50
Date of Issue

Conference Information
Committee CAS
Conference Date 2000/6/16(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Circuits and Systems (CAS)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Generalization of Complete Stability Conditions of Neural Networks with Piecewise Linear Output Function
Sub Title (in English)
Keyword(1) recurrent neural networks
Keyword(2) complete stability
Keyword(3) Gauss-Seidel method
1st Author's Name Norikazu Takahashi
1st Author's Affiliation Department of Computer Science and Communication Engineering, Kyushu University()
Date 2000/6/16
Paper # CAS2000-29,VLD2000-38,DSP2000-50
Volume (vol) vol.100
Number (no) 119
Page pp.pp.-
#Pages 6
Date of Issue