Presentation 1993/9/22
On the realization of automata by recurrent higher order neural networks
Ken Tanaka, Itsuo Kumazawa,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) In this paper we will discuss about the learning of the formal language by Recurrent Higher Order Neural Networks.First,we will survey some past studies and point out an primal difficulty which exists in learning process.Secondly,we will apply a neural network model that includes higher order connection weights.In addition,we will show the ability of our model to solve the problem which lies in the past models from the point of view of the ability to realize any state transfer function.Thirdly,we will derive the typical learning algorithm using the gradient descent method.Last of all,to make the efficiency of our model clear,we will show some computer simulation results applying some examples of the regular language.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) higher order connections / recurrent neural networks / regular language / Automaton
Paper # NC93-33
Date of Issue

Conference Information
Committee NC
Conference Date 1993/9/22(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) On the realization of automata by recurrent higher order neural networks
Sub Title (in English)
Keyword(1) higher order connections
Keyword(2) recurrent neural networks
Keyword(3) regular language
Keyword(4) Automaton
1st Author's Name Ken Tanaka
1st Author's Affiliation Department of Computer,Faculty of Engineering,Science,Tokyo Institute of Technology()
2nd Author's Name Itsuo Kumazawa
2nd Author's Affiliation Department of Computer,Faculty of Engineering,Science,Tokyo Institute of Technology
Date 1993/9/22
Paper # NC93-33
Volume (vol) vol.93
Number (no) 247
Page pp.pp.-
#Pages 8
Date of Issue