Summary

International Symposium on Nonlinear Theory and its Applications

2017

Session Number:A1L-E

Session:

Number:A1L-E-1

Sign-Invariant Unsupervised Learning Facilitates Weighted-Sum Computation in Analog Neural-Network Devices

Itaru Hida,  Kodai Ueyoshi,  Shinya Takamaeda-Yamazaki,  Masayuki Ikebe,  Masato Motomura,  Tetsuya Asai,  

pp.82-82

Publication Date:2017/12/4

Online ISSN:2188-5079

DOI:10.34385/proc.29.A1L-E-1

PDF download (21.9KB)

Summary:
Research on neural networks is rapidly growing, in spite of two stagnation periods in the last half century. While theories and methods on neural networks are often invented by computer science or enormous simulation, introducing the discovery of neuroscience and neurophysiology into neural networks is also an attractive approach. We propose a novel method that sets weights between neuron units as sign-invariant weights based on the functional classification of synapses, or distinction between the excitatory synapse and the inhibitory synapse. Since weights are replaced by resistors in analog neural circuits, the proposed method of fixing the sign of the weight will bring effective innovation to the implementation of neural devices. The purpose of this study is to elucidate the effect of the proposed method on network performance. Unsupervised pre-training on the discrimination model was first performed to which the proposed method was applied, and then the model was fine-tuned. The obtained learned network showed high accuracy comparable to non-sign-invariant ordinary networks for category classification.