Summary

the 2014 International Symposium on Nonlinear Theory and its Applications

2014

Session Number:D2L-D

Session:

Number:D2L-D4

Liquid State Machine with Heterogeneous Connections for Information Networks

Kaku Yamaguchi,  Jun-nosuke Teramae,  Naoki Wakamiya,  

pp.799-802

Publication Date:2014/9/14

Online ISSN:2188-5079

DOI:10.34385/proc.46.D2L-D4

PDF download (116.2KB)

Summary:
Liquid State Machine (LSM) is a recently proposed model of cortical computation. The model consists of a random network of large number of neurons interconnected with almost uniform connection strengths. Despite task-independent topology of the network , the LSM successfully performs them only if connection strengths to a readout unit are tuned properly. It implies that an information network based on LSM does not need costly topology maintenance. Recent experiments, however, revealed that strengths of synaptic connections in cortical network are far from uniform but widely distributed on highly heterogeneous distribution with a heavy tail, that can help reliable spike information transmission. In this paper, we first introduce highly heterogeneous connection strengths into the LSM and show that heterogeneity actually improves ability to store input sequences. We then limit the number of output neurons directly projecting to a readout unit. While it degrades the communication ability across the LSM, we can partly compensate for the degradation by utilizing finite history of states of projection neurons.