Summary

Proceedings of the 2012 International Symposium on Nonlinear Theory and its Applications

2012

Session Number:B4L-D

Session:

Number:513

Information processing with recurrent dynamical systems: theory, characterization and experiment.

S. Massar,  Y. Paquot,  F. Duport,  A. Smerieri,  M. Massar,  J. Dambre,  M. Haelterman,  B. Schrauwen,  

pp.513-514

Publication Date:

Online ISSN:2188-5079

DOI:10.15248/proc.1.513

PDF download (294.4KB)

Summary:
Reservoir computing is a novel machine learning technique which uses recurrent dynamical systems to process time dependent information. Reservoir computing provides a very interesting opportunity for cross-fertilisation between machine learning and physicists/mathematicians working on non linear dynamical systems. We report progress on three questions related to reservoir computing, with the aim of bridging the gap between these communities. First we show that the mean field theory of statistical mechanics provides a simple framework with which one can understand the dynamics of the recurrent systems typically used for reservoir computing. Second we inquire whether it is possible to characterize quantitatively the information processing carried out by the dynamical system? We show that the linear memory capacity of dynamical systems introduced by Jaeger can be extended to the non linear capacity. As application of this concept we show that all dynamical systems comprising the same number of internal variables have the same total capacity to process information, provided their internal variables are linearly independent and that the system has fading memory. Third we inquire whether the concept of reservoir computing be used for experimental realization of analog information processing systems? We we will report on a opto-electronic experimental implementation of reservoir computing which is simple to implement, yet has performance comparable to state of the art digital implementations.

References:

[1] H. Jaeger, The echo state approach to analysing and training recurrent neural networks, Fraunhofer Institute for Autonomous Intelligent Systems, Technical report: GMD Report 148, 2001.

[2] W. Maass,T. Natschlager, H. Markram, Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural computation 14, 2531-2560, 2002.

[3] H. Jaeger, H. Haas, Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication, Science 304, 78-80, 2004.

[4] S. Massar and M. Massar, Mean Field Theory of Dynamical Systems Driven by External Signals, in preparation.

[5] M. Hermans and B. Schrauwen, Memory in linear recurrent neural networks in continuous time, Neural Networks 23, pp. 341-355, 2010

[6] J. Dambre, D. Verstraeten, B. Schrauwen and S. Massar, Information Processing Capacity of Dynamical Systems, submitted

[7] H. Jaeger, Short Term Memory in Echo State Networks, Fraunhofer Institute for Autonomous Intelligent Systems. Technical report:GMD report 152, 2002.

[8] Y. Paquot, B. Schrauwen, J. Dambre, M. Haelterman, S. Massar, Reservoir Computing: a Photonic Neural Network for Information Processing, In Proceedings of SPIE Vol. 7728 77280B-1, 2010

[9] L. Appeltant L et al., Information processing using a single dynamical node as complex system, Nature Communications 2, 468-472, 2011.

[10] Y. Paquot et al., Optoelectronic Reservoir Computing, Scientific Reports 2, 287, 2012

[11] L. Larger et al., Photonic information processing beyond Turing: an optoelectronic implementation of reservoir computing, Optics Express 20, 3241-3249, 2012