Summary

2023

Session Number:C4L-4

Session:

Number:C4L-44

A Study on Visualizing the Effectiveness of the Parameter Set for Forecasting Time Series Using the ReLU Chaotic Neural Network Reservoir

Saito Tatsuya,  Fujita Misa,  

pp.649-652

Publication Date:2023-09-21

Online ISSN:2188-5079

DOI:10.34385/proc.76.C4L-44

PDF download (513.2KB)

Summary:
The chaotic neural network reservoir (CNNR) is a type of reservoir computing that replaces the reservoir layer neuron models with the chaotic neuron model. The CNNR can be trained quickly and accurately. However, the tuning of the CNNR is very difficult because it has many parameters. To overcome this issue, we proposed to use the complexity entropy causality plane (CECP) to visualize the effectiveness of the parameter set. The results of numerical experiments revealed that the distance between the features of the original time series and that of the time series of the internal state of neurons in the reservoir layer of the ReLU CNNR on the CECP corresponds to the root mean square error of the two-time series.