2024 Volume 15 Issue 4 Pages 764-783
Delay-based reservoir computing is a physical implementation scheme of machine learning and could lead to more energy-efficient and faster processing than traditional digital computing methods. Reservoir computing based on echo-state networks has a well-known trade-off between memory and nonlinearity. However, this trade-off has not been extensively studied in the context of delay-based reservoir computing, and methods for adjusting memory and nonlinearity have not yet been explored. In this study, we show that the delay-based reservoir computing exhibits a trade-off relationship, where a reservoir with a small delay time produces higher nonlinearity and lower memory, and one with a large delay time produces opposite properties. Our results indicate that nonlinearity and memory can be controlled by varying the feedback delay time in delay-based reservoir computing. Moreover, we show that a delay-based reservoir with a small delay time can produce nodes with different nonlinearity and memory properties. These findings have a potential to significantly enhance the performance and efficiency of delay-based reservoir computing. We investigate the reservoir computing performance of two chaotic time-series prediction tasks. A small delay reservoir produces a high prediction performance for multivariate time-series analysis.The proposed method allows for effective control of reservoir nonlinearity without the need to tune physical nodes.