The study presents an experimental validation of an optimization technique for reservoir computing using an optoelectronic setup. The technique leverages a delayed version of the input signal to identify the optimal operational region of the reservoir. This simplifies the traditionally time-consuming task of hyperparameter tuning. The effectiveness of this approach is verified on different benchmark tasks and reservoir operating conditions. The study also highlights the potential of reservoir computing in processing temporal data and the need for efficient optimization approaches.

 

Publication date: 25 Jan 2024
Project Page: N/A
Paper: https://arxiv.org/pdf/2401.14371