Physical implementation of recurrent neural networks is hindered by the fact that hidden units need to be trained and are often fully-connected. We propose to relieve both these constraints by adopting and improving on an oscillators-based reservoir computing model called Random Oscillators Network (RON). RON is a recurrent neural network composed by damped oscillatory units that showed excellent performance in many sequence processing tasks. RON does not require training of its hidden parameters, since it leverages on a random heterogeneous reservoir. However, the reservoir of RON depends on a fully-connected set of oscillators. In this paper, we propose 6 sparse topologies for RON and we study the performance of the model across different levels of sparsity and different numbers of hidden units. Our experiments highlight that RON can tolerate large levels of sparsity without harming its expressive power, and in most cases even outperforming its fully-connected counterpart. Also RON clearly surpasses Leaky ESN (sparse and fully-connected) and LSTM in all benchmarks. We believe RON to be an ideal candidate for the realization and the study of physical neural networks in the real world.

Sparse Reservoir Topologies for Physical Implementations of Random Oscillators Networks

Cossu A.;Ceni A.;Bacciu D.;Gallicchio C.
2025-01-01

Abstract

Physical implementation of recurrent neural networks is hindered by the fact that hidden units need to be trained and are often fully-connected. We propose to relieve both these constraints by adopting and improving on an oscillators-based reservoir computing model called Random Oscillators Network (RON). RON is a recurrent neural network composed by damped oscillatory units that showed excellent performance in many sequence processing tasks. RON does not require training of its hidden parameters, since it leverages on a random heterogeneous reservoir. However, the reservoir of RON depends on a fully-connected set of oscillators. In this paper, we propose 6 sparse topologies for RON and we study the performance of the model across different levels of sparsity and different numbers of hidden units. Our experiments highlight that RON can tolerate large levels of sparsity without harming its expressive power, and in most cases even outperforming its fully-connected counterpart. Also RON clearly surpasses Leaky ESN (sparse and fully-connected) and LSTM in all benchmarks. We believe RON to be an ideal candidate for the realization and the study of physical neural networks in the real world.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/1313108
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact