Residual connections have been established as a staple for modern deep learning architectures. Most of their applications are cast towards feedforward computing. In this paper, we study the architectural bias of residual connections in the context of recurrent neural networks (RNNs), specifically in the temporal dimension. We frame our discussion from the perspective of Reservoir Computing and dynamical system theory, focusing on important aspects of neural computation like memory capacity, long-term information processing, stability, and nonlinear computation capability. Experiments corroborate the striking advantage brought by temporal residual connections for a plethora of different time series processing tasks, comprehending memory-based, forecasting, and classification problems.
Residual Echo State Networks: Residual recurrent neural networks with stable dynamics and fast learning
Ceni A.;Gallicchio C.
2024-01-01
Abstract
Residual connections have been established as a staple for modern deep learning architectures. Most of their applications are cast towards feedforward computing. In this paper, we study the architectural bias of residual connections in the context of recurrent neural networks (RNNs), specifically in the temporal dimension. We frame our discussion from the perspective of Reservoir Computing and dynamical system theory, focusing on important aspects of neural computation like memory capacity, long-term information processing, stability, and nonlinear computation capability. Experiments corroborate the striking advantage brought by temporal residual connections for a plethora of different time series processing tasks, comprehending memory-based, forecasting, and classification problems.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.