The study of deep Recurrent Neural Network (RNN) models represents a research topic of increasing interest. In this paper we investigate layered recurrent architectures under a dynamical system point of view, focusing on characterizing the fundamental aspect of stability. To this end we provide a framework that allows the analysis of deepRNN dynamical regimes through the study of the maximum among the local Lyapunov exponents. Applied to the case of Reservoir Computing networks, our investigation also provides insights on the true merits of layering in RNN architectures, effectively showing how increasing the number of layers eventually results in progressively less stable global dynamics.

Local Lyapunov Exponents of Deep RNN

GALLICCHIO, CLAUDIO;MICHELI, ALESSIO;
2017-01-01

Abstract

The study of deep Recurrent Neural Network (RNN) models represents a research topic of increasing interest. In this paper we investigate layered recurrent architectures under a dynamical system point of view, focusing on characterizing the fundamental aspect of stability. To this end we provide a framework that allows the analysis of deepRNN dynamical regimes through the study of the maximum among the local Lyapunov exponents. Applied to the case of Reservoir Computing networks, our investigation also provides insights on the true merits of layering in RNN architectures, effectively showing how increasing the number of layers eventually results in progressively less stable global dynamics.
2017
978-287587038-4
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/851919
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 8
  • ???jsp.display-item.citation.isi??? ND
social impact