This brief discusses the simple idea that dropout regularization can be used to efficiently induce resiliency to missing inputs at prediction time in a generic neural network. We show how the approach can be effective on tasks where imputation strategies often fail, namely, involving recurrent neural networks and scenarios where whole sequences of input observations are missing. The experimental analysis provides an assessment of the accuracy-resiliency tradeoff in multiple recurrent models, including reservoir computing methods, and comprising real-world ambient intelligence and biomedical time series.

Augmenting Recurrent Neural Networks Resilience by Dropout

Bacciu, Davide
;
CRECCHI, FRANCESCO
2020-01-01

Abstract

This brief discusses the simple idea that dropout regularization can be used to efficiently induce resiliency to missing inputs at prediction time in a generic neural network. We show how the approach can be effective on tasks where imputation strategies often fail, namely, involving recurrent neural networks and scenarios where whole sequences of input observations are missing. The experimental analysis provides an assessment of the accuracy-resiliency tradeoff in multiple recurrent models, including reservoir computing methods, and comprising real-world ambient intelligence and biomedical time series.
2020
Bacciu, Davide; Crecchi, Francesco
File in questo prodotto:
File Dimensione Formato  
dropinjournal.pdf

accesso aperto

Tipologia: Documento in Pre-print
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 667.72 kB
Formato Adobe PDF
667.72 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/1000157
Citazioni
  • ???jsp.display-item.citation.pmc??? 1
  • Scopus 28
  • ???jsp.display-item.citation.isi??? 22
social impact