This brief discusses the simple idea that dropout regularization can be used to efficiently induce resiliency to missing inputs at prediction time in a generic neural network. We show how the approach can be effective on tasks where imputation strategies often fail, namely, involving recurrent neural networks and scenarios where whole sequences of input observations are missing. The experimental analysis provides an assessment of the accuracy-resiliency tradeoff in multiple recurrent models, including reservoir computing methods, and comprising real-world ambient intelligence and biomedical time series.
Augmenting Recurrent Neural Networks Resilience by Dropout
Bacciu, Davide
;CRECCHI, FRANCESCO
2020-01-01
Abstract
This brief discusses the simple idea that dropout regularization can be used to efficiently induce resiliency to missing inputs at prediction time in a generic neural network. We show how the approach can be effective on tasks where imputation strategies often fail, namely, involving recurrent neural networks and scenarios where whole sequences of input observations are missing. The experimental analysis provides an assessment of the accuracy-resiliency tradeoff in multiple recurrent models, including reservoir computing methods, and comprising real-world ambient intelligence and biomedical time series.File in questo prodotto:
File | Dimensione | Formato | |
---|---|---|---|
dropinjournal.pdf
accesso aperto
Tipologia:
Documento in Pre-print
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
667.72 kB
Formato
Adobe PDF
|
667.72 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.