Objective: Polysomnography is the gold standard for sleep monitoring, despite its many drawbacks: it is complex, costly and rather invasive. Medical-grade actigraphy represents an acceptably accurate alternative for the estimation of sleep patterns in normal, healthy adult populations and in patients suspected of certain sleep disorders. An increasing number of consumer-grade accelerometric devices populate the “quantified-self” market but the lack of validation significantly limits their reliability. Our aim was to prototype and validate a platform-free artificial neural network (ANN) based algorithm applied to a high performance, open source device (Axivity AX3), to achieve accurate actigraphic sleep detection. Methods: 14 healthy subjects (29.35 14.40 yrs, 7 females) were equipped for 13.3 2.58 h with portable polysomnography (pPSG), while wearing the Axivity AX3. The AX3 was set to record 3D accelerations at 100 Hz, with a dynamic range of 8 g coded at 10 bit. For the automatic actigraphy-based sleep detection, a 4 layer artificial neural network has been trained, validated and tested against the pPSG-based expert visual sleep-wake scoring. Results: When compared to the pPSG gold standard scoring, the ANN-based algorithm reached high concordance (85.3 0.06%), specificity (87.3 0.04%) and sensitivity (84.6 0.1%) in the detection of sleep over 30-sec epochs. Moreover there were no statistical differences between pPSG and actigraphy-based Total Sleep Time and Sleep Efficiency measurements (Wilcoxon test). Conclusions: The high concordance rate between ANN-actigraphy scoring and the standard visual pPSG one suggests that this approach could represent a viable method for collecting objective sleep-wake data using a high performance, open source actigraph.

Actigraphic sleep detection: an artificial intelligence approach

D'ASCANIO, PAOLA;Bonuccelli, U.;BONANNI, ENRICA;FARAGUNA, UGO
2016-01-01

Abstract

Objective: Polysomnography is the gold standard for sleep monitoring, despite its many drawbacks: it is complex, costly and rather invasive. Medical-grade actigraphy represents an acceptably accurate alternative for the estimation of sleep patterns in normal, healthy adult populations and in patients suspected of certain sleep disorders. An increasing number of consumer-grade accelerometric devices populate the “quantified-self” market but the lack of validation significantly limits their reliability. Our aim was to prototype and validate a platform-free artificial neural network (ANN) based algorithm applied to a high performance, open source device (Axivity AX3), to achieve accurate actigraphic sleep detection. Methods: 14 healthy subjects (29.35 14.40 yrs, 7 females) were equipped for 13.3 2.58 h with portable polysomnography (pPSG), while wearing the Axivity AX3. The AX3 was set to record 3D accelerations at 100 Hz, with a dynamic range of 8 g coded at 10 bit. For the automatic actigraphy-based sleep detection, a 4 layer artificial neural network has been trained, validated and tested against the pPSG-based expert visual sleep-wake scoring. Results: When compared to the pPSG gold standard scoring, the ANN-based algorithm reached high concordance (85.3 0.06%), specificity (87.3 0.04%) and sensitivity (84.6 0.1%) in the detection of sleep over 30-sec epochs. Moreover there were no statistical differences between pPSG and actigraphy-based Total Sleep Time and Sleep Efficiency measurements (Wilcoxon test). Conclusions: The high concordance rate between ANN-actigraphy scoring and the standard visual pPSG one suggests that this approach could represent a viable method for collecting objective sleep-wake data using a high performance, open source actigraph.
2016
https://onlinelibrary.wiley.com/doi/10.1111/jsr.12446
File in questo prodotto:
File Dimensione Formato  
AbstractESRSBologna.pdf

accesso aperto

Tipologia: Documento in Pre-print
Licenza: Creative commons
Dimensione 98.52 kB
Formato Adobe PDF
98.52 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/820573
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 0
social impact