Visual augmented reality (AR) headsets have the potential to enhance surgical navigation by providing physicians with an egocentric visualization interface capable of seamlessly blending the virtual navigation aid with the real surgical scenario. However, technological and human-factor limitations still hinder the routine use of commercial AR headsets in clinical practice. The aim of this work is to unveil the AR rendering pipeline of a device-agnostic software framework conceived to fulfill strict requirements towards the realization of a functional and reliable AR-based surgical navigator and capable of supporting the deployment of AR applications for image-guided surgery on different AR headsets. The AR rendering pipeline provides highly accurate AR overlay under both video and optical see-through modalities with almost no perceivable difference in terms of perception of relative distances and depths when used in the peripersonal space. The rendering pipeline allows the setting of the intrinsic and extrinsic projection parameters of the virtual rendering cameras offline and at runtime: under video see-through modality, the rendering pipeline can be modified to adapt the warping of the camera frames and pursue an orthostereoscopic and almost natural perception of the real scene in the peripersonal space. Similarly, under optical see-through modality, the calibrated intrinsic and extrinsic parameters of the eye-display model can be updated by the user to account for the actual user’s eye position. The results of the performance tests with an eye-replacement camera show an average motion-to-photon latency of around 110 ms for both AR rendering modalities. The AR platform for surgical navigation has already proven its efficacy and reliability under VST modality during real surgical operations in craniomaxillofacial surgery.

Device-Agnostic Augmented Reality Rendering Pipeline for AR in Medicine

Cutolo, Fabrizio;Cattari, Nadia;Carbone, Marina;D'Amato, Renzo;Ferrari, Vincenzo
2021-01-01

Abstract

Visual augmented reality (AR) headsets have the potential to enhance surgical navigation by providing physicians with an egocentric visualization interface capable of seamlessly blending the virtual navigation aid with the real surgical scenario. However, technological and human-factor limitations still hinder the routine use of commercial AR headsets in clinical practice. The aim of this work is to unveil the AR rendering pipeline of a device-agnostic software framework conceived to fulfill strict requirements towards the realization of a functional and reliable AR-based surgical navigator and capable of supporting the deployment of AR applications for image-guided surgery on different AR headsets. The AR rendering pipeline provides highly accurate AR overlay under both video and optical see-through modalities with almost no perceivable difference in terms of perception of relative distances and depths when used in the peripersonal space. The rendering pipeline allows the setting of the intrinsic and extrinsic projection parameters of the virtual rendering cameras offline and at runtime: under video see-through modality, the rendering pipeline can be modified to adapt the warping of the camera frames and pursue an orthostereoscopic and almost natural perception of the real scene in the peripersonal space. Similarly, under optical see-through modality, the calibrated intrinsic and extrinsic parameters of the eye-display model can be updated by the user to account for the actual user’s eye position. The results of the performance tests with an eye-replacement camera show an average motion-to-photon latency of around 110 ms for both AR rendering modalities. The AR platform for surgical navigation has already proven its efficacy and reliability under VST modality during real surgical operations in craniomaxillofacial surgery.
2021
978-1-6654-1298-8
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/1110471
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 4
social impact