Recent research studies reported that the employment of wearable augmented reality (AR) systems such as head-mounted displays for the in situ visualisation of ultrasound (US) images can improve the outcomes of US-guided biopsies through reduced procedure completion times and improved accuracy. Here, the authors continue in the direction of recent developments and present the first AR system for guiding an in-depth tumour enucleation procedure under US guidance. The system features an innovative visualisation modality with cutting trajectories that 'sink' into the tissue according to the depth reached by the electric scalpel, tracked in real-time, and a virtual-to-virtual alignment between the scalpel's tip and the trajectory. The system has high accuracy in estimating the scalpel's tip position (mean depth error of 0.4 mm and mean radial error of 1.34 mm). Furthermore, we demonstrated with a preliminary user study that our system allowed us to successfully guide an in-depth tumour enucleation procedure (i.e. preserving the safety margin around the lesion). Here, we propose the first AR system for guiding in-depth tumour enucleation procedures. The innovative visualisation modality allows the operator to have the correct guidance at all depths regardless of the point of view. In our system, in fact, the tracking of the surgical tool allows the cutting trajectory to be automatically adapted according to the depth reached by the instrument. In addition, the implementation of a virtual-to-virtual alignment, combined with the chromatic information provided through a semaphore, allows the operator to never lose sight of the instrument tip, even when it is deep into the tissue.
Visualization modality for augmented reality guidance of in-depth tumour enucleation procedures
Cattari N.;Cutolo F.;Ferrari V.
2023-01-01
Abstract
Recent research studies reported that the employment of wearable augmented reality (AR) systems such as head-mounted displays for the in situ visualisation of ultrasound (US) images can improve the outcomes of US-guided biopsies through reduced procedure completion times and improved accuracy. Here, the authors continue in the direction of recent developments and present the first AR system for guiding an in-depth tumour enucleation procedure under US guidance. The system features an innovative visualisation modality with cutting trajectories that 'sink' into the tissue according to the depth reached by the electric scalpel, tracked in real-time, and a virtual-to-virtual alignment between the scalpel's tip and the trajectory. The system has high accuracy in estimating the scalpel's tip position (mean depth error of 0.4 mm and mean radial error of 1.34 mm). Furthermore, we demonstrated with a preliminary user study that our system allowed us to successfully guide an in-depth tumour enucleation procedure (i.e. preserving the safety margin around the lesion). Here, we propose the first AR system for guiding in-depth tumour enucleation procedures. The innovative visualisation modality allows the operator to have the correct guidance at all depths regardless of the point of view. In our system, in fact, the tracking of the surgical tool allows the cutting trajectory to be automatically adapted according to the depth reached by the instrument. In addition, the implementation of a virtual-to-virtual alignment, combined with the chromatic information provided through a semaphore, allows the operator to never lose sight of the instrument tip, even when it is deep into the tissue.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.