This paper presents a scalable framework for real-time, 6-DOF pose estimation of uncooperative space objects using monocular cameras. We address dataset scarcity by generating mission-specific synthetic data augmented with style randomization to mitigate domain shift. A lightweight YOLOv11n-pose model extracts keypoints, which feed into a RANSAC-PnP solver for initial pose estimation, refined through an Extended Kalman Filter leveraging rigid body dynamics. Validation against synthetic sequences and laboratory experiments with a 1:1 replica demonstrates mean orientation errors of 3.9° and translation errors of 1.1%, matching state-of-the-art performance. The pipeline maintains robustness under challenging lighting conditions and domain shift, achieving ~50 fps on GPU and ~5 fps on CPU, enabling deployment on resource-constrained platforms for on-orbit servicing and debris removal missions.

Vision-Based Relative Pose Estimation of Space Objects for Proximity Operations Using Lab-Validated Deep Learning

Matteo FORASASSI;Giordana BUCCHIONI;Lorenzo POLLINI
2025-01-01

Abstract

This paper presents a scalable framework for real-time, 6-DOF pose estimation of uncooperative space objects using monocular cameras. We address dataset scarcity by generating mission-specific synthetic data augmented with style randomization to mitigate domain shift. A lightweight YOLOv11n-pose model extracts keypoints, which feed into a RANSAC-PnP solver for initial pose estimation, refined through an Extended Kalman Filter leveraging rigid body dynamics. Validation against synthetic sequences and laboratory experiments with a 1:1 replica demonstrates mean orientation errors of 3.9° and translation errors of 1.1%, matching state-of-the-art performance. The pipeline maintains robustness under challenging lighting conditions and domain shift, achieving ~50 fps on GPU and ~5 fps on CPU, enabling deployment on resource-constrained platforms for on-orbit servicing and debris removal missions.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/1345007
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact