Recent developments in marine technologies allow underwater vehicles to perform survey missions for data collection in an automatic way. The scientific community is now focusing on endowing these vehicles with strong perception capabilities, aiming at full autonomy and decision-making skills. Such abilities would bring benefits to a wide range of field applications, e.g. Inspection and Maintenance (I&M) of man-made structures, port security, and marine rescue. Indeed, most of these tasks are currently carried out employing remotely operated vehicles, making the presence of humans in water necessary. Projects like Metrological Evaluation and Testing of Robots in International CompetitionS (METRICS), funded by the European Commission, are promoting research on this field by organising events such as the Robotics for Asset Maintenance and Inspection (RAMI) competition. In particular, this competition requires participants to develop perception techniques capable of identifying a set of specific targets. Within such context, this paper presents an algorithm able to detect and classify Objects of Potential Interest (OPIs) in underwater camera images. First, the proposed solution compensates for the quality degradation of underwater images by applying color enhancement and restoration procedures. Then, it exploits deep-learning techniques, as well as color and shape based methods, to recognize and correctly label the predefined OPIs. Preliminary results of the implemented neural network using restored images are provided, and a mean Average Precision (mAP) of about 92% was achieved on the dataset provided to the RAMI competition participating teams by the NATO Science and Technology Organization Centre for Maritime Research and Experimentation (STO CMRE).

Detection and classification of man-made objects for the autonomy of underwater robots

Gentili, A.;Bresciani, M.;Ruscio, F.;Tani, S.;Caiti, A.;Costanzi, R.
2023-01-01

Abstract

Recent developments in marine technologies allow underwater vehicles to perform survey missions for data collection in an automatic way. The scientific community is now focusing on endowing these vehicles with strong perception capabilities, aiming at full autonomy and decision-making skills. Such abilities would bring benefits to a wide range of field applications, e.g. Inspection and Maintenance (I&M) of man-made structures, port security, and marine rescue. Indeed, most of these tasks are currently carried out employing remotely operated vehicles, making the presence of humans in water necessary. Projects like Metrological Evaluation and Testing of Robots in International CompetitionS (METRICS), funded by the European Commission, are promoting research on this field by organising events such as the Robotics for Asset Maintenance and Inspection (RAMI) competition. In particular, this competition requires participants to develop perception techniques capable of identifying a set of specific targets. Within such context, this paper presents an algorithm able to detect and classify Objects of Potential Interest (OPIs) in underwater camera images. First, the proposed solution compensates for the quality degradation of underwater images by applying color enhancement and restoration procedures. Then, it exploits deep-learning techniques, as well as color and shape based methods, to recognize and correctly label the predefined OPIs. Preliminary results of the implemented neural network using restored images are provided, and a mean Average Precision (mAP) of about 92% was achieved on the dataset provided to the RAMI competition participating teams by the NATO Science and Technology Organization Centre for Maritime Research and Experimentation (STO CMRE).
File in questo prodotto:
File Dimensione Formato  
Detection.pdf

accesso aperto

Tipologia: Versione finale editoriale
Licenza: Creative commons
Dimensione 883.3 kB
Formato Adobe PDF
883.3 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/1216814
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
social impact