In recent years, Deep Learning (DL) methodologies have achieved outstanding results in the specific field of digital image processing, performing automatic target classification and object detection in increasingly challenging scenarios. In particular, Convolutional Neural Networks (CNN)-based architectures are currently the most widely used solution for automatic target recognition (ATR). With regard to marine robotics, there has been a massive of underwater imaging for a wide variety of applications highlighting a compelling demand for the use of DL-based ATRs. In this context, acoustic sensing devices such as Forward-Looking Sonar (FLS) are usually preferred for inspection and exploration activities due to their ability, unlike optical cameras, to be unaffected by lighting and turbidity conditions. Motivated by these considerations, in this research work a comprehensive comparative analysis of several state-of-the-art CNNs for instance segmentation in FLS imagery is presented. Specifically, we firstly trained cutting-edge architectures, such as You Only Look Once (YOLO), Mask Region-based CNN (Mask R-CNN), You Only Look At CoefficienTs (YOLACT), and Segmenting Objects by Locations (SOLO), with a custom collected training dataset. Subsequently, a benchmark dataset has been exploited to evaluate the effectiveness of the selected CNNs in addressing the challenges posed by FLS imagery through standardized evaluation metrics for instance segmentation.
Advancements in instance segmentation for Forward-Looking Sonar imagery: a comparative analysis of state-of-the-art Convolutional Neural Network models / Cecchi, Lorenzo; Topini, Alberto; Bucci, Alessandro; Ridolfi, Alessandro. - ELETTRONICO. - (2024), pp. 1-6. ( 2024 IEEE/OES Autonomous Underwater Vehicles Symposium, AUV 2024 Boston, MA, USA 18-20 September 2024) [10.1109/auv61864.2024.11030408].
Advancements in instance segmentation for Forward-Looking Sonar imagery: a comparative analysis of state-of-the-art Convolutional Neural Network models
Cecchi, Lorenzo
;Topini, Alberto;Bucci, Alessandro;Ridolfi, Alessandro
2024
Abstract
In recent years, Deep Learning (DL) methodologies have achieved outstanding results in the specific field of digital image processing, performing automatic target classification and object detection in increasingly challenging scenarios. In particular, Convolutional Neural Networks (CNN)-based architectures are currently the most widely used solution for automatic target recognition (ATR). With regard to marine robotics, there has been a massive of underwater imaging for a wide variety of applications highlighting a compelling demand for the use of DL-based ATRs. In this context, acoustic sensing devices such as Forward-Looking Sonar (FLS) are usually preferred for inspection and exploration activities due to their ability, unlike optical cameras, to be unaffected by lighting and turbidity conditions. Motivated by these considerations, in this research work a comprehensive comparative analysis of several state-of-the-art CNNs for instance segmentation in FLS imagery is presented. Specifically, we firstly trained cutting-edge architectures, such as You Only Look Once (YOLO), Mask Region-based CNN (Mask R-CNN), You Only Look At CoefficienTs (YOLACT), and Segmenting Objects by Locations (SOLO), with a custom collected training dataset. Subsequently, a benchmark dataset has been exploited to evaluate the effectiveness of the selected CNNs in addressing the challenges posed by FLS imagery through standardized evaluation metrics for instance segmentation.| File | Dimensione | Formato | |
|---|---|---|---|
|
Advancements_in_instance_segmentation_for_Forward-Looking_Sonar_imagery_a_comparative_analysis_of_state-of-the-art_Convolutional_Neural_Network_models.pdf
Accesso chiuso
Descrizione: Articolo principale
Tipologia:
Pdf editoriale (Version of record)
Licenza:
Tutti i diritti riservati
Dimensione
8.2 MB
Formato
Adobe PDF
|
8.2 MB | Adobe PDF | Richiedi una copia |
I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.



