The development of remote sensing techniques dramatically improved the human knowledge of natural phenomena and the real time monitoring and interpretation of the events happening in the environment. The recently developed terrestrial, aerial and satellite remote sensors caused the availability of huge amount of data. The large size of such data is leading the research community to the search for efficient methods for real time information extraction, and, more in general, understanding the collected data. Nowadays, this is typically done by means of artificial intelligence-based methods, and, more specifically, usually by means of machine learning tools. Focusing on semantic segmentation, which is clearly related to a proper interpretation of the acquired remote sensing data, supervised machine learning is often used: it is based on the availability of a set of ground truth labeled data, which are used in order to properly train a machine learning classifier. Despite the latter, after a proper training phase, usually allows to obtain quite effective segmentation results, the ground truth labeled data production is usually a very laborious and time consuming task, performed by a human operator. Motivated by the latter consideration, this work aims at introducing a graphical interface developed in order to support semi-automatic semantic segmentation of images acquired by a UAS. Certain of the potentialities of the proposed graphical are shown in the specific case of plastic litter detection in multi-spectral images.

DEVELOPMENT OF A GRAPHICAL USER INTERFACE TO SUPPORT THE SEMI-AUTOMATIC SEMANTIC SEGMENTATION OF UAS-IMAGES / Masiero A.; Cortesi I.; Tucci G.. - In: INTERNATIONAL ARCHIVES OF THE PHOTOGRAMMETRY, REMOTE SENSING AND SPATIAL INFORMATION SCIENCES. - ISSN 1682-1750. - ELETTRONICO. - 48:(2022), pp. 295-300. (Intervento presentato al convegno FOSS4G 2022) [10.5194/isprs-archives-XLVIII-4-W1-2022-295-2022].

DEVELOPMENT OF A GRAPHICAL USER INTERFACE TO SUPPORT THE SEMI-AUTOMATIC SEMANTIC SEGMENTATION OF UAS-IMAGES

Masiero A.
;
Cortesi I.;Tucci G.
2022

Abstract

The development of remote sensing techniques dramatically improved the human knowledge of natural phenomena and the real time monitoring and interpretation of the events happening in the environment. The recently developed terrestrial, aerial and satellite remote sensors caused the availability of huge amount of data. The large size of such data is leading the research community to the search for efficient methods for real time information extraction, and, more in general, understanding the collected data. Nowadays, this is typically done by means of artificial intelligence-based methods, and, more specifically, usually by means of machine learning tools. Focusing on semantic segmentation, which is clearly related to a proper interpretation of the acquired remote sensing data, supervised machine learning is often used: it is based on the availability of a set of ground truth labeled data, which are used in order to properly train a machine learning classifier. Despite the latter, after a proper training phase, usually allows to obtain quite effective segmentation results, the ground truth labeled data production is usually a very laborious and time consuming task, performed by a human operator. Motivated by the latter consideration, this work aims at introducing a graphical interface developed in order to support semi-automatic semantic segmentation of images acquired by a UAS. Certain of the potentialities of the proposed graphical are shown in the specific case of plastic litter detection in multi-spectral images.
2022
2022 Free and Open Source Software for Geospatial, FOSS4G 2022
FOSS4G 2022
Masiero A.; Cortesi I.; Tucci G.
File in questo prodotto:
File Dimensione Formato  
2022 C FOSS4G isprs-archives-XLVIII-4-W1-2022-295.pdf

accesso aperto

Tipologia: Pdf editoriale (Version of record)
Licenza: Open Access
Dimensione 3.45 MB
Formato Adobe PDF
3.45 MB Adobe PDF

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1285590
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact