Industry 4.0 conceptualizes the automation of processes through the introduction of technologies such as artificial intelligence and advanced robotics, resulting in a significant production improvement. Detecting defects in the production process, predicting mechanical malfunctions in the assembly line, and identifying defects of the final product are just a few examples of applications of these technologies. In this context, this work focuses on the detection of ultrasound probes’ surface defects, with a focus on Esaote S.p.A.’s production line probes. To date, this control is performed manually and therefore biased by many factors such as surface morphology, color, size of the defect, and by lighting conditions (which can cause reflections preventing detection). To overcome these shortfalls, this work proposes a fully automatic machine vision system for surface acquisition of ultrasound probes coupled with an automated defect detection system that leverage artificial intelligence. The paper addresses two crucial steps: (i) the development of the acquisition system (i.e., selection of the acquisition device, analysis of the illumination system, and design of the camera handling system); (ii) the analysis of neural network models for defect detection and classification by comparing three possible solutions (i.e., MMSD-Net, ResNet, EfficientNet). The results suggest that the developed system has the potential to be used as a defect detection tool in the production line (full image acquisition cycle takes ~ 200 s), with the best detection accuracy obtained with the EfficientNet model being 98.63% and a classification accuracy of 81.90%.

Machine vision system for automatic defect detection of ultrasound probes / Profili, Andrea; Magherini, Roberto; Servi, Michaela; Spezia, Fabrizio; Gemmiti, Daniele; Volpe, Yary. - In: INTERNATIONAL JOURNAL, ADVANCED MANUFACTURING TECHNOLOGY. - ISSN 0268-3768. - ELETTRONICO. - 135:(2024), pp. 3421-3435. [10.1007/s00170-024-14701-6]

Machine vision system for automatic defect detection of ultrasound probes

Profili, Andrea
;
Magherini, Roberto;Servi, Michaela;Volpe, Yary
2024

Abstract

Industry 4.0 conceptualizes the automation of processes through the introduction of technologies such as artificial intelligence and advanced robotics, resulting in a significant production improvement. Detecting defects in the production process, predicting mechanical malfunctions in the assembly line, and identifying defects of the final product are just a few examples of applications of these technologies. In this context, this work focuses on the detection of ultrasound probes’ surface defects, with a focus on Esaote S.p.A.’s production line probes. To date, this control is performed manually and therefore biased by many factors such as surface morphology, color, size of the defect, and by lighting conditions (which can cause reflections preventing detection). To overcome these shortfalls, this work proposes a fully automatic machine vision system for surface acquisition of ultrasound probes coupled with an automated defect detection system that leverage artificial intelligence. The paper addresses two crucial steps: (i) the development of the acquisition system (i.e., selection of the acquisition device, analysis of the illumination system, and design of the camera handling system); (ii) the analysis of neural network models for defect detection and classification by comparing three possible solutions (i.e., MMSD-Net, ResNet, EfficientNet). The results suggest that the developed system has the potential to be used as a defect detection tool in the production line (full image acquisition cycle takes ~ 200 s), with the best detection accuracy obtained with the EfficientNet model being 98.63% and a classification accuracy of 81.90%.
2024
135
3421
3435
Profili, Andrea; Magherini, Roberto; Servi, Michaela; Spezia, Fabrizio; Gemmiti, Daniele; Volpe, Yary
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1401734
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact