Artificial Intelligence models have been employed in various fields, leading to a growing interest in the subject and in the development of the models. The direct involvement of complex AI models in decision-making processes stressed the needs to explain the rationales behind the results, globally and locally for each prediction/result via eXplainable Artificial Intelligence (XAI) techniques. This paper compared three XAI techniques (SHAP, LIME and IG) with aim of using them for temporal explainability of predictive results regarding time-series in order to understand if these methods are able provide temporal explanation of deep learning AI models. The comparison provided has been qualitative and quantitative and addressing computational performance. This work has been partially supported by the CN MOST, national center on sustainable mobility in Italy, on CAI4DSA of FAIR, and has been developed on the Snap4City platform.

Comparing Techniques for Temporal Explainable Artificial Intelligence / Edoardo Canti; Enrico Collini; Luciano Alessandro Ipsaro Palesi; Paolo Nesi. - STAMPA. - (2024), pp. 87-91. (Intervento presentato al convegno 2024 IEEE 10th International Conference on Big Data Computing Service and Machine Learning Applications) [10.1109/BigDataService62917.2024.00019].

Comparing Techniques for Temporal Explainable Artificial Intelligence

Edoardo Canti;Enrico Collini;Luciano Alessandro Ipsaro Palesi;Paolo Nesi
2024

Abstract

Artificial Intelligence models have been employed in various fields, leading to a growing interest in the subject and in the development of the models. The direct involvement of complex AI models in decision-making processes stressed the needs to explain the rationales behind the results, globally and locally for each prediction/result via eXplainable Artificial Intelligence (XAI) techniques. This paper compared three XAI techniques (SHAP, LIME and IG) with aim of using them for temporal explainability of predictive results regarding time-series in order to understand if these methods are able provide temporal explanation of deep learning AI models. The comparison provided has been qualitative and quantitative and addressing computational performance. This work has been partially supported by the CN MOST, national center on sustainable mobility in Italy, on CAI4DSA of FAIR, and has been developed on the Snap4City platform.
2024
2024 IEEE 10th International Conference on Big Data Computing Service and Machine Learning Applications
2024 IEEE 10th International Conference on Big Data Computing Service and Machine Learning Applications
Edoardo Canti; Enrico Collini; Luciano Alessandro Ipsaro Palesi; Paolo Nesi
File in questo prodotto:
File Dimensione Formato  
Comparing_Techniques_for_Temporal_Explainable_Artificial_Intelligence.pdf

Accesso chiuso

Tipologia: Pdf editoriale (Version of record)
Licenza: Tutti i diritti riservati
Dimensione 969.45 kB
Formato Adobe PDF
969.45 kB Adobe PDF   Richiedi una copia

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1401193
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact