Background and Objective: Artificial Intelligence has proven to be effective in radiomics. The main problem in using Artificial Intelligence is that researchers and practitioners are not able to know how the predictions are generated. This is currently an open issue because results’ explainability is advantageous in understanding the reasoning behind the model, both for patients than for implementing a feedback mechanism for medical specialists using decision support systems. Methods: Addressing transparency issues related to the Artificial Intelligence field, the innovative technique of Formal methods use a mathematical logic reasoning to produce an automatic, quick and reliable diagnosis. In this paper we analyze results given by the adoption of Formal methods for the diagnosis of the Coronavirus disease: specifically, we want to analyse and understand, in a more medical way, the meaning of some radiomic features to connect them with clinical or radiological evidences. Results: In particular, the usage of Formal methods allows the authors to do statistical analysis on the feature value distributions, to do pattern recognition on disease models, to generalize the model of a disease and to reach high performances of results and interpretation of them. A further step for explainability can be accounted by the localization and selection of the most important slices in a multi-slice approach. Conclusions: In conclusion, we confirmed the clinical significance of some First order features as Skewness and Kurtosis. On the other hand, we suggest to decline the use of the Minimum feature because of its intrinsic connection with the Computational Tomography exam of the lung.

Explainability of radiomics through formal methods

Varriano G.;Guerriero P.;Santone A.;Mercaldo F.;Brunese L.
2022-01-01

Abstract

Background and Objective: Artificial Intelligence has proven to be effective in radiomics. The main problem in using Artificial Intelligence is that researchers and practitioners are not able to know how the predictions are generated. This is currently an open issue because results’ explainability is advantageous in understanding the reasoning behind the model, both for patients than for implementing a feedback mechanism for medical specialists using decision support systems. Methods: Addressing transparency issues related to the Artificial Intelligence field, the innovative technique of Formal methods use a mathematical logic reasoning to produce an automatic, quick and reliable diagnosis. In this paper we analyze results given by the adoption of Formal methods for the diagnosis of the Coronavirus disease: specifically, we want to analyse and understand, in a more medical way, the meaning of some radiomic features to connect them with clinical or radiological evidences. Results: In particular, the usage of Formal methods allows the authors to do statistical analysis on the feature value distributions, to do pattern recognition on disease models, to generalize the model of a disease and to reach high performances of results and interpretation of them. A further step for explainability can be accounted by the localization and selection of the most important slices in a multi-slice approach. Conclusions: In conclusion, we confirmed the clinical significance of some First order features as Skewness and Kurtosis. On the other hand, we suggest to decline the use of the Minimum feature because of its intrinsic connection with the Computational Tomography exam of the lung.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11695/109549
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 2
social impact