Brain Tumor Classification (BTC) using Magnetic Resonance Imaging (MRI) has seen significant advancements with the adoption of Deep Learning (DL) models, particularly Convolutional Neural Networks (CNNs). However, the black-box nature of these models raises concerns regarding explainability, which is crucial in the healthcare domain. For this reason, research in BTC has increasingly focused on the application of Explainable Artificial Intelligence (XAI) techniques, mainly through post-hoc methods, with the aim of providing insights into model decisions. This work explores an alternative approach based on inherently interpretable models, specifically Fuzzy Decision Trees (FDTs). We analyze different types of FDTs, showing that they achieve good performance while ensuring high explainability levels, without the need for post-hoc explanations.
An Explainable Brain Tumor Classification System based on Fuzzy Decision Trees
Ducange, Pietro;Fazzolari, Michela;Marcelloni, Francesco;Miglionico, Giustino Claudio;Ruffini, Fabrizio
2025-01-01
Abstract
Brain Tumor Classification (BTC) using Magnetic Resonance Imaging (MRI) has seen significant advancements with the adoption of Deep Learning (DL) models, particularly Convolutional Neural Networks (CNNs). However, the black-box nature of these models raises concerns regarding explainability, which is crucial in the healthcare domain. For this reason, research in BTC has increasingly focused on the application of Explainable Artificial Intelligence (XAI) techniques, mainly through post-hoc methods, with the aim of providing insights into model decisions. This work explores an alternative approach based on inherently interpretable models, specifically Fuzzy Decision Trees (FDTs). We analyze different types of FDTs, showing that they achieve good performance while ensuring high explainability levels, without the need for post-hoc explanations.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


