Brain Tumor Classification (BTC) using Magnetic Resonance Imaging (MRI) has achieved remarkable progress through Deep Learning (DL) models, particularly Convolutional Neural Networks (CNNs). However, the opaque nature of these models raises concerns regarding explainability, which is critical in clinical decision support. To address this, most research has focused on post-hoc Explainable AI (XAI) methods that provide after-the-fact interpretations of CNN predictions. In contrast, this work investigates an inherently explainable alternative based on Fuzzy Decision Trees (FDTs), which combine the interpretability of rule-based reasoning with the expressiveness of fuzzy logic. Moreover, we enhance model transparency by integrating radiomic features that capture clinically meaningful tumor characteristics such as shape, texture, and intensity. To the best of our knowledge, this is among the first studies to apply FDTs to brain tumor classification from MRI, explicitly coupling radiomics with multi-way FDT architectures. We perform a comprehensive evaluation comparing FDTs against four state-of-the-art CNNs, namely ConvNeXt, ResNet18, ResNet50, and EfficientNetB0, as well as classical binary Decision Trees (DTs). We provide an explicit analysis of the trade-off between accuracy, complexity, and interpretability of the models. Results show that FDTs achieve competitive performance (overall F1-score ≈ 0.84) compared to the best CNN baseline (ResNet50, F1-score ≈ 0.86), while offering substantially higher explainability and interpretability. Overall, this study demonstrates that FDTs can bridge the gap between accuracy and explainability, offering a viable explainable-by-design alternative to deep learning in medical imaging. Future work will focus on validating this generalizability across different imaging domains and dataset variations.

Fuzzy Decision Trees for Explainable Brain Tumor Classification: A Comparative Study with Deep Neural Networks and Classical Binary Decision Trees

Pietro Ducange;Michela Fazzolari;Francesco Marcelloni;Giustino Claudio Miglionico;Fabrizio Ruffini
2026-01-01

Abstract

Brain Tumor Classification (BTC) using Magnetic Resonance Imaging (MRI) has achieved remarkable progress through Deep Learning (DL) models, particularly Convolutional Neural Networks (CNNs). However, the opaque nature of these models raises concerns regarding explainability, which is critical in clinical decision support. To address this, most research has focused on post-hoc Explainable AI (XAI) methods that provide after-the-fact interpretations of CNN predictions. In contrast, this work investigates an inherently explainable alternative based on Fuzzy Decision Trees (FDTs), which combine the interpretability of rule-based reasoning with the expressiveness of fuzzy logic. Moreover, we enhance model transparency by integrating radiomic features that capture clinically meaningful tumor characteristics such as shape, texture, and intensity. To the best of our knowledge, this is among the first studies to apply FDTs to brain tumor classification from MRI, explicitly coupling radiomics with multi-way FDT architectures. We perform a comprehensive evaluation comparing FDTs against four state-of-the-art CNNs, namely ConvNeXt, ResNet18, ResNet50, and EfficientNetB0, as well as classical binary Decision Trees (DTs). We provide an explicit analysis of the trade-off between accuracy, complexity, and interpretability of the models. Results show that FDTs achieve competitive performance (overall F1-score ≈ 0.84) compared to the best CNN baseline (ResNet50, F1-score ≈ 0.86), while offering substantially higher explainability and interpretability. Overall, this study demonstrates that FDTs can bridge the gap between accuracy and explainability, offering a viable explainable-by-design alternative to deep learning in medical imaging. Future work will focus on validating this generalizability across different imaging domains and dataset variations.
2026
Ducange, Pietro; Fazzolari, Michela; Marcelloni, Francesco; Miglionico, Giustino Claudio; Ruffini, Fabrizio
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/1341110
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact