Deep neural network explainability is a critical issue in Artificial Intelligence (AI). This work aims to develop a method to explain a deep residual Convolutional Neural Network able to automatically classify mammograms into breast density classes. Breast density, a risk factor for breast cancer, is defined as the amount of fibroglandular tissue compared to fat tissue visible on a mammogram. We studied the explainability of the classifier to understand the reasons behind its predictions, in fact with a deep multi-layer structure, it acts like a black-box. As there is no well-established method, we explored different possible analyses and visualization techniques. The main obtained results were the achievement of a performance improvement in terms of accuracy and a contribution to assess trust in the model. This is fundamental for a potential application in clinical practice.

Explainability of a CNN for breast density assessment

Scapicchio C.;Lizzi F.;Fantacci M. E.
Ultimo
2021-01-01

Abstract

Deep neural network explainability is a critical issue in Artificial Intelligence (AI). This work aims to develop a method to explain a deep residual Convolutional Neural Network able to automatically classify mammograms into breast density classes. Breast density, a risk factor for breast cancer, is defined as the amount of fibroglandular tissue compared to fat tissue visible on a mammogram. We studied the explainability of the classifier to understand the reasons behind its predictions, in fact with a deep multi-layer structure, it acts like a black-box. As there is no well-established method, we explored different possible analyses and visualization techniques. The main obtained results were the achievement of a performance improvement in terms of accuracy and a contribution to assess trust in the model. This is fundamental for a potential application in clinical practice.
2021
Scapicchio, C.; Lizzi, F.; Fantacci, M. E.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/1123302
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact