The Human-Computer Interaction (HCI) community has long stressed the need for a more user-centered approach to Explainable Artificial Intelligence (XAI), a research area that aims at defining algorithms and tools to illustrate the predictions of the so-called black-box models. This approach can benefit from the fields of user-interface, user experience, and visual analytics. In this demo, we propose a visual-based tool, "F.I.P.E.R.", that shows interactive explanations combining rules and feature importance.

Demo: an Interactive Visualization Combining Rule-Based and Feature Importance Explanations

Cappuccio, Eleonora;Fadda, Daniele;Rinzivillo, Salvatore
2023-01-01

Abstract

The Human-Computer Interaction (HCI) community has long stressed the need for a more user-centered approach to Explainable Artificial Intelligence (XAI), a research area that aims at defining algorithms and tools to illustrate the predictions of the so-called black-box models. This approach can benefit from the fields of user-interface, user experience, and visual analytics. In this demo, we propose a visual-based tool, "F.I.P.E.R.", that shows interactive explanations combining rules and feature importance.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/1327127
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact