Evaluating Explainable Artificial Intelligence (XAI) Techniques in Chest Radiology Imaging Through a Human-centered Lens

Abstract

The field of radiology imaging has experienced a remarkable increase in using of deep learning (DL) algorithms to support diagnostic and treatment decisions. This rise has led to the development of Explainable AI (XAI) system to improve the transparency and trust of complex DL methods. However, XAI systems face challenges in gaining acceptance within the healthcare sector, mainly due to technical hurdles in utilizing these systems in practice and the lack of human-centered evaluation/validation. In this study, we focus on visual XAI systems applied to DL-enabled diagnostic system in chest radiography. In particular, we conduct a user study to evaluate two prominent visual XAI techniques from the human perspective. To this end, we created two clinical scenarios for diagnosing pneumonia and COVID-19 using DL techniques applied to chest X-ray and CT scans. The achieved accuracy rates were 90\% for pneumonia and 98\% for COVID-19. Subsequently, we employed two well-known XAI methods, Grad-CAM (Gradient-weighted Class Activation Mapping) and LIME (Local Interpretable Model-agnostic Explanations), to generate visual explanations elucidating the AI decision-making process. The visual explainability results were shared through a user study, undergoing evaluation by medical professionals in terms of clinical relevance, coherency, and user trust. In general, participants expressed a positive perception of the use of XAI systems in chest radiography. However, there was a noticeable lack of awareness regarding their value and practical aspects. Regarding preferences, Grad-CAM showed superior performance over LIME in terms of coherency and trust, although concerns were raised about its clinical usability. Our findings highlight key user-driven explainability requirements, emphasizing the importance of multi-modal explainability and the necessity to increase awareness of XAI systems among medical practitioners. Inclusive design was also identified as a crucial need to ensure better alignment of these systems with user needs.

Publication DOI: https://doi.org/10.1371/journal.pone.0308758
Divisions: College of Engineering & Physical Sciences > School of Computer Science and Digital Technologies
College of Engineering & Physical Sciences
Funding Information: 2021/22 RKE Pump Priming Fund (£14K)
Additional Information: Copyright © 2024 E. Ihongbe et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Publication ISSN: 1932-6203
Data Access Statement: The data underlying the results presented in the study are available from: (1) Kermany, D., Zhang, K. and Goldbaum, M., 2018. Labeled optical coherence tomography (oct) and chest x-ray images for classification. Mendeley data, 2(2), p.651.). https://www.kaggle.com/datasets/paultimothymooney/chest-xray-pneumonia (2) Soares Eduardo, Angelov P. CT scans collected from real patients in hospitals from Sao Paulo, Brazil, A large dataset of CT scans for SARS-CoV-2 (COVID-19) identification. 2020. https://www.kaggle.com/datasets/plameneduardo/sarscov2-ctscan-dataset (3) The author-generated an open access code on which the manuscript is based, has been provided as a supporting information - S1 Supporting Information. Colaboratory Python code for clinical case study 1 - using Chest X-ray Images - (https://colab.research.google.com/drive/1v7RSS-_Prgujr-BrAGeDR_vygX_Tf-7r?usp=sharing) - S2 Supporting Information, Colaboratory Python code for clinical case study 2 - using Chest CT Images (https://colab.research.google.com/drive/1Y1wjd9-sKLD6MaZDw4QVleSfAV22Ldb4?usp=sharing) The code is shared in a way that follows best practice and facilitates reproducibility and reuse.
Last Modified: 18 Oct 2024 07:06
Date Deposited: 15 Aug 2024 13:17
Full Text Link:
Related URLs: https://journal ... al.pone.0308758 (Publisher URL)
PURE Output Type: Article
Published Date: 2024-10-09
Published Online Date: 2024-10-09
Accepted Date: 2024-07-30
Authors: E. Ihongbe, Izegbua
Fouad, Shereen (ORCID Profile 0000-0002-4965-7017)
F. Mahmoud, Taha
Rajasekaran, Arvind
Bhatia, Bahadar

Download

[img]

Version: Published Version

License: Creative Commons Attribution

| Preview

Export / Share Citation


Statistics

Additional statistics for this record