Human-Centered User Interface Design for Explainable AI in Chest Radiology: A Multi-Phase Co-Design Approach

Abstract

The AI-powered computer vision is transforming medical imaging by improving analysis, diagnosis, and treatment. However, the opaque nature of deep learning often limits its adoption in critical clinical settings, where interpretability and trust are paramount. Explainable AI (XAI) aims at mitigating those limitations by creating visual and interpretable explanations of model decisions. However, poor user interface (UI) design frequently hampers the usability and clinical integration of these explanations. This paper addresses the critical need for human-centered UI design in XAI systems for chest radiology, a vital diagnostic domain for diseases such as pneumonia, lung cancer, and pulmonary embolism. Two deep learning–based XAI systems were developed for detecting pneumonia from chest X-rays and diagnosing COVID-19 from chest CT scans using post-hoc explanation methods, Gradient-weighted Class Activation Mapping (Grad-CAM) and Local Interpretable Model-Agnostic Explanations (LIME). Building on these systems, we introduce a novel multi-phase Human-Centered Design (HCD) methodology that actively involves radiologists and clinicians through participatory co-design, iterative prototyping, and multidisciplinary evaluation workshop. This process identified fifteen preliminary UI features tailored to clinical needs and led to a prototype XAI interface. Empirical evaluation from a multidisciplinary workshop revealed that radiologists preferred diagnostic prediction displays combining the original and AI-annotated images shown side-by-side or with adjustable overlays accompanied by explanatory text tailored to various audiences, including radiologists, clinicians, and patients. Participants agreed that highlighting the confidence scores of AI outputs aligned with clinical reasoning enhances perceived trust, diagnostic efficiency, and willingness to adopt XAI in daily practice. Our study demonstrates that collaborative multidisciplinary co-design is essential for bridging technical innovation and clinical utility, offering valuable insights for the future development of effective, trustworthy XAI user interfaces in medical imaging.

Publication DOI: https://doi.org/10.1109/ACCESS.2026.3653233
Divisions: College of Engineering & Physical Sciences > School of Computer Science and Digital Technologies
College of Engineering & Physical Sciences > Aston Centre for Artifical Intelligence Research and Application
College of Engineering & Physical Sciences
College of Engineering & Physical Sciences > School of Computer Science and Digital Technologies > Software Engineering & Cybersecurity
College of Engineering & Physical Sciences > Engineering for Health
College of Health & Life Sciences
College of Engineering & Physical Sciences > Aston Digital Futures Institute
College of Business and Social Sciences > School of Social Sciences & Humanities > Centre for Health and Society
College of Business and Social Sciences > Aston Institute for Forensic Linguistics
College of Business and Social Sciences > School of Social Sciences & Humanities
Aston University (General)
Additional Information: This article has been accepted for publication in IEEE Access. This is the author's version which has not been fully edited and content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2026.3653233This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/
Uncontrolled Keywords: Explainable AI,Visualization,Computed tomography,Radiology,User interfaces,Usability,Pneumonia,Medical diagnostic imaging
Publication ISSN: 2169-3536
Last Modified: 28 Jan 2026 08:39
Date Deposited: 27 Jan 2026 15:07
Full Text Link:
Related URLs: https://ieeexpl ... cument/11355720 (Publisher URL)
PURE Output Type: Article
Published Date: 2026-01-15
Accepted Date: 2026-01-01
Authors: Fouad, Shereen (ORCID Profile 0000-0002-4965-7017)
Hakobyan, Lilit (ORCID Profile 0000-0001-9518-4997)
E. Ihongbe, Izegbua
Kavakli, Manolya (ORCID Profile 0000-0003-3241-6839)
Atkins, Sarah (ORCID Profile 0000-0003-3481-5681)
Bhatia, Bahadar

Download

[img]

Version: Accepted Version

License: Creative Commons Attribution


Export / Share Citation


Statistics

Additional statistics for this record