Fouad, Shereen, Hakobyan, Lilit, E. Ihongbe, Izegbua, Kavakli, Manolya, Atkins, Sarah and Bhatia, Bahadar (2026). Human-Centered User Interface Design for Explainable AI in Chest Radiology: A Multi-Phase Co-Design Approach. IEEE Access ,
Abstract
The AI-powered computer vision is transforming medical imaging by improving analysis, diagnosis, and treatment. However, the opaque nature of deep learning often limits its adoption in critical clinical settings, where interpretability and trust are paramount. Explainable AI (XAI) aims at mitigating those limitations by creating visual and interpretable explanations of model decisions. However, poor user interface (UI) design frequently hampers the usability and clinical integration of these explanations. This paper addresses the critical need for human-centered UI design in XAI systems for chest radiology, a vital diagnostic domain for diseases such as pneumonia, lung cancer, and pulmonary embolism. Two deep learning–based XAI systems were developed for detecting pneumonia from chest X-rays and diagnosing COVID-19 from chest CT scans using post-hoc explanation methods, Gradient-weighted Class Activation Mapping (Grad-CAM) and Local Interpretable Model-Agnostic Explanations (LIME). Building on these systems, we introduce a novel multi-phase Human-Centered Design (HCD) methodology that actively involves radiologists and clinicians through participatory co-design, iterative prototyping, and multidisciplinary evaluation workshop. This process identified fifteen preliminary UI features tailored to clinical needs and led to a prototype XAI interface. Empirical evaluation from a multidisciplinary workshop revealed that radiologists preferred diagnostic prediction displays combining the original and AI-annotated images shown side-by-side or with adjustable overlays accompanied by explanatory text tailored to various audiences, including radiologists, clinicians, and patients. Participants agreed that highlighting the confidence scores of AI outputs aligned with clinical reasoning enhances perceived trust, diagnostic efficiency, and willingness to adopt XAI in daily practice. Our study demonstrates that collaborative multidisciplinary co-design is essential for bridging technical innovation and clinical utility, offering valuable insights for the future development of effective, trustworthy XAI user interfaces in medical imaging.
0000-0002-4965-7017