Skip to main navigation Skip to search Skip to main content

Comparison of SHAP and clinician friendly explanations reveals effects on clinical decision behaviour

  • Sungkyunkwan University
  • AvoMD
  • University of Ulsan
  • Boston University
  • Beth Israel Deaconess Medical Center
  • Soongsil University

Research output: Contribution to journalArticlepeer-review

Abstract

Clinical decision-making substantially impacts patients’ lives and their quality of life. However, the black-box nature of AI-powered clinical decision support systems (CDSSs) complicates the interpretation of how decisions are derived. Explainable AI (XAI) improves acceptance and trust with explanations, but the effectiveness of different methods remains uncertain. We compared the acceptance, trust, satisfaction and usability of various explanatory methods among clinicians. We also explored the factors associated with acceptance levels for each item using trust, satisfaction and usability score questionnaires. Surgeons and physicians (N = 63), who had prescribed blood products before surgery, made decisions before and after receiving one of three CDSS explanation methods, each comprising six vignettes, in a counterbalanced design. We found empirical evidence, which indicates that providing a clinical explanation enhances clinicians’ acceptance than presenting ‘results only’ or ‘results with SHapley Additive exPlanations (SHAP)’. Additionally, trust, satisfaction and usability were correlated with acceptance. This study suggests best practices for the strategic application of the XAI–CDSS in the medical field.

Original languageEnglish
Article number578
Journalnpj Digital Medicine
Volume8
Issue number1
DOIs
StatePublished - Dec 2025

Fingerprint

Dive into the research topics of 'Comparison of SHAP and clinician friendly explanations reveals effects on clinical decision behaviour'. Together they form a unique fingerprint.

Cite this