Explainable Clinical Decision Support Systems for Post-COVID Care Pathways

Authors

  • Daniel Harper University of Gloucestershire, United Kingdom Author
  • Sophie Langford University of Gloucestershire, United Kingdom Author

DOI:

https://doi.org/10.5281/ZENODO.18071027

Keywords:

Explainable artificial intelligence, clinical decision support systems, post COVID care, healthcare analytics, interpretable machine learning

Abstract

Clinical decision support systems have become a foundational component of modern healthcare delivery, particularly in contexts where patient trajectories are complex, uncertain, and long running. Post COVID care represents such a setting, characterised by heterogeneous symptoms, fluctuating recovery patterns, and multi organ involvement. While machine learning driven decision support systems demonstrate strong predictive capability, their limited transparency poses barriers to clinical trust, accountability, and safe adoption. This article presents a comprehensive framework for explainable clinical decision support tailored to post COVID care pathways. The proposed approach integrates interpretable predictive modelling, uncertainty aware inference, and clinician oriented explanation layers to support decision making across diagnosis, monitoring, and care planning. Through systematic architectural design and empirical evaluation, the study demonstrates how explainability can be embedded as a core system property rather than an afterthought, enabling reliable and actionable clinical insights in complex care environments

Downloads

Published

2021-12-18