Explainable Clinical Decision Support Systems for Post-COVID Care Pathways
DOI:
https://doi.org/10.5281/ZENODO.18071027Keywords:
Explainable artificial intelligence, clinical decision support systems, post COVID care, healthcare analytics, interpretable machine learningAbstract
Clinical decision support systems have become a foundational component of modern healthcare delivery, particularly in contexts where patient trajectories are complex, uncertain, and long running. Post COVID care represents such a setting, characterised by heterogeneous symptoms, fluctuating recovery patterns, and multi organ involvement. While machine learning driven decision support systems demonstrate strong predictive capability, their limited transparency poses barriers to clinical trust, accountability, and safe adoption. This article presents a comprehensive framework for explainable clinical decision support tailored to post COVID care pathways. The proposed approach integrates interpretable predictive modelling, uncertainty aware inference, and clinician oriented explanation layers to support decision making across diagnosis, monitoring, and care planning. Through systematic architectural design and empirical evaluation, the study demonstrates how explainability can be embedded as a core system property rather than an afterthought, enabling reliable and actionable clinical insights in complex care environments
Downloads
Published
Issue
Section
License
Copyright (c) 2021 The Artificial Intelligence Journal

This work is licensed under a Creative Commons Attribution 4.0 International License.