The COVID-19 pandemic continues to have a devastating effect on the health and well-being of the global population. A critical step in the fight against COVID-19 is effective clinical decision support for infected patients, ranging from screening to risk stratification to treatment planning. Motivated by this and inspired by the open source efforts of the research community, in this study we introduce COVID-Net, a global open source initiative for AI-assisted COVID-19 clinical decision support that is open source and available to the general public. As part of the initiative, we not only introduced deep learning models for different stages of clinical decision support, but also some of the largest COVID-19 related open access datasets to support these models. Furthermore, we investigate how the COVID-Net models makes predictions using an explainability method in an attempt to not only gain deeper insights into critical factors associated with COVID cases, which can aid clinicians in improved diagnosis and prognosis, but also audit COVID-Net models in a responsible and transparent manner to validate that it is making decisions based on relevant information. The hope is that the open source COVID-Net initiative will aid both researchers and citizen data scientists alike to accelerate the development of highly accurate yet practical deep learning solutions for COVID-19 clinical decision support and accelerate treatment of those who need it the most.
Learning Objectives:
1. Discuss the potential role of artificial intelligence in the context of clinical decision support.
2. Identify risk factors related to the use of artificial intelligence for clinical decision support and ways to mitigate such risks.
3. Explain the role of explainability in artificial intelligence development and deployment.