«Artificial intelligence (AI)-based computer-aided diagnostics (CAD) is a promising technique to make the diagnosis process more efficient and accessible to the general public. Deep learning is the most widely used AI technology for a variety of tasks, including medical imaging. It is the state of the art for a variety of computer vision tasks and has been used in medical imaging tasks such as Alzheimer’s disease classification, lung cancer detection, retinal disease detection, and so on. Despite the remarkable results, the lack of such tools to inspect the behavior of black-box models affects the use of deep learning in medical field where explainability and reliability are the key elements for trust by the medical professional».

«Also, Newer rules, such as the European General Data Protection Regulation (GDPR), are making black-box models more difficult to apply in all sectors, including healthcare, because decision retractability is now required».

«In such situations, the explainable AI (XAI) can be a promising tool to interpret, interlink and understand the reason behind the decision of the black-box model».

Article written by Akshay Daydar.

Photo by: