Explainable Artificial Intelligence and its role in supporting medical diagnosis


Explainable Artificial Intelligence and its role in supporting medical diagnosis

The use of deep learning algorithms in the clinical context is hindered by their lack of interpretability. One way of increasing the acceptance of such complex algorithms is by providing explanations of the decisions through the presentation of similar examples. Besides helping to understand model behaviour, the presentation of similar disease-related examples, also supports the decision-making process of the radiologist or clinician under challenging diagnosis scenarios. In this talk, the speaker will discuss and present his work on strategies to provide decisions and case-based explanations in the medical domain. Particularly, he will discuss the work developed in several clinical applications, such as, aesthetic evaluation of breast cancer treatments, melanoma detection in dermoscopic images, and pleural effusion diagnosis in chest x-ray images.

Wilson Silva


Wilson Silva is a PhD Candidate in Electrical and Computer Engineering at the Faculty of Engineering of the University of Porto (FEUP) and a research assistant at INESC TEC, where he is associated with the Visual Computing and Machine Intelligence and Breast research groups. Currently, Wilson is also an Invited Teaching Assistant at FEUP, where he teaches courses related to Machine Learning, Computer-aided Diagnosis, and Programming. He holds an integrated master’s degree (BSc + MSc) in Electrical and Computer Engineering obtained from FEUP in 2016. He was a visiting master’s student for one year at the Karlsruhe Institute of Technology (KIT, Karlsruhe, Germany), and a visiting PhD student for six months at Inselspital (University Hospital Bern, Bern, Switzerland). His main research interests include Machine Learning and Computer Vision, with a particular focus on Explainable Artificial Intelligence and Medical Image Analysis.