East Campus USI-SUPSI - Room C1.02
The achievements of contemporary machine learning (ML) methods highlight the enormous potential of integrating AI systems in various domains of medicine, ranging from the analysis of diagnostic images in radiology and dermatology to increasingly complex applications such as forecasting in intensive care units or the diagnosis of psychiatric disorders. However, despite their potential, many medical professionals are sceptical toward the integration of machine learning tools in their practices. The reasons for this scepticism are mostly related to opacity, or so-called black-box, problem, which refers to the difficulty of humans to understand the reasoning behind the outcomes of ML models and ultimately decide whether to trust them or not.
Much effort has been dedicated in the last year to overcome such difficulty, both from a policy and ethical but also engineering and design perspectives. Nevertheless there is still much disagreement among scholars on the real effectiveness of the various proposed solutions.
Local organisers
- Alessandro Facchini (alessandro.facchini@idsia.ch)
- Alberto Termine (alberto.termine@idsia.ch)
-
Department of Innovative Technologies
Dalle Molle Institute for Artificial Intelligence USI-SUPSI
Polo universitario Lugano - Campus Est, Via la Santa 1
CH-6962 Lugano-Viganello
T +41 (0)58 666 66 66
info@idsia.ch