Team

Team:

M.Sc. Luca Gemballa

Academic Staff

M.Sc. Luca Gemballa

Room:
R09 R02 H21
Email:
Consultation Hour:
by arrangement
Author Profiles:
ORCID
ResearchGate

Bio:

Luca Gemballa is a research assistant at the Chair for Information Systems and Artificial Intelligence (AI) in Marketing at the University of Duisburg-Essen, Germany. During his Master's degree in Computer Science at TU Dortmund University he worked as a student assistant at the Chair for Enterprise Computing of the faculty for Computer Science. Besides his Master's degree he also holds a Bachelor's degree in Computer Science from TU Dortmund University.

As a PhD student Luca Gemballa conducts research in the field of XAI.

Tutored Theses:

Filter:
  • Interactive Interfaces for medical Treatment Effect Prediction Explanations (Bachelor Thesis Business Information Systems, in progress) Details

    The application of deep learning can lead to increased prediction performance in a wide variety of use cases. However, in high stakes decision contexts like for example medicine, an increase in performance on its own can be insufficient to foster trust towards a predictive model. If lack of trust has the effect of the high performance decision support system around the deep learning model not being used, nothing is gained. Developing an interface with the intention of convincing users of its plausibility on the other hand risks overtrust and an abandonment of critical thinking, especially among less experienced professionals. XAI methods are a way to support appropriate levels of trust by giving users tools to detect faulty reasoning by an otherwise inscrutable deep learning model. We have developed visualizations as explanations of treatment effect predictions. During our evaluation of these visualizations, several experts voiced their interest in interactive components to explore the model's reasoning and underlying data to develop a better understanding. 

    To build on our visualizations a structured literature review on interactivity in decision support systems is conducted in this Bachelor's thesis. On the basis of this SLR the student develops an interactive explanation interface for a medical treatment effect prediction case and evaluates it in a series of expert interviews.

  • AI Explanations in the Context of Medical Decision Support Systems (Bachelor Thesis Business Information Systems, in progress) Details

    In order to properly utilize performance improvements through the adoption of AI models, a number of conditions must be met. Since modern deep learning systems are opaque and inscrutable to human users, problems of mistrust and corresponding non-use can arise. But even if the adoption of AI technology into clinical practice is not hindered by such barriers, problems may arise due to an attitude of overconfidence and overreliance on AI results. The XAI community strives to develop methods that help to create an appropriate level of trust in AI systems. Such methods are particularly important in the medical application context, as incorrect diagnostic and prognostic decisions can have significant negative consequences for the patients concerned. We intend to research XAI in the context of medical decision support systems. This includes developing an understanding of the application of XAI to different data types and diseases, and whether there has been experimental evaluation of the impact of XAI in AI-based decision support.

    To develop a better understanding of XAI in the context of medical decision support systems, a structured literature review is carried out in this Bachelor’s thesis. To collect additional data and enhance the knowledge about XAI use cases in medical practice, the student conducts a series of expert interviews for requirements elicitation.