Turing-Roche Translational Science Methods Club: Multimodal Data Integration

Published 2024-05-28
The Translational Science Methods Club event series is a project led by Sarah Buehler as part of the Turing-Roche Community Scholar Scheme which you can find out more about here: www.turing.ac.uk/events/translational-science-meth…

This is a recording of the third session of the Methods Club held on May 23rd 2024, focusing on the topic of Multimodal Data Integration. This session featured Sarah Buehler, currently a PhD in Cognitive Neuroscience at University College London (UCL), as the host and discussion moderator, as well as two speakers: Yuhan Wang, PhD student at the Centre for Robotics Research, Department of Engineering, King’s College London and Enrichment Student at the Alan Turing Institute, and Hui Xin Ng, a PhD candidate in Cognitive Science at UC San Diego and a visiting researcher at the Centre of Medical Image Computing at UCL.

In the first talk, Yuhan provided a concise introduction on integrating multimodal data into various medical applications. Using Alzheimer’s disease as a focal point, she illustrated the integration of image, genetic, and tableau data to detect the disease. Additionally, she introduced the challenges associated with utilising multimodal data for the application of Alzheimer’s disease detection.

In the second talk, Hui Xin introduced the use of interpretability methods. She discussed their applications and limitations of the methods, and gave a short demo on how to use the Python library Captum to understand which brain features from a neuroimaging dataset were most important and how the model reached its prediction. Demo materials can be found here: github.com/nghuixin/TSciM-Club_May2024

All Comments (1)