Paper: https://arxiv.org/abs/2411.16346 Abstract Notable progress has been made in generalist medical large language models across various healthcare areas. However, large-scale modeling of …
Recent Publications
A Foundation Model for Intensive Care: Unlocking Generalization across Tasks and Domains at Scale
(Spotlight at ML4H Findings Track and invited talk for AI in Medicine at BIFOLD Institute)
Manuel Burger, Daphné Chopard, Malte Londschien, Fedor Sergeev, Hugo Yèche, Rita Kuznetsova, Martin Faltys, Eike Gerdes, Polina Leshetkina, Peter Bühlmann, Gunnar Rätsch
Manuel Burger, Daphné Chopard, Malte Londschien, Fedor Sergeev, Hugo Yèche, Rita Kuznetsova, Martin Faltys, Eike Gerdes, Polina Leshetkina, Peter Bühlmann, Gunnar Rätsch
Working Paper on medRxiv
Towards Foundation Models for Critical Care Time Series
(Best Paper Award)
Manuel Burger*, Fedor Sergeev*, Malte Londschien, Daphné Chopard, Hugo Yèche, Eike Gerdes, Polina Leshetkina, Alexander Morgenroth, Zeynep Babür, Jasmina Bogojeska, Martin Faltys, Rita Kuznetsova, Gunnar Rätsch
Manuel Burger*, Fedor Sergeev*, Malte Londschien, Daphné Chopard, Hugo Yèche, Eike Gerdes, Polina Leshetkina, Alexander Morgenroth, Zeynep Babür, Jasmina Bogojeska, Martin Faltys, Rita Kuznetsova, Gunnar Rätsch
AIM-FM @ NeurIPS 2024
Multi-Modal Contrastive Learning for Online Clinical Time-Series Applications
Fabian Baldenweg, Manuel Burger, Gunnar Rätsch, Rita Kuznetsova
Fabian Baldenweg, Manuel Burger, Gunnar Rätsch, Rita Kuznetsova
TS4H @ ICLR 2024
Recent Posts

Clinicians are increasingly looking towards machine learning to gain insights about patient evolutions. We propose a novel approach named Multi-Modal UMLS Graph Learning (MMUGL) for learning meaningful representations of medical concepts using graph neural networks over knowledge graphs based on the unified medical language system. These representations are aggregated to represent entire patient visits and then fed into a sequence model to perform predictions at the granularity of multiple hospital visits of a patient. We improve performance by incorporating prior medical knowledge and considering multiple modalities. We compare our method to existing architectures proposed to learn representations at different granularities on the MIMIC-III dataset and show that our approach outperforms these methods. The results demonstrate the significance of multi-modal medical concept representations based on prior medical knowledge.
Recent Talks
Invited Talk
Scaling Critical Care AI: Towards Conversational Interactions with Patient Data
AI in Medicine Workshop
@ Charité Berlin and BIFOLD Institute, Berlin, Germany
Nov 2025
Spotlight Talk
Towards Foundation Models for Critical Care Time Series
AIM-FM Workshop
@ NeurIPS 2024, Vancouver, Canada
Dec 2024
Awards & Honors
Best Paper
Best Paper Award
@ AIM-FM @ NeurIPS 2024
Dec 2024
Best Reviewer
Best Reviewer Award
@ ML4H 2024
Dec 2024

