Accelerating Physical Simulations with Efficient Generative Models
Title: Accelerating Physical Simulations with Efficient Generative Models
DNr: Berzelius-2025-246
Project Type: LiU Berzelius
Principal Investigator: Sebastian Dalleiger <sdall@kth.se>
Affiliation: Kungliga Tekniska högskolan
Duration: 2025-08-05 – 2026-03-01
Classification: 10210
Keywords:

Abstract

Machine learning (ML) has recently emerged as a viable approach for a wide variety of scientific applications, and is used to solve problems in chemistry, life science, and material science---specifically when it comes to computational simulations. Prominent research problems included simulating molecular dynamics, assessing molecular conformation, computing energy landscapes of materials, or for folding proteins. Most research on ML-based simulations is dedicated on improving simulation speed providing automized end-to-end experimental design pipelines, or enhancing the interpretability of these models. Although those advances are impressive, ML-based simulations struggle with scalability and efficiency: While ML-based simulations are significantly faster than solving ODEs at atomic scale, they are so only under two conditions: accurate force/energy landscapes are pre-trained accurately, and inference remains within the interpolation region. Outside of these, generalization and energy drift remain problematic. Grounded in principles from optimal transport, geometric machine learning, and physic, we address these issues---and to ultimately produce more efficient computational ML-based simulations---by developing novel objective functions that explicitly encode geometric and physical constraints and global topological structure into the training and sampling process. Moreover, our models leverage coarse-graining for efficiency, while Schrödinger-Bridge-inspired or Fokker-Planck-based regularization ensures stable and physically consistent sampling. Developing these methods will allow us to not only enhance the stability and accuracy of ML-based simulations over long time horizons, but also to extend their applicability beyond the training distribution, which enables reliable, scalable models that retain physical fidelity across diverse settings in molecular, materials, and life sciences.