Graph-based, spatial, temporal, and generative machine learning
Title: Graph-based, spatial, temporal, and generative machine learning
DNr: Berzelius-2024-114
Project Type: LiU Berzelius
Principal Investigator: Fredrik Lindsten <>
Affiliation: Linköpings universitet
Duration: 2024-04-01 – 2024-10-01
Classification: 10201


This is a joint proposal for 4 separate projects with the same PI. The projects are outlined below. Generative modeling of discrete data: This project aims at developing methods for the generation of discrete data. Apart from general method development, the aim is to apply the developed methods for the generation of molecules and proteins, important and promising applications of machine learning. The project aims at extending on previous work [1], potentially combining discriminator guidance and previously developed methods for conditional sampling like classifier guidance. [1] Discriminator Guidance for Autoregressive Diffusion Models, Ekström Kelvinius and Lindsten, AISTATS 2024 Deep Learning for Weather Forecasting: In this project we investigate the use of deep learning for Numerical Weather Prediction (NWP). Recent works have shown that machine learning models can learn to highly accurately approximate NWP systems while producing predictions in a fraction of the time used by the original system [1]. When incorporating real observations the deep learning models can even surpass the accuracy of the original NWP systems. Still many questions around these models remain, in particular related to forecast uncertainty, regional forecasting and moving from the weather to climate timescale. This project is a continuation of our existing work in this area [2]. We here aim to continue developing machine learning models with new capabilities for regional and global weather forecasting. The use of compute and storage from Berzelius allows us to perform this research at the cutting edge of this fast-moving research area. The main future activities planned in this project include: 1) Running final experiments for the publication on our newly developed probabilistic weather forecasting model. 2) Work on further developing and evaluating regional machine learning weather forecasting models. 3) Extending the probabilistic forecasting model by utilizing a diffusion model formulation. In this project we are working with collaborator within both meteorological institutes and academia, including SMHI, DMI, MeteoSwiss and ETH Zürich. We additionally have two incoming PhD students starting work within this project, allowing us to accelerate the research and naturally also increasing our use of compute resources. We aim to publish results from this research both in the machine learning and NWP literature. [1] The rise of data-driven weather forecasting, Ben-Bouallegue, et al., preprint, 2023. [2] Graph-based neural weather prediction for limited area modeling, Oskarsson et al., NeurIPS 2023 Workshop on Tackling Climate Change with Machine Learning, 2023. Denoising Diffusion-based Sequential Monte Carlo Sampler: Denoising diffusion models are a class of generative models known for their state-of-the-art performance across various domains [1, 2]. The core idea is to employ a noise diffusion process that transforms the data distribution into a Gaussian distribution. Samples from diffusion models can be obtained by simulating an approximation of the time-reversal of this diffusion process, typically initialized with Gaussian noise. One common approach to achieve this involves optimizing denoising score matching, which learns a score function corresponding to the gradient of an intractable marginal distribution. We aim to extend this concept to sample approximately from a given target distribution. Specifically, we consider a noise diffusion process where the target distribution gradually diffuses towards a Gaussian distribution. However, denoising score matching is not applicable in this context, as we cannot sample directly from the target distribution. Therefore, we aim to develop a sampler that integrates the denoising diffusion idea with the sequential Monte Carlo (SMC) algorithm [3]. [1] Denoising Diffusion Probabilistic Models. Jonathan Ho, et al. NeurIPS, 2021. [2] Score-Based Generative Modeling through Stochastic Differential Equations. Yang Song, et al. ICLR, 2021. [3] Graphical Model Inference: Sequential Monte Carlo Meets Deterministic Approximations. Fredrik Lindsten, et al. NeurIPS, 2018. Learning Geometry-aware Representations: This project aims to bring geometrical interpretations to learned representation spaces. Both generative modeling and unsupervised representation learning leverage abstract learned (latent) representations to perform their tasks, such as the generation of data in the former and facilitating downstream tasks in the latter. Previous work [1, 2] has shown that ignoring the induced metric in those representations can lead to wrong conclusions about them. This can be alleviated by looking at them with a metric "pulled back" from the observable data space via, for instance, a decoder that maps latent representations into observable data. We aim to achieve a similar goal, but inducing the metric explicitly, allowing for greater flexibility, awareness of this metric, and facilitating the use of the latent space by choosing convenient metrics from the beginning, instead of letting the model induce one in a normal training process. Some preliminary results [3] show the promising capabilites of this approach. [1] Latent Space Oddity: on the Curvature of Deep Generative Models. Georgios Arvanitidis, Lars Kai Hansen and Søren Hauberg. In International Conference on Learning Representations (ICLR), 2018. [2] Geometrically Enriched Latent Spaces. Georgios Arvanitidis, Søren Hauberg, and Bernhard Schölkopf. Proceedings of the International Conference on Artificial Intelligence and Statistics (AISTATS), 2021. [3] Exploring the Poincaré Ellipsis. S. G. Fadel, T. Paulsen, U. Brefeld. International Workshop on Mining and Learning with Graphs (ECML/PKDD), 2023.