Diffusion Based Representation Learning
Title: Diffusion Based Representation Learning
SNIC Project: Berzelius-2021-89
Project Type: LiU Berzelius
Principal Investigator: Stefan Bauer <stefan.a.bauer@gmail.com>
Affiliation: Kungliga Tekniska högskolan
Duration: 2021-12-09 – 2022-07-01
Classification: 10201
Homepage: https://arxiv.org/abs/2105.14257
Keywords:

Abstract

Score-based methods represented as stochastic differential equations on a continuous time domain have recently proven successful as a non-adversarial generative model. In particular they achieved new state-of-the-art performance on image generation while offering theoretical guarantees. Training such models relies on denoising score matching, which can be seen as multi-scale denoising autoencoders. Here, we augment the denoising score-matching framework to enable representation learning without any supervised signal. GANs and VAEs learn representations by directly transforming latent codes to data samples. In contrast, the introduced diffusion based representation learning relies on a new formulation of the denoising score-matching objective and thus encodes information needed for denoising. We illustrate how this difference allows for manual control of the level of details encoded in the representation. Using the same approach, we propose to learn an infinite-dimensional latent code which achieves improvements of state-of-the-art models on semi-supervised image classification. As a side contribution, we show how adversarial training in score-based models can improve sample quality and improve sampling speed using a new approximation of the prior at smaller noise scales.