Memorization in Deep Networks
Title: Memorization in Deep Networks
DNr: Berzelius-2021-52
Project Type: LiU Berzelius
Principal Investigator: Mårten Björkman <celle@kth.se>
Affiliation: Kungliga Tekniska högskolan
Duration: 2021-09-30 – 2022-04-01
Classification: 10207
Keywords:

Abstract

Deep networks achieve state of the art performance on several tasks within the natural world. At the same time, modern architectures have enough capacity to perfectly fit common benchmark datasets and shatter them. At present, the mechanisms governing learning and memorization in deep networks are not fully understood. In this work, we exploit the local geometry of convolutional and dense layers to empirically investigate which layers are responsible for memorization. Importantly, ReLU activations, the most popular non-linearities adopted by feed-forward networks, allow to interpret a model under the lenses of activation regions and hyperplane arrangements, opening an angle for Geometric studies. In this project, we focus on activation regions to contrast learning with memorization at each individual layer of several trained networks, with the goal of developing a measure of generality of features learned by each layer.