Beyond backpropagation
Title: Beyond backpropagation
DNr: Berzelius-2023-227
Project Type: LiU Berzelius
Principal Investigator: Rasmus Kjær Høier <hier@chalmers.se>
Affiliation: Chalmers tekniska högskola
Duration: 2023-09-07 – 2024-04-01
Classification: 10207
Keywords:

Abstract

Today backpropagation is the primary learning algorithm used in deep learning. However, in recent years interest in fully local learning algorithms have increased due to their potential for energy effeciency and speed on next generation neuromorphic hardware. Among various local learning algorithms single-phase contrastive Hebbian learning algorithms are particularly promising as they don't require computing derivatives explicitly and in certain cases are capable to learn with neurons operating asynchronously. In this project we plan to apply such algorithms to challenging sequence learning tasks (e.g. audio and video), which has not previously been done. This will be computationally demanding, so we are not able to run the experiments on our local machines. This work is a continuation of previous work carried out in our research group: - Høier & Zach, Lifted Regression/Reconstruction Networks, BMVC 2020 - Zach, Bilevel Programs Meet Deep Learning: A Unifying View on Inference Learning Methods, 2021 - Le, Høier, Lin & Zach et al., AdaSTE: An Adaptive Straight-Through Estimator to Train Binary Neural Networks, CVPR 2022 - Høier, Staudt & Zach, Dual Propagation: Accelerating Contrastive Hebbian Learning with Dyadic Neurons, ICML 2023 (This paper was made possible thanks to a previous resource allocation at the Berzelius cluster)