Beyond backpropagation
Title: |
Beyond backpropagation |
DNr: |
Berzelius-2024-358 |
Project Type: |
LiU Berzelius |
Principal Investigator: |
Rasmus Kjær Høier <hier@chalmers.se> |
Affiliation: |
Chalmers tekniska högskola |
Duration: |
2024-10-01 – 2025-04-01 |
Classification: |
10207 |
Homepage: |
https://www.chalmers.se/en/persons/hier/ |
Keywords: |
|
Abstract
Today backpropagation is the primary learning algorithm used in deep learning. However, in recent years interest in fully local learning algorithms have increased due to their potential for energy effeciency and speed on next generation neuromorphic hardware. Among various local learning algorithms single-phase contrastive Hebbian learning algorithms are particularly promising as they don't require computing derivatives explicitly and in certain cases are capable to learn with neurons operating asynchronously. In this project we plan to apply such algorithms to challenging tasks.
This work is a continuation of previous work carried out in our research group:
- Høier & Zach, Lifted Regression/Reconstruction Networks, BMVC 2020
- Zach, Bilevel Programs Meet Deep Learning: A Unifying View on Inference Learning Methods, 2021
- Le, Høier, Lin & Zach et al., AdaSTE: An Adaptive Straight-Through Estimator to Train Binary Neural Networks, CVPR 2022
- Høier, Staudt & Zach, Dual Propagation: Accelerating Contrastive Hebbian Learning with Dyadic Neurons, ICML 2023
- Høier & Zach, A Lagrangian Perspective on Dual Propagation, MLNCP workshop @NeurIPS
- Høier & Zach, Two Tales of Single-Phase Contrastive Hebbian Learning, ICML 2024
- Høier, Kalinin, Ernoult and Zach, Dyadic Learning in Recurrent and Feedforward Networks, in review at the Machine Learning with new Compute Paradigms workshop at Neurips 2024