Scaling Fenchel Backpropagation
||Scaling Fenchel Backpropagation|
||Rasmus Kjær Høier <firstname.lastname@example.org>|
||Chalmers tekniska högskola|
||2022-10-01 – 2023-04-01|
The backpropagation of error algorithm has been the driving force behind recent progress in artificial intelligence, but relies on a number of biologically implausible features. Research aiming to provide biologically plausible (or at least less implausible) alternatives typically manages to account for some of these issues at the cost of reduced accuracy and/or greatly increased runtime. We plan to explore an alternative algorithm, Fenchel backpropagation (Zach, Bilevel Programs Meet Deep Learning: A Unifying View on Inference Learning Methods, 2021; Le et al., AdaSTE: An Adaptive Straight-Through Estimator to Train Binary Neural Networks, 2022, Høier & Zach, Lifted Regression/Reconstruction Networks, 2020), which incorporates biologically plausible features typically found in various contrastive Hebbian learning algorithms, while remaining comparable to backpropagation in terms of accuracy and runtime. In this project we plan to scale Fenchel backpropagation to challenging datasets for the first time, as well as to evaluate the effect of biologically motivated synaptic constraints on the overall performance of the algorithm.