Backrpopagation Implementation on Progressive Learning Network
Title: |
Backrpopagation Implementation on Progressive Learning Network |
DNr: |
SNIC 2018/7-74 |
Project Type: |
SNIC Small Compute |
Principal Investigator: |
Alireza Mahdavi Javid <almj@kth.se> |
Affiliation: |
Kungliga Tekniska högskolan |
Duration: |
2018-11-14 – 2019-12-01 |
Classification: |
10201 |
Keywords: |
|
Abstract
We design an algorithm for constructing structure
of feed-forward neural network. Number of layers and number
of nodes in every individual layer are main parameters for
a structure. Our algorithm provides these parameters using
a progressive learning approach. The algorithm starts with a
small-size neural network and progressively grows to a largesize neural network. Nodes and layers are added in a forwardlearning principle that ensures progressive improvement in cost
minimization. Progressive improvement guarantees a monotonic
decreasing cost with new addition of either a node or a layer. We
show that rectified linear unit (ReLU) and some of its derivatives
follow a progression property. Our neural network is built on
structured weight matrices and the progression property. A part
of a structured weight matrix is optimized for cost minimization
and the other part is chosen as a random instance. We formulate
a sequence of layer-wise cost optimization problems which are
convex with an appropriate regularizing constraint. Layer-wise
convex cost optimization allows efficient computational solution using alternating-direction-method-of-multipliers (ADMM),
leading to fast execution of our algorithm. We explore further
optimization of weight matrices using back propagation.