Decoupling Backpropagation using Constrained Optimization Methods
We propose BlockProp, a neural network training algorithm. Unlike backpropagation, it does not rely on direct top-to-bottom propagation of an error signal. Rather, by interpreting backpropagation as a constrained optimization problem we split the neural network model into sets of layers (blocks) that must satisfy a consistency constraint, i.e. the output of one set of layers must be equal to the input of the next. These decoupled blocks are then updated with the gradient of the optimization constraint violation. The main advantage of this formulation is that we decouple the propagation of the error signal on different subparts (blocks) of the network making it particularly relevant for multi-devices applications.
decoupling_backpropagation_using_constrained_optimization_methods.pdf
Publisher's version
openaccess
Copyright
814.99 KB
Adobe PDF
1c941d073c9d927b7c30f91c3a50e31f