Deep Neural Networks With Trainable Activations and Controlled Lipschitz Constant

We introduce a variational framework to learn the activation functions of deep neural networks. Our aim is to increase the capacity of the network while controlling an upper-bound of the actual Lipschitz constant of the input-output relation. To that end, we first establish a global bound for the Lipschitz constant of neural networks. Based on the obtained bound, we then formulate a variational problem for learning activation functions. Our variational problem is infinite-dimensional and is not computationally tractable. However, we prove that there always exists a solution that has continuous and piecewise-linear (linear-spline) activations. This reduces the original problem to a finite-dimensional minimization where an l(1) penalty on the parameters of the activations favors the learning of sparse nonlinearities. We numerically compare our scheme with standard ReLU network and its variations, PReLU and LeakyReLU and we empirically demonstrate the practical aspects of our framework.


Published in:
Ieee Transactions On Signal Processing, 68, 4688-4699
Year:
Jan 01 2020
Publisher:
Piscataway, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
ISSN:
1053-587X
1941-0476
Keywords:
Laboratories:




 Record created 2020-09-20, last modified 2020-10-27


Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)