Deep Spline Networks With Control Of Lipschitz Regularity

The motivation for this work is to improve the performance of deep neural networks through the optimization of the individual activation functions. Since the latter results in an infinite-dimensional optimization problem, we resolve the ambiguity by searching for the sparsest and most regular solution in the sense of Lipschitz. To that end, we first introduce a bound that relates the properties of the pointwise nonlinearities to the global Lipschitz constant of the network. By using the proposed bound as regularizer, we then derive a representer theorem that shows that the optimum configuration is achievable by a deep spline network. It is a variant of a conventional deep ReLU network where each activation function is a piecewise-linear spline with adaptive knots. The practical interest is that the underlying spline activations can be expressed as linear combinations of ReLU units and optimized using l(1)-minimization techniques.


Published in:
2019 Ieee International Conference On Acoustics, Speech And Signal Processing (Icassp), 3242-3246
Presented at:
44th IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Brighton, ENGLAND, May 12-17, 2019
Year:
Jan 01 2019
Publisher:
New York, IEEE
ISSN:
1520-6149
ISBN:
978-1-4799-8131-1
Keywords:
Laboratories:




 Record created 2019-09-26, last modified 2019-09-27


Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)