Neumayer, SebastianGoujon, AlexisBohra, PakshalUnser, Michael2023-08-282023-08-282023-08-282023-01-0110.1137/22M1504573https://infoscience.epfl.ch/handle/20.500.14299/200063WOS:001043311200002Although Lipschitz-constrained neural networks have many applications in machine learning, the design and training of expressive Lipschitz-constrained networks is very challenging. Since the popular rectified linear-unit networks have provable disadvantages in this setting, we propose using learnable spline activation functions with at least three linear regions instead. We prove that our choice is universal among all componentwise 1-Lipschitz activation functions in the sense that no other weight-constrained architecture can approximate a larger class of functions. Additionally, our choice is at least as expressive as the recently introduced non-componentwise Groupsort activation function for spectral-norm-constrained weights. The theoretical findings of this paper are consistent with previously published numerical results.Mathematics, AppliedMathematicsdeep learninglearnable activationsuniversalityrobustnesslipschitz continuitylinear splinesApproximation of Lipschitz Functions Using Deep Spline Neural Networkstext::journal::journal article::research article