Abstract

Deep neural networks can be fragile and sensitive to small input perturbations that might cause a significant change in the output. In this letter, we employ contraction theory to improve the robustness of neural ODEs (NODEs). A dynamical system is contractive if all solutions with different initial conditions converge to each other exponentially fast. As a consequence, perturbations in initial conditions become less and less relevant over time. Since in NODEs the input data corresponds to the initial condition of dynamical systems, we show contractivity can mitigate the effect of input perturbations. More precisely, inspired by NODEs with Hamiltonian dynamics, we propose a class of contractive Hamiltonian NODEs (CH-NODEs). By properly tuning a scalar parameter, CH-NODEs ensure contractivity by design and can be trained using standard backpropagation. Moreover, CH-NODEs enjoy built-in guarantees of non-exploding gradients, which ensure a well-posed training process. Finally, we demonstrate the robustness of CH-NODEs on the MNIST image classification problem with noisy test data.

Details

Actions