Jones, CorinneRoulet, VincentHarchaoui, Zaid2023-03-132023-03-132023-03-132023-02-1010.1080/10618600.2022.2163649https://infoscience.epfl.ch/handle/20.500.14299/195748WOS:000936815900001Convolutional neural networks, as most artificial neural networks, are frequently viewed as methods different in essence from kernel-based methods. In this work we translate several classical convolutional neural networks into kernel-based counterparts. Each kernel-based counterpart is a statistical model called a convolutional kernel network with parameters that can be learned from data. We provide an alternating minimization algorithm with mini-batch sampling and implicit partial differentiation to learn from data the parameters of each convolutional kernel network. We also show how to obtain inexact derivatives with respect to the parameters using an algorithm based on two inter-twined Newton iterations. The models and the algorithms are illustrated on benchmark datasets in image classification. We find that the convolutional neural networks and their kernel counterparts often perform similarly. nd code for the article are available online.Statistics & ProbabilityMathematicsartificial neural networkskernel-based methodsreproducing kernelsreverse-mode automatic differentiationsmoothing splinesstochastic optimizationmechanismmodelRevisiting Convolutional Neural Networks from the Viewpoint of Kernel-Based Methodstext::journal::journal article::research article