Loading...
conference paper
Neural Tangent Kernel: Convergence and Generalization in Neural Networks (Invited Paper)
January 1, 2021
Stoc '21: Proceedings Of The 53Rd Annual Acm Sigact Symposium On Theory Of Computing
The Neural Tangent Kernel is a new way to understand the gradient descent in deep neural networks, connecting them with kernel methods. In this talk, I'll introduce this formalism and give a number of results on the Neural Tangent Kernel and explain how they give us insight into the dynamics of neural networks during training and into their generalization features.
Type
conference paper
Web of Science ID
WOS:000810492500004
Authors
Publication date
2021-01-01
Publisher
Published in
Stoc '21: Proceedings Of The 53Rd Annual Acm Sigact Symposium On Theory Of Computing
ISBN of the book
978-1-4503-8053-9
Publisher place
New York
Series title/Series vol.
Annual ACM Symposium on Theory of Computing
Start page
6
End page
6
Peer reviewed
REVIEWED
EPFL units
Event name | Event place | Event date |
ELECTR NETWORK | Jun 21-25, 2021 | |
Available on Infoscience
July 18, 2022
Use this identifier to reference this record