Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Oblivious Sketching of High-Degree Polynomial Kernels
 
conference paper

Oblivious Sketching of High-Degree Polynomial Kernels

Ahle, Thomas D.
•
Kapralov, Michael  
•
Knudsen, Jakob B. T.
Show more
January 1, 2020
Proceedings Of The Thirty-First Annual Acm-Siam Symposium On Discrete Algorithms (Soda'20)
31st Annual ACM-SIAM Symposium on Discrete Algorithms (SODA)

Kernel methods are fundamental tools in machine learning that allow detection of non-linear dependencies between data without explicitly constructing feature vectors in high dimensional spaces. A major disadvantage of kernel methods is their poor scalability: primitives such as kernel PCA or kernel ridge regression generally take prohibitively large quadratic space and (at least) quadratic time, as kernel matrices are usually dense. Some methods for speeding up kernel linear algebra are known, but they all invariably take time exponential in either the dimension of the input point set (e.g., fast multipole methods suffer from the curse of dimensionality) or in the degree of the kernel function. Oblivious sketching has emerged as a powerful approach to speeding up numerical linear algebra over the past decade, but our understanding of oblivious sketching solutions for kernel matrices has remained quite limited, suffering from the aforementioned exponential dependence on input parameters. Our main contribution is a general method for applying sketching solutions developed in numerical linear algebra over the past decade to a tensoring of data points without forming the tensoring explicitly. This leads to the first oblivious sketch for the polynomial kernel with a target dimension that is only polynomially dependent on the degree of the kernel function, as well as the first oblivious sketch for the Gaussian kernel on bounded datasets that does not suffer from an exponential dependence on the dimensionality of input data points.

  • Details
  • Metrics
Type
conference paper
DOI
10.1137/1.9781611975994.9
Web of Science ID

WOS:000554408100009

Author(s)
Ahle, Thomas D.
Kapralov, Michael  
Knudsen, Jakob B. T.
Pagh, Rasmus
Velingker, Ameya
Woodruff, David P.
Zandieh, Amir  
Date Issued

2020-01-01

Publisher

ASSOC COMPUTING MACHINERY

Publisher place

New York

Published in
Proceedings Of The Thirty-First Annual Acm-Siam Symposium On Discrete Algorithms (Soda'20)
Start page

141

End page

160

Subjects

approximation

•

moments

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
THL4  
Event nameEvent placeEvent date
31st Annual ACM-SIAM Symposium on Discrete Algorithms (SODA)

Salt Lake City, UT

Jan 05-08, 2020

Available on Infoscience
August 21, 2020
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/171018
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés