Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Efficiently Learning Fourier Sparse Set Functions
 
Loading...
Thumbnail Image
conference paper

Efficiently Learning Fourier Sparse Set Functions

Amrollahi, Andisheh
•
Zandieh, Amir  
•
Kapralov, Michael
Show more
January 1, 2019
Advances In Neural Information Processing Systems 32 (Nips 2019)
33rd Conference on Neural Information Processing Systems (NeurIPS)

Learning set functions is a key challenge arising in many domains, ranging from sketching graphs to black-box optimization with discrete parameters. In this paper we consider the problem of efficiently learning set functions that are defined over a ground set of size n and that are sparse (say k-sparse) in the Fourier domain. This is a wide class, that includes graph and hypergraph cut functions, decision trees and more. Our central contribution is the first algorithm that allows learning functions whose Fourier support only contains low degree (say degree d = o(n)) polynomials using O(kd log n) sample complexity and runtime O(kn log(2) k log n log d). This implies that sparse graphs with k edges can, for the first time, be learned from O(k log n) observations of cut values and in linear time in the number of vertices. Our algorithm can also efficiently learn (sums of) decision trees of small depth. The algorithm exploits techniques from the sparse Fourier transform literature and is easily implementable. Lastly, we also develop an efficient robust version of our algorithm and prove l(2)/l(2) approximation guarantees without any statistical assumptions on the noise.

  • Details
  • Metrics
Type
conference paper
Web of Science ID

WOS:000535866906074

Author(s)
Amrollahi, Andisheh
•
Zandieh, Amir  
•
Kapralov, Michael
•
Krause, Andreas
Date Issued

2019-01-01

Publisher

NEURAL INFORMATION PROCESSING SYSTEMS (NIPS)

Publisher place

La Jolla

Published in
Advances In Neural Information Processing Systems 32 (Nips 2019)
Series title/Series vol.

Advances in Neural Information Processing Systems

Volume

32

Subjects

Computer Science, Artificial Intelligence

•

Computer Science

Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
THL4  
Event nameEvent placeEvent date
33rd Conference on Neural Information Processing Systems (NeurIPS)

Vancouver, CANADA

Dec 08-14, 2019

Available on Infoscience
July 10, 2020
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/169960
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés