Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Preprints and Working Papers
  4. Trace norm regularization for multi-task learning with scarce data
 
working paper

Trace norm regularization for multi-task learning with scarce data

Boursier, Etienne  
•
Konobeev, Mikhail  
•
Flammarion, Nicolas  
2022

Multi-task learning leverages structural similarities between multiple tasks to learn despite very few samples. Motivated by the recent success of neural networks applied to data-scarce tasks, we consider a linear low-dimensional shared representation model. Despite an extensive literature, existing theoretical results either guarantee weak estimation rates or require a large number of samples per task. This work provides the first estimation error bound for the trace norm regularized estimator when the number of samples per task is small. The advantages of trace norm regularization for learning data-scarce tasks extend to meta-learning and are confirmed empirically on synthetic datasets.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

boursier22a.pdf

Type

Publisher

Version

Published version

Access type

restricted

License Condition

copyright

Size

446.19 KB

Format

Adobe PDF

Checksum (MD5)

56e15f18169ad769907335ace49d6bbe

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés