Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Vision Transformer Adapters for Generalizable Multitask Learning
 
conference paper

Vision Transformer Adapters for Generalizable Multitask Learning

Bhattacharjee, Deblina  
•
Süsstrunk, Sabine  
•
Salzmann, Mathieu  
2023
Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)
IEEE/CVF International Conference on Computer Vision (ICCV)

We introduce the first multitasking vision transformer adapters that learn generalizable task affinities which can be applied to novel tasks and domains. Integrated into an off-the-shelf vision transformer backbone, our adapters can simultaneously solve multiple dense vision tasks in a parameter-efficient manner, unlike existing multitasking transformers that are parametrically expensive. In contrast to concurrent methods, we do not require retraining or fine-tuning whenever a new task or domain is added. We introduce a task-adapted attention mechanism within our adapter framework that combines gradient-based task similarities with attention-based ones. The learned task affinities generalize to the following settings: zero-shot task transfer, unsupervised domain adaptation, and generalization without fine-tuning to novel domains. We demonstrate that our approach outperforms not only the existing convolutional neural network-based multitasking methods but also the vision transformer-based ones. Our project page is at https://ivrl.github.io/VTAGML.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

10950-main.pdf

Type

Publisher

Version

http://purl.org/coar/version/c_970fb48d4fbd8a85

Access type

openaccess

License Condition

MIT License

Size

1.71 MB

Format

Adobe PDF

Checksum (MD5)

4bbc1de10a4ed6b33faa0bd84ac6065e

Loading...
Thumbnail Image
Name

10950-supp.pdf

Type

Publisher

Version

http://purl.org/coar/version/c_970fb48d4fbd8a85

Access type

openaccess

License Condition

MIT License

Size

3.75 MB

Format

Adobe PDF

Checksum (MD5)

c07ca9bd22cbe9286a304dda2bb3fa7b

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés