Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Lightweight Cross-Lingual Sentence Representation Learning
 
Loading...
Thumbnail Image
conference paper

Lightweight Cross-Lingual Sentence Representation Learning

Mao, Zhuoyuan
•
Gupta, Prakhar  
•
Chu, Chenhui
Show more
January 1, 2021
59Th Annual Meeting Of The Association For Computational Linguistics And The 11Th International Joint Conference On Natural Language Processing (Acl-Ijcnlp 2021)
Joint Conference of 59th Annual Meeting of the Association-for-Computational-Linguistics (ACL) / 11th International Joint Conference on Natural Language Processing (IJCNLP) / 6th Workshop on Representation Learning for NLP (RepL4NLP)

Large-scale models for learning fixed-dimensional cross-lingual sentence representations like LASER (Artetxe and Schwenk, 2019b) lead to significant improvement in performance on downstream tasks. However, further increases and modifications based on such large-scale models are usually impractical due to memory limitations. In this work, we introduce a lightweight dual-transformer architecture with just 2 layers for generating memory-efficient cross-lingual sentence representations. We explore different training tasks and observe that current cross-lingual training tasks leave a lot to be desired for this shallow architecture. To ameliorate this, we propose a novel cross-lingual language model, which combines the existing single-word masked language model with the newly proposed cross-lingual token-level reconstruction task. We further augment the training task by the introduction of two computationally-lite sentence-level contrastive learning tasks to enhance the alignment of cross-lingual sentence representation space, which compensates for the learning bottleneck of the lightweight transformer for generative tasks. Our comparisons with competing models on cross-lingual sentence retrieval and multilingual document classification confirm the effectiveness of the newly proposed training tasks for a shallow model.

  • Details
  • Metrics
Type
conference paper
DOI
10.18653/v1/2021.acl-long.226
Web of Science ID

WOS:000698679200026

Author(s)
Mao, Zhuoyuan
•
Gupta, Prakhar  
•
Chu, Chenhui
•
Jaggi, Martin  
•
Kurohashi, Sadao
Date Issued

2021-01-01

Publisher

ASSOC COMPUTATIONAL LINGUISTICS-ACL

Publisher place

Stroudsburg

Journal
59Th Annual Meeting Of The Association For Computational Linguistics And The 11Th International Joint Conference On Natural Language Processing (Acl-Ijcnlp 2021)
ISBN of the book

978-1-954085-52-7

Volume

1

Start page

2902

End page

2913

Subjects

Computer Science, Artificial Intelligence

•

Computer Science, Interdisciplinary Applications

•

Linguistics

•

Computer Science

Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
MLO  
Event nameEvent placeEvent date
Joint Conference of 59th Annual Meeting of the Association-for-Computational-Linguistics (ACL) / 11th International Joint Conference on Natural Language Processing (IJCNLP) / 6th Workshop on Representation Learning for NLP (RepL4NLP)

ELECTR NETWORK

Aug 01-06, 2021

Available on Infoscience
December 4, 2021
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/183530
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés