Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. The Esntorch Library: Efficient Implementation of Transformer-based Echo State Networks
 
conference paper

The Esntorch Library: Efficient Implementation of Transformer-based Echo State Networks

Cabessa, Jeremie
•
Hernault, Hugo
•
Lamonato, Yves
Show more
Tanveer, M
•
Agarwal, S
Show more
April 15, 2023
Neural Information Processing 29th International Conference, ICONIP 2022, Virtual Event, November 22–26, 2022, Proceedings
29th International Conference on Neural Information Processing

Transformer-based models have revolutionized NLP. But in general, these models are highly resource consuming. Based on this consideration, several reservoir computing approaches to NLP have shown promising results. In this context, we propose EsnTorch, a library that implements echo state networks (ESNs) with transformer-based embeddings for text classification. EsnTorch is developed in PyTorch, optimized to work on GPU, and compatible with the transformers and datasets libraries from Hugging Face: the major data science platform for NLP. Accordingly, our library can make use of all the models and datasets available from Hugging Face. A transformer-based ESN implemented in EsnTorch consists of four building blocks: (1) An embedding layer, which uses a transformer-based model to embed the input texts; (2) A reservoir layer, which can implements three kinds of reservoirs: recurrent, linear or null; (3) A pooling layer, which offers three kinds of pooling strategies: mean, last, or None; (4) And a learning algorithm block, which provides six different supervised learning algorithms. Overall, this work falls within the context of sustainable models for NLP.

  • Details
  • Metrics
Type
conference paper
DOI
10.1007/978-981-99-1648-1_20
Web of Science ID

WOS:001420267300020

Author(s)
Cabessa, Jeremie

Czech Academy of Sciences

Hernault, Hugo

Playtika Ltd

Lamonato, Yves

Playtika Ltd

Rochat, Mathieu  

École Polytechnique Fédérale de Lausanne

Levy, Yariv Z.

Playtika Ltd

Editors
Tanveer, M
•
Agarwal, S
•
Ozawa, S
•
Ekbal, A
•
Jatowt, A
Date Issued

2023-04-15

Publisher

Springer Nature

Publisher place

Singapore

Published in
Neural Information Processing 29th International Conference, ICONIP 2022, Virtual Event, November 22–26, 2022, Proceedings
ISBN of the book

978-981-99-1647-4

978-981-99-1648-1

Book part title

Part VII

Series title/Series vol.

Communications in Computer and Information Science; 1794

ISSN (of the series)

1865-0929

1865-0937

Start page

235

End page

246

Subjects

reservoir computing

•

echo state networks

•

natural language processing (NLP)

•

text classification

•

transformers

•

BERT

•

python library

•

Hugging Face

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
EPFL  
Event nameEvent acronymEvent placeEvent date
29th International Conference on Neural Information Processing

ICONIP 2022

Indore, India

2022-11-22 - 2022-11-26

FunderFunding(s)Grant NumberGrant URL

Czech Science Foundation AppNeCo

RVO: 67985807

Available on Infoscience
May 27, 2025
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/250748
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés