The Esntorch Library: Efficient Implementation of Transformer-based Echo State Networks
Transformer-based models have revolutionized NLP. But in general, these models are highly resource consuming. Based on this consideration, several reservoir computing approaches to NLP have shown promising results. In this context, we propose EsnTorch, a library that implements echo state networks (ESNs) with transformer-based embeddings for text classification. EsnTorch is developed in PyTorch, optimized to work on GPU, and compatible with the transformers and datasets libraries from Hugging Face: the major data science platform for NLP. Accordingly, our library can make use of all the models and datasets available from Hugging Face. A transformer-based ESN implemented in EsnTorch consists of four building blocks: (1) An embedding layer, which uses a transformer-based model to embed the input texts; (2) A reservoir layer, which can implements three kinds of reservoirs: recurrent, linear or null; (3) A pooling layer, which offers three kinds of pooling strategies: mean, last, or None; (4) And a learning algorithm block, which provides six different supervised learning algorithms. Overall, this work falls within the context of sustainable models for NLP.
WOS:001420267300020
Czech Academy of Sciences
Playtika Ltd
Playtika Ltd
École Polytechnique Fédérale de Lausanne
Playtika Ltd
2023-04-15
Singapore
978-981-99-1647-4
978-981-99-1648-1
Part VII
Communications in Computer and Information Science; 1794
1865-0929
1865-0937
235
246
REVIEWED
EPFL
| Event name | Event acronym | Event place | Event date |
ICONIP 2022 | Indore, India | 2022-11-22 - 2022-11-26 | |