Loading...
conference paper
The Unstoppable Rise of Computational Linguistics in Deep Learning
2020
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
In this paper, we trace the history of neural networks applied to natural language understanding tasks, and identify key contributions which the nature of language has made to the development of neural network architectures. We focus on the importance of variable binding and its instantiation in attention-based models, and argue that Transformer is not a sequence model but an induced-structure model. This perspective leads to predictions of the challenges facing research in deep learning architectures for natural language understanding.
Loading...
Name
2020.acl-main.561.pdf
Type
Publisher's version
Access type
openaccess
License Condition
CC BY
Size
306.19 KB
Format
Adobe PDF
Checksum (MD5)
a83eac421189f5fa97e76d6b5c45e599