Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. SKILL: Structured Knowledge Infusion for Large Language Models
 
conference paper

SKILL: Structured Knowledge Infusion for Large Language Models

Moiseev, Fedor
•
Dong, Zhe
•
Alfonseca, Enrique
Show more
January 1, 2022
Naacl 2022: The 2022 Conference Of The North American Chapter Of The Association For Computational Linguistics: Human Language Technologies
Conference of the North-American-Chapter-of-the-Association-for-Computational-Linguistics (NAAACL) - Human Language Technologies

Large language models (LLMs) have demonstrated human-level performance on a vast spectrum of natural language tasks. However, it is largely unexplored whether they can better internalize knowledge from a structured data, such as a knowledge graph, or from text. In this work, we propose a method to infuse structured knowledge into LLMs, by directly training T5 models on factual triples of knowledge graphs (KGs). We show that models pre-trained on Wikidata KG with our method outperform the T5 baselines on FreebaseQA and WikiHop, as well as the Wikidata-answerable subset of TriviaQA and NaturalQuestions. The models pretrained on factual triples compare competitively with the ones on natural language sentences that contain the same knowledge. Trained on a smaller size KG, WikiMovies, we saw 3x improvement of exact match score on MetaQA task compared to T5 baseline. The proposed method has an advantage that no alignment between the knowledge graph and text corpus is required in curating training data. This makes our method particularly useful when working with industry-scale knowledge graphs.

  • Details
  • Metrics
Type
conference paper
DOI
10.18653/v1/2022.naacl-main.113
Web of Science ID

WOS:000859869501048

Author(s)
Moiseev, Fedor
Dong, Zhe
Alfonseca, Enrique
Jaggi, Martin  
Date Issued

2022-01-01

Publisher

ASSOC COMPUTATIONAL LINGUISTICS-ACL

Publisher place

Stroudsburg

Published in
Naacl 2022: The 2022 Conference Of The North American Chapter Of The Association For Computational Linguistics: Human Language Technologies
ISBN of the book

978-1-955917-71-1

Start page

1581

End page

1588

Subjects

Computer Science, Artificial Intelligence

•

Computer Science, Interdisciplinary Applications

•

Linguistics

•

Computer Science

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
MLO  
Event nameEvent placeEvent date
Conference of the North-American-Chapter-of-the-Association-for-Computational-Linguistics (NAAACL) - Human Language Technologies

Seattle, WA

Jul 10-15, 2022

Available on Infoscience
November 21, 2022
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/192369
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés