File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Relational Memory-Augmented Language Models

TitleRelational Memory-Augmented Language Models
Authors
Issue Date2022
Citation
Transactions of the Association for Computational Linguistics, 2022, v. 10, p. 555-572 How to Cite?
AbstractWe present a memory-augmented approach to condition an autoregressive language model on a knowledge graph. We represent the graph as a collection of relation triples and retrieve relevant relations for a given context to improve text generation. Experiments on WikiText-103, WMT19, and enwik8 English datasets demonstrate that our approach produces a better language model in terms of perplexity and bits per character. We also show that relational memory improves coherence, is complementary to token-based memory, and enables causal interventions. Our model provides a simple yet effective way to combine an autoregressive language model and a knowledge graph for more coherent and logical generation.
Persistent Identifierhttp://hdl.handle.net/10722/321997

 

DC FieldValueLanguage
dc.contributor.authorLiu, Qi-
dc.contributor.authorYogatama, Dani-
dc.contributor.authorBlunsom, Phil-
dc.date.accessioned2022-11-03T02:22:54Z-
dc.date.available2022-11-03T02:22:54Z-
dc.date.issued2022-
dc.identifier.citationTransactions of the Association for Computational Linguistics, 2022, v. 10, p. 555-572-
dc.identifier.urihttp://hdl.handle.net/10722/321997-
dc.description.abstractWe present a memory-augmented approach to condition an autoregressive language model on a knowledge graph. We represent the graph as a collection of relation triples and retrieve relevant relations for a given context to improve text generation. Experiments on WikiText-103, WMT19, and enwik8 English datasets demonstrate that our approach produces a better language model in terms of perplexity and bits per character. We also show that relational memory improves coherence, is complementary to token-based memory, and enables causal interventions. Our model provides a simple yet effective way to combine an autoregressive language model and a knowledge graph for more coherent and logical generation.-
dc.languageeng-
dc.relation.ispartofTransactions of the Association for Computational Linguistics-
dc.titleRelational Memory-Augmented Language Models-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1162/tacl_a_00476-
dc.identifier.scopuseid_2-s2.0-85132612784-
dc.identifier.volume10-
dc.identifier.spage555-
dc.identifier.epage572-
dc.identifier.eissn2307-387X-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats