File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Episodic memory in lifelong language learning

TitleEpisodic memory in lifelong language learning
Authors
Issue Date2019
Citation
33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, 8-14 December 2019. In Advances in Neural Information Processing Systems 32 (NeurIPS 2019), 2019 How to Cite?
Abstract© 2019 Neural information processing systems foundation. All rights reserved. We introduce a lifelong language learning setup where a model needs to learn from a stream of text examples without any dataset identifier. We propose an episodic memory model that performs sparse experience replay and local adaptation to mitigate catastrophic forgetting in this setup. Experiments on text classification and question answering demonstrate the complementary benefits of sparse experience replay and local adaptation to allow the model to continuously learn from new datasets. We also show that the space complexity of the episodic memory module can be reduced significantly (~50-90%) by randomly choosing which examples to store in memory with a minimal decrease in performance. We consider an episodic memory component as a crucial building block of general linguistic intelligence and see our model as a first step in that direction.
Persistent Identifierhttp://hdl.handle.net/10722/296219
ISSN
2020 SCImago Journal Rankings: 1.399
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorde Masson D'Autume, Cyprien-
dc.contributor.authorRuder, Sebastian-
dc.contributor.authorKong, Lingpeng-
dc.contributor.authorYogatama, Dani-
dc.date.accessioned2021-02-11T04:53:05Z-
dc.date.available2021-02-11T04:53:05Z-
dc.date.issued2019-
dc.identifier.citation33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, 8-14 December 2019. In Advances in Neural Information Processing Systems 32 (NeurIPS 2019), 2019-
dc.identifier.issn1049-5258-
dc.identifier.urihttp://hdl.handle.net/10722/296219-
dc.description.abstract© 2019 Neural information processing systems foundation. All rights reserved. We introduce a lifelong language learning setup where a model needs to learn from a stream of text examples without any dataset identifier. We propose an episodic memory model that performs sparse experience replay and local adaptation to mitigate catastrophic forgetting in this setup. Experiments on text classification and question answering demonstrate the complementary benefits of sparse experience replay and local adaptation to allow the model to continuously learn from new datasets. We also show that the space complexity of the episodic memory module can be reduced significantly (~50-90%) by randomly choosing which examples to store in memory with a minimal decrease in performance. We consider an episodic memory component as a crucial building block of general linguistic intelligence and see our model as a first step in that direction.-
dc.languageeng-
dc.relation.ispartofAdvances in Neural Information Processing Systems 32 (NeurIPS 2019)-
dc.titleEpisodic memory in lifelong language learning-
dc.typeConference_Paper-
dc.description.naturelink_to_OA_fulltext-
dc.identifier.scopuseid_2-s2.0-85090174020-
dc.identifier.isiWOS:000535866904074-
dc.identifier.issnl1049-5258-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats