File Download

There are no files associated with this item.

Supplementary

Conference Paper: A Contrastive Framework for Neural Text Generation

TitleA Contrastive Framework for Neural Text Generation
Authors
KeywordsOpen-ended Text Generation
Decoding Method
Contrastive Learning
Issue Date2022
PublisherCurran Associates, Inc.
Citation
Thirty-Sixth Conference on Neural Information Processing Systems (NeurIPS) (Hybrid), New Orleans, Louisiana, United States of America, November 28-December 9, 2022. In Advances in Neural Information Processing Systems 35 (NeurIPS 2022) How to Cite?
AbstractText generation is of great importance to many natural language processing applications. However, maximization-based decoding methods (e.g., beam search) of neural language models often lead to degenerate solutions---the generated text is unnatural and contains undesirable repetitions. Existing approaches introduce stochasticity via sampling or modify training objectives to decrease the probabilities of certain tokens (e.g., unlikelihood training). However, they often lead to solutions that lack coherence. In this work, we show that an underlying reason for model degeneration is the anisotropic distribution of token representations. We present a contrastive solution: (i) SimCTG, a contrastive training objective to calibrate the model's representation space, and (ii) a decoding method---contrastive search---to encourage diversity while maintaining coherence in the generated text. Extensive experiments and analyses on three benchmarks from two languages demonstrate that our proposed approach outperforms state-of-the-art text generation methods as evaluated by both human and automatic metrics.
Persistent Identifierhttp://hdl.handle.net/10722/318256

 

DC FieldValueLanguage
dc.contributor.authorSu, Y-
dc.contributor.authorLan, T-
dc.contributor.authorWang, Y-
dc.contributor.authorYogatama, D-
dc.contributor.authorKong, L-
dc.contributor.authorCollier, N-
dc.date.accessioned2022-10-07T10:35:30Z-
dc.date.available2022-10-07T10:35:30Z-
dc.date.issued2022-
dc.identifier.citationThirty-Sixth Conference on Neural Information Processing Systems (NeurIPS) (Hybrid), New Orleans, Louisiana, United States of America, November 28-December 9, 2022. In Advances in Neural Information Processing Systems 35 (NeurIPS 2022)-
dc.identifier.urihttp://hdl.handle.net/10722/318256-
dc.description.abstractText generation is of great importance to many natural language processing applications. However, maximization-based decoding methods (e.g., beam search) of neural language models often lead to degenerate solutions---the generated text is unnatural and contains undesirable repetitions. Existing approaches introduce stochasticity via sampling or modify training objectives to decrease the probabilities of certain tokens (e.g., unlikelihood training). However, they often lead to solutions that lack coherence. In this work, we show that an underlying reason for model degeneration is the anisotropic distribution of token representations. We present a contrastive solution: (i) SimCTG, a contrastive training objective to calibrate the model's representation space, and (ii) a decoding method---contrastive search---to encourage diversity while maintaining coherence in the generated text. Extensive experiments and analyses on three benchmarks from two languages demonstrate that our proposed approach outperforms state-of-the-art text generation methods as evaluated by both human and automatic metrics.-
dc.languageeng-
dc.publisherCurran Associates, Inc.-
dc.subjectOpen-ended Text Generation-
dc.subjectDecoding Method-
dc.subjectContrastive Learning-
dc.titleA Contrastive Framework for Neural Text Generation-
dc.typeConference_Paper-
dc.identifier.emailKong, L: lpk@cs.hku.hk-
dc.identifier.authorityKong, L=rp02775-
dc.identifier.hkuros337876-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats