File Download
Supplementary

postgraduate thesis: Enhanced neural machine translation with external resources

TitleEnhanced neural machine translation with external resources
Authors
Advisors
Advisor(s):Pan, JWang, WP
Issue Date2022
PublisherThe University of Hong Kong (Pokfulam, Hong Kong)
Citation
Chen, G. [陳冠華]. (2022). Enhanced neural machine translation with external resources. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR.
AbstractNeural machine translation (NMT) is the task of translating a source language sentence into a target language with neural networks. The NMT has achieved superior performance to conventional machine translation approaches and become the dominative commercial translation system due to its impressive performance as well as advantages in training and deployment. However, the NMT still faces several challenges. First, due to its end-to-end nature, it is hard to incorporate lexical constraints, namely pre-specified translation fragments into NMT, which is useful for domain adaptation and interactive NMT. Second, as a data-driven approach, adequate training of the NMT model requires large-scale parallel datasets. Therefore, for low-resource language pairs where parallel sentences are limited, the NMT performance degrades significantly. In this thesis, we aim at enhancing NMT with external resources such as lexical constraints and a multilingual pretrained encoder. We discuss the following topics thoroughly: (1) Incorporating the lexical constraints into NMT. We propose two lexically constrained NMT methods either through data augmentation or the designs of novel decoding algorithms. The former approach trains the NMT model with augmented training data using sampled target phrases as constraints, while the latter method modifies the decoding algorithm using more accurate word alignment. (2) Incorporating the multilingual pretrained encoder into NMT. For this research, we first propose SixT, a zero-shot multilingual NMT model which incorporates a multilingual pretrained encoder into NMT with a position-disentangled encoder and capacity-enhanced decoder. SixT is trained with a novel two-stage transferability-enhanced training framework. Then we extend SixT to multilingual fine-tuning and propose SixT+. SixT+ is not only a multilingual NMT model, but can also serve as a pretrained model for different downstream cross-lingual text generation tasks, such as unsupervised machine translation for extremely low-resource languages and zero-shot cross-lingual abstractive summarization. We compare the proposed models against baselines on various datasets and language pairs with extensive experiments. The results show that our approaches significantly outperform the baselines, demonstrating that we effectively incorporate external resources for improving neural machine translation. Moreover, we conduct a series of in-depth analyses to better understand our proposed methods. All codes for these works are publicly available.
DegreeDoctor of Philosophy
SubjectNeural networks (Computer science)
Machine translating
Dept/ProgramComputer Science
Persistent Identifierhttp://hdl.handle.net/10722/318423

 

DC FieldValueLanguage
dc.contributor.advisorPan, J-
dc.contributor.advisorWang, WP-
dc.contributor.authorChen, Guanhua-
dc.contributor.author陳冠華-
dc.date.accessioned2022-10-10T08:18:57Z-
dc.date.available2022-10-10T08:18:57Z-
dc.date.issued2022-
dc.identifier.citationChen, G. [陳冠華]. (2022). Enhanced neural machine translation with external resources. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR.-
dc.identifier.urihttp://hdl.handle.net/10722/318423-
dc.description.abstractNeural machine translation (NMT) is the task of translating a source language sentence into a target language with neural networks. The NMT has achieved superior performance to conventional machine translation approaches and become the dominative commercial translation system due to its impressive performance as well as advantages in training and deployment. However, the NMT still faces several challenges. First, due to its end-to-end nature, it is hard to incorporate lexical constraints, namely pre-specified translation fragments into NMT, which is useful for domain adaptation and interactive NMT. Second, as a data-driven approach, adequate training of the NMT model requires large-scale parallel datasets. Therefore, for low-resource language pairs where parallel sentences are limited, the NMT performance degrades significantly. In this thesis, we aim at enhancing NMT with external resources such as lexical constraints and a multilingual pretrained encoder. We discuss the following topics thoroughly: (1) Incorporating the lexical constraints into NMT. We propose two lexically constrained NMT methods either through data augmentation or the designs of novel decoding algorithms. The former approach trains the NMT model with augmented training data using sampled target phrases as constraints, while the latter method modifies the decoding algorithm using more accurate word alignment. (2) Incorporating the multilingual pretrained encoder into NMT. For this research, we first propose SixT, a zero-shot multilingual NMT model which incorporates a multilingual pretrained encoder into NMT with a position-disentangled encoder and capacity-enhanced decoder. SixT is trained with a novel two-stage transferability-enhanced training framework. Then we extend SixT to multilingual fine-tuning and propose SixT+. SixT+ is not only a multilingual NMT model, but can also serve as a pretrained model for different downstream cross-lingual text generation tasks, such as unsupervised machine translation for extremely low-resource languages and zero-shot cross-lingual abstractive summarization. We compare the proposed models against baselines on various datasets and language pairs with extensive experiments. The results show that our approaches significantly outperform the baselines, demonstrating that we effectively incorporate external resources for improving neural machine translation. Moreover, we conduct a series of in-depth analyses to better understand our proposed methods. All codes for these works are publicly available.-
dc.languageeng-
dc.publisherThe University of Hong Kong (Pokfulam, Hong Kong)-
dc.relation.ispartofHKU Theses Online (HKUTO)-
dc.rightsThe author retains all proprietary rights, (such as patent rights) and the right to use in future works.-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subject.lcshNeural networks (Computer science)-
dc.subject.lcshMachine translating-
dc.titleEnhanced neural machine translation with external resources-
dc.typePG_Thesis-
dc.description.thesisnameDoctor of Philosophy-
dc.description.thesislevelDoctoral-
dc.description.thesisdisciplineComputer Science-
dc.description.naturepublished_or_final_version-
dc.date.hkucongregation2022-
dc.identifier.mmsid991044600200403414-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats